CN113159103B - Image matching method, device, electronic equipment and storage medium - Google Patents

Image matching method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113159103B
CN113159103B CN202110209671.9A CN202110209671A CN113159103B CN 113159103 B CN113159103 B CN 113159103B CN 202110209671 A CN202110209671 A CN 202110209671A CN 113159103 B CN113159103 B CN 113159103B
Authority
CN
China
Prior art keywords
template
contour
edge
image
searched
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110209671.9A
Other languages
Chinese (zh)
Other versions
CN113159103A (en
Inventor
张翔
刘吉刚
王月
王升
孙仲旭
章登极
吴丰礼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Topstar Technology Co Ltd
Original Assignee
Guangdong Topstar Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Topstar Technology Co Ltd filed Critical Guangdong Topstar Technology Co Ltd
Priority to CN202110209671.9A priority Critical patent/CN113159103B/en
Publication of CN113159103A publication Critical patent/CN113159103A/en
Application granted granted Critical
Publication of CN113159103B publication Critical patent/CN113159103B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/752Contour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/02Affine transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses an image matching method, an image matching device, electronic equipment and a storage medium. The method extracts the template edge by pyramid layering and variable-scale multi-angle template image creation, creates a multi-layer variable-scale multi-angle template edge contour point set, enables the template edge contour of the template image to contain scale information, solves the problems of scale information deletion and scale invariance in the image shape matching process, and improves the image matching accuracy in the image matching process; meanwhile, the data structure containing the template edge contour information of the scale features is complex to construct, long time is needed in the matching process, the images to be searched are layered by layering pyramid layers of the template images, the performance improvement strategy of matching the similarity from coarse to fine is comprehensively applied to each layering, the matching data volume of the lower layer is reduced, and the shape edge matching speed under the multi-scale features is improved on the premise of ensuring the matching precision.

Description

Image matching method, device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to an image matching method, an image matching device, electronic equipment and a storage medium.
Background
With the development of computer vision, the application fields of image shape matching are also increasing, such as fields of object recognition, character recognition (OCR), image retrieval, medical image analysis, and robot navigation.
At present, the shape matching methods are mainly divided into two main categories: one is to calculate the difference value of the image invariant under various transformations; the other is to minimize the matching error by finding the local correspondence between the image to be searched and the template image. However, in the above method, the first type is suitable for global feature description, but some important shape information, such as scale features, is lost, so that the image matching precision is low; the other method is to tightly combine the whole shape and the local shape, has good robustness to image translation, rotation, scale change and slight geometric deformation, but has higher computational complexity and can influence the image matching efficiency.
Disclosure of Invention
The embodiment of the invention provides an image matching method, an image matching device, electronic equipment and a storage medium, which are used for solving the problems of low matching precision and low matching efficiency in the shape matching process.
In a first aspect, an embodiment of the present invention provides an image matching method, where the method includes:
Determining a template edge contour information set of a template image; the template edge contour information set characterizes the template edge contour of the multi-scale multi-rotation angles under different pyramid layering extracted from the template image;
determining an edge contour information set to be searched of an image to be searched; the edge contour information set to be searched represents the edge contour to be searched under different pyramid layering extracted from the image to be searched according to the pyramid layering number of the template image;
aiming at the pyramid hierarchical structure from top to bottom, based on the contour matching search information of the current hierarchy, carrying out similarity matching on the contour of the template edge of the current hierarchy on the contour of the edge to be searched of the current hierarchy to obtain a contour matching result of the current hierarchy;
determining contour matching search information used when the contour of the template edge of the next layer is subjected to similarity matching corresponding to the contour of the edge to be searched in the next layer according to the contour matching result of the current layer, and using the contour matching search information when the contour of the template edge of the next layer is subjected to similarity matching to jump to the next layer until reaching the bottom layer of the pyramid;
identifying a target shape indicated by a template image in the image to be searched according to an outline matching result when the pyramid bottom layer similarity matching is finished;
The profile matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the profile of the edge of the template is matched in similarity; the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
In a second aspect, an embodiment of the present invention further provides an image matching apparatus, where the apparatus includes:
the template information determining module is used for determining a template edge contour information set of the template image; the template edge contour information set characterizes the template edge contour of the multi-scale multi-rotation angles under different pyramid layering extracted from the template image;
the to-be-searched information determining module is used for determining an edge contour information set to be searched of the to-be-searched image; the edge contour information set to be searched represents the edge contour to be searched under different pyramid layering extracted from the image to be searched according to the pyramid layering number of the template image;
the contour similarity matching module is used for matching the contour of the template edge of the current layering on the contour of the edge to be searched of the current layering according to the contour matching search information of the current layering aiming at the pyramid layering structure from top to bottom, so as to obtain a contour matching result of the current layering;
The contour lower layer matching mapping module is used for determining contour matching search information used when the contour of the template edge of the next layer is subjected to similarity matching corresponding to the contour of the edge to be searched in the next layer according to the contour matching result of the current layer, and is used when the contour of the template edge of the next layer is subjected to similarity matching in a jump to the next layer until reaching the bottom layer of the pyramid;
the image recognition module to be searched is used for recognizing the target shape indicated by the template image in the image to be searched according to the contour matching result when the pyramid bottom layer similarity matching is finished;
the profile matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the profile of the edge of the template is matched in similarity; the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
one or more processing devices;
a storage means for storing one or more programs;
when the one or more programs are executed by the one or more processing devices, the one or more processing devices implement the image matching method according to any one of the embodiments of the present invention.
In a fourth aspect, there is also provided in an embodiment of the present application a computer-readable storage medium having stored thereon a computer program which, when executed by a processing device, implements the image matching method according to any one of the embodiments of the present application.
According to the image matching method provided by the embodiment of the application, the template image is subjected to pyramid layering, the template image is subjected to multi-angle template image creation in a variable scale manner to extract the template edge, and a multi-layer variable-scale multi-angle template edge contour point set is created, so that the template edge contour of the template image contains scale information, the problems of scale information deletion and scale invariance in the image shape matching process are solved, and the image matching accuracy in the image matching process is improved; meanwhile, the data structure of template edge contour matching comprising scale features is considered to be complex in construction, long time is needed in the matching process, images to be searched are layered by layering pyramid layers of the template images, a performance improvement strategy of similarity matching from coarse to fine is comprehensively applied to each layering by comprehensively utilizing image pyramid pairs, the matching data amount at the lower layer is reduced, algorithm complexity is reduced, and the shape edge matching speed under the multi-scale features is improved on the premise of ensuring matching precision.
The foregoing summary is merely an overview of the present invention, and is intended to be implemented in accordance with the teachings of the present invention in order that the same may be more fully understood, and in order that the same or additional objects, features and advantages of the present invention may be more fully understood.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 is a flow chart of an image matching method provided in an embodiment of the present invention;
FIG. 2 is a schematic diagram of a template edge profile information set including image scale features provided in an embodiment of the present invention;
FIG. 3 is a general flow diagram of image matching provided in an embodiment of the present invention;
FIG. 4 is a flow chart of creating a template edge contour information set provided in an embodiment of the present invention;
FIG. 5 is a schematic diagram of edge extraction of a template image provided in an embodiment of the present invention;
FIG. 6 is a schematic diagram of a differential processing of the direction of a template image provided in an embodiment of the present invention;
FIG. 7 is a schematic illustration of a gradient magnitude image of a template image provided in an embodiment of the present invention;
FIG. 8 is a schematic diagram of image gradient directions of a template image provided in an embodiment of the present invention;
FIG. 9 is a schematic illustration of non-maximum suppression of a template image provided in an embodiment of the present invention;
FIG. 10 is a schematic diagram of an adaptive hysteresis thresholding provided in an embodiment of the present invention;
FIG. 11 is a schematic illustration of the effect of adaptive hysteresis thresholding on a template image provided in an embodiment of the present invention;
FIG. 12 is a schematic flow chart of pyramid adaptive layering of a template edge contour image provided in an embodiment of the present invention;
FIG. 13 is a flow chart of the construction of an edge profile information set to be searched provided in an embodiment of the present invention;
FIG. 14 is a flow chart of another image matching method provided in an embodiment of the present invention;
FIG. 15 is a schematic diagram of sliding a template edge contour over a corresponding edge contour to be searched according to an embodiment of the present invention;
FIG. 16 is a schematic diagram of similarity matching of template edge contours to be searched in each pyramid hierarchy provided in an embodiment of the present invention;
FIG. 17 is a schematic diagram of mapping profile matching search information from an upper layer to a lower layer of a pyramid according to an embodiment of the present invention;
FIG. 18 is a schematic diagram of a comparison of front and back acceleration for similarity matching using a performance enhancement strategy according to an embodiment of the present invention;
fig. 19 is a block diagram of an image matching apparatus provided in an embodiment of the present invention;
fig. 20 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Before discussing the exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations (or steps) can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Fig. 1 is a flowchart of an image matching method provided in an embodiment of the present application. The technical solution of the embodiment is applicable to the situation of matching the shapes between the images, and the method can be executed by an image matching device which can be realized in a software and/or hardware mode and is integrated on any electronic equipment with a network communication function. As shown in fig. 1, the image matching method in the embodiment of the present application may include the following steps:
s110, determining a template edge contour information set of a template image; the template edge profile information set characterizes the multi-scale multi-rotation angle template edge profile under different pyramid hierarchies extracted from the template image.
Image shape matching is usually based on a certain criterion to measure the similarity between shapes, and the shape matching result of two images can be represented by a numerical value, which is called shape similarity. The larger the value of the shape similarity, the more similar the shape of the two images; otherwise, the less similar. The similarity of shape matching is the result corresponding to the maximum value in the matching process.
Referring to fig. 2, a template image including a target shape may be subjected to pyramid layering to obtain a plurality of layered template images, and template edge contour features of the template image of each layer are described from a multi-scale multi-rotation angle after pyramid layering, so as to create a template edge contour information set of the template image. Therefore, the template edge contour features with multiple layers, variable scales and multiple angles can be constructed, so that the template edge contour contains scale information, and the problem of scale invariance in the image matching process is solved.
S120, determining an edge contour information set to be searched of an image to be searched; the edge contour information set to be searched characterizes edge contours to be searched under different pyramid layering extracted from the images to be searched according to the pyramid layering number of the template images.
Because the image to be searched is required to be matched with the template edge contour information of the template image, after the image to be searched is acquired, pyramid layering can be carried out on the image to be searched according to the pyramid layering number of the template image similar to the processing of the template image, so that a plurality of layered images to be searched are obtained. Furthermore, the corresponding edge contour points to be searched can be extracted from each layered image to be searched, the edge contour to be searched of each layer is constructed, and the same-layer matching of the template image and the image to be searched is realized by using the edge contour.
S130, aiming at the pyramid hierarchical structure from top to bottom, based on the contour matching search information of the current hierarchy, performing similarity matching on the template edge contour of the current hierarchy on the edge contour to be searched of the current hierarchy to obtain a contour matching result of the current hierarchy.
The contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the edge contours of the templates are matched in similarity. The region to be matched can indicate the limited searching position of the template edge contour when the similarity matching is carried out on the edge contour to be searched in the same layer, and the problem that similarity matching resources are wasted due to the fact that the template edge contour is matched in other regions except the region to be matched of the edge contour to be searched is avoided. The scale to be matched can indicate the scale of the template edge profile used when the template edge profile is subjected to similarity matching with the template edge profile to be searched in the same layer, so that the problem that the matching resource is wasted due to the fact that the template edge profile with unsuitable scale is matched with the template edge profile to be searched is avoided. Similarly, the angle to be matched can indicate the rotation angle of the template edge profile when the template edge profile is subjected to similarity matching with the template edge profile to be searched in the same layer, so that the problem that the template edge profile with an improper angle is subjected to similarity matching and the similarity matching resource is wasted is avoided.
When the template edge contour is subjected to similarity matching with the edge contour to be searched, the similarity between the template edge contour and the overlapping contour region of the edge contour to be searched can be counted, the greater the similarity is, the more similar the template edge contour and the overlapping contour region of the edge contour to be searched are, and the similarity is 1 when the template edge contour and the overlapping contour region of the edge contour to be searched are completely matched. The contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched. For the template edge contours of the same hierarchy and the edge contours to be searched, similarity matching is respectively carried out on the same edge contour to be searched through the template edge contours with different scales and different rotation angles, a series of similarity measurement values between the contours can be obtained, and a position area, an angle and a scale corresponding to the maximum value of the similarity measurement values are taken as the contour matching result of the current hierarchy.
And S140, determining contour matching search information used by the edge contour of the template of the next layer when the next layer performs similarity matching on the edge contour to be searched according to the contour matching result of the current layer, and skipping to the next layer for similarity matching until reaching the bottom layer of the pyramid.
Referring to fig. 3, in image matching of a template image with an image to be searched, an image pyramid is introduced to speed up the contour matching speed between the edge contour of the template and the edge contour to be searched. Aiming at pyramid layering from top to bottom, the template edge profile close to the bottom layer of the pyramid and the edge profile feature details in the edge profile to be searched are more, and the time required for similarity matching is longer. Therefore, according to the sequence from top to bottom of the pyramid, the similarity rough matching between the template edge profile and the edge profile to be searched is firstly carried out on the upper layer of the pyramid, so that a rough profile matching result is obtained.
Referring to fig. 3, after the contour matching result is obtained at the upper layer of the pyramid, new contour matching search information can be obtained by mapping to the lower layer of the pyramid based on the contour matching result obtained at the upper layer of the pyramid, so that when similarity matching is carried out between the template edge contour and the edge contour to be searched at the lower layer of the pyramid, the region to be matched, the scale range to be matched and the angle range to be matched at the lower layer of the pyramid are effectively determined, and the similarity matching of the position, the useless angle and the useless scale at the lower layer of the pyramid is greatly reduced, thereby accelerating the similarity matching speed at the lower layer of the pyramid.
And S150, identifying the target shape indicated by the template image in the image to be searched according to the contour matching result when the pyramid bottom layer similarity matching is finished.
Referring to fig. 3, according to the pyramid layering from top to bottom, the contour matching search information used when the edge contour of the pyramid bottom template performs similarity matching on the edge contour to be searched can be determined by using the contour matching result obtained by performing similarity matching on the edge contour to be searched on the template edge contour determined on the second layer of the pyramid through downward mapping. And then, similarity matching can be carried out on the template edge contour of the pyramid bottom layer on the edge contour to be searched corresponding to the pyramid bottom layer, so that a contour matching result is obtained when the pyramid bottom layer similarity matching is finished.
The contour matching result can comprise the position of the center of gravity of the contour, the scale and the rotation angle of the contour of the template edge of the pyramid bottom layer when the contour of the edge to be searched of the pyramid bottom layer completes similarity matching. And drawing a target shape indicated by the template image in the image to be searched based on the center position of the contour indicated by the contour matching result at the end of the pyramid bottom layer similarity matching, the scale and the rotation angle of the used template edge contour, and realizing shape matching between images.
According to the image matching method provided by the embodiment of the application, the template image is subjected to pyramid layering, the template image is subjected to multi-angle template image creation in a variable scale manner to extract the template edge, and a multi-layer variable-scale multi-angle template edge contour point set is created, so that the template edge contour of the template image contains scale information, the problems of scale information deletion and scale invariance in the image shape matching process are solved, and the image matching accuracy in the image matching process is improved; meanwhile, the data structure of template edge contour matching comprising scale features is considered to be complex, long time is needed in the matching process, images to be searched are layered by layering pyramids of template images, coarse-to-fine similarity matching is realized in each layering by comprehensively utilizing image pyramids, matching data quantity in the lower layer can be greatly reduced by the performance improvement strategy, algorithm complexity is reduced, and shape edge matching speed under the multi-scale features is improved on the premise of ensuring matching precision.
Fig. 4 is a flowchart for creating a template edge profile information set according to an embodiment of the present application, where the technical solution of the present embodiment is further optimized based on the foregoing embodiment, and the technical solution of the present embodiment may be combined with each of the alternatives in one or more foregoing embodiments. As shown in fig. 4, the process of creating the template edge profile information set provided in the embodiment of the present application may include the following steps S410 to S430:
s410, pyramid self-adaptive layering is carried out on the template images, and a plurality of layered template images are obtained.
In an alternative of this embodiment, referring to fig. 5, pyramid adaptive layering is performed on a template image to obtain a plurality of layered template images, which may include the following steps A1-A4:
and A1, performing non-maximum suppression processing on pixel points in the template image according to the gradient amplitude and the gradient direction of the template image to obtain a template image with non-maximum suppression.
Referring to fig. 3, after the template image is acquired, the acquired template image is preprocessed using a separate gaussian filter. The Gaussian filter is a linear filter and can effectively suppress noise and smooth images. The Gaussian filter generates a template according to the Gaussian function, and then convolves the template with the image to be processed. The two-dimensional gaussian kernel function is as follows: Wherein x and y are coordinates of pixel points in the template image; e is a natural constant, equal to about 2.71828; sigma is standard deviation, the smaller sigma is, the larger the center coefficient of the generated template is, and the surrounding system isThe smaller the number, the more obvious the smoothing effect on the template image; on the contrary, if the sigma is larger, the difference of the coefficients of the generated template is not quite large, and the smoothing effect of the template image is obvious similar to that of the average template. The preprocessing of the template image is as follows: l (x, y) =g (x, y) ×i (x, y), where I (x, y) is a template image, G (x, y) is a gaussian kernel function generated, and L (x, y) is a template image after gaussian kernel smoothing.
Referring to fig. 3 and 6, after preprocessing an acquired template image using a separate gaussian filter, an image direction difference process may be performed on the template image after the image preprocessing, to obtain a difference image of the template image in the x-direction and the y-direction. For example, sobel edge detection is essentially a first order filter, and based on the sensitivity of the first derivative to lines and noise, it is necessary to perform smoothing of the image first when Sobel edge detection is performed, so as to reduce the influence of noise on edge detection. The Sobel operator contains two sets of 3 x 3 matrices, the x-direction and y-direction convolution kernels, respectively. And carrying out convolution processing on the template image after Gaussian smoothing by a Sobel operator to obtain a differential image of the template image in the x direction and the y direction.
Referring to fig. 3 and 7, the difference values in the x-direction and the y-direction of the corresponding pixel coordinates (x, y) can be extracted from the differential image of the template image in the x-direction and the y-direction, and the gradient amplitude F (x, y) is obtained by squaring the root-mean-square, and the calculation formula of the gradient amplitude is as follows:g in x (x,y)、G y (x, y) is the differential value at the corresponding pixel coordinates (x, y) of the differential image in the x-direction and the y-direction, F (x, y) is the gradient amplitude of the corresponding coordinates, and the generated image is a gradient amplitude graph. Meanwhile, the gradient direction θ can be obtained by the arctangent function atan2 (y, x), and the calculation formula of the gradient direction is as follows: θ=atan2 (G) y (x,y),G x (x, y)), where G x (x,y)、G y (x, y) is a difference value at a pixel coordinate (x, y) corresponding to the differential image in the x-direction and the y-direction, respectively, and θ is a gradient direction of the corresponding coordinate.
Edges detected by the Sobel operator are sometimes too thickSince the edge information cannot be directly utilized, pixels with insufficient gradients need to be suppressed, and only the maximum gradient is reserved, so that the purpose of thin edges is achieved. Pixels with insufficient gradients are likely to be transition points for an edge. According to the definition of the maximum point of the binary function, i.e. the point (x 0 ,y 0 ) All (x, y) in a certain neighborhood of (2) have f (x, y) less than or equal to f (x) 0 ,y 0 ) Then f is said to be at point (x 0 ,y 0 ) Has a maximum value of f (x 0 ,y 0 ) Non-maximum suppression processing can be performed on pixel points in the template image based on the gradient magnitude and gradient direction of the template image.
Referring to fig. 8, performing non-maximum suppression processing on a template image can be achieved by the following method operations:
(1) According to the calculated image gradient direction, the image center pixel point can be divided into four directions according to angles, and the four directions are respectively: a horizontal gradient direction, a vertical gradient direction, an upper right diagonal gradient direction, and a lower right gradient direction. Wherein the horizontal gradient direction is the part between the straight lines 3 and 4 in FIG. 8, specifically (θ > -22.5& < 22.5) | (θ < -157.5& & θ > 157.5); the vertical gradient direction is the portion between the straight lines 1, 2 in fig. 8, specifically (θ > = 67.5&θ < = 112.5) | (θ > = -112.5& & θ < = -67.5); the upper right diagonal gradient direction is the score part between the straight lines 2 and 4 in fig. 8, specifically (θ > = 22.5& & θ < 67.5) || (θ > = -157.5& & θ < -112.5); the lower right gradient direction is the portion between the straight lines 1, 3 in fig. 8, specifically (θ > = -67.5 = - θ < = -22.5) | (θ > 112.5 = θ < = 157.5).
(2) Dividing the image into four directions according to the gradient directions of the central pixel points of the image, and then respectively comparing gradient values of the pixel points on the gradient straight lines in the 8 neighborhood of the image. For example, 4 gradient lines are defined in fig. 8 according to the region, which are a horizontal gradient line, a vertical gradient line, a lower left-upper right-45 ° gradient line, and an upper left-upper right-lower 45 ° gradient line, respectively. When the gradient value of the central point is larger than the gradient values of the two endpoints on the gradient straight line, the gradient value of the central pixel point is the maximum value in the 8-neighborhood, and the pixel value corresponding to the central point is taken as the selected pixel; when the gradient value of the central point is smaller than or equal to the gradient values of the two endpoints on the gradient straight line, the corresponding pixel value of the central point is taken as 0, and fig. 9 is an image after non-maximum value inhibition processing.
And A2, adaptively determining a hysteresis threshold of the template image according to the gradient amplitude of the template image.
In order to reduce human factor errors caused by manually setting the threshold, the high and low thresholds are obtained in a self-adaptive mode according to the image gradient amplitude. Referring to fig. 10, when the high and low thresholds are obtained according to the image gradient amplitude self-adaption, intra-class variance minimization self-adaption selection of the high and low thresholds can be introduced in the edge extraction process, and interference caused by factors such as manually setting the thresholds is reduced. The specific method comprises the following steps: obtaining a gradient amplitude diagram of a template image, classifying the gradient amplitude by L, wherein L=256, and dividing the 8-bit gray scale diagram into [0,1, …,255 according to the gradient amplitude]The method comprises the steps of carrying out a first treatment on the surface of the All the classifications are further divided into three categories, C 0 、C 1 And C 2 Wherein C 0 For non-edge point pixel gradients, including previous grading of gradient magnitudes [0,1, …, k],C 1 For weak edge point pixel gradients, including gradient magnitude grading [ k, k+1, …, m],C 2 For strong edge point pixel gradients, including gradient magnitude grading [ m+1, m+2, …, L-1]. The high and low threshold (m, k) is obtained by minimizing the gradient amplitude histogram and the intra-class variance. For example, for the "Lena" image shown in fig. 7, the high-low hysteresis thresholds obtained by the above method are 150, 92.
And A3, performing edge point division processing on the template image subjected to non-maximum value inhibition processing according to a hysteresis threshold of the template image to obtain a template edge contour image of the template image.
Referring to fig. 10, the hysteresis threshold (dual-threshold segmentation) is achieved by assuming that there are two classes of edges in the image: among edge points after non-maximum suppression, those whose gradient value exceeds a high threshold are called strong edges, those whose gradient value is smaller than the high threshold and larger than the low threshold are called weak edges, and those whose gradient is smaller than the low threshold are not edges. The strong edge is necessarily an edge point, and therefore the high threshold value must be set high enough to require the gradient value of the pixel point to be large enough (the change is strong enough). The weak edges may then be edges, and also noise present in the image. When strong edge points exist in 8 adjacent areas around the weak edge, the weak edge points are changed into strong edge points, so that the strong edge is supplemented. After determining the hysteresis threshold, the template image subjected to the non-maximum value suppression processing is subjected to edge point division processing using the hysteresis threshold, as in the effect diagram of fig. 11.
And A4, carrying out pyramid self-adaptive layering on the template image based on the template edge contour point number in the template edge contour image of the template image to obtain a plurality of layered template images.
In an alternative to this embodiment, referring to FIG. 12, pyramid adaptive layering of template images may include the following steps B1-B2:
and B1, pyramid layering is carried out on the input template image, and the number of edge contour points after layering is counted.
And B2, stopping pyramid layering if the number of the edge contour points after layering is smaller than a preset pyramid top layer edge point threshold value, taking the upper layer of image as a pyramid top layer, and determining the pyramid layering number.
Referring to fig. 12, a pyramid top-level edge point threshold is set, for example, a threshold is set to 20, pyramid layering is performed on the template image, and the number of edge contour points after layering is counted each time. And judging whether the number of the edge contour points after layering is smaller than a preset image pyramid top layer edge point threshold value. And stopping layering when the number of the edge contour points after layering is smaller than a preset threshold value, taking the image of the upper layer as the top layer of the pyramid, determining the number of pyramid layering, and realizing the self-adaptive layering of the image pyramid.
S420, carrying out scale configuration on each layered template image, and carrying out multi-rotation angle configuration on the scale-divided template images.
Referring to fig. 3, given a scale interval and a step length matched by a shape, a template image set after pyramid layering is subjected to scale division processing to obtain a multi-scale template image expression of each layer of pyramid template images, namely a set from a minimum scale image to a maximum scale image of a current layer of template images in each layer of pyramid template image set.
Referring to fig. 3, the orthogonal templates are to be matched with non-orthogonal areas of the image to be searched, so that an angle interval and a step length for shape matching are given, rotation angle configuration is performed on the template images with different scales according to angles configured by the angle interval and the step length, and rotation of the template images is enabled to determine the matching angle. Considering that the digital image usually exists in a matrix form, the rotation angle configuration can be carried out on the template image with a split scale through the affine transformation of the image, and the specific flow of the affine transformation of the image is as follows:
(1) Constructing affine transformation matrix
The transformation of an image from one two-dimensional coordinate to another is accomplished by an affine transformation of the image, which involves rotation and translation of the image. From the spatial three-dimensional coordinate system of the image, the rotation of the image is equivalent to the rotation around the Z axis of the image, and the rotation center point of the image is translated, so that the affine transformation matrix containing the rotation translation of the image is finally constructed. The corresponding affine transformation matrix expression is:
in the above formula, (u, v) is a matrix coordinate after affine transformation of the image, and (x, y) is an image coordinate of the original template or the image to be searched. (c) 1 ,c 2 ) Is the translational coordinates of the image rotation center relative to the original template or the rotation center of the image to be searched, (a) 1 ,a 2 ,b 1 ,b 2 ) To form a rotation matrix in an affine transformation matrixThe parameter (a) includes information such as rotation and scale change of the image, and the x and y axes of the image coordinate axes are orthogonal axes, and therefore, the parameter (a) 1 ,a 2 ,b 1 ,b 2 ) Satisfy the following requirementsAnd a 1 b 1 +a 2 b 2 =0。
(2) Calculating affine transformation matrix
Since the rotation transformation of the image is performed around the Z axis of the image space coordinate system, according to the rotation angle information of the image obtained in the first step, a rotation matrix of affine transformation is calculated and recorded as:
in the above equation, θ is a rotation angle around the Z axis of the image space coordinate system. The rotation center of the image is defined as the coordinate center of the image, and half of the image row and column values are taken, namely (cols/2, rows/2). Finally, an affine transformation matrix containing the image rotation translation information is obtained, and is recorded as:
according to the angle starting point, range and step length information of the first step image, a series of affine transformation matrix groups of images formed by a plurality of rotation angles can be obtained, and the affine transformation matrix groups are marked as follows:
in the above formula, i is the number of pixels of the image to be rotated, i=1, 2, …, n, n is the number of pixels of the image to be rotated, (x) i ,y i ) For the coordinate position of the corresponding i pixel point in the image to be rotated, theta i Is a rotation angle (u) i ,v i ) The coordinate position after the rotation of the corresponding i pixel point.
S430, respectively extracting corresponding template edge contour information from each layered template image set with multiple rotation angles in a multi-scale mode to construct a template edge contour information set of the template image.
Edge extraction to obtain individual ones of a set of template images containing image scale featuresThe specific extraction of template edge contour point information of the template image is referred to the process of step A1-A3, and will not be repeated here. The edge profile information may include a location of a center of gravity of the profile, a pixel location of the edge profile point relative to the center of gravity of the profile, an edge profile point gradient magnitude, and an edge profile point lateral gradient and longitudinal gradient. The center of gravity of the edge contour point set is calculated as: and counting and summing row and column coordinates of all edge contour points, and dividing the row coordinates by the counted edge contour points. Edge profile centroidThe calculation of (2) is as follows:
in the above formula, the sum of the row and column coordinates of each edge pixel point is counted, n is the number of the edge pixel points, and the average value of the row and column coordinates is calculated. The coordinates of the edge contour points relative to the contour centroid pixels are solved as:
gradients of the edge contour points in the x-direction and the y-direction are obtained from the x-direction and y-direction gradient images generated by the image direction difference. The gradient magnitude of the edge contour points is derived from the gradient magnitude calculated previously.
After the corresponding template edge contour information is respectively extracted from each template image in each layered, multi-scale and multi-rotation-angle template image set, the template edge contour information is stored in a structure, all template edge contour information can be organized in a linear table mode for convenient access, a template edge contour information set of the template image is finally constructed, the constructed template edge contour information set is prestored, and the template image is not required to be determined once after image matching is carried out on the image to be searched each time.
Fig. 13 is a flowchart of a construction of an edge profile information set to be searched provided in an embodiment of the present application, where the technical solution of the present embodiment is further optimized based on the foregoing embodiment, and the technical solution of the present embodiment may be combined with each alternative solution in one or more foregoing embodiments. As shown in fig. 13, the construction process of the edge profile information set to be searched provided in the embodiment of the present application may include the following steps:
and S1310, pyramid layering is carried out on the image to be searched according to the pyramid layering number of the template image, so that a plurality of layered images to be searched are obtained.
S1320, extracting corresponding edge contour information to be searched from each layered image to be searched to construct an edge contour information set to be searched of the image to be searched.
The edge contour information comprises the gravity center position of the contour, the pixel position of an edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point and the transverse gradient and the longitudinal gradient of the edge contour point.
In an alternative of this embodiment, extracting the corresponding edge profile information to be searched from each layered image to be searched may include: according to the gradient amplitude and gradient direction of each layered image to be searched, performing non-maximum suppression on pixel points in the image to be searched; and carrying out edge point dividing processing on the image to be searched, which is not inhibited by the maximum value, to obtain an edge contour image to be searched of the image to be searched, thereby obtaining corresponding edge contour information to be searched.
The difference between the construction process of the edge profile information set to be searched provided in the embodiment of the present application and the construction process of the template edge profile information set is that the construction of the edge profile information set to be searched does not perform the configuration processing of the multi-rotation angles in a multi-scale manner, and in particular, technical details which are not described in detail in the embodiment of the present application can be referred to the construction process of the template edge profile information provided in any embodiment of the present application.
Fig. 14 is a flowchart of another image matching method provided in the embodiment of the present application, where the technical solution of the present embodiment is further optimized based on the foregoing embodiment, and the technical solution of the present embodiment may be combined with each of the alternatives in one or more foregoing embodiments. As shown in fig. 14, the image matching method provided in the embodiment of the present application may include the following steps:
s1410, determining a template edge contour information set of a template image; the template edge profile information set characterizes the multi-scale multi-rotation angle template edge profile under different pyramid hierarchies extracted from the template image.
S1420, determining an edge contour information set to be searched of an image to be searched; the to-be-searched edge contour information set represents to-be-searched edge contours under different pyramid layers extracted from the to-be-searched image according to the pyramid layer number of the template image.
S1430, aiming at the pyramid hierarchical structure from top to bottom, based on the contour matching search information of the current hierarchy, respectively carrying out sliding traversal on the template edge contours of different scales and different rotation angles of the current hierarchy on the edge contours to be searched corresponding to the current hierarchy.
The contour matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the edge contours of the templates are matched in similarity.
Referring to fig. 15 and 16, in the case where the current hierarchy is the top layer of the pyramid, the profile matching search information of the current hierarchy includes the entire profile area of the edge profile to be searched as the area to be matched, and the scale range and the angle range set by initialization are respectively used as the scale range to be matched and the angle range to be matched. At this time, the edge contour to be searched of the pyramid top layer and the template edge contour containing the scale angle are respectively taken as inputs of the pyramid top layer traversal matching, and the position coordinates of the template edge contour points are relative to the center of gravity of the template edge contour, so that the center of gravity of the template edge contour moves in the edge contour to be searched to calculate the similarity in the pyramid top layer traversal matching process.
Referring to fig. 15 and 16, in the case that the current hierarchy is a top layer of a non-pyramid, the contour matching search information of the current hierarchy includes mapping determination on the contour of the edge to be searched in the current hierarchy according to a preset matching search mapping mode according to a contour matching result in the contour of the edge to be searched corresponding to the previous hierarchy. At this time, the region to be matched in the edge profile to be searched indicated by the profile matching search information of the current hierarchy and the template edge profile of the scale range to be matched and the angle range to be matched indicated by the middle profile matching search information of the current hierarchy are respectively taken as the input of the current hierarchy traversal matching.
S1440, sliding and traversing the template edge profiles respectively using different rotation angles of different scales, and calculating the similarity between the template edge profiles of different rotation angles of different scales and the edge profile to be searched.
Referring to fig. 9, for similarity matching of each hierarchy, the center of the black intersection is the center of gravity of the outline of the template edge, and the center of gravity traverses from the upper left corner to the lower right corner of the outline image of the edge to be searched, and the similarity between the outline of the template edge and the outline of the corresponding image of the edge to be searched when each movement is counted, the greater the similarity is, the more similar the two are, and the similarity is 1 when the two are completely matched. The template edge profile similarity measure function is as follows:
in the above formula, n is the number of edge points to be calculated, d i ' gradient of a certain edge point in the edge contour image to be searched, e q+p' For the gradient of the edge contour points corresponding to the template edge contour image, t' i Andthe gradients in the x direction of the edge contour points corresponding to the edge contour image to be searched and the template edge contour image are respectively, u' i And->And the gradients in the y direction of the edge contour points corresponding to the edge contour image to be searched and the template edge contour image respectively.
S1450, determining a contour matching result of the current hierarchical lower template edge contour on the edge contour to be searched based on the calculated similarity between the template edge contour with different scales and different rotation angles and the edge contour to be searched.
Referring to fig. 16, for different pyramid layering, according to the contour matching search information of each layering, traversing the edge contours of templates with different scales and different angles on the edge contour image to be searched, obtaining a series of similarity, and taking the position, angle and scale corresponding to the maximum value of the similarity as the contour matching result of the edge contour of the template under the current layering on the edge contour to be searched. The contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
S1460, determining contour matching search information used by the template edge contour of the next layer when the contour of the next layer is matched with the contour of the edge to be searched according to the contour matching result of the current layer, and skipping to the next layer for similarity matching until reaching the bottom layer of the pyramid.
Referring to fig. 17, when determining the contour matching result of the current hierarchy, the contour matching search information used when the template edge contour of the next hierarchy performs similarity matching on the edge contour to be searched in the next hierarchy may be determined by matching the mapping according to the preset matching search mapping mode. The matching search mapping mode is used for mapping out contour matching search information used for carrying out similarity matching on the contour of the edge to be searched in the lower layer according to the contour matching result of the contour of the edge to be searched in the upper layer.
And carrying out similarity matching on the edge profile of the template of the next layer on the basis of the profile matching search information of the next layer, and carrying out similarity matching on the edge profile of the template of the next layer from top to bottom until each pyramid layer is subjected to similarity matching. By introducing pyramid layering, when the upper layer is matched with the result, the matching area is mapped to the lower layer, so that the matching area of the lower layer can be effectively reduced, and the matching speed of the lower layer is increased.
Referring to fig. 16 and 17, the matching search mapping manner includes mapping the region to be matched in the contour matching search information from the upper layer of the pyramid to the lower layer. Traversing and matching the contour gravity center of the template edge contour along the upper layer edge contour image to be searched to obtain an upper layer matching position (x, y) with highest score, wherein the position of the position on the lower layer is (2 x,2 y), and correspondingly, the position on the lower layer is:
in the above formula, (x ', y') is the upper left corner coordinate of the lower layer edge contour image mapping area to be searched, and (x ", y") is the lower right corner coordinate of the lower layer edge contour image mapping area to be searched, so that the position of the region to be matched corresponding to the contour matching search information in the lower layer of the pyramid can be determined.
Referring to fig. 16 and 17, the matching search mapping manner further includes mapping the range of angles to be matched in the contour matching search information from the upper layer to the lower layer of the pyramid. The calculation formula of the angle range of the mapping of the angles to the pyramid lower layer according to the matching search mapping mode is as follows:
the above formula is the angle mapping formula to be matched next_pre Angle for the lower layer mapping start point angle next_aft For the lower layer mapping end point angle, numLevels is the pyramid layering number of the image to be searched determined.
Referring to fig. 16 and 17, the matching search mapping manner further includes mapping the scale to be matched in the contour matching search information from the upper layer to the lower layer of the pyramid. The calculation formula of mapping the scale to the pyramid lower layer according to the matching search mapping mode is as follows:
the above formula is the scale mapping formula to be matched next_pre Scale for the lower layer mapping starting point scale next_aft For the underlying map endpoint scale, scalemap is the scale step given when creating the template.
S1470, identifying the target shape indicated by the template image in the image to be searched according to the contour matching result when the pyramid bottom-layer similarity matching is finished.
On the basis of the foregoing embodiment, optionally, before identifying the target shape indicated by the template image in the image to be searched according to the contour matching result at the end of the pyramid bottom-layer similarity matching, the method further includes:
when the next hierarchical layer is determined to be positioned at the secondary bottom layer of the pyramid, the scale to be matched in the contour matching search information of the current hierarchical layer is directly used as the scale to be matched in the contour matching search information of the next hierarchical layer, so that the scale mapping adjustment of the lower layer is cut off in advance.
Referring to fig. 3, with the last two layers of the pyramid as scale division boundaries, it is determined whether the next layer of the current layer is located in the last two layers of the pyramid. If the next layer of the current layer is not positioned at the last two layers of the pyramid, mapping and determining the dimension to be matched in the profile matching search information used when the profile of the template edge is matched with the profile of the edge to be searched in the next layer in a matching and mapping mode, namely determining the dimension to be matched in the profile matching search information used in the next layer according to a preset matching and searching mapping based on the dimension in the profile matching result in the profile of the edge to be searched in the last layer. If the next layer of the current layer is positioned at the last two layers of the pyramid, determining that the dimension to be matched in the profile matching search information used by the next layer is directly the dimension in the profile matching result of the current layer, namely determining the dimension information by the last two layers of the pyramid, and realizing advanced dimension cut-off. The data volume of scale matching in the next hierarchical level for similarity matching can be reduced by the scale advanced cutoff, and the shape edge matching speed is improved on the premise of ensuring the matching precision.
On the basis of the foregoing embodiment, optionally, before identifying the target shape indicated by the template image in the image to be searched according to the contour matching result at the end of the pyramid bottom-layer similarity matching, the method further includes:
when the current layering is determined to be positioned on the non-pyramid top layer, correcting the angle range to be matched in the contour matching search information of the current layering as follows: and matching the edge contours of the templates of the current layering on the corresponding edge contours to be searched in a similarity mode based on the angles to be matched indicated by the corrected contour matching search information.
Referring to fig. 3, after determining the angle range to be matched in the current hierarchical contour matching search information, in order to improve the matching speed of the current hierarchy and reduce the number of angles to be matched, the angle range to be matched in the current hierarchical contour matching search information is traversed and matched, and the angle range to be matched is corrected to be matched with a plurality of angles to be matched selected along with the variable angle step length of the pyramid layer number. The angle-varying step matching strategy is as follows:
in the above formula, numLevels is the determined pyramid layering number, angle next_pre And angle next_aft The angle step length is changed along with the number of pyramid layers and the angles of the current layering are matched three times, namely angle next_pre 、angle next_aft And angle, thereby reducing the number of angle matches.
On the basis of the above embodiment, optionally, sliding the template edge profiles of the current hierarchy with different scales and different rotation angles on the edge profiles to be searched corresponding to the current hierarchy may include:
when the current layering is determined to be positioned on the non-pyramid top layer, template edge contours of different scales and different rotation angles of the current layering are controlled, and interval sliding traversal is performed on the edge contours to be searched corresponding to the current layering according to the interval number of edge contour points corresponding to the current layering.
Referring to fig. 3, in order to reduce the matching time of the current hierarchy, under the condition of ensuring that the number of edge contour points matched by the previous hierarchy is unchanged, the edge contour points of the current hierarchy are accessed at intervals of a certain pixel in the matching time, so that the effects of reducing the number of matching points and improving the matching speed are achieved. Ensuring that the number of lower edge contour points is 100 points, and obtaining the total number (P total ) The number of interval points when traversing and matching the edge contour points of the current hierarchy is as follows: p (P) delete_num =P total 100, wherein P total Total point number of edge profile of template for current layering, P delete_num The number of edge contour point intervals for the current hierarchy used for the matching calculation. By deleting the strategy of the edge contour points of the non-pyramid top layer, the information redundancy and the computational complexity can be effectively reduced.
Referring to fig. 18, in the case of serial, the above performance enhancement strategy of the present embodiment is adopted, the shape matching time including the scale features of the image is reduced from 2.999s to 200ms, the shape speed is improved by 93.3%, and finally the matching scores of the two are different by 0.0002, which indicates that the performance enhancement strategy can effectively enhance the shape matching speed including the scale features on the premise of ensuring the matching accuracy.
According to the image matching method provided by the embodiment of the application, the template image is subjected to pyramid layering, the template image is subjected to multi-angle template image creation in a variable scale manner to extract the template edge, and a multi-layer variable-scale multi-angle template edge contour point set is created, so that the template edge contour of the template image contains scale information, the problems of scale information deletion and scale invariance in the image shape matching process are solved, and the image matching accuracy in the image matching process is improved; meanwhile, the data structure of template edge contour matching comprising scale features is considered to be complex in construction, long time is needed in the matching process, images to be searched are layered by layering pyramid layers of the template images, a performance improvement strategy of similarity matching from coarse to fine is comprehensively applied to each layering by comprehensively utilizing image pyramid pairs, the matching data amount at the lower layer is reduced, algorithm complexity is reduced, and the shape edge matching speed under the multi-scale features is improved on the premise of ensuring matching precision.
Fig. 19 is a block diagram of an image matching apparatus provided in an embodiment of the present application. The technical scheme of the embodiment can be suitable for the situation of matching the shapes among images, and the device can be realized in a software and/or hardware mode and is integrated on any electronic equipment with a network communication function. As shown in fig. 19, the image matching apparatus in the embodiment of the present application may include: template information determination module 1910, to-be-searched information determination module 1920, contour similarity matching module 1930, contour lower-layer matching mapping module 1940, and to-be-searched image recognition module 1950. Wherein:
a template information determining module 1910 for determining a template edge profile information set of a template image; the template edge contour information set characterizes the template edge contour of the multi-scale multi-rotation angles under different pyramid layering extracted from the template image;
the to-be-searched information determining module 1920 is configured to determine a to-be-searched edge profile information set of the to-be-searched image; the edge contour information set to be searched represents the edge contour to be searched under different pyramid layering extracted from the image to be searched according to the pyramid layering number of the template image;
The contour similarity matching module 1930 is configured to perform similarity matching on the contour of the edge to be searched of the current hierarchy based on the contour matching search information of the current hierarchy for the pyramid hierarchy from top to bottom, so as to obtain a contour matching result of the current hierarchy;
the contour lower layer matching mapping module 1940 is configured to determine contour matching search information used when the contour of the template edge of the next layer matches the contour of the edge to be searched in the next layer according to the contour matching result of the current layer, and skip to the next layer for similarity matching until reaching the bottom layer of the pyramid;
the image to be searched identification module 1950 is configured to identify a target shape indicated by a template image in the image to be searched according to a contour matching result at the end of the pyramid bottom level similarity matching;
the profile matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the profile of the edge of the template is matched in similarity; the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
Based on the above embodiment, optionally, the template information determining module 1910 includes:
pyramid self-adaptive layering is carried out on the template images, so that a plurality of layered template images are obtained; carrying out scale configuration on each layered template image, and carrying out multi-rotation angle configuration on the scale-divided template images;
respectively extracting corresponding template edge contour information from each layered template image set with multiple scales and multiple rotation angles to construct a template edge contour information set of the template image;
the edge contour information comprises the gravity center position of the contour, the pixel position of an edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point, and the transverse gradient and the longitudinal gradient of the edge contour point.
On the basis of the above embodiment, optionally, pyramid adaptive layering is performed on the template image to obtain a plurality of layered template images, including:
according to the edge gradient amplitude and gradient direction of the template image, performing non-maximum value inhibition treatment on pixel points in the template image; and adaptively determining a hysteresis threshold of the template image according to the gradient amplitude of the template image;
according to a hysteresis threshold of the template image, performing edge point division processing on the template image subjected to non-maximum value inhibition processing to obtain a template edge contour image of the template image;
And carrying out pyramid self-adaptive layering on the template image based on the template edge contour points in the template edge contour image of the template image to obtain a plurality of layered template images.
On the basis of the above embodiment, optionally, the information to be searched determining module 1920 includes:
pyramid layering is carried out on the images to be searched according to the pyramid layering number of the template images, so that a plurality of layered images to be searched are obtained;
extracting corresponding edge contour information to be searched from each layered image to be searched to construct an edge contour information set to be searched of the image to be searched;
the edge contour information comprises the gravity center position of the contour, the pixel position of an edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point, and the transverse gradient and the longitudinal gradient of the edge contour point.
On the basis of the foregoing embodiment, optionally, extracting the corresponding edge profile information to be searched from each layered image to be searched includes:
according to the gradient amplitude and gradient direction of each layered image to be searched, performing non-maximum suppression on pixel points in the image to be searched;
and carrying out edge point dividing processing on the non-maximum value suppressed image to be searched to obtain an edge contour image to be searched of the image to be searched, so as to obtain corresponding edge contour information to be searched.
Optionally, based on the above embodiment, the contour similarity matching module 1930 includes:
based on the contour matching search information of the current hierarchy, respectively sliding and traversing the template edge contours of the current hierarchy with different scales and different rotation angles on the edge contours to be searched corresponding to the current hierarchy;
calculating the similarity between the template edge profiles of different scales and the edge profiles to be searched by using the sliding traversal of the template edge profiles of different rotation angles of different scales respectively;
and determining the contour matching result of the current hierarchical lower template edge contour on the edge contour to be searched based on the calculated similarity between the template edge contour with different scales and different rotation angles and the edge contour to be searched.
On the basis of the above embodiment, optionally, in the case that the current hierarchy is the top layer of the pyramid, the profile matching search information of the current hierarchy includes the entire profile area of the edge profile to be searched as the area to be matched, and the scale range and the angle range set by initialization are respectively used as the scale range to be matched and the angle range to be matched;
and under the condition that the current hierarchy is a non-pyramid top layer, the contour matching search information of the current hierarchy comprises mapping and determining the contour of the edge to be searched corresponding to the current hierarchy according to a preset matching search mapping mode according to the contour matching result in the contour of the edge to be searched corresponding to the previous hierarchy.
On the basis of the above embodiment, optionally, the matching search mapping manner is used for mapping out contour matching search information used for performing similarity matching on the contour of the edge to be searched in the lower layer according to the contour matching result of the contour of the edge to be searched in the upper layer.
On the basis of the above embodiment, optionally, the apparatus further includes:
and when the next hierarchy is determined to be positioned at the secondary bottom layer of the pyramid, directly taking the dimension to be matched in the contour matching search information of the current hierarchy as the dimension to be matched in the contour matching search information of the next hierarchy, so that the advance cut-off of the scale mapping adjustment of the lower hierarchy is realized.
On the basis of the above embodiment, optionally, the apparatus further includes:
when the current layering is determined to be positioned on the non-pyramid top layer, correcting the angle range to be matched in the contour matching search information of the current layering as follows: and a plurality of angles to be matched are selected from the angle range to be matched according to the angle step length of the pyramid layer number.
On the basis of the foregoing embodiment, optionally, when it is determined that the current hierarchy is located at the top layer of the non-pyramid, sliding and traversing the template edge contours of different scales and different rotation angles of the current hierarchy on the edge contours to be searched corresponding to the current hierarchy, where the sliding and traversing includes:
And controlling the template edge profiles of different scales and different rotation angles of the current layering, and performing interval sliding traversal on the edge profile to be searched corresponding to the current layering according to the interval number of the edge profile points corresponding to the current layering.
The image matching device provided in the embodiment of the present application may perform the image matching method provided in any embodiment of the present application, and has the corresponding functions and beneficial effects of performing the image matching method, and technical details not described in detail in the foregoing embodiment may be referred to the image matching method provided in any embodiment of the present application.
Fig. 20 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 20, an electronic device provided in an embodiment of the present application includes: one or more processors 2010 and a memory device 2020; the electronic device may have one or more processors 2010, one processor 2010 being taken as an example in fig. 20; the storage 2020 is used to store one or more programs; the one or more programs are executed by the one or more processors 2010 to cause the one or more processors 2010 to implement an image matching method as in any of the embodiments of the present application.
The electronic device may further include: an input device 2030 and an output device 2040.
The processor 2010, the storage 2020, the input device 2030 and the output device 2040 in the electronic apparatus may be connected by a bus or other means, for example by a bus connection in fig. 20.
The storage 2020 in the electronic device is used as a computer readable storage medium, and may be used to store one or more programs, which may be software programs, computer executable programs, and modules, such as program instructions/modules corresponding to the image matching method provided in the embodiments of the present invention. The processor 2010 executes various functional applications of the electronic device and data processing, namely, implements the image matching method in the above-described method embodiment, by running software programs, instructions, and modules stored in the storage 2020.
The storage 2020 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for a function; the storage data area may store data created according to the use of the electronic device, etc. Further, the storage 2020 may include high-speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, storage 2020 may further include memory located remotely from processor 2010, which may be connected to the device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 2030 may be used for receiving input numeric or character information and generating key signal inputs related to user settings and function control of the electronic apparatus. The output device 2040 may include a display device such as a display screen.
And, when one or more programs included in the above-described electronic device are executed by the one or more processors 2010, the programs perform the following operations:
determining a template edge contour information set of a template image; the template edge contour information set characterizes the template edge contour of the multi-scale multi-rotation angles under different pyramid layering extracted from the template image;
determining an edge contour information set to be searched of an image to be searched; the edge contour information set to be searched represents the edge contour to be searched under different pyramid layering extracted from the image to be searched according to the pyramid layering number of the template image;
aiming at the pyramid hierarchical structure from top to bottom, based on the contour matching search information of the current hierarchy, carrying out similarity matching on the contour of the template edge of the current hierarchy on the contour of the edge to be searched of the current hierarchy to obtain a contour matching result of the current hierarchy;
determining contour matching search information used when the contour of the template edge of the next layer is subjected to similarity matching corresponding to the contour of the edge to be searched in the next layer according to the contour matching result of the current layer, and using the contour matching search information when the contour of the template edge of the next layer is subjected to similarity matching to jump to the next layer until reaching the bottom layer of the pyramid;
Identifying a target shape indicated by a template image in the image to be searched according to an outline matching result when the pyramid bottom layer similarity matching is finished;
the profile matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the profile of the edge of the template is matched in similarity; the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
Of course, those skilled in the art will appreciate that the program(s) may also perform the associated operations of the image matching method provided in any embodiment of the present invention when the program(s) included in the electronic device are executed by the processor(s) 510.
In an embodiment of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program for executing an image matching method when executed by a processor, the method including:
determining a template edge contour information set of a template image; the template edge contour information set characterizes the template edge contour of the multi-scale multi-rotation angles under different pyramid layering extracted from the template image;
Determining an edge contour information set to be searched of an image to be searched; the edge contour information set to be searched represents the edge contour to be searched under different pyramid layering extracted from the image to be searched according to the pyramid layering number of the template image;
aiming at the pyramid hierarchical structure from top to bottom, based on the contour matching search information of the current hierarchy, carrying out similarity matching on the contour of the template edge of the current hierarchy on the contour of the edge to be searched of the current hierarchy to obtain a contour matching result of the current hierarchy;
determining contour matching search information used when the contour of the template edge of the next layer is subjected to similarity matching corresponding to the contour of the edge to be searched in the next layer according to the contour matching result of the current layer, and using the contour matching search information when the contour of the template edge of the next layer is subjected to similarity matching to jump to the next layer until reaching the bottom layer of the pyramid;
identifying a target shape indicated by a template image in the image to be searched according to an outline matching result when the pyramid bottom layer similarity matching is finished;
the profile matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the profile of the edge of the template is matched in similarity; the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
Optionally, the program may be further configured to perform the image matching method provided in any embodiment of the present invention when executed by a processor.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access Memory (Random Access Memory, RAM), a Read-Only Memory (ROM), an erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), a flash Memory, an optical fiber, a portable CD-ROM, an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. A computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to: electromagnetic signals, optical signals, or any suitable combination of the preceding. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, radio frequency (RadioFrequency, RF), and the like, or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (12)

1. A method of image matching, the method comprising:
determining a template edge contour information set of a template image; the template edge contour information set characterizes the template edge contour of the multi-scale multi-rotation angles under different pyramid layering extracted from the template image;
determining an edge contour information set to be searched of an image to be searched; the edge contour information set to be searched represents the edge contour to be searched under different pyramid layering extracted from the image to be searched according to the pyramid layering number of the template image;
aiming at the pyramid hierarchical structure from top to bottom, based on the contour matching search information of the current hierarchy, carrying out similarity matching on the contour of the template edge of the current hierarchy on the contour of the edge to be searched of the current hierarchy to obtain a contour matching result of the current hierarchy;
determining contour matching search information used when the contour of the template edge of the next layer is subjected to similarity matching corresponding to the contour of the edge to be searched in the next layer according to the contour matching result of the current layer, and using the contour matching search information when the contour of the template edge of the next layer is subjected to similarity matching to jump to the next layer until reaching the bottom layer of the pyramid;
identifying a target shape indicated by a template image in the image to be searched according to an outline matching result when the pyramid bottom layer similarity matching is finished;
The profile matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the profile of the edge of the template is matched in similarity; the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the contour of the edge to be searched is matched;
determining a template edge profile information set of a template image, comprising:
pyramid self-adaptive layering is carried out on the template images, so that a plurality of layered template images are obtained; carrying out scale configuration on each layered template image, and carrying out multi-rotation angle configuration on the scale-divided template images;
respectively extracting corresponding template edge contour information from each layered template image set with multiple scales and multiple rotation angles to construct a template edge contour information set of the template image;
the edge contour information comprises the gravity center position of the contour, the pixel position of an edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point, and the transverse gradient and the longitudinal gradient of the edge contour point;
pyramid self-adaption layering is carried out on the template image, a plurality of layered template images are obtained, and the pyramid self-adaption layering method comprises the following steps:
according to the edge gradient amplitude and gradient direction of the template image, performing non-maximum value inhibition treatment on pixel points in the template image; and adaptively determining a hysteresis threshold of the template image according to the gradient amplitude of the template image;
According to a hysteresis threshold of the template image, performing edge point division processing on the template image subjected to non-maximum value inhibition processing to obtain a template edge contour image of the template image;
pyramid self-adaptive layering is carried out on the template image based on the template edge contour points in the template edge contour image of the template image to obtain a plurality of layered template images;
the template image is subjected to pyramid self-adaptive layering by the template edge contour point number in the template edge contour image based on the template image to obtain a plurality of layered template images, and the method comprises the following steps:
pyramid layering is carried out on the input template image, and the number of edge contour points after layering is counted;
and stopping pyramid layering if the number of the edge contour points after layering is smaller than a preset pyramid top layer edge point threshold value, taking the upper layer of image as a pyramid top layer, and determining the pyramid layering number.
2. The method of claim 1, wherein determining the set of edge profile information to be searched for the image to be searched comprises:
pyramid layering is carried out on the images to be searched according to the pyramid layering number of the template images, so that a plurality of layered images to be searched are obtained;
extracting corresponding edge contour information to be searched from each layered image to be searched to construct an edge contour information set to be searched of the image to be searched;
The edge contour information comprises the gravity center position of the contour, the pixel position of an edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point, and the transverse gradient and the longitudinal gradient of the edge contour point.
3. The method of claim 2, wherein extracting corresponding edge profile information to be searched from each layered image to be searched comprises:
according to the gradient amplitude and gradient direction of each layered image to be searched, performing non-maximum suppression on pixel points in the image to be searched;
and carrying out edge point dividing processing on the non-maximum value suppressed image to be searched to obtain an edge contour image to be searched of the image to be searched, so as to obtain corresponding edge contour information to be searched.
4. The method of claim 1, wherein matching the template edge contours of the current hierarchy to similarity on the edge contours to be searched of the current hierarchy based on the contour matching search information of the current hierarchy, comprises:
based on the contour matching search information of the current hierarchy, respectively sliding and traversing the template edge contours of the current hierarchy with different scales and different rotation angles on the edge contours to be searched corresponding to the current hierarchy;
Calculating the similarity between the template edge profiles of different scales and the edge profiles to be searched by using the sliding traversal of the template edge profiles of different rotation angles of different scales respectively;
and determining the contour matching result of the current hierarchical lower template edge contour on the edge contour to be searched based on the calculated similarity between the template edge contour with different scales and different rotation angles and the edge contour to be searched.
5. The method of claim 4, wherein the step of determining the position of the first electrode is performed,
under the condition that the current layering is the pyramid top layer, the contour matching search information of the current layering comprises the whole contour area of the edge contour to be searched as a region to be matched, and the scale range and the angle range which are set in an initialized mode are respectively used as a scale range to be matched and an angle range to be matched;
and under the condition that the current hierarchy is a non-pyramid top layer, the contour matching search information of the current hierarchy comprises mapping and determining the contour of the edge to be searched corresponding to the current hierarchy according to a preset matching search mapping mode according to the contour matching result in the contour of the edge to be searched corresponding to the previous hierarchy.
6. The method of claim 5, wherein the matching search mapping mode is used for mapping out contour matching search information used for performing similarity matching on the contour of the edge to be searched in the lower layer according to a contour matching result of the contour of the edge to be searched in the upper layer.
7. The method according to claim 1, wherein the method further comprises:
and when the next hierarchy is determined to be positioned at the secondary bottom layer of the pyramid, directly taking the dimension to be matched in the contour matching search information of the current hierarchy as the dimension to be matched in the contour matching search information of the next hierarchy, so that the advance cut-off of the scale mapping adjustment of the lower hierarchy is realized.
8. The method of claim 1, wherein the step of determining the position of the substrate comprises,
when the current layering is determined to be positioned on the non-pyramid top layer, correcting the angle range to be matched in the contour matching search information of the current layering as follows: and a plurality of angles to be matched are selected from the angle range to be matched according to the angle step length of the pyramid layer number.
9. The method of claim 4, wherein sliding the template edge contours of the current hierarchy at different scales and different rotation angles over the corresponding edge contours to be searched for in the current hierarchy when it is determined that the current hierarchy is at the top level of the non-pyramid comprises:
and controlling the template edge profiles of different scales and different rotation angles of the current layering, and performing interval sliding traversal on the edge profile to be searched corresponding to the current layering according to the interval number of the edge profile points corresponding to the current layering.
10. An image matching apparatus, the apparatus comprising:
the template information determining module is used for determining a template edge contour information set of the template image; the template edge contour information set characterizes the template edge contour of the multi-scale multi-rotation angles under different pyramid layering extracted from the template image;
the template information determining module is also used for carrying out pyramid self-adaptive layering on the template image to obtain a plurality of layered template images; carrying out scale configuration on each layered template image, and carrying out multi-rotation angle configuration on the scale-divided template images;
respectively extracting corresponding template edge contour information from each layered template image set with multiple scales and multiple rotation angles to construct a template edge contour information set of the template image;
the edge contour information comprises the gravity center position of the contour, the pixel position of an edge contour point relative to the gravity center of the contour, the gradient amplitude of the edge contour point, and the transverse gradient and the longitudinal gradient of the edge contour point;
pyramid self-adaptive layering is carried out on the template image to obtain a plurality of layered template images, which comprises the following steps:
according to the edge gradient amplitude and gradient direction of the template image, performing non-maximum value inhibition treatment on pixel points in the template image; and adaptively determining a hysteresis threshold of the template image according to the gradient amplitude of the template image;
According to a hysteresis threshold of the template image, performing edge point division processing on the template image subjected to non-maximum value inhibition processing to obtain a template edge contour image of the template image;
pyramid self-adaptive layering is carried out on the template image based on the template edge contour points in the template edge contour image of the template image to obtain a plurality of layered template images;
the template image is subjected to pyramid self-adaptive layering by the template edge contour point number in the template edge contour image based on the template image to obtain a plurality of layered template images, and the method comprises the following steps:
pyramid layering is carried out on the input template image, and the number of edge contour points after layering is counted;
stopping pyramid layering if the number of the edge contour points after layering is smaller than a preset pyramid top layer edge point threshold value, taking the upper layer of image as a pyramid top layer, and determining the pyramid layering number;
the to-be-searched information determining module is used for determining an edge contour information set to be searched of the to-be-searched image; the edge contour information set to be searched represents the edge contour to be searched under different pyramid layering extracted from the image to be searched according to the pyramid layering number of the template image;
the contour similarity matching module is used for matching the contour of the template edge of the current layering on the contour of the edge to be searched of the current layering according to the contour matching search information of the current layering aiming at the pyramid layering structure from top to bottom, so as to obtain a contour matching result of the current layering;
The contour lower layer matching mapping module is used for determining contour matching search information used when the contour of the template edge of the next layer is subjected to similarity matching corresponding to the contour of the edge to be searched in the next layer according to the contour matching result of the current layer, and is used when the contour of the template edge of the next layer is subjected to similarity matching in a jump to the next layer until reaching the bottom layer of the pyramid;
the image recognition module to be searched is used for recognizing the target shape indicated by the template image in the image to be searched according to the contour matching result when the pyramid bottom layer similarity matching is finished;
the profile matching search information comprises a region to be matched, a scale range to be matched and an angle range to be matched when the profile of the edge of the template is matched in similarity; the contour matching result comprises the gravity center position, the scale and the rotation angle of the template edge contour when the edge contour to be searched is matched.
11. An electronic device, comprising:
one or more processing devices;
a storage means for storing one or more programs;
when the one or more programs are executed by the one or more processing devices, the one or more processing devices are caused to implement the image matching method of any of claims 1-9.
12. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processing device, implements the image matching method as claimed in any one of claims 1-9.
CN202110209671.9A 2021-02-24 2021-02-24 Image matching method, device, electronic equipment and storage medium Active CN113159103B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110209671.9A CN113159103B (en) 2021-02-24 2021-02-24 Image matching method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110209671.9A CN113159103B (en) 2021-02-24 2021-02-24 Image matching method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113159103A CN113159103A (en) 2021-07-23
CN113159103B true CN113159103B (en) 2023-12-05

Family

ID=76883883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110209671.9A Active CN113159103B (en) 2021-02-24 2021-02-24 Image matching method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113159103B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113869441B (en) * 2021-10-10 2022-09-27 青岛星科瑞升信息科技有限公司 Multi-scale target positioning method based on template matching
CN114792373B (en) * 2022-04-24 2022-11-25 广东天太机器人有限公司 Visual identification spraying method and system of industrial robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101872475A (en) * 2009-04-22 2010-10-27 中国科学院自动化研究所 Method for automatically registering scanned document images
CN102073874A (en) * 2010-12-29 2011-05-25 中国资源卫星应用中心 Geometric constraint-attached spaceflight three-line-array charged coupled device (CCD) camera multi-image stereo matching method
CN102654902A (en) * 2012-01-16 2012-09-05 江南大学 Contour vector feature-based embedded real-time image matching method
CN105930858A (en) * 2016-04-06 2016-09-07 吴晓军 Fast high-precision geometric template matching method enabling rotation and scaling functions
CN110378376A (en) * 2019-06-12 2019-10-25 西安交通大学 A kind of oil filler object recognition and detection method based on machine vision
WO2021017361A1 (en) * 2019-07-31 2021-02-04 苏州中科全象智能科技有限公司 Template matching algorithm based on edge and gradient feature
CN112396640A (en) * 2020-11-11 2021-02-23 广东拓斯达科技股份有限公司 Image registration method and device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101872475A (en) * 2009-04-22 2010-10-27 中国科学院自动化研究所 Method for automatically registering scanned document images
CN102073874A (en) * 2010-12-29 2011-05-25 中国资源卫星应用中心 Geometric constraint-attached spaceflight three-line-array charged coupled device (CCD) camera multi-image stereo matching method
CN102654902A (en) * 2012-01-16 2012-09-05 江南大学 Contour vector feature-based embedded real-time image matching method
CN105930858A (en) * 2016-04-06 2016-09-07 吴晓军 Fast high-precision geometric template matching method enabling rotation and scaling functions
CN110378376A (en) * 2019-06-12 2019-10-25 西安交通大学 A kind of oil filler object recognition and detection method based on machine vision
WO2021017361A1 (en) * 2019-07-31 2021-02-04 苏州中科全象智能科技有限公司 Template matching algorithm based on edge and gradient feature
CN112396640A (en) * 2020-11-11 2021-02-23 广东拓斯达科技股份有限公司 Image registration method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于小波金字塔和轮廓特征的医学图像配准;张石;唐敏;董建威;;计算机仿真(第05期);全文 *
基于边缘几何特征的高性能模板匹配算法;吴晓军;邹广华;;仪器仪表学报(第07期);全文 *

Also Published As

Publication number Publication date
CN113159103A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
Mukhopadhyay et al. A survey of Hough Transform
JP6216508B2 (en) Method for recognition and pose determination of 3D objects in 3D scenes
US9141871B2 (en) Systems, methods, and software implementing affine-invariant feature detection implementing iterative searching of an affine space
CN110334762B (en) Feature matching method based on quad tree combined with ORB and SIFT
WO2017219391A1 (en) Face recognition system based on three-dimensional data
WO2019169635A1 (en) Object recognition
CN106981077B (en) Infrared image and visible light image registration method based on DCE and LSS
US9619733B2 (en) Method for generating a hierarchical structured pattern based descriptor and method and device for recognizing object using the same
WO2022179002A1 (en) Image matching method and apparatus, electronic device, and storage medium
CN113111212B (en) Image matching method, device, equipment and storage medium
CN112348836B (en) Method and device for automatically extracting building outline
CN111476251A (en) Remote sensing image matching method and device
CN113159103B (en) Image matching method, device, electronic equipment and storage medium
CN108388902B (en) Composite 3D descriptor construction method combining global framework point and local SHOT characteristics
CN112926592B (en) Trademark retrieval method and device based on improved Fast algorithm
CN111310688A (en) Finger vein identification method based on multi-angle imaging
CN111783722B (en) Lane line extraction method of laser point cloud and electronic equipment
JPWO2012070474A1 (en) Information representation method of object or shape
CN113420648B (en) Target detection method and system with rotation adaptability
Sahin et al. Iterative hough forest with histogram of control points for 6 dof object registration from depth images
CN104268550A (en) Feature extraction method and device
Carrilho et al. Extraction of building roof planes with stratified random sample consensus
Liang et al. Sketch-based retrieval using content-aware hashing
CN117080142B (en) Positioning method for center point of alignment mark and wafer bonding method
CN113658235B (en) Accurate registration method of optical remote sensing image based on VGG network and Gaussian difference network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant