CN109544509B - Workpiece positioning method and device based on secondary template matching and storage device - Google Patents

Workpiece positioning method and device based on secondary template matching and storage device Download PDF

Info

Publication number
CN109544509B
CN109544509B CN201811231405.0A CN201811231405A CN109544509B CN 109544509 B CN109544509 B CN 109544509B CN 201811231405 A CN201811231405 A CN 201811231405A CN 109544509 B CN109544509 B CN 109544509B
Authority
CN
China
Prior art keywords
image
workpiece
image block
coordinate
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811231405.0A
Other languages
Chinese (zh)
Other versions
CN109544509A (en
Inventor
陈鑫
曹卫华
谭畅
刘振焘
刘勇
张浩阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Geosciences
Original Assignee
China University of Geosciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Geosciences filed Critical China University of Geosciences
Priority to CN201811231405.0A priority Critical patent/CN109544509B/en
Publication of CN109544509A publication Critical patent/CN109544509A/en
Application granted granted Critical
Publication of CN109544509B publication Critical patent/CN109544509B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a workpiece positioning method, equipment and storage equipment based on secondary template matching, wherein the method comprises the following steps: the method comprises the steps of collecting a workpiece image by using a calibrated camera, then carrying out image correction, positioning a characteristic region of the workpiece through template matching twice, converting a two-dimensional coordinate of a centroid of the characteristic region into a three-dimensional coordinate of an original image through a series of methods, and finally calculating to obtain the three-dimensional coordinate of the centroid of the workpiece to be positioned through a solid geometry method to complete positioning. A workpiece positioning device and a storage device based on secondary template matching are used for realizing a workpiece positioning method based on secondary template matching. The invention has the beneficial effects that: the method based on two-time template matching improves the positioning speed and accuracy, is suitable for most workpieces, can be continuously adjusted and optimized according to actual conditions by utilizing multiple template segmentation, and has strong practicability.

Description

Workpiece positioning method and device based on secondary template matching and storage device
Technical Field
The invention relates to the field of image recognition, in particular to a workpiece positioning method and device based on secondary template matching and a storage device.
Background
Currently, machine vision template matching techniques are widely used in various aspects of industrial sites. The existing template matching methods are divided into three categories: a matching method based on gray value, a matching method based on geometric characteristics and a template matching method based on gradient direction. The gray value-based template matching method depends on a good illumination environment and does not adapt to rotation transformation; although the matching method based on the geometric features has higher requirements on image quality, the robustness of the matching effect on changes such as illumination, rotation and the like is stronger; the template matching algorithm based on the gradient direction can take some defects as noise, so that the problem of low matching rate is caused.
Some workpieces are located outdoors, and the illumination condition is unstable, so that the template matching method based on the gray value is difficult to be applied to positioning of the workpieces. Meanwhile, the surface of the template is usually stained to different degrees due to year-round use, so that the application of the template matching algorithm based on the gradient direction is limited. Based on the two situations, the matching method based on the geometric features can effectively solve the two problems, but the direct use of the template matching method based on the geometric features can result in overlarge calculation amount and low precision in template matching.
A chip positioning method based on template matching comprises the following steps: G06K9/00(2006.01) I, classification No.: G06K9/00(2006.01) I. The invention discloses a chip positioning method based on template matching, which specifically comprises the following steps: step one, manufacturing a template; secondly, preprocessing a picture to be positioned to increase the contrast ratio of the background and the chip matrix; thirdly, carrying out image segmentation on the preprocessed picture to obtain a blob block, and eliminating a chip with the defects of crystal connection and defect by using the area of the blob block and the side length information of the minimum circumscribed rectangle corresponding to the blob block; obtaining the center position coordinates of the minimum circumscribed rectangle of the remaining blob blocks and the included angle between the short edge and the horizontal direction; and step four, according to the central position coordinates and the included angles obtained in the step three, a template matching chip is adopted on the picture to be positioned, and the position and the angle of the chip are positioned. The method is mainly suitable for positioning the chips in the chip manufacturing process, and can quickly and accurately position the qualified chips by adopting the method of screening and matching, and eliminate the chips with the defects of continuous crystals and defects.
MELF element positioning and detecting method based on template matching, main classification number: G06K9/62(2006.01) I, classification number: G06K9/62(2006.01) I G06T7/00(2006.01) I. The method comprises the steps of establishing a template image with an angle, obtaining a reduced component image, obtaining a distance conversion image of the reduced component image and a distance conversion image of an original component image, obtaining a final best matching template image and a best matching position, extracting key edge points from the edge image with interference points, forming a minimum circumscribed rectangle, setting offset according to the minimum circumscribed rectangle, then forming the number of internal non-zero pixels, obtaining that the component position is correct, the length and the width of the component are within a tolerance range, finishing the positioning and detecting process and outputting component position information.
Disclosure of Invention
In order to solve the above problems, the present invention provides a workpiece positioning method, device and storage device based on secondary template matching, and a workpiece positioning method based on secondary template matching mainly comprises the following steps:
s101: calibrating the camera by adopting a Zhangyingyou calibration method, and completing image acquisition of the workpiece to be positioned by utilizing the calibrated camera to obtain a qualified first workpiece image to be positioned;
s102: carrying out image correction on the first workpiece image to be positioned by adopting an image self-correction method to obtain a corrected second workpiece image to be positioned;
s103: according to a preset first template, screening a first image block corresponding to the first template from the corrected second workpiece image to be positioned by adopting a template matching algorithm to obtain a central position coordinate corresponding to the first image block in the second workpiece image to be positioned; the first template is all or part of an image of a workpiece to be positioned; the corresponding area of the first image block on the second workpiece image to be positioned is the target area of the workpiece to be positioned;
s104: screening a second image block corresponding to a second template from the first image block by adopting a template matching algorithm according to a preset second template to obtain a central position coordinate of the second image block in the first image block; the second template is a part of the first template and is a local characteristic region of the workpiece image to be positioned;
s105: calculating to obtain the center position coordinate of the second image block in the second workpiece image to be positioned by utilizing a linear coordinate conversion method according to the center position coordinate of the second image block in the first image block;
s106: calculating to obtain the center coordinates of a fitting arc of the second image block by adopting an edge fitting algorithm according to the center position coordinates of the second image block in the second workpiece image to be positioned; calculating to obtain the centroid coordinate of the second image block by adopting a stereo matching method according to the center coordinate of the fitting arc of the second image block; the circle center coordinate is a two-dimensional coordinate based on an image coordinate system, and the centroid coordinate is a three-dimensional coordinate based on a world coordinate system;
s107: and calculating to obtain the three-dimensional coordinates of the workpiece centroid according to the centroid coordinates of the second image block, and completing workpiece positioning.
Further, in step S102, the self-correction method adopts a method of combining global self-correction and local self-correction.
Further, in step S103, the template matching algorithm adopts a geometric feature-based template segmentation matching algorithm.
Further, in step S105, a linear coordinate transformation method is adopted to calculate the center position coordinates (S) of the second image block in the second workpiece image to be positionedbui,sbvi) Is shown in formula (1):
Figure BDA0001837253440000031
wherein(s)bui,sbvi) For the centroid position coordinates of the second image block in the second workpiece image to be positioned, (t)bu,tbv) For the centroid coordinates of the first image block in the second workpiece image to be positioned,(s)tui,stvi) Is the center position coordinate, W, of the second image block in the first image blockTAnd HTI is 1,2, …, and n is the number of the second image blocks.
Further, in step S106, the step of calculating the center coordinates of the fitted arc of the second image block by using an edge fitting algorithm includes:
s201: with(s)bui,sbvi) Taking r as the radius of the circle as the center of the circle to form a circle, and obtaining a circular area; r is a preset value;
s202: performing fitting arc operation in the circular area to obtain the center O of the fitting arciI is 1,2, …, n, n is the thirdThe number of the two image blocks;
s203: determining the coefficient value of an elliptic equation where the fitted arc is located, wherein the expression of the elliptic equation is shown as the formula (2):
Ax2+Bxy+Cy2+Dx+Ey+F=0 (2)
in the formula, A, B, C, D, E, F is the coefficient of the ellipse equation where the fitted arc is located, and the calculation method is as follows: will be given(s)bui,sbvi) Discretizing a circle with the circle center as r as the radius, randomly taking 5 different coordinate points on the circle, and solving the value of the coefficient A, B, C, D, E, F of the elliptic equation by adopting a least square method; the calculation formula of the least square method is shown as formula (3):
Figure BDA0001837253440000041
wherein (x)i,yi) To be(s)bui,sbvi) Taking the circle center as a center, r is the coordinate of a discrete point on the circle with the radius, i is 1,2, …, n, n is the number of the second image blocks;
s204: according to the coefficient value A, B, C, D, E, F, the center coordinates (X) of the fitted arc are calculated0,Y0) The calculation formula is shown as formula (4):
Figure BDA0001837253440000042
further, in step S106, according to the coordinates of the center of the fitted arc of the second image block, a formula of the centroid coordinate P (x, y, z) of the second image block is calculated by using a stereo matching method, and is shown in formula (5):
AP=b (5)
in the formula (I), the compound is shown in the specification,
Figure BDA0001837253440000043
P=[x y z]T
Figure BDA0001837253440000044
Mland MrThe projection matrices for the left and right cameras are known quantities.
Further, in step S107, an equation for calculating the centroid coordinate of the workpiece according to the centroid coordinate of the second image block is shown in equation (6):
Figure BDA0001837253440000045
in the above formula, the first and second carbon atoms are,
Figure BDA0001837253440000046
(x, y, z) are the centroid coordinates of the workpiece, (x)iYi, z) is the centroid coordinate of the second image block, i is 1,2, …, n, n is the number of the second image blocks.
A storage device stores instructions and data for implementing a secondary template matching-based workpiece positioning method.
A workpiece positioning apparatus based on secondary template matching, comprising: a processor and the storage device; the processor loads and executes the instructions and data in the storage device to realize a workpiece positioning method based on secondary template matching.
The technical scheme provided by the invention has the beneficial effects that: the method of template matching twice improves the positioning speed and accuracy, is suitable for most workpieces, can be continuously adjusted and optimized according to actual conditions by utilizing repeated template segmentation, and has strong practicability.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a flow chart of a method for positioning a workpiece based on quadratic template matching according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an hardware panel template in an embodiment of the invention;
FIG. 3 is a schematic diagram of coordinate positions during linear coordinate transformation in an embodiment of the present invention;
fig. 4 is a schematic diagram of the operation of the hardware device in the embodiment of the present invention.
Detailed Description
For a more clear understanding of the technical features, objects and effects of the present invention, embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
The embodiment of the invention provides a workpiece positioning method and device based on secondary template matching and a storage device.
Referring to fig. 1, fig. 1 is a flowchart of a workpiece positioning method based on secondary template matching according to an embodiment of the present invention, which specifically includes the following steps:
s101: calibrating the camera by adopting a Zhangyingyou calibration method, and completing image acquisition of the workpiece to be positioned by utilizing the calibrated camera to obtain a qualified first workpiece image to be positioned; the camera comprises a left camera and a right camera; the first workpiece image to be positioned comprises a first workpiece left image to be positioned acquired by the left camera and a first workpiece right image to be positioned acquired by the right camera;
s102: carrying out image correction on the qualified first workpiece image to be positioned by using an image self-correction method to obtain a corrected second workpiece image to be positioned; the second workpiece image to be positioned comprises a second workpiece left image to be positioned and a second workpiece right image to be positioned;
s103: according to a preset first template, screening a first image block corresponding to the first template from the corrected second workpiece image to be positioned by adopting a template matching algorithm to obtain a central position coordinate corresponding to the first image block in the second workpiece image to be positioned; the first template is all or part of an image of a workpiece to be positioned; the corresponding area of the first image block on the second workpiece image to be positioned is the target area of the workpiece to be positioned;
s104: screening a second image block corresponding to a second template from the first image block by adopting a template matching algorithm according to a preset second template to obtain a position coordinate of the second image block in the first image block; the second template is a part of the first template and is a local characteristic region of the workpiece image to be positioned;
s105: calculating the position coordinates of the second image block in the second workpiece image to be positioned by utilizing a linear coordinate conversion method according to the position coordinates of the second image block in the first image block;
s106: calculating the center coordinate of a fitting arc of the second image block by adopting an edge fitting algorithm according to the position coordinate of the second image block in the second workpiece image to be positioned; calculating to obtain the centroid coordinate of the second image block by adopting a stereo matching method according to the center coordinate of the fitting arc of the second image block; the circle center coordinate is a two-dimensional coordinate based on an image coordinate system, and the centroid coordinate is a three-dimensional coordinate based on a world coordinate system;
s107: and calculating to obtain the centroid coordinate of the workpiece according to the centroid coordinate of the second image block, and completing workpiece positioning.
In step S102, the self-calibration method adopts a method of combining global self-calibration and local self-calibration.
In step S103, the template matching algorithm adopts a geometric feature-based template segmentation matching algorithm.
In step S105, calculating to obtain the position coordinates (S) of the second image block in the second workpiece image to be positioned by adopting a linear coordinate conversion methodbui,sbvi) Is shown in formula (1):
Figure BDA0001837253440000061
wherein(s)bui,sbvi) For the centroid position coordinates of the second image block in the second workpiece image to be positioned, (t)bu,tbv) For the centroid coordinates of the first image block in the second workpiece image to be positioned,(s)tui,stvi) Is the center position coordinate, W, of the second image block in the first image blockTAnd HTI is 1,2, …, and n is the number of the second image blocks.
In step S106, the step of calculating the center coordinates of the fitting arc of the second image block by using an edge fitting algorithm is as follows:
s201: with(s)bui,sbvi) Taking r as the radius of the circle as the center of the circle to form a circle, and obtaining a circular area; r is a preset value;
s202: performing fitting arc operation in the circular area to obtain the center O of the fitting arciI is 1,2, …, n, n is the number of the second image blocks;
s203: determining the coefficient value of an elliptic equation where the fitted arc is located, wherein the expression of the elliptic equation is shown as the formula (2):
Ax2+Bxy+Cy2+Dx+Ey+F=0 (2)
in the formula, A, B, C, D, E, F is the coefficient of the ellipse equation where the fitted arc is located, and the calculation method is as follows: will be given(s)bui,sbvi) Discretizing a circle with the circle center as r as the radius, randomly taking 5 different coordinate points on the circle, and solving the value of the coefficient A, B, C, D, E, F of the elliptic equation by adopting a least square method; the calculation formula of the least square method is shown as formula (3):
Figure BDA0001837253440000071
wherein (x)iYi) is(s)bui,sbvi) Taking the circle center as a center, r is the coordinate of a discrete point on the circle with the radius, i is 1,2, …, n, n is the number of the second image blocks;
s204: according to the coefficient value A, B, C, D, E, F, the center coordinates (X) of the fitted arc are calculated0,Y0) The calculation formula is shown as formula (4):
Figure BDA0001837253440000072
in step S106, according to the coordinates of the center of the fitted arc of the second image block, a formula of calculating the centroid coordinate P (x, y, z) of the second image block by using a stereo matching method is shown in formula (5):
AP=b (5)
in the formula (I), the compound is shown in the specification,
Figure BDA0001837253440000073
P=[x y z]T
Figure BDA0001837253440000074
Mland MrThe projection matrices for the left and right cameras are known quantities.
In step S107, a formula for obtaining the centroid coordinate of the workpiece by calculation according to the centroid coordinate of the second image block is shown in formula (6):
Figure BDA0001837253440000081
in the above formula, the first and second carbon atoms are,
Figure BDA0001837253440000082
(x, y, z) are the centroid coordinates of the workpiece, (x)iYi, z) is the centroid coordinate of the second image block, i is 1,2, …, n, n is the number of the second image blocks.
In order to describe the technical scheme of the invention more specifically, the technical scheme of the invention is utilized to position the transformer substation pipe bus fitting panel so as to describe the feasibility of the technical scheme:
according to the steps of the method, the calibration of the camera is completed firstly. The camera calibration is mainly used for acquiring internal parameters of the camera, including the focal length f of the camera, the distortion coefficient k and the pixel size dx、dyAnd image principal point cx、cy. The calibration of the binocular camera (the camera includes a left camera and a right camera) needs to solve the internal reference of the left camera and the right camera, and also needs to solve the relative pose relationship between the left camera and the right camera, including a rotation matrix R and a translation vector T. During calibration, a calibration picture is collected firstly. In order to acquire more accurate calibration data, the number of the acquired pictures is 15-20, and the calibration plate for acquiring the pictures basically covers all binocular vision. And after the calibration picture is acquired, completing camera calibration by using a Zhangyingyou calibration method, and solving the relation between the internal parameters of the camera and the relative pose between the two cameras.
After the camera calibration is completed, a workpiece image B is collected, self-correction is carried out on the image, the image quality is improved, and follow-up template matching is facilitated. In order to improve the accuracy and efficiency of template matching, a template segmentation matching algorithm based on geometric features is adopted to complete the identification of the bus hardware panel. The method comprises the following specific steps:
(1) and creating a hardware panel template as shown in fig. 2, and pre-storing the hardware panel template into a program. The hardware panel template is the first template in the technical scheme of the invention.
(2) And matching the template segmentation of the geometric features. Searching the hardware panel template from the whole image and recording the coordinates of the template center point on the whole imageBT(tbu,tbv)。
And then, taking the screw holes on the hardware fitting panel as local features, and creating a screw hole template (the screw hole template is the second template in the technical scheme of the invention). Searching a screw hole template from the bus hardware panel template image searched for the first time, and recording the position coordinates of the central point of the screw hole template in the bus hardware panel template imageTS1(stu1,stv1)、TS2(stu2,stv2)、TS3(stu3,stv3)、TS4(stu4,stv4)。
Converting the coordinates of the center position of the lower screw hole template of the hardware panel template image into the coordinates of the whole image through linear coordinate conversionBS1(sbu1,sbv1)、BS2(sbu2,sbv2)、BS3(sbu3,sbv3)、BS4(sbu4,sbv4). The steps of linear coordinate transformation are as follows:
original size WB*HBThe size of the hardware panel template is WT*HTThe center coordinates of the screw hole templates matched in the hardware fitting panel template image are respectivelyTS1(stu1,stv1)、TS2(stu2,stv2)、TS3(stu3,stv3)、TS4(stu4,stv4) Converting the center coordinates of the screw hole template in the hardware panel template image into the center coordinates of the screw hole template in the original image through linear coordinate conversionBS1(sbu1,sbv1)、BS2(sbu2,sbv2)、BS3(sbu3,sbv3)、BS4(sbu4,sbv4) As shown in fig. 3.
For obtained by solutionBS1,BS2 BS3,BS4And respectively taking the four points as the center of a circle, and drawing a circle by designating a radius r to obtain 4 circular areas. Fitting the arc in the circular area to obtain the center O of the arc1,O2,O3,O4. The process of fitting the arc is as follows:
the equation of the ellipse can be expressed as formula (7):
Ax2+Bxy+Cy2+Dx+Ey+F=0 (7)
according to the above equation, only 5 points on the contour edge are needed to determine the ellipse parameters. For the entire edge curve, the least squares method may be used for the calculation. The above equation is directly applied to carry out least square processing on the discrete points after edge detection, so that each coefficient in the equation can be obtained, and the calculation formula is as the formula (8):
Figure BDA0001837253440000091
then, based on the extreme value principle, to minimize the value of f (a, B, C, D, E), the formula (9) is obtained:
Figure BDA0001837253440000092
from this, a system of linear equations is obtained, and then the values of the coefficients A, B, C, D, E, F are obtained by solving the system of linear equations.
The coordinates of the center of the ellipse (X) can be solved according to the value of A, B, C, D, E, F0,Y0) As in equation (10):
Figure BDA0001837253440000101
because the left image and the right image respectively obtain 4 fitted circle centers, after the 4 circle centers of the left image and the right image are matched by using a stereo matching method, the three-dimensional coordinates of the 4 circle centers can be obtained by using a least square method.
The three-dimensional coordinates are extracted as follows:
assuming that the center of the circular hole on the surface of the bus hardware fitting is a point P (x, y, z), the imaging coordinates on the two camera planes are P respectivelyl(ul,vl) And Pr(ur,vr) From the pinhole imaging model, there is formula (11):
Figure BDA0001837253440000102
wherein M isl、MrThe projection matrixes of the left camera and the right camera are respectively. To the above formula respectively eliminate zl、zrEquation (12) is obtained:
AP=b (12)
in the above formula:
Figure BDA0001837253440000103
P=[x y z]T
Figure BDA0001837253440000104
and (3) obtaining the three-dimensional coordinates of the point P in the world coordinate system according to a least square method, wherein the three-dimensional coordinates are shown in formula (13):
P=(ATA)-1ATb (13)
in the above equation, the matrix A, b is known and is respectively substituted into M for the nominal solutionl,MrAnd the left and right images correspond to the fitted screw hole centroid pixel coordinates, so that the three-dimensional coordinates R of 4 screw holes can be solved1,R2,R3,R4. And (4) solving the centroid coordinate of the hardware fitting panel according to the three-dimensional coordinates of the 4 screw holes. The solving method is as follows:
let R1(x1,y1,z),R2(x2,y2,z),R3(x3,y3,z),R4(x4,y4And z) is a three-dimensional coordinate of 4 screw holes, the formula (14) is shown:
Figure BDA0001837253440000111
the centroid coordinate of the hardware panel is O (x, y, z), wherein the calculation formula of each coordinate value is as formula (15):
Figure BDA0001837253440000112
and solving to obtain an accurate three-dimensional coordinate O (x, y, z) of the centroid of the hardware panel, and finishing workpiece positioning.
Referring to fig. 4, fig. 4 is a schematic diagram of a hardware device according to an embodiment of the present invention, where the hardware device specifically includes: a workpiece positioning device 401 based on secondary template matching, a processor 402 and a storage device 403.
A workpiece positioning apparatus 401 based on secondary template matching: the workpiece positioning device 401 based on the secondary template matching realizes the workpiece positioning method based on the secondary template matching.
The processor 402: the processor 402 loads and executes the instructions and data in the storage device 403 to implement the secondary template matching-based workpiece positioning method.
The storage device 403: the storage device 403 stores instructions and data; the storage device 403 is used to implement the workpiece positioning method based on the secondary template matching.
The invention has the beneficial effects that: the method of template matching twice improves the positioning speed and accuracy, is suitable for most workpieces, can be continuously adjusted and optimized according to actual conditions by utilizing repeated template segmentation, and has strong practicability.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (9)

1. A workpiece positioning method based on secondary template matching is characterized in that: the method comprises the following steps:
s101: calibrating the camera by adopting a Zhangyingyou calibration method, and completing image acquisition of the workpiece to be positioned by utilizing the calibrated camera to obtain a qualified first workpiece image to be positioned;
s102: carrying out image correction on the first workpiece image to be positioned by adopting an image self-correction method to obtain a corrected second workpiece image to be positioned;
s103: according to a preset first template, screening a first image block corresponding to the first template from the corrected second workpiece image to be positioned by adopting a template matching algorithm to obtain a central position coordinate corresponding to the first image block in the second workpiece image to be positioned; the first template is all or part of an image of a workpiece to be positioned; the corresponding area of the first image block on the second workpiece image to be positioned is the target area of the workpiece to be positioned;
s104: screening a second image block corresponding to a second template from the first image block by adopting a template matching algorithm according to a preset second template to obtain a central position coordinate of the second image block in the first image block; the second template is a part of the first template and is a local characteristic region of the workpiece image to be positioned;
s105: calculating to obtain the center position coordinate of the second image block in the second workpiece image to be positioned by utilizing a linear coordinate conversion method according to the center position coordinate of the second image block in the first image block;
s106: calculating to obtain the center coordinates of a fitting arc of the second image block by adopting an edge fitting algorithm according to the center position coordinates of the second image block in the second workpiece image to be positioned; calculating to obtain the centroid coordinate of the second image block by adopting a stereo matching method according to the center coordinate of the fitting arc of the second image block; the circle center coordinate is a two-dimensional coordinate based on an image coordinate system, and the centroid coordinate is a three-dimensional coordinate based on a world coordinate system;
s107: and calculating to obtain the three-dimensional coordinates of the workpiece centroid according to the centroid coordinates of the second image block, and completing workpiece positioning.
2. The method of claim 1 for workpiece positioning based on quadratic template matching, wherein: in step S102, the self-calibration method adopts a method of combining global self-calibration and local self-calibration.
3. The method of claim 1 for workpiece positioning based on quadratic template matching, wherein: in step S103, the template matching algorithm adopts a geometric feature-based template segmentation matching algorithm.
4. The method of claim 1 for workpiece positioning based on quadratic template matching, wherein: in step S105, a linear coordinate conversion method is adopted to calculate and obtain the center position coordinate (S) of the second image block in the second workpiece image to be positionedbui,sbvi) Is shown in formula (1):
Figure FDA0002440937820000021
wherein(s)bui,sbvi) For the centroid position coordinates of the second image block in the second workpiece image to be positioned, (t)bu,tbv) For the centroid coordinates of the first image block in the second workpiece image to be positioned,(s)tui,stvi) Is the center position coordinate, W, of the second image block in the first image blockTAnd HTAnd i is 1,2, L, n, n is the number of the second image blocks.
5. The method of claim 1 for workpiece positioning based on quadratic template matching, wherein: in step S106, the step of calculating the center coordinates of the fitting arc of the second image block by using an edge fitting algorithm is as follows:
s201: with(s)bui,sbvi) Taking r as the radius of the circle as the center of the circle to form a circle, and obtaining a circular area; r is a preset value;
s202: performing fitting arc operation in the circular area to obtain the center O of the fitting arciI is 1,2, L, n, n is the number of the second image blocks;
s203: determining the coefficient value of an elliptic equation where the fitted arc is located, wherein the expression of the elliptic equation is shown as the formula (2):
Ax2+Bxy+Cy2+Dx+Ey+F=0 (2)
in the formula, A, B, C, D, E, F is the coefficient of the ellipse equation where the fitted arc is located, and the calculation method is as follows: will be given(s)bui,sbvi) Discretizing a circle with the circle center as r as the radius, randomly taking 5 different coordinate points on the circle, and solving the value of the coefficient A, B, C, D, E, F of the elliptic equation by adopting a least square method; the calculation formula of the least square method is shown as formula (3):
Figure FDA0002440937820000022
wherein (x)i,yi) To be(s)bui,sbvi) Taking the circle center as a center, r is the coordinate of a discrete point on a circle with the radius, and i is 1,2, L, n, n is the number of the second image blocks;
s204: according to the coefficient value A, B, C, D, E, F, the center coordinates (X) of the fitted arc are calculated0,Y0) The calculation formula is shown as formula (4):
Figure FDA0002440937820000031
6. the method of claim 1 for workpiece positioning based on quadratic template matching, wherein: in step S106, according to the coordinates of the center of the fitted arc of the second image block, a formula of calculating the centroid coordinate P (x, y, z) of the second image block by using a stereo matching method is shown in formula (5):
AP=b (5)
in the formula (I), the compound is shown in the specification,
Figure FDA0002440937820000032
P=[x y z]T
Figure FDA0002440937820000033
Mland MrThe projection matrices for the left and right cameras are known quantities.
7. The method of claim 1 for workpiece positioning based on quadratic template matching, wherein: in step S107, a formula for obtaining the centroid coordinate of the workpiece by calculation according to the centroid coordinate of the second image block is shown in formula (6):
Figure FDA0002440937820000034
in the above formula, the first and second carbon atoms are,
Figure FDA0002440937820000035
(x, y, z) are the centroid coordinates of the workpiece, (x)i,yiZ) is the centroid coordinate of the second image block, i is 1,2, L, n, n is the number of the second image blocks.
8. A storage device, characterized by: the storage device stores instructions and data for implementing the secondary template matching-based workpiece positioning method of any one of claims 1 to 7.
9. The utility model provides a work piece positioning device based on secondary template matching which characterized in that: the method comprises the following steps: a processor and a storage device; the processor loads and executes the instructions and data in the storage device to realize the workpiece positioning method based on the secondary template matching in any one of claims 1 to 7.
CN201811231405.0A 2018-10-22 2018-10-22 Workpiece positioning method and device based on secondary template matching and storage device Active CN109544509B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811231405.0A CN109544509B (en) 2018-10-22 2018-10-22 Workpiece positioning method and device based on secondary template matching and storage device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811231405.0A CN109544509B (en) 2018-10-22 2018-10-22 Workpiece positioning method and device based on secondary template matching and storage device

Publications (2)

Publication Number Publication Date
CN109544509A CN109544509A (en) 2019-03-29
CN109544509B true CN109544509B (en) 2020-07-07

Family

ID=65844213

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811231405.0A Active CN109544509B (en) 2018-10-22 2018-10-22 Workpiece positioning method and device based on secondary template matching and storage device

Country Status (1)

Country Link
CN (1) CN109544509B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112734863B (en) * 2021-03-31 2021-07-02 武汉理工大学 Crossed binocular camera calibration method based on automatic positioning

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100483283C (en) * 2007-08-01 2009-04-29 暨南大学 Two-dimensional positioning device based on machine vision
CN102721364B (en) * 2011-03-30 2015-12-02 比亚迪股份有限公司 A kind of localization method of workpiece and device thereof
US10393505B2 (en) * 2013-12-06 2019-08-27 Werth Messtechnik Gmbh Device and method for measuring workpieces
CN104457577A (en) * 2014-12-19 2015-03-25 上海工业自动化仪表研究院 Machine-vision-oriented non-contact type workpiece positioning and measuring method
WO2018120290A1 (en) * 2016-12-26 2018-07-05 华为技术有限公司 Prediction method and device based on template matching

Also Published As

Publication number Publication date
CN109544509A (en) 2019-03-29

Similar Documents

Publication Publication Date Title
CN111429532B (en) Method for improving camera calibration accuracy by utilizing multi-plane calibration plate
CN109035200B (en) Bolt positioning and pose detection method based on single-eye and double-eye vision cooperation
WO2017128865A1 (en) Multiple lens-based smart mechanical arm and positioning and assembly method
CN107481284A (en) Method, apparatus, terminal and the system of target tracking path accuracy measurement
CN111721259B (en) Underwater robot recovery positioning method based on binocular vision
CN104748683B (en) A kind of on-line automatic measurement apparatus of Digit Control Machine Tool workpiece and measuring method
CN108381549B (en) Binocular vision guide robot rapid grabbing method and device and storage medium
CN107588721A (en) The measuring method and system of a kind of more sizes of part based on binocular vision
CN107767456A (en) A kind of object dimensional method for reconstructing based on RGB D cameras
CN108470356B (en) Target object rapid ranging method based on binocular vision
CN107977996B (en) Space target positioning method based on target calibration positioning model
CN106952262B (en) Ship plate machining precision analysis method based on stereoscopic vision
CN114494045A (en) Large-scale straight gear geometric parameter measuring system and method based on machine vision
US20220230348A1 (en) Method and apparatus for determining a three-dimensional position and pose of a fiducial marker
CN110060304B (en) Method for acquiring three-dimensional information of organism
CN113012234B (en) High-precision camera calibration method based on plane transformation
CN114029946A (en) Method, device and equipment for guiding robot to position and grab based on 3D grating
CN114463442A (en) Calibration method of non-coaxial camera
CN108917640A (en) A kind of laser blind hole depth detection method and its system
CN108596947B (en) Rapid target tracking method suitable for RGB-D camera
CN208254424U (en) A kind of laser blind hole depth detection system
CN115096206A (en) Part size high-precision measurement method based on machine vision
CN111583342A (en) Target rapid positioning method and device based on binocular vision
CN109544509B (en) Workpiece positioning method and device based on secondary template matching and storage device
CN109671084B (en) Method for measuring shape of workpiece

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant