CN111145247B - Position degree detection method based on vision, robot and computer storage medium - Google Patents

Position degree detection method based on vision, robot and computer storage medium Download PDF

Info

Publication number
CN111145247B
CN111145247B CN201911311789.1A CN201911311789A CN111145247B CN 111145247 B CN111145247 B CN 111145247B CN 201911311789 A CN201911311789 A CN 201911311789A CN 111145247 B CN111145247 B CN 111145247B
Authority
CN
China
Prior art keywords
feature
center
measured
coordinate system
visual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911311789.1A
Other languages
Chinese (zh)
Other versions
CN111145247A (en
Inventor
王伦
周瑞松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peitian Robot Technology Co Ltd
Original Assignee
Peitian Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peitian Robot Technology Co Ltd filed Critical Peitian Robot Technology Co Ltd
Priority to CN201911311789.1A priority Critical patent/CN111145247B/en
Publication of CN111145247A publication Critical patent/CN111145247A/en
Application granted granted Critical
Publication of CN111145247B publication Critical patent/CN111145247B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention provides a vision-based position degree detection method, a robot and a computer storage medium, wherein the position degree detection method comprises the following steps: acquiring an image at a reference feature photographing position and an image at a feature photographing position to be detected; acquiring coordinates of a reference visual center of an image in a reference feature photographing position in a visual coordinate system and reference deviation of the reference feature center relative to the reference visual center, and acquiring coordinates of a to-be-measured visual center of the image in the to-be-measured feature photographing position in the visual coordinate system and to-be-measured deviation of the to-be-measured feature center relative to the to-be-measured visual center; and calculating the coordinates of the reference feature center in the visual coordinate system and the coordinates of the feature center to be detected in the visual coordinate system to obtain the position degree of the feature center to be detected relative to the reference feature center. The position degree of the feature center to be detected is calculated by collecting the image, the requirement on the detection environment is low, the detection device can be integrated into an automatic production line, automatic detection is realized, and the product to be detected is prevented from being worn.

Description

Position degree detection method based on vision, robot and computer storage medium
Technical Field
The invention relates to the technical field of machine vision detection, in particular to a vision-based position detection method, a robot and a computer storage medium.
Background
The positional degree is an index for limiting the amount of fluctuation of the actual position of the element to be measured with respect to the ideal position. Position detection is a conventional detection frequently performed in assembling parts, and the position determines the assembly accuracy of the parts and the product qualification rate. For example, the positioning pin and other fastening devices are assembled with high requirements on the position degree, so that the position degree of the measured element needs to be accurately detected.
The existing position degree detection is generally to detect the position degree of a part on line, and common detection methods include a position degree detection tool and a three-coordinate measuring instrument. The position degree gauge is used for detection, on one hand, the detection needs to be performed manually, and automation cannot be realized; on the other hand, the cost is high because the corresponding position degree gauge needs to be manufactured according to the detection object and the detection precision requirement, and the contact measurement is performed when the gauge is used for detection, so that the abrasion phenomenon exists. The three-coordinate measuring instrument is used for detection, and although the detection precision is high, the three-coordinate measuring instrument has high requirements on the environment, the detection needs to be carried out in the environment with constant temperature and constant humidity, and the automatic detection cannot be carried out in an automatic production line. And the three-coordinate measuring instrument has higher cost and high requirement on personnel operation.
Disclosure of Invention
The invention provides a vision-based position degree detection method, a robot and a computer storage medium, which are used for solving the technical problems that the position degree detection in the prior art cannot be automated and has high cost.
In order to solve the technical problems, the invention adopts a technical scheme that: provided is a vision-based position degree detection method, comprising: acquiring an image at a reference feature photographing position and an image at a feature photographing position to be detected; acquiring coordinates of a reference visual center of an image in the reference feature photographing position in a visual coordinate system and reference deviation of the reference feature center relative to the reference visual center, and acquiring coordinates of a to-be-measured visual center of the image in the to-be-measured feature photographing position in the visual coordinate system and to-be-measured deviation of the to-be-measured feature center relative to the to-be-measured visual center; and calculating the coordinates of the reference feature center in the visual coordinate system and the coordinates of the feature center to be detected in the visual coordinate system to obtain the position degree of the feature center to be detected relative to the reference feature center.
In order to solve the technical problems, the invention adopts another technical scheme that: there is provided a computer storage medium for storing a computer program for implementing a position degree detection method as described hereinbefore when being executed by a processor.
In order to solve the technical problems, the invention adopts another technical scheme that: the robot comprises a clamping system, an image acquisition system and a main control system, wherein the clamping system is used for clamping and positioning a product to be detected, the image acquisition system is used for taking a picture, and the main control system is used for executing the position detection method.
The beneficial effects of the invention are as follows: compared with the prior art, the vision-based position detection method provided by the invention has the advantages that the images under the shooting positions of the reference features and the images under the shooting positions of the features to be detected are obtained, and the position of the center of the features to be detected relative to the center of the reference features is obtained based on the image analysis and calculation. On one hand, the requirements on the detection environment are low, and the method can be integrated into an automatic production line to detect the position degree in real time and automatically; on the other hand, the abrasion of the product to be tested can be avoided.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings may be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of a robot in an embodiment of the present invention;
FIG. 2 is a flow chart of a method for detecting position according to an embodiment of the invention;
FIG. 3 is a schematic diagram of the model of FIG. 2;
fig. 4 is step S in fig. 2 102, a partial flow schematic;
fig. 5 is a schematic flow chart of step S202 in fig. 4;
FIG. 6 is a flowchart of a position detection method according to another embodiment of the present invention;
FIG. 7 is a schematic diagram of the model of FIG. 6;
fig. 8 is a flow chart of step S403 in fig. 6;
fig. 9 is a flowchart of step S501 in fig. 8;
fig. 10 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application.
Detailed Description
The following description of the technical solutions in the embodiments of the present invention will be clear and complete, and it is obvious that the described embodiments are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The present invention provides a robot 100, the robot 100 comprising: clamping system 10, image acquisition system 20 and master control system 30. As shown in fig. 1, in the present embodiment, the clamping system 10 is used for clamping and positioning a product to be measured, the image acquisition system 20 is used for taking a picture, and the main control system 30 is used for performing a position degree detection method to obtain a position degree of a center of a feature to be measured on the product to be measured relative to a reference feature center.
The invention utilizes the image acquisition system 20 to acquire the image of the product to be detected, and utilizes the main control system 30 to analyze and calculate the image of the product to be detected, thereby detecting the position degree of the element on the product to be detected. On the one hand, the clamping system 10, the image acquisition system 20 and the main control system 30 can be arranged on an automatic production line, so that the real-time detection of the position degree of the element is realized; on the other hand, the robot 100 can be prevented from contacting with the product, and abrasion is prevented from being caused to the product.
The invention further provides a vision-based position detection method, which can be used for the main control system 30. Fig. 2 is a flow chart of a position detection method according to an embodiment of the invention, and fig. 3 is a model schematic diagram in fig. 2, as shown in fig. 2 and 3. The position degree detection method of the embodiment specifically comprises the following steps:
step S101: and acquiring an image at the photographing position of the reference feature and an image at the photographing position of the feature to be detected.
Specifically, the main control system 30 controls the image acquisition system 20 to take a picture at the reference feature photographing position and the feature photographing position to be measured, respectively, to acquire an image at the reference feature photographing position and an image at the feature photographing position to be measured.
One or two reference feature photographing positions may be provided, and one, two or more feature photographing positions to be detected may be provided. As shown in fig. 3, a reference feature photographing position and a feature photographing position to be measured are taken as examples for detailed description. The reference feature photographing position is recorded as M, and the feature photographing position to be measured is recorded as N.
Step S102: acquiring coordinates of a reference visual center of an image in a reference feature photographing position in a visual coordinate system and reference deviation of the reference feature center relative to the reference visual center, and acquiring coordinates of a to-be-measured visual center of the image in the to-be-measured feature photographing position in the visual coordinate system and to-be-measured deviation of the to-be-measured feature center relative to the to-be-measured visual center.
Specifically, in the present embodiment, the main control system 30 calculates the coordinates (x 0 ,y 0 ) And the reference deviation of the reference feature center B relative to the reference vision center A, and calculating the photographing position N of the feature to be measuredCoordinates (x) of the visual center C to be measured of the image in the visual coordinate system 1 ,y 1 ) And the to-be-measured deviation of the to-be-measured characteristic center D relative to the to-be-measured visual center C.
The reference deviation is the deviation of the reference feature center relative to the reference visual center in the visual coordinate system, and the to-be-measured deviation is the deviation of the to-be-measured feature center relative to the to-be-measured visual center in the visual coordinate system.
Further, as shown in fig. 4, fig. 4 is a partial flow diagram of step S102 in fig. 2. In step S102, the step of acquiring the reference deviation of the reference feature center B from the reference visual center includes:
step S201: and acquiring the reference characteristic deviation of the reference characteristic center B of the image at the reference characteristic photographing position M relative to the reference visual center A.
As a result, images acquired with image acquisition system 20 typically suffer from the following three errors: (1) poor repeated positioning accuracy; (2) visual calibration errors; (3) visual inspection errors. Based on the errors in the above three aspects, the host system 30 analyzes the calculated coordinates of the reference visual center a and the reference feature center B in the visual coordinate system based on the image at the reference feature photographing position M, and there is an error between the actual coordinates in the visual coordinate system and the coordinates in the visual coordinate system. Wherein, the reference characteristic deviation is: the main control system 30 analyzes the image at the reference feature photographing position M, and the obtained deviation between the theoretical coordinates of the reference feature center B in the visual coordinate system and the theoretical coordinates of the reference visual center a in the visual coordinate system.
After obtaining the reference feature deviation of the reference feature center B from the reference visual center a, step S202 is continued to be performed: and calibrating the reference characteristic deviation to obtain the reference deviation.
In order to reduce the recognition error of the robot 100 caused by the acquisition error of the image acquisition system 20, the reference feature deviation acquired by the master control system 30 needs to be calibrated to obtain the reference deviation between the actual coordinates of the reference feature center B in the visual coordinate system and the actual coordinates of the reference visual center a in the visual coordinate system. The embodiment can reduce the influence of the errors of the three aspects on the position degree detection result by calibrating the reference characteristic deviation, and further improve the detection precision of the robot 100.
Alternatively, in the present embodiment, as shown in fig. 4, after the reference deviation is obtained by the above-described method in step S202, it is also necessary to perform the following steps:
step S203: and acquiring the feature deviation to be detected of the feature center to be detected of the image in the feature photographing position to be detected relative to the vision center to be detected.
Step S204: and calibrating the characteristic deviation to be measured, thereby obtaining the deviation to be measured.
In this embodiment, the step S203 is substantially the same as the step S201, and the step S204 is substantially the same as the step S202, except that in the step S201 and the step S202, the images at the photographing positions of the reference features are analyzed, and in the step S203 and the step S204, the images at the photographing positions of the features to be measured are analyzed, and the analysis methods thereof are substantially the same, which is described in the above embodiment, and is omitted herein.
In another embodiment, step S203 and step S204 may be performed to obtain the deviation to be measured, and then step S201 and step S202 may be performed to obtain the reference deviation. Alternatively, in yet another embodiment, step S201 and step S202 may be performed to obtain the reference deviation, and step S203 and step S204 may be performed to obtain the deviation to be measured, so as to shorten the time calculated by the master control system 30.
Further, referring to fig. 5, fig. 5 is a flow chart of step S202 in fig. 4. In this embodiment, the step of calibrating the reference feature deviation includes:
step S301: and acquiring an image of the visual calibration plate.
The visual calibration plate is a geometric model of camera imaging which needs to be established for correcting lens distortion, determining conversion relation between physical size and pixels and determining interrelation between three-dimensional geometric position of a point on the surface of a space object and corresponding point in an image in the applications of machine vision, image measurement, photogrammetry, three-dimensional reconstruction and the like.
In this embodiment, the main control system 30 clamps and positions the visual calibration board on the clamping system 10 first, and then, the main control system 30 controls the image acquisition system 20 to acquire the image of the visual calibration board.
Step S302: analyzing the image, acquiring a calibration model, and calibrating the reference characteristic deviation and the characteristic deviation to be measured by using the calibration model.
Specifically, the main control system 30 analyzes the image of the visual calibration board acquired by the image acquisition system 20 to acquire a calibration model, and then the main control system 30 calibrates the reference feature deviation acquired in step S201 using the calibration model to acquire the reference deviation.
Further, the method in step S301 and step S302 may be used to calibrate the feature deviation to be measured, which is not described herein.
In this embodiment, as shown in fig. 3, at the reference feature photographing position M, the master control system 30 may calculate the reference feature deviation of the reference feature center B from the reference vision center a, and calibrate the reference feature deviation, thereby obtaining the reference deviation of the reference feature center B from the reference vision center a in the vision coordinate system
Figure BDA0002324719940000061
In the photographing position N of the feature to be measured, the main control system 30 can calculate the feature deviation to be measured of the feature center D to be measured relative to the vision center C to be measured, and calibrate the above feature deviation to be measured, so as to obtain the deviation +.>
Figure BDA0002324719940000062
Step S103: and calculating the coordinates of the reference feature center in the visual coordinate system and the coordinates of the feature center to be detected in the visual coordinate system to obtain the position degree of the feature center to be detected relative to the reference feature center.
Wherein, the master control system 30 can calculate the coordinates of the reference feature center in the visual coordinate system through the coordinates of the reference visual center in the visual coordinate system and the reference deviation of the reference feature center relative to the reference visual center. The master control system 30 can calculate and obtain the coordinates of the feature center to be measured in the visual coordinate system through the coordinates of the feature center to be measured in the visual coordinate system and the deviation of the feature center to be measured relative to the vision center to be measured.
Specifically, as shown in fig. 3, the reference visual center a has a coordinate in the visual coordinate system of (x 0 ,y 0 ) The reference deviation of the reference feature center B relative to the reference vision center A in the vision coordinate system is that
Figure BDA0002324719940000071
As can be seen from the plane vector algorithm, the reference feature center B has a coordinate (x) in the visual coordinate system 0 +x AB ,y 0 +y AB ). The coordinates of the visual center C to be measured in the visual coordinate system are (x 1 ,y 1 ) The reference deviation of the feature center D to be measured in the visual coordinate system with respect to the reference visual center C is +.>
Figure BDA0002324719940000072
According to the plane vector algorithm, the coordinates of the feature center D to be measured in the visual coordinate system are (x) 1 +x CD ,y 1 +y CD )。
Since the number of the photographing positions of the reference feature is one in the present embodiment, the degree of the position of the feature center to be measured with respect to the reference feature center is equal to the distance between the coordinates of the reference feature center in the visual coordinate system and the coordinates of the feature center to be measured in the visual coordinate system.
Specifically, in this embodiment, the position degree of the feature center D to be measured relative to the reference feature center B is the distance between the reference feature center B and the feature center D to be measured in the visual coordinate system.
As shown in fig. 3, in the present embodiment, the reference feature center B has a coordinate in the visual coordinate system of (x 0 +x AB ,y 0 +y AB ) The coordinates of the feature center D to be measured in the visual coordinate system are (x 1 +x CD ,y 1 +y CD ). The master control system 30 can obtain the position degree between the feature center D to be measured and the reference feature center B according to the distance formula between the two points
Figure BDA0002324719940000073
Further, before acquiring the image at the reference feature photographing position and the image at the feature photographing position to be detected, the position degree detection method includes: and the reference feature photographing position and the feature photographing position to be measured are marked through teaching.
Specifically, teaching of the master control system is also required before the position degree detection using the robot 100, so that the master control system 30 acquires the reference feature photographing position M and the feature photographing position N to be measured.
In the present embodiment, before the position degree detection using the robot 100, the master control system 30 needs to be taught to obtain one reference feature photographing position M and one feature photographing position N to be measured. In performing the position degree detection using the robot 100, the main control system 30 controls the image acquisition system 20 to take a picture at the one reference feature photographing position M and the one feature photographing position N to be measured.
Of course, in other embodiments, when there are a plurality of detected features to be detected, the shooting positions N of the plurality of features to be detected need to be taught, so that the master control system 30 obtains the shooting positions N of the plurality of features to be detected. And at the time of the subsequent position degree detection, the image acquisition system 20 is controlled to take pictures at a plurality of feature shooting positions N to be detected. The number and the number of the photographing of the features to be measured are determined according to the positions of the features to be measured on the clamping system 10 and the number of the features to be measured, which are required to be detected, and the present invention is not limited in detail herein.
The teaching of the robot 100 may be performed in a currently common manner, and the embodiment of the present invention is not specifically limited.
With continued reference to fig. 6 and fig. 7, fig. 6 is a flowchart of a method for detecting a position according to another embodiment of the invention, and fig. 7 is a schematic diagram of the model in fig. 6. In this embodiment, the vision-based position degree detection method includes:
step S401: and acquiring an image at the photographing position of the reference feature and an image at the photographing position of the feature to be detected.
Step S401 in the present embodiment is substantially the same as step S101 in the above embodiment, except that in the present embodiment, the number of reference feature photographing positions is two, that is, the main control system 30 controls the image capturing system 20 to photograph at the two reference feature photographing positions, respectively. The reference feature photographing positions are recorded as R and S, and the feature photographing position to be measured is recorded as T.
Step S402: acquiring coordinates of a reference visual center of an image in a reference feature photographing position in a visual coordinate system and reference deviation of the reference feature center relative to the reference visual center, and acquiring coordinates of a to-be-measured visual center of the image in the to-be-measured feature photographing position in the visual coordinate system and to-be-measured deviation of the to-be-measured feature center relative to the to-be-measured visual center.
In this embodiment, the master control system 330 calculates the reference visual center R of the images under the reference feature photographing positions R and S, respectively 0 And S is 0 Coordinates in the visual coordinate system (x R ,y R ) And (x) S ,y S ) Reference feature center R 1 Relative reference visual center R 0 Reference deviation of (2) and reference feature center S 1 Relative reference vision center S 0 Is defined as a reference deviation of (a). And calculates the vision center T to be measured of the image under the photographing position T of the feature to be measured 0 Coordinates in the visual coordinate system (x T ,y T ) The center of the feature to be measured is opposite to the center T of the vision to be measured 0 Is a measured deviation.
The calculation method of each reference deviation and the to-be-measured deviation in step S402 in the present embodiment is substantially the same as the calculation method of each reference deviation and the to-be-measured deviation in step S102 in the above embodiment, and is not repeated herein, referring to the description in the above embodiment.
Specifically, in the present embodiment, as shown in fig. 7, at the reference feature photographing positions R and S, the main control system 230 may calculate the reference feature deviation I of the reference feature center with respect to the reference vision center and the reference feature deviation II of the reference feature center with respect to the reference vision center, and calibrate the above reference feature deviations I and II, thereby obtaining the reference feature center R 1 Relative reference visual center R 0 Reference deviation in visual coordinate system
Figure BDA0002324719940000091
And a reference feature center S 1 Relative reference vision center S 0 Reference deviation in visual coordinate System +.>
Figure BDA0002324719940000092
Under the photographing position T of the feature to be measured, the main control system 230 can calculate the center T of the feature to be measured 1 Relative to vision center T to be measured 0 The feature deviation to be measured is calibrated, so that a feature center T to be measured can be obtained 1 Relative to vision center T to be measured 0 Deviation to be measured in the visual coordinate system +.>
Figure BDA0002324719940000093
The master control system 230 calculates a reference deviation
Figure BDA0002324719940000094
And->
Figure BDA0002324719940000095
Deviation to be measured->
Figure BDA0002324719940000096
The following step S403 is then performed: and calculating the coordinates of the reference feature center in the visual coordinate system and the coordinates of the feature center to be detected in the visual coordinate system to obtain the position degree of the feature center to be detected relative to the reference feature center.
Wherein, the master control system 30 can calculate the coordinates of the reference feature center in the visual coordinate system through the coordinates of the reference visual center in the visual coordinate system and the reference deviation of the reference feature center relative to the reference visual center. The master control system 30 can calculate and obtain the coordinates of the feature center to be measured in the visual coordinate system through the coordinates of the feature center to be measured in the visual coordinate system and the deviation of the feature center to be measured relative to the vision center to be measured.
Specifically, as shown in fig. 7, the reference visual center R 0 And S is 0 The coordinates in the visual coordinate system are (x R ,y R ) And (x) S ,y S ) Reference feature center R 1 And S is 1 Relative reference visual center R in visual coordinate system 0 And S is 0 Reference deviation of (2) is
Figure BDA0002324719940000101
And->
Figure BDA0002324719940000102
The reference feature center R is known from the plane vector algorithm 1 And S is 1 The coordinates in the visual coordinate system are +.>
Figure BDA0002324719940000104
And->
Figure BDA0002324719940000105
Visual center T to be measured 0 The coordinates in the visual coordinate system are (x T ,y T ) And, feature center to be measured T 1 Relative reference visual center T in visual coordinate system 0 Reference deviation of (2) is
Figure BDA0002324719940000103
According to the plane vector algorithm, the feature center T is measured 1 The coordinates in the visual coordinate system are +.>
Figure BDA0002324719940000106
Since the number of reference feature photographing positions is two, as shown in fig. 8, fig. 8 is a flowchart of step S403 in fig. 6. In this embodiment, the step of obtaining the position degree of the feature center to be measured relative to the reference feature center includes:
step S501: and establishing a reference coordinate system by using the two reference feature centers, and calculating the conversion relation between the reference coordinate system and the visual coordinate system.
Specifically, the master control system 30 centers R with two reference features 1 And S is 1 Establishing a reference coordinate system and utilizing two reference feature centers R 1 And S is 1 And the relation between the coordinates in the reference coordinate system and the visual coordinate system is obtained, so that the conversion relation between the reference coordinate system and the visual coordinate is obtained.
Further, as shown in fig. 9, fig. 9 is a schematic flow chart of step S501 in fig. 8. In this embodiment, the step of establishing the reference coordinate system with the two reference feature centers includes:
step S601: with one of the reference feature centers as the origin.
In the present embodiment, the master control system 30 uses the reference feature center R 1 As the origin.
Step S602: the X-axis is the line connecting the two reference features.
In the present embodiment, the master control system 30 uses R 1 And S is 1 Is used as the X axis.
Step S603: and establishing a reference coordinate system by taking the direction perpendicular to the X axis as the Y axis.
In this embodiment, the main control system 30 establishes a reference coordinate system with a direction perpendicular to the X axis as the Y axis.
In another embodiment, the master control system 30 may also reference the feature center S 1 As the origin, take S 1 And R is 1 The connection line of the (c) is taken as an X axis, and a direction perpendicular to the X axis is taken as a Y axis, so as to establish a reference coordinate system.
Step S502: and calculating the coordinates of the feature center to be measured in the reference coordinate system.
Specifically, the master control system 30 utilizes the feature center under test T 1 Coordinates in visual coordinate system
Figure BDA0002324719940000112
And the conversion relation from the visual coordinate system to the reference coordinate system can obtain the feature center T to be detected 1 Coordinates (x ', y') in the reference visual coordinate system.
Step S503: the position degree of the feature center to be measured relative to the reference feature center is equal to the distance between the feature center to be measured and the origin of coordinates of the reference coordinate system.
Specifically, in this embodiment, the position of the center of the feature to be measured relative to the reference center of the feature to be measured is the center T of the feature to be measured in the reference coordinate system 1 Distance from origin of coordinates.
Specifically, in the present embodiment, the feature center T to be measured 1 The coordinates in the reference coordinate system are (x ', y'), the master control system 230 can obtain the feature center T to be measured according to the distance formula between two points 1 Distance from origin of coordinates, i.e. center T of feature to be measured 1 The position degree of (2) is
Figure BDA0002324719940000111
Referring to fig. 10, fig. 10 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application. The computer storage medium 40 is for storing a computer program 41 for implementing the position degree detection method in the above-described embodiment when the computer program 41 is executed by a processor.
The computer storage medium 40 may be a server, a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, etc. various media capable of storing program codes.
In summary, the invention uses the image acquisition system to acquire the reference feature photographing position and the image under the feature photographing position to be measured, and uses the main control system to analyze the images, thereby calculating the position degree of the feature center to be measured relative to the reference feature center. The vision is utilized for detection, the requirement on the environment of detection is low, and therefore the detection device can be integrated into an automatic production line, and real-time detection of the product position degree is realized. And contact with the product is avoided, and abrasion to the product is reduced.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (8)

1. A vision-based location detection method, comprising:
acquiring an image at a reference feature photographing position and an image at a feature photographing position to be detected; wherein the shooting position of the feature to be detected is one;
acquiring coordinates of a reference visual center of the image in the reference feature photographing position in a visual coordinate system;
acquiring a reference feature deviation of a reference feature center of an image at the reference feature photographing position relative to the reference vision center, and calibrating the reference feature deviation to obtain a reference deviation;
acquiring coordinates of a visual center to be detected of an image under the shooting position of the feature to be detected in the visual coordinate system;
acquiring the feature deviation to be detected of the image at the shooting position of the feature to be detected relative to the vision center to be detected, and calibrating the feature deviation to be detected to obtain the deviation to be detected;
and calculating the coordinates of the reference feature center in the visual coordinate system and the coordinates of the feature center to be detected in the visual coordinate system to obtain the position degree of the feature center to be detected relative to the reference feature center.
2. The method according to claim 1, wherein the step of calibrating the reference feature deviation and the feature deviation to be measured includes:
collecting an image of a visual calibration plate;
analyzing the image, acquiring a calibration model, and calibrating the reference characteristic deviation and the characteristic deviation to be measured by using the calibration model.
3. The method according to claim 1, wherein the number of the reference feature photographing positions is one, and the degree of the position of the feature center to be measured with respect to the reference feature center is equal to a distance between coordinates of the reference feature center in the visual coordinate system and coordinates of the feature center to be measured in the visual coordinate system.
4. The method according to claim 1, wherein the number of the reference feature photographing positions is two, and the step of obtaining the degree of position of the feature center to be measured with respect to the reference feature center includes:
establishing a reference coordinate system by using the two reference feature centers, and calculating the conversion relation between the reference coordinate system and the visual coordinate system;
calculating the coordinates of the feature center to be detected in the reference coordinate system;
the position degree of the feature center to be measured relative to the reference feature center is equal to the distance between the feature center to be measured and the coordinate origin of the reference coordinate system.
5. The method of claim 4, wherein the step of establishing a reference coordinate system with two reference feature centers comprises:
taking one of the reference feature centers as an origin;
taking the connecting line of the two reference features as an X axis;
and establishing the reference coordinate system by taking the direction perpendicular to the X axis as a Y axis.
6. The position degree detection method according to claim 1, characterized in that before acquiring the image in the reference feature photographing position and the image in the feature photographing position to be measured, the position degree detection method comprises: and marking the reference feature photographing position and the feature photographing position to be measured through teaching.
7. A computer storage medium storing a computer program, which, when executed by a processor, is adapted to carry out the position detection method according to any one of claims 1-6.
8. The robot is characterized by comprising a clamping system, an image acquisition system and a main control system, wherein the clamping system is used for clamping and positioning a product to be detected, the image acquisition system is used for taking pictures, and the main control system is used for executing the position degree detection method according to any one of claims 1-6.
CN201911311789.1A 2019-12-18 2019-12-18 Position degree detection method based on vision, robot and computer storage medium Active CN111145247B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911311789.1A CN111145247B (en) 2019-12-18 2019-12-18 Position degree detection method based on vision, robot and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911311789.1A CN111145247B (en) 2019-12-18 2019-12-18 Position degree detection method based on vision, robot and computer storage medium

Publications (2)

Publication Number Publication Date
CN111145247A CN111145247A (en) 2020-05-12
CN111145247B true CN111145247B (en) 2023-07-07

Family

ID=70518749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911311789.1A Active CN111145247B (en) 2019-12-18 2019-12-18 Position degree detection method based on vision, robot and computer storage medium

Country Status (1)

Country Link
CN (1) CN111145247B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111805158A (en) * 2020-05-25 2020-10-23 福建星云电子股份有限公司 Self-detection method and system for displacement of clamp of robot
CN116840243B (en) * 2023-09-01 2023-11-28 湖南睿图智能科技有限公司 Correction method and system for machine vision object recognition

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5125035A (en) * 1989-12-18 1992-06-23 Chromalloy Gas Turbine Corporation Five axis generated hole inspection system
JPH06785A (en) * 1992-06-23 1994-01-11 Citizen Watch Co Ltd Correcting method for visual sensor coordinate system
US7194387B1 (en) * 2005-09-30 2007-03-20 Intel Corporation Apparatus and method for determining the minimum zone for an array of features with a true position tolerance
CN102183205A (en) * 2011-01-19 2011-09-14 北京航空航天大学 Method for matching optimal assembly poses of large-sized parts
CN102589429A (en) * 2012-01-20 2012-07-18 天津大学 On-line engine cylinder hexahedral hole set position error detecting method
WO2018214147A1 (en) * 2017-05-26 2018-11-29 深圳配天智能技术研究院有限公司 Robot calibration method and system, robot and storage medium
CN109900207A (en) * 2019-03-12 2019-06-18 精诚工科汽车***有限公司 The tool center point scaling method and system of robot vision tool
CN110017769A (en) * 2019-03-12 2019-07-16 精诚工科汽车***有限公司 Part detection method and system based on industrial robot
CN110068271A (en) * 2019-04-19 2019-07-30 怡得乐电子(杭州)有限公司 The PIN needle position degree detection method of the large size product of sub-pixel precision
CN110207565A (en) * 2019-05-31 2019-09-06 南京理工大学 A kind of high speed warp knit machine crankshaft rigging position degree measurement tooling and measurement method
CN110281238A (en) * 2019-06-17 2019-09-27 深圳视觉龙智能传感器有限公司 Assembly line multi-robot scaling method, device, computer equipment and storage medium
CN110455225A (en) * 2019-08-22 2019-11-15 吉林大学 Rectangle splined spindle concentricity and key position degree measurement method based on structure light vision

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5125035A (en) * 1989-12-18 1992-06-23 Chromalloy Gas Turbine Corporation Five axis generated hole inspection system
JPH06785A (en) * 1992-06-23 1994-01-11 Citizen Watch Co Ltd Correcting method for visual sensor coordinate system
US7194387B1 (en) * 2005-09-30 2007-03-20 Intel Corporation Apparatus and method for determining the minimum zone for an array of features with a true position tolerance
CN102183205A (en) * 2011-01-19 2011-09-14 北京航空航天大学 Method for matching optimal assembly poses of large-sized parts
CN102589429A (en) * 2012-01-20 2012-07-18 天津大学 On-line engine cylinder hexahedral hole set position error detecting method
CN109952176A (en) * 2017-05-26 2019-06-28 深圳配天智能技术研究院有限公司 A kind of robot calibration method, system, robot and storage medium
WO2018214147A1 (en) * 2017-05-26 2018-11-29 深圳配天智能技术研究院有限公司 Robot calibration method and system, robot and storage medium
CN109900207A (en) * 2019-03-12 2019-06-18 精诚工科汽车***有限公司 The tool center point scaling method and system of robot vision tool
CN110017769A (en) * 2019-03-12 2019-07-16 精诚工科汽车***有限公司 Part detection method and system based on industrial robot
CN110068271A (en) * 2019-04-19 2019-07-30 怡得乐电子(杭州)有限公司 The PIN needle position degree detection method of the large size product of sub-pixel precision
CN110207565A (en) * 2019-05-31 2019-09-06 南京理工大学 A kind of high speed warp knit machine crankshaft rigging position degree measurement tooling and measurement method
CN110281238A (en) * 2019-06-17 2019-09-27 深圳视觉龙智能传感器有限公司 Assembly line multi-robot scaling method, device, computer equipment and storage medium
CN110455225A (en) * 2019-08-22 2019-11-15 吉林大学 Rectangle splined spindle concentricity and key position degree measurement method based on structure light vision

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Dynamic tracking of manipulators using visual feedback from an uncalibrated fixed camera;Yun-hui Liu等;《Proceedings 2006 IEEE International Conference on Robotics and Automation》;20060626;4124-4129 *
Mobile robot localization system using multiple ceiling mounted cameras;R.Visvanathan等;《2015 IEEE SENSORS》;20160107;1-4 *
基于多传感器测量的孔组位置度快速检测方法;李兵等;《计算机集成制造***》;20180531;第24卷(第5期);1089-1097 *
孔组位置度视觉测量技术研究;陈朝;《中国优秀硕士学位论文全文数据库 信息科技辑》;20150915;第2015年卷(第9期);I138-1444 *

Also Published As

Publication number Publication date
CN111145247A (en) 2020-05-12

Similar Documents

Publication Publication Date Title
CN110689579B (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
JP6280525B2 (en) System and method for runtime determination of camera miscalibration
US9111177B2 (en) Position/orientation measurement apparatus, processing method therefor, and non-transitory computer-readable storage medium
CN102376089B (en) Target correction method and system
CN109163657B (en) Round target pose detection method based on binocular vision three-dimensional reconstruction
JP5270670B2 (en) 3D assembly inspection with 2D images
TWI517101B (en) Calibration system and method for 3d scanner
EP3607264B1 (en) Machine vision system
CN106709955B (en) Space coordinate system calibration system and method based on binocular stereo vision
CN112325789B (en) Method for measuring deformation and displacement in model test based on image processing
CN111145247B (en) Position degree detection method based on vision, robot and computer storage medium
CN107607294B (en) industrial camera entrance pupil position detection method and system
CN102589429A (en) On-line engine cylinder hexahedral hole set position error detecting method
CN108627104A (en) A kind of dot laser measurement method of parts height dimension
CN106705860B (en) A kind of laser distance measurement method
CN115540775A (en) 3D video extensometer of CCD single-phase machine
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN110044266B (en) Photogrammetry system based on speckle projection
TW201326738A (en) Test device and method for camera module
CN104034259A (en) Method for correcting image measurement instrument
Ricolfe-Viala et al. Optimal conditions for camera calibration using a planar template
TW201317587A (en) Dimension measuring device and method thereof
CN113670280B (en) Verticality measuring device and measuring method
CN113983951B (en) Three-dimensional target measuring method, device, imager and storage medium
CN115290008A (en) Angle calibration algorithm of image measurement collimator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant