CN113211433A - Separated visual servo control method based on composite characteristics - Google Patents

Separated visual servo control method based on composite characteristics Download PDF

Info

Publication number
CN113211433A
CN113211433A CN202110427272.XA CN202110427272A CN113211433A CN 113211433 A CN113211433 A CN 113211433A CN 202110427272 A CN202110427272 A CN 202110427272A CN 113211433 A CN113211433 A CN 113211433A
Authority
CN
China
Prior art keywords
camera
curve
control
point
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110427272.XA
Other languages
Chinese (zh)
Other versions
CN113211433B (en
Inventor
盛春阳
张垚
卢晓
王海霞
张治国
聂君
宋诗斌
李玉霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Science and Technology
Original Assignee
Shandong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Science and Technology filed Critical Shandong University of Science and Technology
Priority to CN202110427272.XA priority Critical patent/CN113211433B/en
Publication of CN113211433A publication Critical patent/CN113211433A/en
Application granted granted Critical
Publication of CN113211433B publication Critical patent/CN113211433B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a separated visual servo control method based on composite characteristics. Firstly, in order to accurately express the global geometric information of an object and reduce the interference of environmental noise on image feature extraction, the invention adopts a non-uniform rational B-spline curve fitting technology to extract the contour features of the object; then, the robot is controlled in a translation and rotation separated mode by utilizing the composite characteristics, namely the translation and rotation postures of the robot are controlled by utilizing curve fitting control point characteristics, line characteristics of connecting lines of adjacent points and distance characteristics between the two points; finally, aiming at the problem of strong coupling between the attitude control and the position control of the robot, the invention provides a rotation compensation module for compensating the image characteristic position deviation of the robot caused by the rotation motion of a camera, thereby improving the performance of a servo system. The method does not depend on global characteristics, has good real-time performance, rotation compensation and high reliability, and is a high-precision robot vision servo control method.

Description

Separated visual servo control method based on composite characteristics
Technical Field
The invention belongs to the field of robot visual servo control, and particularly relates to a separated visual servo control method based on composite characteristics.
Background
The vision servo is servo control of the pose of the robot or the camera by using vision information, and the aim of the vision servo is to control the robot or the camera to quickly reach the expected pose. The design of a visual servo system relates to two key problems, namely extraction of image features and design of a visual servo control law. The selection of image features is generally divided into three categories, namely point features, line features and global features. The point features comprise angular points, inflection points, mass points and the like, and the point features have the advantages of simple model, high calculation speed and poor interference coping capability. The line characteristics comprise straight line characteristics and curve characteristics, the straight line characteristics require that the object is a standard body and the application scene is limited; the curve feature is flexible, but requires extraction of contour information of the object. The global features comprise image moments, image entropy, light moments and the like, and have the advantages of high robustness to environmental changes, complex calculation and long system convergence time.
The characteristic extraction method is comprehensively compared, and the object contour curve is used as the image characteristic, so that the global characteristic information of the object can be better described, and the real-time requirement can be met. However, since the rotation control of the robot or the camera has a large influence on the position control, especially when the contour feature of the target object is extracted by using the curve fitting technology, the position of the curve control point feature is changed greatly by the rotation motion, which easily causes the situation of target loss. Therefore, it is very feasible to consider that the robot is controlled in a separated mode by using the composite image characteristics, and the image characteristic position deviation caused by the attitude control is compensated, so that the method has important significance for improving the performance of the robot servo system.
Disclosure of Invention
Aiming at the problems of the existing robot vision servo method based on curve characteristics, the invention provides a separated vision servo control method based on composite characteristics, which utilizes the point characteristics of curve fitting characteristic points, the connection line characteristics of adjacent points and the distance characteristics between two points to realize the separated control of the translation and rotation postures of the robot, designs a rotation compensation module to compensate the image characteristic position deviation caused by the rotation motion of a camera, overcomes the defects of the prior art and has good effect.
The invention adopts the following technical scheme:
a separation type visual servo control method based on composite characteristics comprises the steps of image characteristic extraction, the design of a rotation compensation module and the design of a separation type visual servo control law, and comprises the following steps:
s1, inversely calculating a curve control vertex based on a NURBS curve fitting technology, and obtaining curve control point characteristics;
s2, calculating a two-dimensional projection curve of the spatial NURBS curve on the camera imaging plane;
s3, after the real-time fitting operation is carried out on the contour projection curve of the object by a curve fitting method, calculating an interaction matrix based on the characteristics of curve control points by using the obtained curve control points;
s4, selecting curve control point connection characteristics, and calculating an interaction matrix based on the curve control point connection characteristics;
s5, calculating the distance between curve control points as a distance characteristic, and calculating an interaction matrix based on the distance characteristic between the two points;
s6, compensating for the change of the image characteristic position caused by the change of the camera rotation attitude, so that the system obtains better decoupling characteristic and the performance of the servo system is improved;
s7, because the visual servo system adopts a monocular camera, the depth information of the space control vertex can not be directly obtained, and in order to obtain the depth information of the curve control vertex, the depth information in the interactive matrix is subjected to online depth estimation by utilizing the projection variation quantity on the image plane of the curve control point characteristic and the motion speed of the camera;
s8, controlling the rotation posture of the camera by using the linear characteristic of the connecting line of the adjacent curve control points, controlling the translational motion of the robot along the Z-axis direction of the camera by using the distance characteristic between the two curve control points, controlling the translational motion of the robot along the X, Y-axis direction of the camera by using the characteristics of the curve control points, adding the compensation amount of the rotational motion in the position control of the robot, and finally completing the servo task of the robot.
Preferably, step S1 includes the following sub-steps:
s11, processing the projection contour curve on the camera imaging plane, and extracting the data point cloud coordinates of the projection curve;
s12, equally dividing the coordinates of the curve point cloud data into r intervals according to the same interval;
s13, selecting n fields of each data point in each interval to carry out polynomial operation to obtain a segmented polynomial, and then respectively calculating the curvature of the data point in each segmented polynomial;
s14, selecting a data point with the maximum curvature absolute value and the end points of the first data and the last data of the curve as curve type value points;
and S15, inversely calculating curve control points according to the De Boor-Cox algorithm.
Preferably, in S15, a value point of the k-times NURBS curve is recorded as qi(i-0, 1, …, n), and its node vector U-U0,u1,…,un+6]Is calculated as:
Figure BDA0003030030870000021
wherein the content of the first and second substances,
Figure BDA0003030030870000022
after curve type value points and node vectors are obtained, curve control points are inversely calculated according to a De Boor-Cox algorithm, and a NURBS equation system of node interpolation is as follows:
Figure BDA0003030030870000031
wherein d isjIs the curve control point, ωjIs a weight factor of the curve control point, let ωj=1,Bj,k(uj) The B-spline basis function is deduced by the node vector according to a De Boor-Cox recursion formula.
After all curve control vertexes are obtained, if the fitting curve precision is found to be not capable of meeting the requirement, the fitting precision is improved by increasing the number of nodes until the fitting precision meets the requirement.
Preferably, in S2, it is known that there is a set of robot vision servo system in space, the camera is mounted at the end of the arm, i.e. eye-on-hand configuration; defining the coordinate system of the end effector of the robot as { T }, the coordinate system of the camera as { C }, and the coordinate system of the robot as { R }; when the NURBS curve is used to describe the space curve, the vertex of the space control is di(i=0,1,…,n),diThe projection control vertex on the camera imaging plane is recorded asCdi(i ═ 0,1, …, n); according to the perspective projection model of the camera, the control vertex d of the space curve at any time tiProjecting control points on the camera image planeCdiExpressed as:
Figure BDA0003030030870000032
wherein M is an internal reference matrix of the camera, and the internal reference matrix of the camera is obtained by calibrating the camera; h (q (t)) is a homogeneous transformation matrix from a robot base coordinate system to a robot tail end coordinate system, and q (t) is a joint variable of the robot at the time t; h (q (t)) is calculated by the D-H parameter of the robot and the joint angle of the robot at the time t;
from the NURBS curve definition and equation (4),
Figure BDA0003030030870000033
equation (5) is a two-dimensional projection curve of the spatial NURBS curve on the camera imaging plane at the time t.
Preferably, in S3, the rigid body movement speed of the camera mounted on the end of the robot arm in space is set to V ═ V (V ═ V)cc),P(Xd,Yd,Zd) The coordinate of the space control vertex P relative to the camera, the moving speed of the curve control vertex P in the camera coordinate system is:
Figure BDA0003030030870000034
the scalar form of equation (6) is:
Figure BDA0003030030870000041
wherein v isc=[vcx,vcy,vcz]TIs the linear velocity, omega, of the camerac=[ωcxcycz]TIs the angular velocity of the camera.
According to the projection perspective relation of the camera, the coordinates of the curve control points on the image normalization plane are expressed as:
xdc=Xd/Zd,ydc=Yd/Zd (8)
the two sides of the formula (8) are respectively derived from the time:
Figure BDA0003030030870000042
substituting the formula (8) and the formula (9) into the formula (7) and finishing to obtain:
Figure BDA0003030030870000043
equation (10) is a relational expression between the change of the control point feature on the camera normalized imaging plane and the movement of the camera at the end of the robot arm, and it can be seen from equation (10) that one point image feature corresponds to two components, and at least 3 curve control vertexes should be selected as image features in order to avoid the occurrence of the under-driving condition when one 6-degree-of-freedom robot arm is controlled. When the number of the selected characteristic points is more than 3, the pose of the 6-degree-of-freedom robot can be better and uniquely determined. Therefore, the invention selects 4 NURBS curve control points as image characteristics to perform the servo task while considering the real-time performance of the visual servo system.
Preferably, in S4, it is assumed that point A, B is two adjacent curve control points obtained by performing NURBS curve fitting on the contour of the target object, and O is an intersection point of the camera optical axis and the image normalization plane; determining l from the image coordinates of A, B two control pointsABEquation of the straight line of (c), let the straight line lABHas a slope of k1A straight line p perpendicular thereto1The slope of o is k2,k2=-1/k1(ii) a Coordinates from origin O and slope k2To find a point p1Coordinates on the focal length normalized imaging plane;
let p1Coordinate is p1(xp1,yp11), then p1The polar parameters of the points are expressed as:
Figure BDA0003030030870000044
along a line lABSetting two points of interest p1Point of symmetry p2And p3Polar coordinate parameters are respectively rho22And ρ33. P is to be2,p3Substituting the parameters into a polar coordinate parameter equation of the straight line to obtain:
Figure BDA0003030030870000051
wherein alpha is2=α+Δα,α3=α-Δα,ρ2=ρ3Δ α is an approximationA positive number of 0, then p is used2,p3Represents a straight line l in polar coordinatesABThe parameter α of (a) is:
Figure BDA0003030030870000052
deriving the time from formula (13) and arranging:
Figure BDA0003030030870000053
and (3) substituting the polar coordinate parameter change rate of the points into the formula (14), and sorting to obtain an interaction matrix based on the linear polar coordinate parameter alpha:
Figure BDA0003030030870000054
wherein v ═ vx,vy,vz]TThe translational motion rate of the camera in the x, y and z axes is ω ═ ωxyz]TThe rotational motion rate of the camera in the x, y, z axes,
Figure BDA0003030030870000055
and z is2,z3Is a point p2And p3The value of which is obtained by online estimation during camera motion; when the line feature is perpendicular to the camera optical axis, z2=z3The change of the linear characteristic parameter alpha is approximately regarded as being generated only by the change of the rotation posture of the camera; rewriting equation (15) as:
Figure BDA0003030030870000056
after the relevant straight line characteristic parameters are calculated by using the connection line of the two control points, the camera posture is controlled by using an interaction matrix shown in an equation (17).
Preferably, in S5, set A, B as the two NURBS curve control vertices on the focal length normalized imaging plane and A, B as the pixel coordinates of a (x) for the two control verticesA,yA),B(xB,yB) (ii) a From the geometric knowledge, A, B represents the distance between two points:
Figure BDA0003030030870000057
the two sides of the above formula are simultaneously derived from time to obtain the distance change rate between the two control points as follows:
Figure BDA0003030030870000061
and (3) carrying the point characteristic interaction matrix (10) into a formula (18) to obtain an interaction matrix of the distance characteristic between two points:
Figure BDA0003030030870000062
wherein the content of the first and second substances,
Figure BDA0003030030870000063
and z isA,zBIs the depth value of the curve control vertex A, B, obtained by online estimation during the motion of the camera.
Preferably, in S6, the rotational motion rate of the camera is found by using the linear feature interaction matrix as follows:
Figure BDA0003030030870000064
wherein L isIs a line characteristic interaction matrix of a connecting line of two control points, and rho and alpha are linear characteristic parameters of the connecting line of the control points.
Figure BDA0003030030870000065
The relationship between the rate of rotational motion of the camera and the rate of change of the projected position of the control point feature on the image plane is:
Figure BDA0003030030870000066
wherein L isIs an interaction matrix of the feature change rate of the control points on the imaging plane and the rotational motion rate of the camera, LComprises the following steps:
Figure BDA0003030030870000071
wherein x iscn、ycnIs the pixel coordinate of the nth curve control point on the camera imaging plane;
the rotation compensation quantity of the control point of the interactive matrix design curve by utilizing the variation quantity of the characteristics on the image plane is as follows:
Figure BDA0003030030870000072
wherein L ispvIs an interaction matrix of the feature change rate of the control points on the imaging plane and the rotational motion rate of the camera, LpvComprises the following steps:
Figure BDA0003030030870000073
wherein z iscnIs the depth value of the nth curve control point.
Preferably, in S7, let zciIs the depth value of the ith curve control vertex with the pixel coordinate of Pci(xci,yci1), then ziThe estimation of (c) is as follows:
Figure BDA0003030030870000074
point P on the line characteristic of the line connecting two adjacent control points on the camera focal length normalization imaging planelj(xlj,ylj1) depth information zljCarrying out online estimation through the change rate of the linear characteristic parameters and the movement speed of the camera; z is a radical ofljThe estimation method of (2) is as follows:
Figure BDA0003030030870000075
preferably, in S8, the rotational posture of the camera is controlled by using a linear feature of a connection line between adjacent control points, the translational motion of the robot along the Z-axis direction of the camera is controlled by using a distance feature between the two control points, the translational motion of the robot along the X, Y-axis direction is controlled by using a control point feature, and a rotational motion compensation amount is added in the position control of the robot.
Respectively defining the error of a curve control point in an image feature space as ep(t) the angle error of the connecting line of the control points of the adjacent curves is eα(t) the distance error between two control points is ed(t);
Figure BDA0003030030870000081
Figure BDA0003030030870000082
Figure BDA0003030030870000083
In the formula (29), fphIs the desired curve control point characteristic, fpc(t) is the control point feature at the current pose of the camera; in the formula (30), fαhIs due to the expectation ofAngle characteristic of the line connecting the control points of adjacent curves, fαc(t) is the angle characteristic of the connecting line of the control points of the adjacent curves under the current pose of the camera; in the formula (31), fdhIs a desired distance characteristic between two control points, fdcIs a distance characteristic between two control points under the current pose of the camera,
Figure BDA0003030030870000084
the image characteristic variation quantity of the point characteristic, the line characteristic of the control point connecting line and the distance characteristic between two points which are required by completing the servo task is respectively;
the uncalibrated visual servoing system aims to converge the difference between the characteristic errors of the respective images defined on the image plane to 0, and therefore, equations (32), (33), and (34) are defined as follows:
Figure BDA0003030030870000085
Figure BDA0003030030870000086
Figure BDA0003030030870000087
wherein k isp,kα,kdIs the system controller gain factor;
formula (32), formula (33), formula (34) and the definition of jacobian are taken together in formula (29), formula (30) and formula (31), respectively:
Figure BDA0003030030870000088
Figure BDA0003030030870000089
Figure BDA00030300308700000810
wherein the content of the first and second substances,
Figure BDA00030300308700000811
is a control point feature interaction matrix JpThe pseudo-inverse matrix of (a) is,
Figure BDA00030300308700000812
and
Figure BDA00030300308700000813
respectively a pseudo-inverse matrix of the line characteristic interaction matrix and a pseudo-inverse matrix of the distance characteristic interaction matrix between the two control points;
take the matrix J in equation (35)pThe first 3 columns of (A) form a matrix JpvTaking the matrix JpThe last 3 columns of (A) form a matrix JAnd the compensation quantity of the curve rotation compensation module to the position control is as follows:
Figure BDA00030300308700000814
wherein the content of the first and second substances,
Figure BDA00030300308700000815
is a matrix JpvA pseudo-inverse matrix of (d);
get
Figure BDA0003030030870000091
First 2 behavior of
Figure BDA0003030030870000092
The position control amount of the robot end movement is:
Figure BDA0003030030870000093
total joint control required to perform servo tasks
Figure BDA0003030030870000094
Comprises the following steps:
Figure BDA0003030030870000095
the invention has the following beneficial effects:
the invention provides a separation type visual servo control method based on composite characteristics, which respectively constructs point characteristics, distance characteristics between two points and line characteristics of adjacent point connecting lines on the basis of image characteristics obtained by a curve fitting technology, realizes the control of the translation and rotation postures of a robot, and has the advantages that 1) a composite characteristic is designed, and more image information is contained; 2) the robot translation and rotation control is controlled in a partially separated mode, so that the robot can be converged to a desired position quickly and smoothly; 3) a rotation compensation module is designed in a position control law to compensate the movement of the characteristic translation direction caused by the rotation of the robot, so that the robot obtains good decoupling characteristic, and the target object can be prevented from being lost in the visual field range of the camera to a certain extent.
Drawings
FIG. 1 is a block diagram of a servo system of the present invention;
FIG. 2 is a NURBS curve fitting flow chart of the present invention;
FIG. 3 is a schematic view of a vision servo system of the present invention;
FIG. 4 is a line characterization diagram of the invention formed by connecting adjacent control points;
Detailed Description
The following description of the embodiments of the present invention will be made with reference to the accompanying drawings:
as shown in fig. 1, a method for separate vision servo control based on composite features, taking a six-degree-of-freedom industrial robot servo task as an example, a camera is installed at the tail end of a mechanical arm, namely, an eye-in-hand configuration, includes the following steps:
s1, inversely calculating a curve control vertex based on a NURBS curve fitting technology, and obtaining curve control point characteristics;
specifically, step S1 includes the following sub-steps:
s11, processing the projection contour curve on the camera imaging plane, and extracting the data point cloud coordinates of the projection curve;
s12, equally dividing the coordinates of the curve point cloud data into r intervals according to the same interval;
s13, selecting n fields of each data point in each interval to carry out polynomial operation to obtain a segmented polynomial, and then respectively calculating the curvature of the data point in each segmented polynomial;
s14, selecting a data point with the maximum curvature absolute value and the end points of the first data and the last data of the curve as curve type value points;
and S15, inversely calculating curve control points according to the De Boor-Cox algorithm.
Specifically, in S15, a point of the k-times NURBS curve is designated as qi(i-0, 1, …, n), and its node vector U-U0,u1,…,un+6]Is calculated as:
Figure BDA0003030030870000101
wherein the content of the first and second substances,
Figure BDA0003030030870000102
after curve type value points and node vectors are obtained, curve control points are inversely calculated according to a De Boor-Cox algorithm, and a NURBS equation system of node interpolation is as follows:
Figure BDA0003030030870000103
wherein d isjIs the curve control point, ωjIs a weight factor of the curve control point, let ωj=1,Bj,k(uj) The B-spline basis function is deduced by the node vector according to a De Boor-Cox recursion formula.
As shown in fig. 2, after all curve control vertexes are obtained, if it is found that the fitting curve accuracy does not meet the requirement, the fitting accuracy is improved by increasing the number of nodes until the fitting accuracy meets the requirement.
S2, calculating a two-dimensional projection curve of the spatial NURBS curve on the camera imaging plane;
specifically, as shown in fig. 3, it is known that there exists a set of robot vision servo systems in space, with a camera mounted at the end of the arm, i.e. eye-on-hand configuration; defining the coordinate system of the end effector of the robot as { T }, the coordinate system of the camera as { C }, and the coordinate system of the robot as { R }; when the NURBS curve is used to describe the space curve, the vertex of the space control is di(i=0,1,…,n),diThe projection control vertex on the camera imaging plane is recorded asCdi(i ═ 0,1, …, n); according to the perspective projection model of the camera, the control vertex d of the space curve at any time tiProjecting control points on the camera image planeCdiExpressed as:
Figure BDA0003030030870000104
wherein M is an internal reference matrix of the camera, and the internal reference matrix of the camera is obtained by calibrating the camera; h (q (t)) is a homogeneous transformation matrix from a robot base coordinate system to a robot tail end coordinate system, and q (t) is a joint variable of the robot at the time t; h (q (t)) is calculated by the D-H parameter of the robot and the joint angle of the robot at the time t;
from the NURBS curve definition and equation (4), at t-time, the two-dimensional projection curve of the spatial NURBS curve on the camera imaging plane can be represented as:
Figure BDA0003030030870000111
s3, after the real-time fitting operation is carried out on the contour projection curve of the object by a curve fitting method, calculating an interaction matrix based on the characteristics of curve control points by using the obtained curve control points;
specifically, let V ═ be the rigid body movement speed in space of the camera mounted at the end of the robot arm (V)cc),P(Xd,Yd,Zd) The coordinate of the space control vertex P relative to the camera, the moving speed of the curve control vertex P in the camera coordinate system is:
Figure BDA0003030030870000112
the scalar form of equation (6) is:
Figure BDA0003030030870000113
wherein v isc=[vcx,vcy,vcz]TIs the linear velocity, omega, of the camerac=[ωcxcycz]TIs the angular velocity of the camera.
According to the projection perspective relation of the camera, the coordinates of the curve control points on the image normalization plane are expressed as:
xdc=Xd/Zd,ydc=Yd/Zd (8)
the two sides of the formula (8) are respectively derived from the time:
Figure BDA0003030030870000114
substituting the formula (8) and the formula (9) into the formula (7) and finishing to obtain:
Figure BDA0003030030870000115
equation (10) is a relational expression between the change of the control point feature on the camera normalized imaging plane and the movement of the camera at the end of the robot arm, and it can be seen from equation (10) that one point image feature corresponds to two components, and at least 3 curve control vertexes should be selected as image features in order to avoid the occurrence of the under-driving condition when one 6-degree-of-freedom robot arm is controlled. When the number of the selected characteristic points is more than 3, the pose of the 6-degree-of-freedom robot can be better and uniquely determined. Therefore, the invention selects 4 NURBS curve control points as image characteristics to perform the servo task while considering the real-time performance of the visual servo system.
S4, selecting curve control point connection characteristics, and calculating an interaction matrix based on the curve control point connection characteristics;
specifically, as shown in FIG. 4, point A, B is two adjacent curve control points obtained by NURBS curve fitting of the contour of the target object, O is the intersection of the camera optical axis and the image normalization plane, and is set to have coordinates O [500,500 ]](ii) a Determining l from the image coordinates of A, B two control pointsABEquation of the straight line of (c), let the straight line lABHas a slope of k1A straight line p perpendicular thereto1The slope of o is k2,k2=-1/k1(ii) a Coordinates from origin O and slope k2To find a point p1Coordinates on the focal length normalized imaging plane;
let p1Coordinate is p1(xp1,yp11), then p1The polar parameters of the points are expressed as:
Figure BDA0003030030870000121
along a line lABSetting two points of interest p1Point of symmetry p2And p3Polar coordinate parameters are respectively rho22And ρ33. P is to be2,p3Substituting the parameters into a polar coordinate parameter equation of the straight line to obtain:
Figure BDA0003030030870000122
wherein alpha is2=α+Δα,α3=α-Δα,ρ2=ρ3And Δ α is a positive number of approximately 0, then p is used2,p3Represents a straight line l in polar coordinatesABThe parameter α of (a) is:
Figure BDA0003030030870000123
deriving the time from formula (13) and arranging:
Figure BDA0003030030870000124
and (3) substituting the polar coordinate parameter change rate of the points into the formula (14), and sorting to obtain an interaction matrix based on the linear polar coordinate parameter alpha:
Figure BDA0003030030870000125
wherein the content of the first and second substances,
Figure BDA0003030030870000131
and z is2,z3Is a point p2And p3The value of which is obtained by online estimation during camera motion; when the line feature is perpendicular to the camera optical axis, z2=z3The change of the linear characteristic parameter alpha is approximately regarded as being generated only by the change of the rotation posture of the camera; rewriting equation (15) as:
Figure BDA0003030030870000132
after the relevant straight line characteristic parameters are calculated by using the connection line of the two control points, the camera posture is controlled by using an interaction matrix shown in an equation (17).
S5, calculating the distance between curve control points as a distance characteristic, and calculating an interaction matrix based on the distance characteristic between the two points;
specifically, let A, B be the focal length to normalize the two NURBS curve control vertices on the imaging plane and A, B have the pixel coordinates a (x) for the two control verticesA,yA),B(xB,yB) (ii) a From the geometric knowledge, A, B represents the distance between two points:
Figure BDA0003030030870000133
the two sides of the above formula are simultaneously derived from time to obtain the distance change rate between the two control points as follows:
Figure BDA0003030030870000134
and (3) carrying the point characteristic interaction matrix (10) into a formula (18) to obtain an interaction matrix of the distance characteristic between two points:
Figure BDA0003030030870000135
wherein the content of the first and second substances,
Figure BDA0003030030870000136
and z isA,zBIs the depth value of the curve control vertex A, B, obtained by online estimation during the motion of the camera.
S6, compensating for the change of the image characteristic position caused by the change of the camera rotation attitude, so that the system obtains better decoupling characteristic and the performance of the servo system is improved;
specifically, the linear feature interaction matrix is used for solving the rotational motion rate of the camera as follows:
Figure BDA0003030030870000141
wherein L isIs a line characteristic interaction matrix of a connecting line of two control points, and rho and alpha are linear characteristic parameters of the connecting line of the control points.
Figure BDA0003030030870000142
The relationship between the rate of rotational motion of the camera and the rate of change of the projected position of the control point feature on the image plane is:
Figure BDA0003030030870000143
wherein L isIs an interaction matrix of the feature change rate of the control points on the imaging plane and the rotational motion rate of the camera, LComprises the following steps:
Figure BDA0003030030870000144
wherein x iscn、ycnIs the pixel coordinate of the nth curve control point on the camera imaging plane;
the rotation compensation quantity of the control point of the interactive matrix design curve by utilizing the variation quantity of the characteristics on the image plane is as follows:
Figure BDA0003030030870000145
wherein L ispvIs an interaction matrix of the feature change rate of the control points on the imaging plane and the rotational motion rate of the camera, LpvComprises the following steps:
Figure BDA0003030030870000146
wherein z iscnIs the depth value of the nth curve control point.
S7, because the visual servo system adopts a monocular camera, the depth information of the space control vertex can not be directly obtained, and in order to obtain the depth information of the curve control vertex, the depth information in the interactive matrix is subjected to online depth estimation by utilizing the projection variation quantity on the image plane of the curve control point characteristic and the motion speed of the camera;
specifically, in S7, let z beciIs the depth value of the ith curve control vertex with the pixel coordinate of Pci(xci,yci1), then ziThe estimation of (c) is as follows:
Figure BDA0003030030870000151
point P on the line characteristic of the line connecting two adjacent control points on the camera focal length normalization imaging planelj(xlj,ylj1) depth information zljCarrying out online estimation through the change rate of the linear characteristic parameters and the movement speed of the camera; z is a radical ofljThe estimation method of (2) is as follows:
Figure BDA0003030030870000152
s8, in the robot attitude controller, controlling the rotation attitude of the camera by using the linear characteristic of the connecting line of the control points of the adjacent curves; in a robot Z-axis translation controller, controlling the translation motion of the robot along the Z-axis direction of a camera by using the distance characteristic between two curve control points; in the X, Y axis translation controller, the translation motion of the robot along the camera X, Y axis direction is controlled by using the curve control point characteristics, and the rotation motion compensation amount is added in the position control of the robot, so that the robot servo task is finally completed.
Specifically, the rotation posture of the camera is controlled by adopting a linear characteristic of a connecting line of adjacent control points, the translational motion of the robot along the Z-axis direction of the camera is controlled by utilizing a distance characteristic between the two control points, the translational motion of the robot along the X, Y-axis direction of the camera is controlled by utilizing the characteristics of the control points, and a rotation motion compensation amount is added in the position control of the robot.
Respectively defining the error of a curve control point in an image feature space as ep(t) the angle error of the connecting line of the control points of the adjacent curves is eα(t) the distance error between two control points is ed(t);
Figure BDA0003030030870000153
Figure BDA0003030030870000154
Figure BDA0003030030870000155
In the formula (29), fphIs the desired curve control point characteristic, fpc(t) is the control point feature at the current pose of the camera; in the formula (30), fαhIs the desired angular characteristic of the connecting line of control points of adjacent curves, fαc(t) is the angle characteristic of the connecting line of the control points of the adjacent curves under the current pose of the camera; in the formula (31), fdhIs a desired distance characteristic between two control points, fdcIs a distance characteristic between two control points under the current pose of the camera,
Figure BDA0003030030870000156
the image characteristic variation quantity of the point characteristic, the line characteristic of the control point connecting line and the distance characteristic between two points which are required by completing the servo task is respectively;
the uncalibrated visual servoing system aims to converge the difference between the characteristic errors of the respective images defined on the image plane to 0, and therefore, equations (32), (33), and (34) are defined as follows:
Figure BDA0003030030870000161
Figure BDA0003030030870000162
Figure BDA0003030030870000163
wherein k isp,kα,kdIs the system controller gain factor;
formula (32), formula (33), formula (34) and the definition of jacobian are taken together in formula (29), formula (30) and formula (31), respectively:
Figure BDA0003030030870000164
Figure BDA0003030030870000165
Figure BDA0003030030870000166
wherein, Jp(t) is an image Jacobian matrix based on point features,
Figure BDA0003030030870000167
is a control point feature interaction matrix JpA pseudo-inverse matrix of (d); j. the design is a squareα(t) is an image Jacobian matrix based on point-distance features,
Figure BDA0003030030870000168
is a pseudo-inverse of the line feature interaction matrix; j. the design is a squared(t) is an image Jacobian matrix based on the dot-line features,
Figure BDA0003030030870000169
is a pseudo-inverse matrix of a distance characteristic interaction matrix between two control points;
Figure BDA00030300308700001610
Figure BDA00030300308700001611
6 joint values required by joint control, namely translation postures and rotation postures in the x, y and z directions are formed together;
Figure BDA00030300308700001612
corresponding to the translational postures of the directions of the x axis and the y axis,
Figure BDA00030300308700001613
corresponding to the translational attitude in the z-axis direction,
Figure BDA00030300308700001614
corresponding to the rotational attitude in the x, y, z-axis direction.
Take the matrix J in equation (35)pThe first 3 columns of (A) form a matrix JpvTaking the matrix JpThe last 3 columns of (A) form a matrix JAnd the compensation quantity of the curve rotation compensation module to the position control is as follows:
Figure BDA00030300308700001615
wherein the content of the first and second substances,
Figure BDA00030300308700001616
is a matrix JpvA pseudo-inverse matrix of (d);
get
Figure BDA00030300308700001617
First 2 behavior of
Figure BDA00030300308700001618
The position control amount of the robot end movement is:
Figure BDA00030300308700001619
required for performing servo tasksTotal joint control amount of
Figure BDA00030300308700001620
Comprises the following steps:
Figure BDA0003030030870000171
it is to be understood that the above description is not intended to limit the present invention, and the present invention is not limited to the above examples, and those skilled in the art may make modifications, alterations, additions or substitutions within the spirit and scope of the present invention.

Claims (10)

1. A separated visual servo control method based on composite characteristics is characterized by comprising the following steps:
s1, inversely calculating a curve control vertex based on a NURBS curve fitting technology, and obtaining curve control point characteristics;
s2, calculating a two-dimensional projection curve of the spatial NURBS curve on the camera imaging plane;
s3, after the real-time fitting operation is carried out on the contour projection curve of the object by a curve fitting method, calculating an interaction matrix based on the characteristics of curve control points by using the obtained curve control points;
s4, selecting curve control point connection characteristics, and calculating an interaction matrix based on the curve control point connection characteristics;
s5, calculating the distance between curve control points as a distance characteristic, and calculating an interaction matrix based on the distance characteristic between the two points;
s6, compensating for the change of the image characteristic position caused by the change of the camera rotation posture;
s7, in order to obtain the depth information of the curve control vertex, performing online depth estimation on the depth information in the interaction matrix by using the projection variation on the image plane of the curve control point characteristic and the motion speed of the camera;
s8, controlling the rotation posture of the camera by using the linear characteristic of the connecting line of the adjacent curve control points, controlling the translational motion of the robot along the Z-axis direction of the camera by using the distance characteristic between the two curve control points, controlling the translational motion of the robot along the X, Y-axis direction of the camera by using the characteristics of the curve control points, adding the compensation amount of the rotational motion in the position control of the robot, and finally completing the servo task of the robot.
2. A method as claimed in claim 1, wherein the step S1 comprises the following sub-steps:
s11, processing the projection contour curve on the camera imaging plane, and extracting the data point cloud coordinates of the projection curve;
s12, equally dividing the coordinates of the curve point cloud data into r intervals according to the same interval;
s13, selecting n fields of each data point in each interval to carry out polynomial operation to obtain a segmented polynomial, and then respectively calculating the curvature of the data point in each segmented polynomial;
s14, selecting a data point with the maximum curvature absolute value and the end points of the first data and the last data of the curve as curve type value points;
and S15, inversely calculating curve control points according to the De Boor-Cox algorithm.
3. The method of claim 2, wherein in step S15, a point of k NURBS curves is denoted as qi(i-0, 1, …, n), and its node vector U-U0,u1,…,un+6]Is calculated as:
Figure FDA0003030030860000011
wherein the content of the first and second substances,
Figure FDA0003030030860000021
after curve type value points and node vectors are obtained, curve control points are inversely calculated according to a De Boor-Cox algorithm, and a NURBS equation system of node interpolation is as follows:
Figure FDA0003030030860000022
wherein d isjIs the curve control point, ωjIs a weight factor of the curve control point, let ωj=1,Bj,k(uj) The B spline basis function is deduced by the node vector according to a De Boor-Cox recursion formula;
after all curve control vertexes are obtained, if the fitting curve precision is found to be not capable of meeting the requirement, the fitting precision is improved by increasing the number of nodes until the fitting precision meets the requirement.
4. A split vision servo control method based on composite features as claimed in claim 1, wherein in S2, it is known that there exists a set of robot vision servo system in space, the camera is installed at the end of the arm, i.e. eye-on-hand configuration; defining the coordinate system of the end effector of the robot as { T }, the coordinate system of the camera as { C }, and the coordinate system of the robot as { R }, wherein when the spatial curve is described by the NURBS curve, the vertex of the spatial control is di(i=0,1,…,n),diThe projection control vertex on the camera imaging plane is recorded asCdi(i ═ 0,1, …, n); according to the perspective projection model of the camera, the control vertex d of the space curve at any time tiProjecting control points on the camera image planeCdiExpressed as:
Figure FDA0003030030860000023
wherein M is an internal reference matrix of the camera, and the internal reference matrix of the camera is obtained by calibrating the camera; h (q (t)) is a homogeneous transformation matrix from a robot base coordinate system to a robot tail end coordinate system, and q (t) is a joint variable of the robot at the time t; h (q (t)) is calculated by the D-H parameter of the robot and the joint angle of the robot at the time t;
from the NURBS curve definition and equation (4),
Figure FDA0003030030860000024
equation (5) is a two-dimensional projection curve of the spatial NURBS curve on the camera imaging plane at the time t.
5. The visual servo control method of claim 1, wherein in S3, the rigid body moving speed of the camera mounted on the end of the robot arm in space is set as V ═ V (V ═ V)cc),P(Xd,Yd,Zd) The coordinate of the space control vertex P relative to the camera, the moving speed of the curve control vertex P in the camera coordinate system is:
Figure FDA0003030030860000031
the scalar form of equation (6) is:
Figure FDA0003030030860000032
wherein v isc=[vcx,vcy,vcz]TIs the linear velocity, omega, of the camerac=[ωcxcycz]TIs the angular velocity of the camera;
according to the projection perspective relation of the camera, the coordinates of the curve control points on the image normalization plane are expressed as:
xdc=Xd/Zd,ydc=Yd/Zd (8)
the two sides of the formula (8) are respectively derived from the time:
Figure FDA0003030030860000033
substituting the formula (8) and the formula (9) into the formula (7) and finishing to obtain:
Figure FDA0003030030860000034
equation (10) is a relational expression between the change in the feature of the control point on the camera normalized imaging plane and the movement of the camera at the end of the robot arm, and it is known from equation (10) that one point image feature corresponds to two components.
6. A separate visual servoing control method based on composite features as claimed in claim 1, wherein in S4, point A, B is two adjacent curve control points obtained by NURBS curve fitting of the contour of the target object, and O is the intersection point of the camera optical axis and the image normalization plane; determining l from the image coordinates of A, B two control pointsABEquation of the straight line of (c), let the straight line lABHas a slope of k1A straight line p perpendicular thereto1The slope of o is k2,k2=-1/k1(ii) a Coordinates from origin O and slope k2To find a point p1Coordinates on the focal length normalized imaging plane;
let p1Coordinate is p1(xp1,yp11), then p1The polar parameters of the points are expressed as:
Figure FDA0003030030860000035
along a line lABSetting two points of interest p1Point of symmetry p2And p3Polar coordinate parameters are respectively rho22And ρ33A 1 is to p2,p3Substituting the parameters into a linear polar coordinate parameter equation to obtainTo:
Figure FDA0003030030860000041
wherein alpha is2=α+Δα,α3=α-Δα,ρ2=ρ3And Δ α is a positive number of approximately 0, then p is used2,p3Represents a straight line l in polar coordinatesABThe parameter α of (a) is:
Figure FDA0003030030860000042
deriving the time from formula (13) and arranging:
Figure FDA0003030030860000043
substituting the polar coordinate parameter change rate of the point into the formula (14) to obtain an interaction matrix based on the linear polar coordinate parameter alpha:
Figure FDA0003030030860000044
wherein v ═ vx,vy,vz]TThe translational motion rate of the camera in the x, y and z axes is ω ═ ωxyz]TThe rotational motion rate of the camera in the x, y, z axes,
Figure FDA0003030030860000045
and z is2,z3Is a point p2And p3The value of which is obtained by online estimation during camera motion; when the line feature is perpendicular to the camera optical axis, z2=z3Variation of the characteristic parameter α of a straight lineThe approximation is seen as being generated only by the rotational pose changes of the camera; rewriting equation (15) as:
Figure FDA0003030030860000046
after the relevant straight line characteristic parameters are calculated by using the connection line of the two control points, the camera posture is controlled by using an interaction matrix shown in an equation (17).
7. The method of claim 1, wherein in step S5, A, B is set as two NURBS curve control vertices on the focus normalized imaging plane and A, B is set as a (x) pixel coordinate for two control verticesA,yA),B(xB,yB) (ii) a From the geometric knowledge, A, B represents the distance between two points:
Figure FDA0003030030860000047
the two sides of the above formula are simultaneously derived from time to obtain the distance change rate between the two control points as follows:
Figure FDA0003030030860000051
and (3) carrying the point characteristic interaction matrix (10) into a formula (18) to obtain an interaction matrix of the distance characteristic between two points:
Figure FDA0003030030860000052
wherein the content of the first and second substances,
Figure FDA0003030030860000053
and z isA,zBIs the depth value of the curve control vertex A, B, obtained by online estimation during the motion of the camera.
8. The method of claim 1, wherein in step S6, the rotational motion rate of the camera is obtained by using the linear feature interaction matrix as follows:
Figure FDA0003030030860000054
wherein L isIs a line characteristic interaction matrix of a connecting line of two control points, and rho and alpha are linear characteristic parameters of the connecting line of the control points;
Figure FDA0003030030860000055
the relationship between the rate of rotational motion of the camera and the rate of change of the projected position of the control point feature on the image plane is:
Figure FDA0003030030860000056
wherein L isIs an interaction matrix of the feature change rate of the control points on the imaging plane and the rotational motion rate of the camera, LComprises the following steps:
Figure FDA0003030030860000061
wherein x iscn、ycnIs the pixel coordinate of the nth curve control point on the camera imaging plane;
the rotation compensation quantity of the control point of the interactive matrix design curve by utilizing the variation quantity of the characteristics on the image plane is as follows:
Figure FDA0003030030860000062
wherein L ispvIs an interaction matrix of the feature change rate of the control points on the imaging plane and the rotational motion rate of the camera, LpvComprises the following steps:
Figure FDA0003030030860000063
wherein z iscnIs the depth value of the nth curve control point.
9. The visual servo control method of claim 1, wherein in S7, let z beciIs the depth value of the ith curve control vertex with the pixel coordinate of Pci(xci,yci1), then ziThe estimation of (c) is as follows:
Figure FDA0003030030860000064
point P on the line characteristic of the line connecting two adjacent control points on the camera focal length normalization imaging planelj(xlj,ylj1) depth information zljCarrying out online estimation through the change rate of the linear characteristic parameters and the movement speed of the camera; z is a radical ofljThe estimation method of (2) is as follows:
Figure FDA0003030030860000065
10. the visual servoing method of claim 1, wherein the curve control point error is defined as e in S8p(t) the angle error of the connecting line of the control points of the adjacent curves is eα(t) the distance error between two control points is ed(t);
Figure FDA0003030030860000066
Figure FDA0003030030860000067
Figure FDA0003030030860000071
In the formula (29), fphIs the desired curve control point characteristic, fpc(t) is the control point feature at the current pose of the camera; in the formula (30), fαhIs the desired angular characteristic of the connecting line of control points of adjacent curves, fαc(t) is the angle characteristic of the connecting line of the control points of the adjacent curves under the current pose of the camera; in the formula (31), fdhIs a desired distance characteristic between two control points, fdcIs a distance characteristic between two control points under the current pose of the camera,
Figure FDA0003030030860000072
the image characteristic variation quantity of the point characteristic, the line characteristic of the control point connecting line and the distance characteristic between two points which are required by completing the servo task is respectively;
the following are defined in equations (32), (33) and (34):
Figure FDA0003030030860000073
Figure FDA0003030030860000074
Figure FDA0003030030860000075
wherein k isp,kα,kdIs the system controller gain factor;
formula (32), formula (33), formula (34) and the definition of jacobian are taken together in formula (29), formula (30) and formula (31), respectively:
Figure FDA0003030030860000076
Figure FDA0003030030860000077
Figure FDA0003030030860000078
wherein the content of the first and second substances,
Figure FDA0003030030860000079
is a control point feature interaction matrix JpThe pseudo-inverse matrix of (a) is,
Figure FDA00030300308600000710
and
Figure FDA00030300308600000711
respectively a pseudo-inverse matrix of the line characteristic interaction matrix and a pseudo-inverse matrix of the distance characteristic interaction matrix between the two control points;
take the matrix J in equation (35)pThe first 3 columns of (A) form a matrix JpvTaking the matrix JpThe last 3 columns of (A) form a matrix JAnd the compensation quantity of the curve rotation compensation module to the position control is as follows:
Figure FDA00030300308600000712
wherein the content of the first and second substances,
Figure FDA00030300308600000713
is a matrix JpvA pseudo-inverse matrix of (d);
get
Figure FDA00030300308600000714
First 2 behavior of
Figure FDA00030300308600000715
The position control amount of the robot end movement is:
Figure FDA00030300308600000716
total joint control required to perform servo tasks
Figure FDA00030300308600000717
Comprises the following steps:
Figure FDA0003030030860000081
CN202110427272.XA 2021-04-21 2021-04-21 Separated visual servo control method based on composite characteristics Active CN113211433B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110427272.XA CN113211433B (en) 2021-04-21 2021-04-21 Separated visual servo control method based on composite characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110427272.XA CN113211433B (en) 2021-04-21 2021-04-21 Separated visual servo control method based on composite characteristics

Publications (2)

Publication Number Publication Date
CN113211433A true CN113211433A (en) 2021-08-06
CN113211433B CN113211433B (en) 2022-09-20

Family

ID=77088106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110427272.XA Active CN113211433B (en) 2021-04-21 2021-04-21 Separated visual servo control method based on composite characteristics

Country Status (1)

Country Link
CN (1) CN113211433B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113985255A (en) * 2021-10-29 2022-01-28 北京航星科技有限公司 Circuit board static test system and test method
CN116339141A (en) * 2023-03-10 2023-06-27 山东科技大学 Mechanical arm global fixed time track tracking sliding mode control method

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5317682A (en) * 1989-10-24 1994-05-31 International Business Machines Corporation Parametric curve evaluation method and apparatus for a computer graphics display system
JP2001154719A (en) * 1999-11-30 2001-06-08 Yaskawa Electric Corp Method for interpolating free curve
CN101926678A (en) * 2005-09-30 2010-12-29 修复型机器人公司 Be used to gather and implant the automated systems and methods of Follicular Unit
CN102794767A (en) * 2012-08-31 2012-11-28 江南大学 B spline track planning method of robot joint space guided by vision
CN105773620A (en) * 2016-04-26 2016-07-20 南京工程学院 Track planning and control method of free curve of industrial robot based on double quaternions
CN106271281A (en) * 2016-09-27 2017-01-04 华南理工大学 A kind of complicated abnormal shape workpiece automatic welding system of path generator and method
CN106424877A (en) * 2016-10-15 2017-02-22 中国计量大学 Generating method for milling tool path of novel robot
CN107901041A (en) * 2017-12-15 2018-04-13 中南大学 A kind of robot vision servo control method based on image blend square
CN108621167A (en) * 2018-07-23 2018-10-09 中南大学 A kind of visual servo decoupling control method based on profile side and the interior feature that takes all of
CN108927807A (en) * 2018-08-14 2018-12-04 河南工程学院 A kind of robot vision control method based on point feature
CN109048911A (en) * 2018-08-31 2018-12-21 河南工程学院 A kind of robot vision control method based on rectangular characteristic
CN109551307A (en) * 2018-11-22 2019-04-02 太原理工大学 A kind of method that online replacement polishing and grinding head realizes robot subregion rubbing down blade
CN110136169A (en) * 2019-04-26 2019-08-16 哈尔滨工业大学(深圳) A kind of unmarked planar flexible body deformation tracking method based on NURBS
CN111366070A (en) * 2018-12-25 2020-07-03 苏州笛卡测试技术有限公司 Multi-axis space coordinate system calibration method for combined type line laser measurement system

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5317682A (en) * 1989-10-24 1994-05-31 International Business Machines Corporation Parametric curve evaluation method and apparatus for a computer graphics display system
JP2001154719A (en) * 1999-11-30 2001-06-08 Yaskawa Electric Corp Method for interpolating free curve
CN101926678A (en) * 2005-09-30 2010-12-29 修复型机器人公司 Be used to gather and implant the automated systems and methods of Follicular Unit
CN102794767A (en) * 2012-08-31 2012-11-28 江南大学 B spline track planning method of robot joint space guided by vision
CN105773620A (en) * 2016-04-26 2016-07-20 南京工程学院 Track planning and control method of free curve of industrial robot based on double quaternions
CN106271281A (en) * 2016-09-27 2017-01-04 华南理工大学 A kind of complicated abnormal shape workpiece automatic welding system of path generator and method
CN106424877A (en) * 2016-10-15 2017-02-22 中国计量大学 Generating method for milling tool path of novel robot
CN107901041A (en) * 2017-12-15 2018-04-13 中南大学 A kind of robot vision servo control method based on image blend square
CN108621167A (en) * 2018-07-23 2018-10-09 中南大学 A kind of visual servo decoupling control method based on profile side and the interior feature that takes all of
CN108927807A (en) * 2018-08-14 2018-12-04 河南工程学院 A kind of robot vision control method based on point feature
CN109048911A (en) * 2018-08-31 2018-12-21 河南工程学院 A kind of robot vision control method based on rectangular characteristic
CN109551307A (en) * 2018-11-22 2019-04-02 太原理工大学 A kind of method that online replacement polishing and grinding head realizes robot subregion rubbing down blade
CN111366070A (en) * 2018-12-25 2020-07-03 苏州笛卡测试技术有限公司 Multi-axis space coordinate system calibration method for combined type line laser measurement system
CN110136169A (en) * 2019-04-26 2019-08-16 哈尔滨工业大学(深圳) A kind of unmarked planar flexible body deformation tracking method based on NURBS

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YAO ZHANG, NA WANG, HAIXIA WANG,ET.AL: "Curve Control Points-based Feature Extraction for Visual Servo with rotational pose compensation", 《IEEE XPLORE》 *
简杰: "工业机器人伺服电机控制及运动控制***研究", 《中国优秀硕士论文全文数据库》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113985255A (en) * 2021-10-29 2022-01-28 北京航星科技有限公司 Circuit board static test system and test method
CN116339141A (en) * 2023-03-10 2023-06-27 山东科技大学 Mechanical arm global fixed time track tracking sliding mode control method
CN116339141B (en) * 2023-03-10 2023-10-03 山东科技大学 Mechanical arm global fixed time track tracking sliding mode control method

Also Published As

Publication number Publication date
CN113211433B (en) 2022-09-20

Similar Documents

Publication Publication Date Title
CN110116407B (en) Flexible robot position and posture measuring method and device
Chen et al. Adaptive homography-based visual servo tracking for a fixed camera configuration with a camera-in-hand extension
CN113211433B (en) Separated visual servo control method based on composite characteristics
CN108408080A (en) A kind of aircraft wing body Butt Assembling device, method and system
CN108621167B (en) Visual servo decoupling control method based on contour edge and inner wrapping circle features
Caron et al. Multiple camera types simultaneous stereo calibration
CN110775288B (en) Bionic-based flight mechanical neck eye system and control method
KR20210118414A (en) Environment mapping using the state of the robot device
CN113744340A (en) Calibrating cameras with non-central camera models of axial viewpoint offset and computing point projections
CN112000135B (en) Three-axis holder visual servo control method based on human face maximum temperature point characteristic feedback
CN112184812A (en) Method for improving identification and positioning precision of unmanned aerial vehicle camera to Apriltag, positioning method and positioning system
Fang et al. Self-supervised camera self-calibration from video
CN110928311B (en) Indoor mobile robot navigation method based on linear features under panoramic camera
CN117218210A (en) Binocular active vision semi-dense depth estimation method based on bionic eyes
CN113240597B (en) Three-dimensional software image stabilizing method based on visual inertial information fusion
Kim et al. Robust extrinsic calibration for arbitrarily configured dual 3D LiDARs using a single planar board
CN112945233A (en) Global drift-free autonomous robot simultaneous positioning and map building method
Allotta et al. On the use of linear camera-object interaction models in visual servoing
Fuchs et al. Advanced 3-D trailer pose estimation for articulated vehicles
CN115446836A (en) Visual servo method based on mixing of multiple image characteristic information
CN109542094B (en) Mobile robot vision stabilization control without desired images
CN114820984A (en) Three-dimensional reconstruction method and system based on laser radar
CN113379840A (en) Monocular vision pose estimation method based on coplanar target
Alkhalil et al. Stereo visual servoing with decoupling control
Barajas et al. Visual servoing of uav using cuboid model with simultaneous tracking of multiple planar faces

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant