CN110722533A - External parameter calibration-free visual servo tracking of wheeled mobile robot - Google Patents

External parameter calibration-free visual servo tracking of wheeled mobile robot Download PDF

Info

Publication number
CN110722533A
CN110722533A CN201810787728.1A CN201810787728A CN110722533A CN 110722533 A CN110722533 A CN 110722533A CN 201810787728 A CN201810787728 A CN 201810787728A CN 110722533 A CN110722533 A CN 110722533A
Authority
CN
China
Prior art keywords
robot
camera
parameters
track
mobile robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810787728.1A
Other languages
Chinese (zh)
Other versions
CN110722533B (en
Inventor
李宝全
邱雨
安晨亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Polytechnic University
Original Assignee
Tianjin Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Polytechnic University filed Critical Tianjin Polytechnic University
Priority to CN201810787728.1A priority Critical patent/CN110722533B/en
Publication of CN110722533A publication Critical patent/CN110722533A/en
Application granted granted Critical
Publication of CN110722533B publication Critical patent/CN110722533B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The method comprises the steps of designing a visual servo track tracking method aiming at the condition that external translation parameters of a camera are unknown, obtaining a relative pose relation between a current pose and an expected track according to a decomposition method of a homography matrix, defining track tracking errors according to the relative pose, further solving an open-loop error equation, designing a self-adaptive visual servo track tracking control law, and strictly proving that tracking errors are asymptotically converged to zero by using a Lyapunov technology and Barbalt's lemma. Even if the parameters of the translation camera are unknown, simulation and comparison experiments are carried out to prove that the proposed strategy can drive the robot to track the expected track efficiently.

Description

External parameter calibration-free visual servo tracking of wheeled mobile robot
Technical Field
The invention belongs to the technical field of computer vision and mobile robots, and particularly relates to a wheel type mobile robot external parameter calibration-free vision servo tracking control method.
Background
The visual sensor is widely applied to intelligent agents such as a wheeled mobile robot and the like, and has the advantages of large information amount, attractive appearance, high reliability and the like. In the past decades, robotic systems have brought about a tremendous socio-economic impact on human life, for example, in social life, industrial engineering, and bioengineering. The combination of a mobile robot and a vision sensor enhances its external environment perception and enables the robot to perform some difficult tasks. The similarity to human visual perception and its ability to possess a non-contact measurement environment make visual sensors a very useful part of various types of robots. The robot vision servo system can integrate vision data into a robot controller, so that the control performance is improved, and the robot is more flexible. In general, visual servoing frameworks can be divided into three categories, namely position-based visual servoing, image-based visual servoing, and hybrid visual servoing. There have been many studies on parameter estimation for uncertain parameters and environmental perturbations in the system, and also on how to deal with environmental perturbations, e.g. Zhao et al solve the blurred image problem.
In the classical visual servoing process, calibrating intrinsic and extrinsic parameters of the visual system has been a tedious and necessary process. In a typical vision servoing framework for a mobile robotic system, it is often required to place the camera in a suitable position on the robot in order to have a good field of view available. Therefore, there are extrinsic parameters between the camera and the robot. Furthermore, the introduction of unknown external parameters can result due to inevitable mounting errors when placing the camera on the robot platform. Therefore, it is very interesting and also challenging to handle an uncalibrated external camera and its corresponding robot parameter for the particular visual servo tracking problem of a wheeled mobile robot.
Visual servoing strategies have been widely used for robotic manipulators, which typically must deal with unknown factors in the environment and control system. Visual servoing strategies have been widely used for robotic manipulators, which typically must deal with unknown factors in the environment and control system. Wang et al devise an adaptive rule to estimate curve parameters of an online object that is based on a vision-based robotic manipulation system. In document [19], the authors use active lighting control on robotic manipulators to guarantee the lighting quality of the environment. In [20], a robust adaptive uncalibrated visual servo controller is designed for asymptotic adjustment of a robotic end-effector. Document [21] proposes a stereo vision-inertial ranging algorithm assembled with different separation kalman filters. Wang et al have designed a new mechanism for a silicone rubber soft manipulator using the Cosserat rod theory and Kelvin model. Document [24] proposes a new strategy for the transport of multiple biological cells, which comprises an automatic on-line calibration algorithm for the identification of system parameters. Compared with a robot, due to incomplete constraint and under-actuated property, the influence of unknown parameters on the wheel type mobile robot is more serious, and great difficulty is brought to the design and stability analysis of a controller.
For visual adjustment of wheeled mobile robots, non-complete constraints and visual uncertainties should be considered, as is the work in [26 ]. Li et al use a concurrent learning strategy and homography decomposition techniques to adjust the pose required for the mobile robot. In [28], an adaptive visual servoing strategy is designed to drive a mobile robot with natural behavior. There are also some tasks related to mobile robot vision adjustment, such as working [29], etc. that are not calibrated. A novel two-stage controller is elaborately designed in [30] by adopting an adaptive controller and a back stepping method, so that the robot adjusting task is completed, and unknown external camera-robot parameters and unknown depth are provided.
In addition to visual servo regulation, visual servo tracking control (another area of visual servo) is now also becoming increasingly important. Compared with the posture adjustment control of the mobile robot, the trajectory tracking scheme can be combined with motion planning and multi-system constraint, so that the method is more suitable for realizing complex tasks. For example, Chen et al successfully completed the trajectory tracking task of a mobile, vision-equipped robot in [32 ]. Some work has been done in the robot vision servo tracking task, namely with respect to handling difficulties caused by unknown parameters. In [33], a depth-independent image jacobian framework is developed for a robot tracking control scheme that has no position and velocity measurements. In [34] and [35], the adaptive controller is designed to compensate for unknown camera system and mechanical parameters, respectively, in the visual tracking process. Liang et al designs a self-adaptive trajectory tracking controller suitable for a wheeled mobile robot in an environment where camera parameters and precise characteristic positions are unknown. Unfortunately, existing methods rarely take into account uncertain camera-to-robot parameters in system models for visual servo tracking tasks.
Inspired by the trajectory tracking method in document [37], a new adaptive tracking controller is proposed herein for a wheeled mobile robot with uncalibrated translation camera to robot parameters. After the kinematic model of the robot system is analyzed and homography decomposition is used, measurable signals are obtained to construct system tracking errors, and open loop dynamics are deduced. Subsequently, a motion controller was developed for the trajectory tracking target under incomplete constraints, while carefully designing the adaptive update law to compensate for the unknown translation parameters. The stability analysis was performed strictly by using Lyapunov technique and extended Babalart lemma. Simulation and comparative experimental results were collected to test the effectiveness of the proposed strategy. The main contribution here is that the wheeled mobile robot successfully completed the visual servo tracking task, although the external camera parameters were translated and the scene depth was unknown.
Disclosure of Invention
The invention aims to provide a method for solving the problem that external parameters of a wheeled mobile robot are not calibrated in a visual servo tracking mode.
A new adaptive tracking controller is presented for a wheeled mobile robot that is not calibrated for translational camera-to-robot parameters. After the kinematic model of the robot system is analyzed and homography decomposition is used, measurable signals are obtained to construct system tracking errors, and open loop dynamics are deduced. Subsequently, a motion controller was developed for the trajectory tracking target under incomplete constraints, while carefully designing the adaptive update law to compensate for the unknown translation parameters. The stability analysis was performed strictly by using Lyapunov technique and extended Babalart lemma. Simulation and comparative experimental results were collected to test the effectiveness of the proposed strategy. The main contribution here is that the wheeled mobile robot successfully completed the visual servo tracking task, although the external camera parameters were translated and the scene depth was unknown.
The visual servo tracking of the external parameter calibration-free wheeled mobile robot is characterized by comprising the following steps:
1 st, System and kinematic model
1.1, System description
Fig. 1 is a schematic diagram into which translational external parameters of the camera relative to the robot are introduced, unlike the conventional assumption that a stationary monocular camera is directly above the center of a wheeled mobile robot. Current camera frame composed of
Figure BSA0000167319490000039
Is represented by the formula (I) in which zcThe axis is defined along the optical axis. Frame
Figure BSA00001673194900000310
Define the current frame of the wheeled mobile robot, wherein zrThe shaft is located in front of the robot. Due to the presence of translated camera-to-robot parameters in the system, thereforerTczrTcrExpressed along the z and x axes, respectively
Figure BSA00001673194900000314
And (4) a translation parameter of (c).
Figure BSA0000167319490000031
And
Figure BSA0000167319490000032
frames of the camera and robot are defined on the desired trajectories, respectively.
Furthermore, with respect to the gesture contrast introduction,
Figure BSA0000167319490000033
and
Figure BSA0000167319490000034
frames of the camera and robot are defined at the reference poses, respectively. Thetac(t) and θd(t) each representsAnd
Figure BSA0000167319490000038
relative to
Figure BSA0000167319490000035
Is easy to know the rotation angle of thetac(t) and θd(t) are respectively equivalent to
Figure BSA00001673194900000311
And
Figure BSA00001673194900000312
relative to
Figure BSA00001673194900000313
The angle of rotation of (c).
1.2 kinematic model
Will be provided with
Figure BSA00001673194900000415
Is regarded as a characteristic point, and is definedIn the robot coordinate system
Figure BSA00001673194900000417
The coordinates ofWhere σ corresponds to the height difference between the camera and the robot.
Figure BSA0000167319490000042
The following relation is satisfied with the movement speed of the robot:
Figure BSA0000167319490000043
wherein v and w are each independently
Figure BSA00001673194900000421
Linear and angular velocities around itself:
v:=[0,0,vr]T,w:=[0,-wr,0]T(5)
wherein v isr(t) and wr(t) is the linear velocity and angular velocity of the mobile robot, respectively, substituting (2) into (1) can result in
Figure BSA0000167319490000044
Kinematic equation in z, x direction:
Figure BSA0000167319490000045
for convenience, an unknown translational extrinsic parameter is defined asrTcx:=a,rTcz: will be as b
Figure BSA00001673194900000418
The coordinates of the origin point of (A) in the camera coordinate system are defined as
Figure BSA0000167319490000046
According to the rule of transformation of the coordinate system,
Figure BSA0000167319490000047
and
Figure BSA0000167319490000048
the following relationships are provided:
Figure BSA0000167319490000049
wherein
Figure BSA00001673194900000410
AndrTcrespectively represent
Figure BSA00001673194900000419
In that
Figure BSA00001673194900000420
The lower rotation matrix and the translation vector. The form is as follows:
Figure BSA00001673194900000411
next, from (4) can be obtained:
Figure BSA00001673194900000412
wherein
Figure BSA00001673194900000413
AndrTcrespectively representIn that
Figure BSA00001673194900000426
The formula (6) is substituted into the formula
Figure BSA00001673194900000422
At the origin ofThe following kinematic equation:
Figure BSA00001673194900000414
similarly, can obtain
Figure BSA00001673194900000424
Is in the desired camera coordinate systemThe following kinematic equation:
whereindT*z(t),dT*x(t) each represents
Figure BSA00001673194900000511
At the origin of
Figure BSA00001673194900000512
Z, x coordinates of, vrd(t),wrd(t) represents the linear and angular velocities of the robot on the desired trajectory, respectively.
No. 2, control development
2.1 open-Loop error equation
Assuming that there are several coplanar feature points in space, the estimation and decomposition of the homography matrix are used to obtain proportional images from the reference image and the image on the expected track
Figure BSA00001673194900000513
Relative to
Figure BSA00001673194900000514
The pose of (2):cT*z/d*(t),cT*x/d*(t),θc(t) additionally, from the reference image and the current image, a ratio-wise expression can be obtained
Figure BSA0000167319490000059
Relative to
Figure BSA00001673194900000510
The pose of (2):dT*z/d*(t),dT*x/d*(t),θd(t).
defining the error of track tracking as
Figure BSA0000167319490000052
When e is known1,e2,e3When the convergence is 0, the robot tracks the expected track, the two ends of (9) are derived with respect to time, and (7) and (8) are substituted by
Figure BSA0000167319490000053
In which the angular kinematic equation of the robot is used
Figure BSA0000167319490000054
In addition, the method can be used for producing a composite material
Figure BSA0000167319490000055
And
Figure BSA0000167319490000056
can be obtained by a front-back time difference mode.
2.2 adaptive controller design
First, depth estimation error
Figure BSA0000167319490000057
The definition is as follows:
Figure BSA0000167319490000058
wherein
Figure BSA0000167319490000061
Is a depth estimate. For one of the translation extrinsic parameters, the auxiliary parameter ρ ∈ R+Is defined as
Figure BSA0000167319490000062
And the estimation error is
Figure BSA0000167319490000063
Wherein
Figure BSA00001673194900000610
Is an estimate of p. In addition, another extrinsic parameter estimation error
Figure BSA00001673194900000611
Is that
Figure BSA0000167319490000064
Wherein
Figure BSA0000167319490000065
Is an estimate of a.
By using the adaptive control framework, the update rule of the unknown parameters is as follows:
Figure BSA0000167319490000066
wherein gamma is1,Γ2,Γ3∈R+
In order to realize the track tracking task, the motion controller of the mobile robot is designed as follows:
Figure BSA0000167319490000067
wherein k isv;kw∈R+Is a control gain, and χ (t) ∈ R is expressed as
Figure BSA0000167319490000068
3 rd, stability analysis
Theorem 1: control input and parameter update laws (15) designed in (16) drive wheeled mobile robots to track desired trajectories with unknown translational external parameters in a sense
Figure BSA0000167319490000069
Assume that the desired trajectory satisfies the following condition:
and (3) proving that: the non-negative Lyapunov function V (t) is chosen as follows:
Figure BSA0000167319490000072
after taking the time derivative of v (t) and substituting the open loop kinetics (10) and the designed controller (16), we have the following relationship:
Figure BSA0000167319490000073
by substituting (15) into (21), the following expression is obtained by eliminating the commonly used terms:
e is obtained according to the formulae (15) and (21)1(t),e2(t),e3(t),
Figure BSA0000167319490000075
According to (11)
Figure BSA0000167319490000076
Then, according to (12), (13), (14), the compounds are obtained
Figure BSA0000167319490000077
Again, the definition (7) and assumptions of error
Figure BSA0000167319490000078
Can obtain the product
Figure BSA0000167319490000079
Further, it is clear from (17)
Figure BSA00001673194900000716
Due to the assumption that
Figure BSA00001673194900000710
According to (16)
Figure BSA00001673194900000715
According to the above analysis, further, according to (10), a
Figure BSA00001673194900000711
According to (15), aFurther, the parameters (12), (13) and (14) can be defined
Figure BSA00001673194900000713
From the left half of (10), the results are obtained
Figure BSA00001673194900000714
To facilitate the following analysis, the function f (t) is defined as f (t): k ═ kve1 2+kw(e3+χ)2≧ 0, for both ends of which a derivation with respect to time is taken into account (17) are:
Figure BSA0000167319490000081
then by
Figure BSA00001673194900000817
Corollary to the guava theorem
Figure BSA0000167319490000082
Can then be pushed out
Figure BSA0000167319490000083
By substituting the control law (16) into the open-loop error equation (10)
Figure BSA0000167319490000084
The terms may be obtained:
Figure BSA0000167319490000085
further bringing (26) into (27) and (15) to obtain
Figure BSA0000167319490000086
To be aligned with
Figure BSA0000167319490000087
In (1)
Figure BSA0000167319490000088
Partly by inference from the Bobara's theorem, it is necessary to bring the law of control (16) into the open-loop error equation (10)
Figure BSA0000167319490000089
Item, can obtain
Wherein the auxiliary symbol
Figure BSA00001673194900000811
The definition of (A) is shown in the formula. Is obtained by
Figure BSA00001673194900000812
After the time derivative of (2), can be deduced
Figure BSA00001673194900000813
Diagonal velocity wr(t) derivation, and the use of (16) (17) having:
Figure BSA00001673194900000814
due to the assumption that
Figure BSA00001673194900000815
Thus, the following (31) shows
Figure BSA00001673194900000816
Further, due to the assumption
Figure BSA0000167319490000091
Thus, according to (30), it is understood that
Figure BSA0000167319490000092
Is ready to obtain
Figure BSA0000167319490000093
Is consistently continuous, considering
Figure BSA0000167319490000094
Thus using the extended ballad theorem for equation (29) we can:
Figure BSA0000167319490000095
due to the assumption that
Figure BSA0000167319490000096
The results of (10) and (32) are shown
Figure BSA0000167319490000097
On the other hand, to
Figure BSA0000167319490000098
In (e)3The + χ) portion is derived, yielding:
Figure BSA0000167319490000099
from (28) to
Figure BSA00001673194900000910
To pair
Figure BSA00001673194900000911
With respect to time derivation have
To prove that
Figure BSA00001673194900000913
Is bounded, of pair (29)
Figure BSA00001673194900000914
The closed-loop error equation is derived with respect to time as:
Figure BSA00001673194900000915
due to the assumption that
Figure BSA00001673194900000916
Thus, the following (36) shows
Figure BSA00001673194900000917
For error definition e in (9)1(t) obtaining a second derivative of
Figure BSA00001673194900000918
Further, it can be seen that
Figure BSA00001673194900000919
Further, it is found from (35)Namely, it isAre consistently continuous.
In view ofFrom the above analysis, the extended Barbara theorem can be applied to (34) to obtain
According to the formulas (33) and (38)
Figure BSA0000167319490000103
By substituting (39) into the definition of χ (t) (17)
Figure BSA0000167319490000104
Bringing (40) into (26) to obtain
In summary, the systematic error e1(t),e2(t),e3(t) all converge to 0 asymptotically, which indicates that the mobile robot can track the upper expected track.
Description of the drawings:
FIG. 1 is a coordinate system definition in the uncalibrated visual tracking control of extrinsic parameters
FIG. 2 is a system block diagram
FIG. 3 is a simulation: on the frameExpected and current motion path of lower wheeled mobile machine
FIG. 4 is a simulation: simulation results
FIG. 5 is a simulation: evolution of systematic errors
FIG. 6 is a simulation: the speed of the mobile robot.
Fig. 7 is a characterization point of a mobile robot platform and experiment, and a camera with unknown translational extrinsic parameters.
FIG. 8 is an experimental diagram: motion paths of current and desired camera frames.
FIG. 9 is an experimental diagram: image trajectories of feature points (dotted line: desired trajectory; solid line: current trajectory).
FIG. 10 is an experimental diagram: the evolution of the trajectory tracking error (dashed line: zero value).
FIG. 11 is an experimental diagram: evolution of robot velocity
FIG. 12 is an experimental diagram: motion paths of current and desired camera frames.
FIG. 13 is an experimental diagram: image trajectories of feature points
FIG. 14 is an experimental chart: tracking evolution of errors
FIG. 15 is an experimental diagram: evolution of robot velocity
The specific implementation mode is as follows:
1. the utility model provides a wheeled mobile robot extrinsic parameter does not have demarcation vision servo tracking system which characterized in that includes the following step:
1 st, System and kinematic model
1.1, System description
Fig. 1 is a schematic diagram into which translational external parameters of the camera relative to the robot are introduced, unlike the conventional assumption that a stationary monocular camera is directly above the center of a wheeled mobile robot. Current camera frame composed of
Figure BSA0000167319490000111
Is represented by the formula (I) in which zcThe axis is defined along the optical axis. Frame
Figure BSA00001673194900001119
Define the current frame of the wheeled mobile robot, wherein zrThe shaft is located in front of the robot. Due to the presence of translated camera-to-robot parameters in the system, thereforerTczrTcrExpressed along the z and x axes, respectivelyAnd (4) a translation parameter of (c).
Figure BSA00001673194900001121
Andframes of the camera and robot are defined on the desired trajectories, respectively.
Furthermore, with respect to the gesture contrast introduction,
Figure BSA00001673194900001113
andframes of the camera and robot are defined at the reference poses, respectively. Thetac(t) and θd(t) each represents
Figure BSA0000167319490000119
And
Figure BSA0000167319490000118
relative to
Figure BSA0000167319490000117
Is easy to know the rotation angle of thetac(t) and θd(t) are respectively equivalent to
Figure BSA00001673194900001110
Andrelative to
Figure BSA00001673194900001112
The angle of rotation of (c).
1.2 kinematic model
Will be provided with
Figure BSA00001673194900001115
Is regarded as a characteristic point, and is defined
Figure BSA00001673194900001117
In the robot coordinate system
Figure BSA00001673194900001118
The coordinates of
Figure BSA0000167319490000112
Where σ corresponds to the height difference between the camera and the robot.
Figure BSA0000167319490000113
The following relation is satisfied with the movement speed of the robot:
Figure BSA0000167319490000114
wherein v and w are each independently
Figure BSA00001673194900001123
Linear and angular velocities around itself:
v:=[0,0,vr]T,w:=[0,-wr,0]T(8)
wherein v isr(t) and wr(t) is the linear velocity and angular velocity of the mobile robot, respectively, substituting (2) into (1) can result in
Figure BSA0000167319490000115
Kinematic equation in z, x direction:
for convenience, an unknown translational extrinsic parameter is defined asrTcx:=a,rTcz: will be as b
Figure BSA00001673194900001116
The coordinates of the origin point of (A) in the camera coordinate system are defined asAccording to the rule of transformation of the coordinate system,
Figure BSA0000167319490000122
and
Figure BSA0000167319490000123
the following relationships are provided:
Figure BSA0000167319490000124
wherein
Figure BSA0000167319490000125
AndrTcrespectively represent
Figure BSA00001673194900001220
In that
Figure BSA00001673194900001221
The lower rotation matrix and the translation vector. The form is as follows:
next, from (4) can be obtained:
Figure BSA0000167319490000127
wherein
Figure BSA0000167319490000128
AndrTcrespectively represent
Figure BSA00001673194900001222
In that
Figure BSA00001673194900001223
The formula (6) is substituted into the formula
Figure BSA00001673194900001211
At the origin of
Figure BSA00001673194900001212
The following kinematic equation:
Figure BSA0000167319490000129
similarly, can obtain
Figure BSA00001673194900001213
Is in the desired camera coordinate system
Figure BSA00001673194900001224
The following kinematic equation:
Figure BSA00001673194900001210
whereindT*z(t),dT*x(t) each represents
Figure BSA00001673194900001214
At the origin of
Figure BSA00001673194900001215
Z, x coordinates of, vrd(t),wrd(t) represents the linear and angular velocities of the robot on the desired trajectory, respectively.
No. 2, control development
2.1 open-Loop error equation
Assuming that there are several coplanar feature points in space, the estimation and decomposition of the homography matrix are used to obtain proportional images from the reference image and the image on the expected track
Figure BSA00001673194900001216
Relative to
Figure BSA00001673194900001217
The pose of (2):cT*z/d*(t),cT*x/d*(t),θc(t) additionally, according to the reference diagramImage and current image can be obtained in a proportional sense
Figure BSA00001673194900001218
Relative to
Figure BSA00001673194900001219
The pose of (2):dT*z/d*(t),dT*x/d*(t),θd(t).
defining the error of track tracking as
Figure BSA0000167319490000131
When e is known1,e2,e3When the convergence is 0, the robot tracks the expected track, the two ends of (9) are derived with respect to time, and (7) and (8) are substituted by
Figure BSA0000167319490000132
In which the angular kinematic equation of the robot is used
Figure BSA0000167319490000133
In addition, the method can be used for producing a composite materialAnd
Figure BSA0000167319490000135
can be obtained by a front-back time difference mode.
2.2 adaptive controller design
First, depth estimation error
Figure BSA0000167319490000136
The definition is as follows:
Figure BSA0000167319490000137
wherein
Figure BSA0000167319490000138
Is a depth estimate. For one of the translation extrinsic parameters, the auxiliary parameter ρ ∈ R+Is defined as
Figure BSA0000167319490000139
And the estimation error is
Figure BSA00001673194900001310
Wherein
Figure BSA00001673194900001311
Is an estimate of p. In addition, another extrinsic parameter estimation error
Figure BSA00001673194900001312
Is that
Figure BSA00001673194900001313
Wherein
Figure BSA00001673194900001314
Is an estimate of a.
By using the adaptive control framework, the update rule of the unknown parameters is as follows:
Figure BSA0000167319490000141
wherein gamma is1,Γ2,Γ3∈R+
In order to realize the track tracking task, the motion controller of the mobile robot is designed as follows:
wherein k isv;kw∈R+Is a control gain, and χ (t) ∈ R is expressed as
Figure BSA0000167319490000143
3 rd, stability analysis
Theorem 1: control input and parameter update laws (15) designed in (16) drive wheeled mobile robots to track desired trajectories with unknown translational external parameters in a sense
Assume that the desired trajectory satisfies the following condition:
Figure BSA0000167319490000145
and (3) proving that: the non-negative Lyapunov function V (t) is chosen as follows:
Figure BSA0000167319490000146
after taking the time derivative of v (t) and substituting the open loop kinetics (10) and the designed controller (16), we have the following relationship:
Figure BSA0000167319490000147
Figure BSA0000167319490000151
by substituting (15) into (21), the following expression is obtained by eliminating the commonly used terms:
V=-kve1 2-kw(e3+χ)2(22)
e is obtained according to the formulae (15) and (21)1(t),e2(t),e3(t),
Figure BSA0000167319490000152
According to (11)
Figure BSA0000167319490000153
Then, according to (12), (13), (14), the compounds are obtained
Figure BSA0000167319490000154
Again, the definition (7) and assumptions of error
Figure BSA0000167319490000155
Can obtain the product
Figure BSA0000167319490000156
Further, it is clear from (17)
Figure BSA00001673194900001516
Due to the assumption thatAccording to (16)
According to the above analysis, further, according to (10), aAccording to (15), a
Figure BSA0000167319490000159
Further, the parameters (12), (13) and (14) can be defined
Figure BSA00001673194900001510
From the left half of (10), the results are obtained
Figure BSA00001673194900001511
For the purpose of the following analysis, the function f (t) is defined as f (t):=kve1 2+kw(e3+χ)2≧ 0, for both ends of which a derivation with respect to time is taken into account (17) are:
Figure BSA00001673194900001512
then by
Figure BSA00001673194900001518
Corollary to the guava theorem
Can then be pushed out
Figure BSA00001673194900001514
By substituting the control law (16) into the open-loop error equation (10)
Figure BSA00001673194900001515
The terms may be obtained:
Figure BSA0000167319490000161
further bringing (26) into (27) and (15) to obtain
To be aligned with
Figure BSA0000167319490000163
In (1)
Figure BSA0000167319490000164
Partly by inference from the Bobara's theorem, it is necessary to bring the law of control (16) into the open-loop error equation (10)
Figure BSA0000167319490000165
Item, can obtain
Figure BSA0000167319490000166
Wherein the auxiliary symbol
Figure BSA0000167319490000167
The definition of (A) is shown in the formula. Is obtained by
Figure BSA0000167319490000168
After the time derivative of (2), can be deduced
Figure BSA0000167319490000169
Diagonal velocity wr(t) derivation, and the use of (16) (17) having:
Figure BSA00001673194900001610
due to the assumption thatThus, the following (31) shows
Figure BSA00001673194900001612
Further, due to the assumption
Figure BSA00001673194900001613
Thus, according to (30), it is understood that
Figure BSA00001673194900001614
Is ready to obtain
Figure BSA00001673194900001615
Is consistently continuous, considering
Figure BSA00001673194900001616
Thus using the extended ballad theorem for equation (29) we can:
Figure BSA00001673194900001617
due to the assumption that
Figure BSA00001673194900001618
The results of (10) and (32) are shown
Figure BSA00001673194900001619
On the other hand, to
Figure BSA00001673194900001620
In (e)3The + χ) portion is derived, yielding:
Figure BSA0000167319490000171
from (28) to
Figure BSA0000167319490000172
To pair
Figure BSA0000167319490000173
With respect to time derivation have
To prove that
Figure BSA0000167319490000175
Is bounded, of pair (29)
Figure BSA0000167319490000176
The closed-loop error equation is derived with respect to time as:
due to the assumption that
Figure BSA0000167319490000178
Thus, the following (36) shows
Figure BSA0000167319490000179
For error definition e in (9)1(t) obtaining a second derivative of
Figure BSA00001673194900001710
Further, it can be seen that
Figure BSA00001673194900001711
Further, it is found from (35)
Figure BSA00001673194900001712
Namely, it is
Figure BSA00001673194900001713
Are consistently continuous.
In view of
Figure BSA00001673194900001714
From the above analysis, the extended Barbara theorem can be applied to (34) to obtain
Figure BSA00001673194900001715
According to the formulas (33) and (38)
Figure BSA00001673194900001716
By substituting (39) into the definition of χ (t) (17)
Figure BSA00001673194900001717
Bringing (40) into (26) to obtain
Figure BSA0000167319490000181
In summary, the systematic error e1(t),e2(t),e3(t) all converge to 0 asymptotically, which indicates that the mobile robot can track the upper expected track.
4 th simulation results and conclusions
4.1, simulation results
In this section, simulation results are collected to verify the feasibility of the proposed strategy. Four coplanar feature points are arbitrarily arranged for simulation, and the internal parameters of the virtual camera are set as follows:
Figure BSA0000167319490000182
the control parameter is selected to be kv-0.2; kw is 0.1; gamma-shaped1=4;Γ2=0:1;Γ38. The external camera-to-robot parameter is set to a ═ 0: 05 m; b is 0: 08 m.
The movement path of the wheeled mobile robot is shown in FIG. 3
Figure BSA0000167319490000186
The thicker one is the desired path and the thinner is the current path, which indicates that the robot successfully tracked the desired trajectory. In FIG. 4, the current image trajectories of the points of interest effectively track their desired image trajectories. As can be seen from fig. 5, all system errors converge to the desired value of zero. Fig. 6 shows the current velocity v of the robotr(t) and wr(t) which are each related to a desired robot speed vrd(t) and wrd(t) good anastomosis. This section further verifies the scheme by real experiments implemented on the mobile robot shown in fig. 7. Each feature point is identified by the common vertex of two squares on a planar panel in the scene, which is also shown in fig. 7. The digital camera moves from substantially the midpoint of the wheel axis to a front right position.
A reference image and a desired image sequence are prepared and the desired trajectory is set to a serpentine curve, with the final velocity becoming zero. Desired trajectory
Figure BSA0000167319490000185
Relative to
Figure BSA0000167319490000184
Is (-2.0 m; 0.8 m; 11.0 deg.). The algorithm is performed in a VC + + environment by using an OpenCV library. The true value of the external parameter is set to a ═ 0: 05m and b ═ 0: 06 m. The off-line calibration of the internal parameters of the camera comprises the following steps:
Figure BSA0000167319490000183
experiment 1 (validation of proposed strategy): camera frame
Figure BSA0000167319490000196
Relative to a reference frame
Figure BSA0000167319490000195
Is estimated roughly. Is (-2.4 m; 0.3 m; 14.6 deg.). The initial value of each estimated parameter being ^
Figure BSA0000167319490000191
The adaptive control parameters are adjusted to:
kv=0.4;kw=0.1;Γ1=8;Γ2=2;Γ3=8
(43)
the motion path of the mobile robot system is framed by unknown external parameters between the robot and the vehicle-mounted camera
Figure BSA0000167319490000194
The motion path of the lower camera. Fig. 8 shows that the current trajectory of the camera moves along the desired camera trajectory, meaning that the mobile robot successfully tracks its desired trajectory. The locus of the feature points is shown in fig. 9, wherein the star points represent the final positions of the locus. It can be seen that the current image trajectory coincides with the trajectory of the desired image sequence. FIG. 10 showsThe evolution of the tracking error is observed and it can be seen that all errors have good convergence. Fig. 11 shows the linear and angular velocities of the mobile robot, where the dotted line is the velocity of the desired trajectory and the solid line is the current robot velocity. This experiment shows that the robot can still successfully track the required trajectory even if the camera-to-robot translation parameters are unknown.
Experiment 2 (comparison with a classical method): to further validate the proposed strategy, an experiment was performed, namely [37]]To perform the comparison. The camera configuration and required trajectory settings were the same as in experiment 1. Adjusting the control parameter to kv-0.4; kw is 0.1; gamma-shaped1=30.
Figure BSA0000167319490000192
Is at an initial value of
Figure BSA0000167319490000193
(0)=0.1m。
Fig. 12 shows the motion paths of the current and desired camera frames, and it can be seen that there is some trajectory tracking error in this process. Fig. 13 shows how the current image features track the desired image. The evolution of the systematic error is shown in fig. 14. Fig. 15 shows the speed of the robot. Comparing the results shows that the proposed method has a better performance when translational camera-to-robot parameters are present in the system.
4.3, conclusion
For a wheeled mobile robot, when the translation camera-to-robot parameters are unknown, a new visual servo tracking scheme is proposed. Using a homography-based algorithm, scaled relative poses are obtained, which are used to construct trajectory tracking errors. Although the camera-to-robot translation parameters and scene depth are not calibrated, the robot can still successfully track the required trajectory using the proposed adaptive controller, which is carefully developed to compensate for those unknown parameters and to drive the mobile robot under incomplete constraints. The system stability was strictly analyzed by using Lyapunov technique. Simulation and comparison experiment results show the effectiveness of the strategy.
Reference to the literature
[1]F.Chaumette and S.Hutchinson,“Visual servo control Part II:advanced approaches,”IEEE Robot.Autom.Mag.,vol.14,no.2,pp.109-118,Mar.2007.
[2]J.Su and W.Xie,“Motion planning and coordination for robot systemsbased on representation space,”IEEE Trans.Syst.Man Cybern.Part B-Cybern.,vol.41,no.1,pp.248-259,Feb.2011.
[3]X.Zhang,R.Wang,Y.Fang,B.Li,and B.Ma,“Acceleration-level pseudo-dynamic visual servoing of mobile robots with backstepping and dynamicsurface control,”IEEE Trans.Syst.Man Cyber.:Syst.,online publised,DOl:10.1109/TSMC.2017.2777897.
[4]G.Hu,W.P.Tay,and Y.Wen,“Cloud robotics:architecture,challenges andapplications,”lEEE Netw.,vol.26,no.3,pp.21-28,May.2012.
[5]N.Sun,Y.Wu,Y.Fang,and H.Chen,“Nonlinear antiswing control forcrane systems with double-pendulum swing effects and uncertain parameters:design and experiments,”IEEE Trans.Aurom.Sci.Eng.,DOI:10.1109/TASE.2017.2723539.
[6]H.Chen and D.Sun,“Moving groups of microparticles into array witha robot-tweezers manipulation system,”IEEE Trans Robot.,vol.28,no.5,pp.1069-1080,Oct.2012.
[7]Z.Ma and J.Su,“Robust uncalibrated visual servoing control basedon disturbance observer,”ISA Transactions.vol.59,pp.193-204,Nov.2015.
[8]N.Sun,Y.Fang,H.Chen,Y.Fu,and B.Lu,“Nonlinear stabilizing controlfor ship-mounted cranes with ship roll and heave movements:Design,analysis,and experiments,”IEEE Trans.Syst.Man Cybern.:Syst.,online published,DOI:10.1109/TSMC.2017.2700393
[9]S.Hutchinson,G.D.Hager,and P.I.Corke,“A tutorial on visual servocontrol,”IEEE Trans.Robot.Aurom.,vol.12,no.5,pp.651-670,Oct.1996.
[10]F.Janabi-Sharifi,L.Deng,and W.J.Wilson,“Comparison of basicvisual servoing methods,”IEEE/ASME Trans.Mechatronics,vol.16,no.5,pp.967-983,Oct.2011.
[11]F.Ke,Z.Li,H.Xiao,and X.Zhang,“Visual servoing of constrainedmobile robots based on model predictive control,”IEEE Trans.Syst.Man Cyber.:Syst.,vol.47,no.7,pp.1428-1438,2017.
[12]A.Hajiloo,M.Keshmiri,W.-F.Xie,and T.-T.Wang,“Robust online modelpredictive control for a constrained image-based visual servoing,”IEEETrans.Ind.Electron.,vol.63,no.4,pp.2242-2250,Apr.2016.
[13]D.Chwa,A.P.Dani,and W.E.Dixon,“Range and motion estimation of amonocular camera using static and moving objects,”IEEE Trans.ControlSyst.Technol.,vol.24,no.4,pp.1174-1183,Jul.2016.
[14]A.P.Dani,N.R.Fischer,and W.E.Dixon,“Single camera structure andmotion,”IEEE Trans.Autom.Control,vol.57,no.1,pp.241□246,Jan.2012.
[15]M.Liu,C.Pradalier,and R.Siegwart,“Visual homing from scale withan uncalibrated omnidirectional camera,”IEEE Trans.Robot.,vol.29,no.6,pp.1353-1365,Dec.2013.
[16]H.Zhao,Y.Liu,X.Xie,Y.Liao,and X.Liu,“Filtering based adaptivevisual odometry sensor framework robust to blurred images,”Sensors,vol.16,no.7,pp.1040,Jul.2016.
[17]H.Wang,Y.-H.Liu,and W Chen,“Uncalibrated visual tracking controlwithout visual velocity,”IEEE Trans.Control Syst.Technol.,vol.18,no.6,pp.1359-1370,Nov.2010.
[18]H.Wang,B.Yang,J.Wang,W.Chen,X.Liang,and Y.Liu,“Adaptive visualservoing of contour features,”IEEE/ASME Trans.Mechatronics,vol.23,no.2,pp.811-822,Apr.2018.
[19]S.Y.Chen,J.Zhang,H.Zhang,N.M.Kowk,and Y.F.Li,“Intelligentlighting control for vision-based robotic manipulation,”IEEETrans.Ind.Electron.,vol.59,no.8,pp.3254-3263,Aug.2012.
[20]G.Hu,W.MacKunis,N.Gans,W.E.Dixon,J.Chen,A.Behal,and D.Dawson,“Homography-Based Visual Servo Control With Imperfect Camera Calibration,”IEEE Trans.Autom.Control,vol.54,no.6,pp.1318-1324,Jun.2009.
[21]Y.Liu,R.Xiong,Y.Wang,H.Huang,X.Xie,X.Liu,and G.Zhang,“Stereovisual-inertial odometry with multiple kalman filters ensemble,”IEEETrans.Ind.Electron.,vol.63,no.10,pp.6205-6216,Oct.2016.
[22]H.Wang,B.Yang,Y.Liu,W.Chen,X.Liang,and R.Pfeifer,“Visual servoingof soft robot manipulator in constrained environments with an adaptivecontroller,”IEEE/ASME Trans.Mechatronics,online published,DOI:10.1 109/TMECH.2016.2613410.
[23]H.Wang,C.Wang,W.Chen,X.Liang,and Y.Liu,“Three dimensionaldynamics for cable-driven soft manipulator,”IEEE/ASME Trans.Mechatronics,Vol.22,No.1,pp.18-28,2017.
[24]H.Chen,C.Wang,X.J.Li,and D.Sun,“Transportation of multiplebiological cells through saturation-controlled optical tweezers in crowdedmicroenvironments,”IEEE/ASME Trans.Mechatronics,vol.21,no.2,pp.888-899,Apr.2016.
[25]N.Sun,Y.Fang,H.Chen,and B.Lu,“Amplitude-saturated nonlinearoutput feedback antiswing control for underactuated cranes withdoublependulum cargo dynamics,”IEEE Trans.Ind.Electron.,vol.64,no.3,pp.2135-2146,Mar.2017.
[26]Y.Fang,X,Liu.and X.Zhang,“Adaptive active visual servoing ofnonholonomic mobile robots,”IEEE Trans.Ind.Electron.,vol.59,no.1,pp.486-497,Jan.2012.
[27]B.Li,X.Zhang,Y.Fang,and W.Shi,“Visual servo regulation of wheeledmobile robots with simultaneous depth identification,”IEEETrans.Ind.Electron.,vol.65,no.l,pp.460-469,Jan.2018.
[28]X.Zhang,Y.Fang,and N.Sun,“Visual servoing of mobile robots forposture stabilization:from theory to experiments,”Int.J.Robust NonlinearControl,vol.25,no.1,pp.1-15,Jan.2015.
[29]H.Wang,D.Guo,H.Xu,W.Chen,T.Liu,and K.Leang,“Eye-inhand trackingcontrol of free-floating space manipulator,”IEEE Trans.Aerosp.Electron.Syst.,vol.53,no.4,pp.1855-1865,2017.
[30]X.Zhang,Y.Fang,B.Li,and J.Wang,“Visual servoing of nonholonomicmobile robots with uncalibrated camera-to-robot parameters,”IEEETrans.Ind.Electron.,vol.64,no.l,pp.390□400,Jan.2017.
[31]A.Cherubini and F.Chaumette,“Visual navigation of a mobile robotwith laser-based collision avoidance,”Int.J.Robot.Res.,vol.32,no.2,pp.189□205,Feb.2013.
[32]J.Chen,B.Jia,and K.Zhang,“Trifocal tensor-based adaptive visualtrajectory tracking control of mobile robots,”IEEE Trans.Cybernetics,onlinepublished,DOI:10.1109/TCYB.2016.2582210.
[33]X.Liang,H.Wang,Y.H.Liu,and W.Chen,“Formation control ofnonholonomic mobile robots without position and velocity measurements,”IEEETrans.Robot.,vol.34,no.2,pp.434□446,Apr.2018.
[34]W.E.Dixon,D.M.Dawson,E.Zergeroglu,A.Behal,“Adaptive trackingcontrol of a wheeled mobile robot via an uncalibrated camera system,”IEEETrans.Syst.Man Cybern.:Cybern.,vol.31,no.3,pp.341□352,Jun.2001.
[35]H.Chen,C.Wang,Z.Liang,D.Zhang,and H.Zhang,“Robust practicalstabilization of nonholonomic mobile robots based on visual servoing feedbackwith inputs saturation,”Asian Journal of Control,vol.16,no.3,pp.692□702,May.2014.
[36]X.Liang,H.Wang,Y.-H.Liu,W.Chen,and J.Zhao,“A unified designmethod for adaptive visual tracking control of robots with eyein-hand/fixedcamera configuration,”Automatica,vol.59,pp.97□105,Sep.2015.
[37]J.Chen,W.Dixon,M.Dawson,and M.McIntyre,“Homography-based visualservo tracking control of a wheeled mobile robot,”IEEE Trans.Robot.,vol.22,no.2,pp.406-415,Apr.2006.
[38]A.D.Luca,G Oriolo,and P.R.Giordano,“Feature depth observation forimage-based visual servoing:theory and experiments,”Int.J.Robot.Res.,vol.27,no.10,pp.1093-1116,Oct.2008.
[39]J.J.Craig,Introduction to robotics:mechanics and control,3rd ed.,NJ:Prentice-Hall,2005.

Claims (1)

1. The utility model provides a wheeled mobile robot extrinsic parameter does not have demarcation vision servo tracking system which characterized in that includes the following step:
1 st, System and kinematic model
1.1, System description
Fig. 1 is a schematic diagram into which translational external parameters of the camera relative to the robot are introduced, unlike the conventional assumption that a stationary monocular camera is directly above the center of a wheeled mobile robot. Current camera frame composed of
Figure FSA0000167319480000011
Is represented by the formula (I) in which zcThe axis is defined along the optical axis. Frame
Figure FSA0000167319480000012
Define the current frame of the wheeled mobile robot, wherein zrThe shaft is located in front of the robot. Due to the presence of translated camera-to-robot parameters in the system, thereforerTczrTcxExpressed along the z and x axes, respectivelyAnd (4) a translation parameter of (c).
Figure FSA0000167319480000013
And
Figure FSA0000167319480000014
frames of the camera and robot are defined on the desired trajectories, respectively.
Furthermore, with respect to the gesture contrast introduction,and
Figure FSA0000167319480000016
frames of the camera and robot are defined at the reference poses, respectively. Thetac(t) and θd(t) each representsAnd
Figure FSA0000167319480000018
relative to
Figure FSA0000167319480000019
Is easy to know the rotation angle of thetac(t) and θd(t) are respectively equivalent to
Figure FSA00001673194800000110
And
Figure FSA00001673194800000111
relative to
Figure FSA00001673194800000112
The angle of rotation of (c).
1.2 kinematic model
Will be provided with
Figure FSA00001673194800000113
Is regarded as a characteristic point, and is defined
Figure FSA00001673194800000114
In the robot coordinate system
Figure FSA00001673194800000115
The coordinates ofWhere σ corresponds to the height difference between the camera and the robot.
Figure FSA00001673194800000117
The following relation is satisfied with the movement speed of the robot:
Figure FSA00001673194800000118
wherein v and w are each independently
Figure FSA00001673194800000119
Linear and angular velocities around itself:
v:=[0,0,vr]T,w:=[0,-wr,0]T(2)
wherein v isr(t) and wr(t) is the linear velocity and angular velocity of the mobile robot, respectively, substituting (2) into (1) can result in
Figure FSA00001673194800000120
Kinematic equation in z, x direction:
Figure FSA00001673194800000121
for convenience, an unknown translational extrinsic parameter is defined asrTcx:=a,rTcz: will be as b
Figure FSA00001673194800000122
The coordinates of the origin point of (A) in the camera coordinate system are defined as
Figure FSA00001673194800000123
According to the rule of transformation of the coordinate system,
Figure FSA0000167319480000021
and
Figure FSA0000167319480000022
the following relationships are provided:
Figure FSA0000167319480000023
whereinAndrTcrespectively represent
Figure FSA0000167319480000025
In that
Figure FSA0000167319480000026
The lower rotation matrix and the translation vector. The form is as follows:
Figure FSA0000167319480000027
next, from (4) can be obtained:
Figure FSA0000167319480000028
whereinAndrTcrespectively represent
Figure FSA00001673194800000210
In that
Figure FSA00001673194800000211
The formula (6) is substituted into the formula
Figure FSA00001673194800000212
At the origin of
Figure FSA00001673194800000213
The following kinematic equation:
Figure FSA00001673194800000214
similarly, can obtain
Figure FSA00001673194800000215
Is in the desired camera coordinate system
Figure FSA00001673194800000216
The following kinematic equation:
Figure FSA00001673194800000217
whereindT*z(t),dT*x(t) each represents
Figure FSA00001673194800000218
At the origin of
Figure FSA00001673194800000219
Z, x coordinates of, vrd(t),wrd(t) represents the linear and angular velocities of the robot on the desired trajectory, respectively.
No. 2, control development
2.1 open-Loop error equation
Assuming that there are several coplanar feature points in space, the estimation and decomposition of the homography matrix are used to obtain proportional images from the reference image and the image on the expected track
Figure FSA00001673194800000220
Relative toThe pose of (2):cT*z/d*(t),cT*x/d*(t),θc(t) additionally, from the reference image and the current image, a ratio-wise expression can be obtained
Figure FSA00001673194800000222
Relative to
Figure FSA00001673194800000223
The pose of (2):dT*z/d*(t),dT*x/d*(t),θd(t).
defining the error of track tracking as
Figure FSA0000167319480000031
When e is known1,e2,e3When the convergence is 0, the robot tracks the expected track, the two ends of (9) are derived with respect to time, and (7) and (8) are substituted by
Figure FSA0000167319480000032
In which the angular kinematic equation of the robot is used
Figure FSA0000167319480000033
In addition, the method can be used for producing a composite materialAndcan be obtained by a front-back time difference mode.
2.2 adaptive controller design
First, depth estimation error
Figure FSA0000167319480000036
The definition is as follows:
Figure FSA0000167319480000037
wherein
Figure FSA0000167319480000038
Is a depth estimate. For one of the translation extrinsic parameters, the auxiliary parameter ρ ∈ R+Is defined as
Figure FSA0000167319480000039
And the estimation error is
Figure FSA00001673194800000310
Wherein
Figure FSA00001673194800000311
Is an estimate of p. In addition, another extrinsic parameter estimation error
Figure FSA00001673194800000312
Is that
Figure FSA00001673194800000313
WhereinIs an estimate of a.
By using the adaptive control framework, the update rule of the unknown parameters is as follows:
Figure FSA00001673194800000315
wherein gamma is1,Γ2,Γ3∈R+
In order to realize the track tracking task, the motion controller of the mobile robot is designed as follows:
Figure FSA0000167319480000041
wherein k isv;kw∈R+Is a control gain, and χ (t) ∈ R is expressed as
3 rd, stability analysis
Theorem 1: control input and parameter update laws (15) designed in (16) drive wheeled mobile robots to track desired trajectories with unknown translational external parameters in a sense
Assume that the desired trajectory satisfies the following condition:
Figure FSA0000167319480000044
and (3) proving that: the non-negative Lyapunov function V (t) is chosen as follows:
Figure FSA0000167319480000045
after taking the time derivative of v (t) and substituting the open loop kinetics (10) and the designed controller (16), we have the following relationship:
Figure FSA0000167319480000046
by substituting (15) into (21), the following expression is obtained by eliminating the commonly used terms:
V=-kve1 2-kw(e3+χ)2(22)
according to the formulae (15) and (21), the compounds
Figure FSA0000167319480000051
According to (11)Then, according to (12), (13), (14), the compounds are obtained
Figure FSA0000167319480000053
Again, the definition (7) and assumptions of error
Figure FSA0000167319480000054
Can obtain the product
Figure FSA0000167319480000055
Further, it is clear from (17)
Figure FSA0000167319480000056
Due to the assumption that
Figure FSA0000167319480000057
According to (16) canTo know
According to the above analysis, further, according to (10), a
Figure FSA0000167319480000059
According to (15), a
Figure FSA00001673194800000510
Further, the parameters (12), (13) and (14) can be defined
Figure FSA00001673194800000511
From the left half of (10), the results are obtained
Figure FSA00001673194800000512
To facilitate the following analysis, the function f (t) is defined as f (t): k ═ kve1 2+kw(e3+χ)2≧ 0, for both ends of which a derivation with respect to time is taken into account (17) are:
Figure FSA00001673194800000513
then by
Figure FSA00001673194800000514
Corollary to the guava theorem
Figure FSA00001673194800000515
Can then be pushed out
Figure FSA00001673194800000516
Substituting the control law (16) into an open-loop error equation(10) Is/are as followsThe terms may be obtained:
Figure FSA00001673194800000518
further bringing (26) into (27) and (15) to obtain
Figure FSA00001673194800000519
To be aligned withIn (1)
Figure FSA0000167319480000062
Partly by inference from the Bobara's theorem, it is necessary to bring the law of control (16) into the open-loop error equation (10)
Figure FSA0000167319480000063
Item, can obtain
Wherein the auxiliary symbol
Figure FSA00001673194800000622
The definition of (A) is shown in the formula. Is obtained by
Figure FSA00001673194800000623
After the time derivative of (2), can be deduced
Figure FSA0000167319480000065
Diagonal velocity wr(t) derivation, and the use of (16) (17) having:
Figure FSA0000167319480000066
due to the assumption that
Figure FSA0000167319480000067
Thus, the following (31) shows
Figure FSA0000167319480000068
Further, due to the assumptionThus, according to (30), it is understood that
Figure FSA00001673194800000610
Is ready to obtain
Figure FSA00001673194800000624
Is consistently continuous, considering
Figure FSA00001673194800000611
Thus using the extended ballad theorem for equation (29) we can:
due to the assumption that
Figure FSA00001673194800000613
The results of (10) and (32) are shown
Figure FSA00001673194800000614
On the other hand, toIn (e)3+ X partThe derivative is divided to obtain:
Figure FSA00001673194800000616
from (28) to
Figure FSA00001673194800000617
To pairWith respect to time derivation have
Figure FSA00001673194800000619
To prove thatIs bounded, of pair (29)
Figure FSA00001673194800000621
The closed-loop error equation is derived with respect to time as:
Figure FSA0000167319480000071
due to the assumption that
Figure FSA0000167319480000072
Thus, the following (36) shows
Figure FSA0000167319480000073
For error definition e in (9)1(t) obtaining a second derivative of
Figure FSA0000167319480000074
Further, it can be seen that
Figure FSA0000167319480000075
Further, it is found from (35)
Figure FSA0000167319480000076
Namely, it is
Figure FSA0000167319480000077
Are consistently continuous.
In view ofFrom the above analysis, the extended Barbara theorem can be applied to (34) to obtain
Figure FSA0000167319480000079
According to the formulas (33) and (38)
Figure FSA00001673194800000710
By substituting (39) into the definition of χ (t) (17)
Figure FSA00001673194800000711
Bringing (40) into (26) to obtain
Figure FSA00001673194800000712
In summary, the systematic error e1(t),e2(t),e3(t) all converge to 0 asymptotically, which indicates that the mobile robot can track the upper expected track.
CN201810787728.1A 2018-07-17 2018-07-17 External parameter calibration-free visual servo tracking of wheeled mobile robot Expired - Fee Related CN110722533B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810787728.1A CN110722533B (en) 2018-07-17 2018-07-17 External parameter calibration-free visual servo tracking of wheeled mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810787728.1A CN110722533B (en) 2018-07-17 2018-07-17 External parameter calibration-free visual servo tracking of wheeled mobile robot

Publications (2)

Publication Number Publication Date
CN110722533A true CN110722533A (en) 2020-01-24
CN110722533B CN110722533B (en) 2022-12-06

Family

ID=69217618

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810787728.1A Expired - Fee Related CN110722533B (en) 2018-07-17 2018-07-17 External parameter calibration-free visual servo tracking of wheeled mobile robot

Country Status (1)

Country Link
CN (1) CN110722533B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111283683A (en) * 2020-03-04 2020-06-16 湖南师范大学 Servo tracking accelerated convergence method for robot visual feature planning track
CN111546344A (en) * 2020-05-18 2020-08-18 北京邮电大学 Mechanical arm control method for alignment
CN112083652A (en) * 2020-08-27 2020-12-15 东南大学 Track tracking control method for multipurpose wheeled mobile robot
CN113031590A (en) * 2021-02-06 2021-06-25 浙江同筑科技有限公司 Mobile robot vision servo control method based on Lyapunov function
CN113051767A (en) * 2021-04-07 2021-06-29 绍兴敏动科技有限公司 AGV sliding mode control method based on visual servo

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009059323A1 (en) * 2007-11-01 2009-05-07 Rimrock Automation, Inc. Dba Wolf Robotics A method and system for finding a tool center point for a robot using an external camera
US20110320039A1 (en) * 2010-06-25 2011-12-29 Hon Hai Precision Industry Co., Ltd. Robot calibration system and calibrating method thereof
CN102736626A (en) * 2012-05-11 2012-10-17 北京化工大学 Vision-based pose stabilization control method of moving trolley
CN104950893A (en) * 2015-06-26 2015-09-30 浙江大学 Homography matrix based visual servo control method for shortest path
CN106457562A (en) * 2014-06-23 2017-02-22 Abb瑞士股份有限公司 Method for calibrating a robot and a robot system
CN106774309A (en) * 2016-12-01 2017-05-31 天津工业大学 A kind of mobile robot is while visual servo and self adaptation depth discrimination method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009059323A1 (en) * 2007-11-01 2009-05-07 Rimrock Automation, Inc. Dba Wolf Robotics A method and system for finding a tool center point for a robot using an external camera
US20110320039A1 (en) * 2010-06-25 2011-12-29 Hon Hai Precision Industry Co., Ltd. Robot calibration system and calibrating method thereof
CN102736626A (en) * 2012-05-11 2012-10-17 北京化工大学 Vision-based pose stabilization control method of moving trolley
CN106457562A (en) * 2014-06-23 2017-02-22 Abb瑞士股份有限公司 Method for calibrating a robot and a robot system
CN104950893A (en) * 2015-06-26 2015-09-30 浙江大学 Homography matrix based visual servo control method for shortest path
CN106774309A (en) * 2016-12-01 2017-05-31 天津工业大学 A kind of mobile robot is while visual servo and self adaptation depth discrimination method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JIAN CHEN: "Homography-Based Visual Servo Tracking Control of a Wheeled Mobile Robot", 《IEEE TRANSACTIONS ON ROBOTICS》 *
张立阳等: "轮式移动机器人轨迹跟踪与避障研究", 《自动化与仪表》 *
李宝全: "轮式移动机器人视觉伺服策略研究", 《中国博士学位论文全文数据库 (信息科技辑)》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111283683A (en) * 2020-03-04 2020-06-16 湖南师范大学 Servo tracking accelerated convergence method for robot visual feature planning track
CN111546344A (en) * 2020-05-18 2020-08-18 北京邮电大学 Mechanical arm control method for alignment
CN112083652A (en) * 2020-08-27 2020-12-15 东南大学 Track tracking control method for multipurpose wheeled mobile robot
CN112083652B (en) * 2020-08-27 2022-06-14 东南大学 Track tracking control method for multipurpose wheeled mobile robot
CN113031590A (en) * 2021-02-06 2021-06-25 浙江同筑科技有限公司 Mobile robot vision servo control method based on Lyapunov function
CN113051767A (en) * 2021-04-07 2021-06-29 绍兴敏动科技有限公司 AGV sliding mode control method based on visual servo

Also Published As

Publication number Publication date
CN110722533B (en) 2022-12-06

Similar Documents

Publication Publication Date Title
Qiu et al. Visual servo tracking of wheeled mobile robots with unknown extrinsic parameters
CN110722533B (en) External parameter calibration-free visual servo tracking of wheeled mobile robot
Qi et al. Contour moments based manipulation of composite rigid-deformable objects with finite time model estimation and shape/position control
CN106774309B (en) A kind of mobile robot visual servo and adaptive depth discrimination method simultaneously
Li et al. Visual servo regulation of wheeled mobile robots with simultaneous depth identification
Sun et al. A review of robot control with visual servoing
Siradjuddin et al. A position based visual tracking system for a 7 DOF robot manipulator using a Kinect camera
Wang et al. A modified image-based visual servo controller with hybrid camera configuration for robust robotic grasping
Li et al. Visual servoing of wheeled mobile robots without desired images
CN109960145B (en) Mobile robot mixed vision trajectory tracking strategy
CN115480583B (en) Visual servo tracking and impedance control method for flying operation robot
CN115122325A (en) Robust visual servo control method for anthropomorphic manipulator with view field constraint
Miao et al. Low-complexity leader-following formation control of mobile robots using only FOV-constrained visual feedback
Lai et al. Image dynamics-based visual servo control for unmanned aerial manipulatorl with a virtual camera
De Farias et al. Dual quaternion-based visual servoing for grasping moving objects
Zhang et al. Recent advances on robot visual servo control methods
Kanellakis et al. On vision enabled aerial manipulation for multirotors
Miranda-Moya et al. Ibvs based on adaptive sliding mode control for a quadrotor target tracking under perturbations
Toro-Arcila et al. Visual path following with obstacle avoidance for quadcopters in indoor environments
Abou Moughlbay et al. Error regulation strategies for model based visual servoing tasks: Application to autonomous object grasping with nao robot
CN109542094B (en) Mobile robot vision stabilization control without desired images
Liu et al. Visual servoing with deep learning and data augmentation for robotic manipulation
Cao et al. Adaptive dynamic surface control for vision-based stabilization of an uncertain electrically driven nonholonomic mobile robot
Aflakian et al. Boosting performance of visual servoing using deep reinforcement learning from multiple demonstrations
Pérez-Alcocer et al. Saturated visual-servoing control strategy for nonholonomic mobile robots with experimental evaluations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20221206