CN110722533A - External parameter calibration-free visual servo tracking of wheeled mobile robot - Google Patents
External parameter calibration-free visual servo tracking of wheeled mobile robot Download PDFInfo
- Publication number
- CN110722533A CN110722533A CN201810787728.1A CN201810787728A CN110722533A CN 110722533 A CN110722533 A CN 110722533A CN 201810787728 A CN201810787728 A CN 201810787728A CN 110722533 A CN110722533 A CN 110722533A
- Authority
- CN
- China
- Prior art keywords
- robot
- camera
- parameters
- track
- mobile robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000000007 visual effect Effects 0.000 title abstract description 30
- 238000000034 method Methods 0.000 claims abstract description 22
- 238000013519 translation Methods 0.000 claims abstract description 17
- 238000000354 decomposition reaction Methods 0.000 claims abstract description 7
- 239000011159 matrix material Substances 0.000 claims abstract description 7
- 230000003044 adaptive effect Effects 0.000 claims description 17
- 230000033001 locomotion Effects 0.000 claims description 16
- 238000004458 analytical method Methods 0.000 claims description 15
- 238000010586 diagram Methods 0.000 claims description 11
- 230000004438 eyesight Effects 0.000 claims description 11
- 238000009795 derivation Methods 0.000 claims description 9
- 238000013461 design Methods 0.000 claims description 5
- 230000009897 systematic effect Effects 0.000 claims description 5
- 241000508269 Psidium Species 0.000 claims description 3
- 239000002131 composite material Substances 0.000 claims description 3
- 150000001875 compounds Chemical class 0.000 claims description 3
- 238000011161 development Methods 0.000 claims description 3
- 230000003287 optical effect Effects 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 3
- 238000004088 simulation Methods 0.000 abstract description 13
- 238000002474 experimental method Methods 0.000 abstract description 9
- 230000006870 function Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000000052 comparative effect Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- WYTGDNHDOZPMIW-RCBQFDQVSA-N alstonine Natural products C1=CC2=C3C=CC=CC3=NC2=C2N1C[C@H]1[C@H](C)OC=C(C(=O)OC)[C@H]1C2 WYTGDNHDOZPMIW-RCBQFDQVSA-N 0.000 description 1
- 230000003872 anastomosis Effects 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 239000012636 effector Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 229920002379 silicone rubber Polymers 0.000 description 1
- 239000004945 silicone rubber Substances 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/1605—Simulation of manipulator lay-out, design, modelling of manipulator
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The method comprises the steps of designing a visual servo track tracking method aiming at the condition that external translation parameters of a camera are unknown, obtaining a relative pose relation between a current pose and an expected track according to a decomposition method of a homography matrix, defining track tracking errors according to the relative pose, further solving an open-loop error equation, designing a self-adaptive visual servo track tracking control law, and strictly proving that tracking errors are asymptotically converged to zero by using a Lyapunov technology and Barbalt's lemma. Even if the parameters of the translation camera are unknown, simulation and comparison experiments are carried out to prove that the proposed strategy can drive the robot to track the expected track efficiently.
Description
Technical Field
The invention belongs to the technical field of computer vision and mobile robots, and particularly relates to a wheel type mobile robot external parameter calibration-free vision servo tracking control method.
Background
The visual sensor is widely applied to intelligent agents such as a wheeled mobile robot and the like, and has the advantages of large information amount, attractive appearance, high reliability and the like. In the past decades, robotic systems have brought about a tremendous socio-economic impact on human life, for example, in social life, industrial engineering, and bioengineering. The combination of a mobile robot and a vision sensor enhances its external environment perception and enables the robot to perform some difficult tasks. The similarity to human visual perception and its ability to possess a non-contact measurement environment make visual sensors a very useful part of various types of robots. The robot vision servo system can integrate vision data into a robot controller, so that the control performance is improved, and the robot is more flexible. In general, visual servoing frameworks can be divided into three categories, namely position-based visual servoing, image-based visual servoing, and hybrid visual servoing. There have been many studies on parameter estimation for uncertain parameters and environmental perturbations in the system, and also on how to deal with environmental perturbations, e.g. Zhao et al solve the blurred image problem.
In the classical visual servoing process, calibrating intrinsic and extrinsic parameters of the visual system has been a tedious and necessary process. In a typical vision servoing framework for a mobile robotic system, it is often required to place the camera in a suitable position on the robot in order to have a good field of view available. Therefore, there are extrinsic parameters between the camera and the robot. Furthermore, the introduction of unknown external parameters can result due to inevitable mounting errors when placing the camera on the robot platform. Therefore, it is very interesting and also challenging to handle an uncalibrated external camera and its corresponding robot parameter for the particular visual servo tracking problem of a wheeled mobile robot.
Visual servoing strategies have been widely used for robotic manipulators, which typically must deal with unknown factors in the environment and control system. Visual servoing strategies have been widely used for robotic manipulators, which typically must deal with unknown factors in the environment and control system. Wang et al devise an adaptive rule to estimate curve parameters of an online object that is based on a vision-based robotic manipulation system. In document [19], the authors use active lighting control on robotic manipulators to guarantee the lighting quality of the environment. In [20], a robust adaptive uncalibrated visual servo controller is designed for asymptotic adjustment of a robotic end-effector. Document [21] proposes a stereo vision-inertial ranging algorithm assembled with different separation kalman filters. Wang et al have designed a new mechanism for a silicone rubber soft manipulator using the Cosserat rod theory and Kelvin model. Document [24] proposes a new strategy for the transport of multiple biological cells, which comprises an automatic on-line calibration algorithm for the identification of system parameters. Compared with a robot, due to incomplete constraint and under-actuated property, the influence of unknown parameters on the wheel type mobile robot is more serious, and great difficulty is brought to the design and stability analysis of a controller.
For visual adjustment of wheeled mobile robots, non-complete constraints and visual uncertainties should be considered, as is the work in [26 ]. Li et al use a concurrent learning strategy and homography decomposition techniques to adjust the pose required for the mobile robot. In [28], an adaptive visual servoing strategy is designed to drive a mobile robot with natural behavior. There are also some tasks related to mobile robot vision adjustment, such as working [29], etc. that are not calibrated. A novel two-stage controller is elaborately designed in [30] by adopting an adaptive controller and a back stepping method, so that the robot adjusting task is completed, and unknown external camera-robot parameters and unknown depth are provided.
In addition to visual servo regulation, visual servo tracking control (another area of visual servo) is now also becoming increasingly important. Compared with the posture adjustment control of the mobile robot, the trajectory tracking scheme can be combined with motion planning and multi-system constraint, so that the method is more suitable for realizing complex tasks. For example, Chen et al successfully completed the trajectory tracking task of a mobile, vision-equipped robot in [32 ]. Some work has been done in the robot vision servo tracking task, namely with respect to handling difficulties caused by unknown parameters. In [33], a depth-independent image jacobian framework is developed for a robot tracking control scheme that has no position and velocity measurements. In [34] and [35], the adaptive controller is designed to compensate for unknown camera system and mechanical parameters, respectively, in the visual tracking process. Liang et al designs a self-adaptive trajectory tracking controller suitable for a wheeled mobile robot in an environment where camera parameters and precise characteristic positions are unknown. Unfortunately, existing methods rarely take into account uncertain camera-to-robot parameters in system models for visual servo tracking tasks.
Inspired by the trajectory tracking method in document [37], a new adaptive tracking controller is proposed herein for a wheeled mobile robot with uncalibrated translation camera to robot parameters. After the kinematic model of the robot system is analyzed and homography decomposition is used, measurable signals are obtained to construct system tracking errors, and open loop dynamics are deduced. Subsequently, a motion controller was developed for the trajectory tracking target under incomplete constraints, while carefully designing the adaptive update law to compensate for the unknown translation parameters. The stability analysis was performed strictly by using Lyapunov technique and extended Babalart lemma. Simulation and comparative experimental results were collected to test the effectiveness of the proposed strategy. The main contribution here is that the wheeled mobile robot successfully completed the visual servo tracking task, although the external camera parameters were translated and the scene depth was unknown.
Disclosure of Invention
The invention aims to provide a method for solving the problem that external parameters of a wheeled mobile robot are not calibrated in a visual servo tracking mode.
A new adaptive tracking controller is presented for a wheeled mobile robot that is not calibrated for translational camera-to-robot parameters. After the kinematic model of the robot system is analyzed and homography decomposition is used, measurable signals are obtained to construct system tracking errors, and open loop dynamics are deduced. Subsequently, a motion controller was developed for the trajectory tracking target under incomplete constraints, while carefully designing the adaptive update law to compensate for the unknown translation parameters. The stability analysis was performed strictly by using Lyapunov technique and extended Babalart lemma. Simulation and comparative experimental results were collected to test the effectiveness of the proposed strategy. The main contribution here is that the wheeled mobile robot successfully completed the visual servo tracking task, although the external camera parameters were translated and the scene depth was unknown.
The visual servo tracking of the external parameter calibration-free wheeled mobile robot is characterized by comprising the following steps:
1 st, System and kinematic model
1.1, System description
Fig. 1 is a schematic diagram into which translational external parameters of the camera relative to the robot are introduced, unlike the conventional assumption that a stationary monocular camera is directly above the center of a wheeled mobile robot. Current camera frame composed ofIs represented by the formula (I) in which zcThe axis is defined along the optical axis. FrameDefine the current frame of the wheeled mobile robot, wherein zrThe shaft is located in front of the robot. Due to the presence of translated camera-to-robot parameters in the system, thereforerTcz,rTcrExpressed along the z and x axes, respectivelyAnd (4) a translation parameter of (c).Andframes of the camera and robot are defined on the desired trajectories, respectively.
Furthermore, with respect to the gesture contrast introduction,andframes of the camera and robot are defined at the reference poses, respectively. Thetac(t) and θd(t) each representsAndrelative toIs easy to know the rotation angle of thetac(t) and θd(t) are respectively equivalent toAndrelative toThe angle of rotation of (c).
1.2 kinematic model
Will be provided withIs regarded as a characteristic point, and is definedIn the robot coordinate systemThe coordinates ofWhere σ corresponds to the height difference between the camera and the robot.The following relation is satisfied with the movement speed of the robot:
v:=[0,0,vr]T,w:=[0,-wr,0]T(5)
wherein v isr(t) and wr(t) is the linear velocity and angular velocity of the mobile robot, respectively, substituting (2) into (1) can result inKinematic equation in z, x direction:
for convenience, an unknown translational extrinsic parameter is defined asrTcx:=a,rTcz: will be as bThe coordinates of the origin point of (A) in the camera coordinate system are defined asAccording to the rule of transformation of the coordinate system,andthe following relationships are provided:
whereinAndrTcrespectively representIn thatThe lower rotation matrix and the translation vector. The form is as follows:
next, from (4) can be obtained:
whereinAndrTcrespectively representIn thatThe formula (6) is substituted into the formulaAt the origin ofThe following kinematic equation:
whereindT*z(t),dT*x(t) each representsAt the origin ofZ, x coordinates of, vrd(t),wrd(t) represents the linear and angular velocities of the robot on the desired trajectory, respectively.
No. 2, control development
2.1 open-Loop error equation
Assuming that there are several coplanar feature points in space, the estimation and decomposition of the homography matrix are used to obtain proportional images from the reference image and the image on the expected trackRelative toThe pose of (2):cT*z/d*(t),cT*x/d*(t),θc(t) additionally, from the reference image and the current image, a ratio-wise expression can be obtainedRelative toThe pose of (2):dT*z/d*(t),dT*x/d*(t),θd(t).
defining the error of track tracking as
When e is known1,e2,e3When the convergence is 0, the robot tracks the expected track, the two ends of (9) are derived with respect to time, and (7) and (8) are substituted by
In which the angular kinematic equation of the robot is used
In addition, the method can be used for producing a composite materialAndcan be obtained by a front-back time difference mode.
2.2 adaptive controller design
whereinIs a depth estimate. For one of the translation extrinsic parameters, the auxiliary parameter ρ ∈ R+Is defined asAnd the estimation error is
By using the adaptive control framework, the update rule of the unknown parameters is as follows:
wherein gamma is1,Γ2,Γ3∈R+
In order to realize the track tracking task, the motion controller of the mobile robot is designed as follows:
wherein k isv;kw∈R+Is a control gain, and χ (t) ∈ R is expressed as
3 rd, stability analysis
Theorem 1: control input and parameter update laws (15) designed in (16) drive wheeled mobile robots to track desired trajectories with unknown translational external parameters in a sense
Assume that the desired trajectory satisfies the following condition:
and (3) proving that: the non-negative Lyapunov function V (t) is chosen as follows:
after taking the time derivative of v (t) and substituting the open loop kinetics (10) and the designed controller (16), we have the following relationship:
by substituting (15) into (21), the following expression is obtained by eliminating the commonly used terms:
e is obtained according to the formulae (15) and (21)1(t),e2(t),e3(t),According to (11)Then, according to (12), (13), (14), the compounds are obtainedAgain, the definition (7) and assumptions of errorCan obtain the productFurther, it is clear from (17)Due to the assumption thatAccording to (16)
According to the above analysis, further, according to (10), aAccording to (15), aFurther, the parameters (12), (13) and (14) can be definedFrom the left half of (10), the results are obtained
To facilitate the following analysis, the function f (t) is defined as f (t): k ═ kve1 2+kw(e3+χ)2≧ 0, for both ends of which a derivation with respect to time is taken into account (17) are:
Can then be pushed out
By substituting the control law (16) into the open-loop error equation (10)The terms may be obtained:
further bringing (26) into (27) and (15) to obtain
To be aligned withIn (1)Partly by inference from the Bobara's theorem, it is necessary to bring the law of control (16) into the open-loop error equation (10)Item, can obtain
Wherein the auxiliary symbolThe definition of (A) is shown in the formula. Is obtained byAfter the time derivative of (2), can be deduced
Diagonal velocity wr(t) derivation, and the use of (16) (17) having:
due to the assumption thatThus, the following (31) showsFurther, due to the assumptionThus, according to (30), it is understood thatIs ready to obtainIs consistently continuous, consideringThus using the extended ballad theorem for equation (29) we can:
To prove thatIs bounded, of pair (29)The closed-loop error equation is derived with respect to time as:
due to the assumption thatThus, the following (36) showsFor error definition e in (9)1(t) obtaining a second derivative of
In view ofFrom the above analysis, the extended Barbara theorem can be applied to (34) to obtain
According to the formulas (33) and (38)
By substituting (39) into the definition of χ (t) (17)
Bringing (40) into (26) to obtain
In summary, the systematic error e1(t),e2(t),e3(t) all converge to 0 asymptotically, which indicates that the mobile robot can track the upper expected track.
Description of the drawings:
FIG. 1 is a coordinate system definition in the uncalibrated visual tracking control of extrinsic parameters
FIG. 2 is a system block diagram
FIG. 3 is a simulation: on the frameExpected and current motion path of lower wheeled mobile machine
FIG. 4 is a simulation: simulation results
FIG. 5 is a simulation: evolution of systematic errors
FIG. 6 is a simulation: the speed of the mobile robot.
Fig. 7 is a characterization point of a mobile robot platform and experiment, and a camera with unknown translational extrinsic parameters.
FIG. 8 is an experimental diagram: motion paths of current and desired camera frames.
FIG. 9 is an experimental diagram: image trajectories of feature points (dotted line: desired trajectory; solid line: current trajectory).
FIG. 10 is an experimental diagram: the evolution of the trajectory tracking error (dashed line: zero value).
FIG. 11 is an experimental diagram: evolution of robot velocity
FIG. 12 is an experimental diagram: motion paths of current and desired camera frames.
FIG. 13 is an experimental diagram: image trajectories of feature points
FIG. 14 is an experimental chart: tracking evolution of errors
FIG. 15 is an experimental diagram: evolution of robot velocity
The specific implementation mode is as follows:
1. the utility model provides a wheeled mobile robot extrinsic parameter does not have demarcation vision servo tracking system which characterized in that includes the following step:
1 st, System and kinematic model
1.1, System description
Fig. 1 is a schematic diagram into which translational external parameters of the camera relative to the robot are introduced, unlike the conventional assumption that a stationary monocular camera is directly above the center of a wheeled mobile robot. Current camera frame composed ofIs represented by the formula (I) in which zcThe axis is defined along the optical axis. FrameDefine the current frame of the wheeled mobile robot, wherein zrThe shaft is located in front of the robot. Due to the presence of translated camera-to-robot parameters in the system, thereforerTcz,rTcrExpressed along the z and x axes, respectivelyAnd (4) a translation parameter of (c).Andframes of the camera and robot are defined on the desired trajectories, respectively.
Furthermore, with respect to the gesture contrast introduction,andframes of the camera and robot are defined at the reference poses, respectively. Thetac(t) and θd(t) each representsAndrelative toIs easy to know the rotation angle of thetac(t) and θd(t) are respectively equivalent toAndrelative toThe angle of rotation of (c).
1.2 kinematic model
Will be provided withIs regarded as a characteristic point, and is definedIn the robot coordinate systemThe coordinates ofWhere σ corresponds to the height difference between the camera and the robot.The following relation is satisfied with the movement speed of the robot:
v:=[0,0,vr]T,w:=[0,-wr,0]T(8)
wherein v isr(t) and wr(t) is the linear velocity and angular velocity of the mobile robot, respectively, substituting (2) into (1) can result inKinematic equation in z, x direction:
for convenience, an unknown translational extrinsic parameter is defined asrTcx:=a,rTcz: will be as bThe coordinates of the origin point of (A) in the camera coordinate system are defined asAccording to the rule of transformation of the coordinate system,andthe following relationships are provided:
whereinAndrTcrespectively representIn thatThe lower rotation matrix and the translation vector. The form is as follows:
next, from (4) can be obtained:
whereinAndrTcrespectively representIn thatThe formula (6) is substituted into the formulaAt the origin ofThe following kinematic equation:
whereindT*z(t),dT*x(t) each representsAt the origin ofZ, x coordinates of, vrd(t),wrd(t) represents the linear and angular velocities of the robot on the desired trajectory, respectively.
No. 2, control development
2.1 open-Loop error equation
Assuming that there are several coplanar feature points in space, the estimation and decomposition of the homography matrix are used to obtain proportional images from the reference image and the image on the expected trackRelative toThe pose of (2):cT*z/d*(t),cT*x/d*(t),θc(t) additionally, according to the reference diagramImage and current image can be obtained in a proportional senseRelative toThe pose of (2):dT*z/d*(t),dT*x/d*(t),θd(t).
defining the error of track tracking as
When e is known1,e2,e3When the convergence is 0, the robot tracks the expected track, the two ends of (9) are derived with respect to time, and (7) and (8) are substituted by
In which the angular kinematic equation of the robot is used
In addition, the method can be used for producing a composite materialAndcan be obtained by a front-back time difference mode.
2.2 adaptive controller design
whereinIs a depth estimate. For one of the translation extrinsic parameters, the auxiliary parameter ρ ∈ R+Is defined asAnd the estimation error is
By using the adaptive control framework, the update rule of the unknown parameters is as follows:
wherein gamma is1,Γ2,Γ3∈R+
In order to realize the track tracking task, the motion controller of the mobile robot is designed as follows:
wherein k isv;kw∈R+Is a control gain, and χ (t) ∈ R is expressed as
3 rd, stability analysis
Theorem 1: control input and parameter update laws (15) designed in (16) drive wheeled mobile robots to track desired trajectories with unknown translational external parameters in a sense
Assume that the desired trajectory satisfies the following condition:
and (3) proving that: the non-negative Lyapunov function V (t) is chosen as follows:
after taking the time derivative of v (t) and substituting the open loop kinetics (10) and the designed controller (16), we have the following relationship:
by substituting (15) into (21), the following expression is obtained by eliminating the commonly used terms:
V=-kve1 2-kw(e3+χ)2(22)
e is obtained according to the formulae (15) and (21)1(t),e2(t),e3(t),According to (11)Then, according to (12), (13), (14), the compounds are obtainedAgain, the definition (7) and assumptions of errorCan obtain the productFurther, it is clear from (17)Due to the assumption thatAccording to (16)
According to the above analysis, further, according to (10), aAccording to (15), aFurther, the parameters (12), (13) and (14) can be definedFrom the left half of (10), the results are obtained
For the purpose of the following analysis, the function f (t) is defined as f (t):=kve1 2+kw(e3+χ)2≧ 0, for both ends of which a derivation with respect to time is taken into account (17) are:
Can then be pushed out
By substituting the control law (16) into the open-loop error equation (10)The terms may be obtained:
further bringing (26) into (27) and (15) to obtain
To be aligned withIn (1)Partly by inference from the Bobara's theorem, it is necessary to bring the law of control (16) into the open-loop error equation (10)Item, can obtain
Wherein the auxiliary symbolThe definition of (A) is shown in the formula. Is obtained byAfter the time derivative of (2), can be deduced
Diagonal velocity wr(t) derivation, and the use of (16) (17) having:
due to the assumption thatThus, the following (31) showsFurther, due to the assumptionThus, according to (30), it is understood thatIs ready to obtainIs consistently continuous, consideringThus using the extended ballad theorem for equation (29) we can:
To prove thatIs bounded, of pair (29)The closed-loop error equation is derived with respect to time as:
due to the assumption thatThus, the following (36) showsFor error definition e in (9)1(t) obtaining a second derivative of
According to the formulas (33) and (38)
By substituting (39) into the definition of χ (t) (17)
In summary, the systematic error e1(t),e2(t),e3(t) all converge to 0 asymptotically, which indicates that the mobile robot can track the upper expected track.
4 th simulation results and conclusions
4.1, simulation results
In this section, simulation results are collected to verify the feasibility of the proposed strategy. Four coplanar feature points are arbitrarily arranged for simulation, and the internal parameters of the virtual camera are set as follows:
the control parameter is selected to be kv-0.2; kw is 0.1; gamma-shaped1=4;Γ2=0:1;Γ38. The external camera-to-robot parameter is set to a ═ 0: 05 m; b is 0: 08 m.
The movement path of the wheeled mobile robot is shown in FIG. 3The thicker one is the desired path and the thinner is the current path, which indicates that the robot successfully tracked the desired trajectory. In FIG. 4, the current image trajectories of the points of interest effectively track their desired image trajectories. As can be seen from fig. 5, all system errors converge to the desired value of zero. Fig. 6 shows the current velocity v of the robotr(t) and wr(t) which are each related to a desired robot speed vrd(t) and wrd(t) good anastomosis. This section further verifies the scheme by real experiments implemented on the mobile robot shown in fig. 7. Each feature point is identified by the common vertex of two squares on a planar panel in the scene, which is also shown in fig. 7. The digital camera moves from substantially the midpoint of the wheel axis to a front right position.
A reference image and a desired image sequence are prepared and the desired trajectory is set to a serpentine curve, with the final velocity becoming zero. Desired trajectoryRelative toIs (-2.0 m; 0.8 m; 11.0 deg.). The algorithm is performed in a VC + + environment by using an OpenCV library. The true value of the external parameter is set to a ═ 0: 05m and b ═ 0: 06 m. The off-line calibration of the internal parameters of the camera comprises the following steps:
experiment 1 (validation of proposed strategy): camera frameRelative to a reference frameIs estimated roughly. Is (-2.4 m; 0.3 m; 14.6 deg.). The initial value of each estimated parameter being ^
kv=0.4;kw=0.1;Γ1=8;Γ2=2;Γ3=8
(43)
the motion path of the mobile robot system is framed by unknown external parameters between the robot and the vehicle-mounted cameraThe motion path of the lower camera. Fig. 8 shows that the current trajectory of the camera moves along the desired camera trajectory, meaning that the mobile robot successfully tracks its desired trajectory. The locus of the feature points is shown in fig. 9, wherein the star points represent the final positions of the locus. It can be seen that the current image trajectory coincides with the trajectory of the desired image sequence. FIG. 10 showsThe evolution of the tracking error is observed and it can be seen that all errors have good convergence. Fig. 11 shows the linear and angular velocities of the mobile robot, where the dotted line is the velocity of the desired trajectory and the solid line is the current robot velocity. This experiment shows that the robot can still successfully track the required trajectory even if the camera-to-robot translation parameters are unknown.
Experiment 2 (comparison with a classical method): to further validate the proposed strategy, an experiment was performed, namely [37]]To perform the comparison. The camera configuration and required trajectory settings were the same as in experiment 1. Adjusting the control parameter to kv-0.4; kw is 0.1; gamma-shaped1=30.Is at an initial value of(0)=0.1m。
Fig. 12 shows the motion paths of the current and desired camera frames, and it can be seen that there is some trajectory tracking error in this process. Fig. 13 shows how the current image features track the desired image. The evolution of the systematic error is shown in fig. 14. Fig. 15 shows the speed of the robot. Comparing the results shows that the proposed method has a better performance when translational camera-to-robot parameters are present in the system.
4.3, conclusion
For a wheeled mobile robot, when the translation camera-to-robot parameters are unknown, a new visual servo tracking scheme is proposed. Using a homography-based algorithm, scaled relative poses are obtained, which are used to construct trajectory tracking errors. Although the camera-to-robot translation parameters and scene depth are not calibrated, the robot can still successfully track the required trajectory using the proposed adaptive controller, which is carefully developed to compensate for those unknown parameters and to drive the mobile robot under incomplete constraints. The system stability was strictly analyzed by using Lyapunov technique. Simulation and comparison experiment results show the effectiveness of the strategy.
Reference to the literature
[1]F.Chaumette and S.Hutchinson,“Visual servo control Part II:advanced approaches,”IEEE Robot.Autom.Mag.,vol.14,no.2,pp.109-118,Mar.2007.
[2]J.Su and W.Xie,“Motion planning and coordination for robot systemsbased on representation space,”IEEE Trans.Syst.Man Cybern.Part B-Cybern.,vol.41,no.1,pp.248-259,Feb.2011.
[3]X.Zhang,R.Wang,Y.Fang,B.Li,and B.Ma,“Acceleration-level pseudo-dynamic visual servoing of mobile robots with backstepping and dynamicsurface control,”IEEE Trans.Syst.Man Cyber.:Syst.,online publised,DOl:10.1109/TSMC.2017.2777897.
[4]G.Hu,W.P.Tay,and Y.Wen,“Cloud robotics:architecture,challenges andapplications,”lEEE Netw.,vol.26,no.3,pp.21-28,May.2012.
[5]N.Sun,Y.Wu,Y.Fang,and H.Chen,“Nonlinear antiswing control forcrane systems with double-pendulum swing effects and uncertain parameters:design and experiments,”IEEE Trans.Aurom.Sci.Eng.,DOI:10.1109/TASE.2017.2723539.
[6]H.Chen and D.Sun,“Moving groups of microparticles into array witha robot-tweezers manipulation system,”IEEE Trans Robot.,vol.28,no.5,pp.1069-1080,Oct.2012.
[7]Z.Ma and J.Su,“Robust uncalibrated visual servoing control basedon disturbance observer,”ISA Transactions.vol.59,pp.193-204,Nov.2015.
[8]N.Sun,Y.Fang,H.Chen,Y.Fu,and B.Lu,“Nonlinear stabilizing controlfor ship-mounted cranes with ship roll and heave movements:Design,analysis,and experiments,”IEEE Trans.Syst.Man Cybern.:Syst.,online published,DOI:10.1109/TSMC.2017.2700393
[9]S.Hutchinson,G.D.Hager,and P.I.Corke,“A tutorial on visual servocontrol,”IEEE Trans.Robot.Aurom.,vol.12,no.5,pp.651-670,Oct.1996.
[10]F.Janabi-Sharifi,L.Deng,and W.J.Wilson,“Comparison of basicvisual servoing methods,”IEEE/ASME Trans.Mechatronics,vol.16,no.5,pp.967-983,Oct.2011.
[11]F.Ke,Z.Li,H.Xiao,and X.Zhang,“Visual servoing of constrainedmobile robots based on model predictive control,”IEEE Trans.Syst.Man Cyber.:Syst.,vol.47,no.7,pp.1428-1438,2017.
[12]A.Hajiloo,M.Keshmiri,W.-F.Xie,and T.-T.Wang,“Robust online modelpredictive control for a constrained image-based visual servoing,”IEEETrans.Ind.Electron.,vol.63,no.4,pp.2242-2250,Apr.2016.
[13]D.Chwa,A.P.Dani,and W.E.Dixon,“Range and motion estimation of amonocular camera using static and moving objects,”IEEE Trans.ControlSyst.Technol.,vol.24,no.4,pp.1174-1183,Jul.2016.
[14]A.P.Dani,N.R.Fischer,and W.E.Dixon,“Single camera structure andmotion,”IEEE Trans.Autom.Control,vol.57,no.1,pp.241□246,Jan.2012.
[15]M.Liu,C.Pradalier,and R.Siegwart,“Visual homing from scale withan uncalibrated omnidirectional camera,”IEEE Trans.Robot.,vol.29,no.6,pp.1353-1365,Dec.2013.
[16]H.Zhao,Y.Liu,X.Xie,Y.Liao,and X.Liu,“Filtering based adaptivevisual odometry sensor framework robust to blurred images,”Sensors,vol.16,no.7,pp.1040,Jul.2016.
[17]H.Wang,Y.-H.Liu,and W Chen,“Uncalibrated visual tracking controlwithout visual velocity,”IEEE Trans.Control Syst.Technol.,vol.18,no.6,pp.1359-1370,Nov.2010.
[18]H.Wang,B.Yang,J.Wang,W.Chen,X.Liang,and Y.Liu,“Adaptive visualservoing of contour features,”IEEE/ASME Trans.Mechatronics,vol.23,no.2,pp.811-822,Apr.2018.
[19]S.Y.Chen,J.Zhang,H.Zhang,N.M.Kowk,and Y.F.Li,“Intelligentlighting control for vision-based robotic manipulation,”IEEETrans.Ind.Electron.,vol.59,no.8,pp.3254-3263,Aug.2012.
[20]G.Hu,W.MacKunis,N.Gans,W.E.Dixon,J.Chen,A.Behal,and D.Dawson,“Homography-Based Visual Servo Control With Imperfect Camera Calibration,”IEEE Trans.Autom.Control,vol.54,no.6,pp.1318-1324,Jun.2009.
[21]Y.Liu,R.Xiong,Y.Wang,H.Huang,X.Xie,X.Liu,and G.Zhang,“Stereovisual-inertial odometry with multiple kalman filters ensemble,”IEEETrans.Ind.Electron.,vol.63,no.10,pp.6205-6216,Oct.2016.
[22]H.Wang,B.Yang,Y.Liu,W.Chen,X.Liang,and R.Pfeifer,“Visual servoingof soft robot manipulator in constrained environments with an adaptivecontroller,”IEEE/ASME Trans.Mechatronics,online published,DOI:10.1 109/TMECH.2016.2613410.
[23]H.Wang,C.Wang,W.Chen,X.Liang,and Y.Liu,“Three dimensionaldynamics for cable-driven soft manipulator,”IEEE/ASME Trans.Mechatronics,Vol.22,No.1,pp.18-28,2017.
[24]H.Chen,C.Wang,X.J.Li,and D.Sun,“Transportation of multiplebiological cells through saturation-controlled optical tweezers in crowdedmicroenvironments,”IEEE/ASME Trans.Mechatronics,vol.21,no.2,pp.888-899,Apr.2016.
[25]N.Sun,Y.Fang,H.Chen,and B.Lu,“Amplitude-saturated nonlinearoutput feedback antiswing control for underactuated cranes withdoublependulum cargo dynamics,”IEEE Trans.Ind.Electron.,vol.64,no.3,pp.2135-2146,Mar.2017.
[26]Y.Fang,X,Liu.and X.Zhang,“Adaptive active visual servoing ofnonholonomic mobile robots,”IEEE Trans.Ind.Electron.,vol.59,no.1,pp.486-497,Jan.2012.
[27]B.Li,X.Zhang,Y.Fang,and W.Shi,“Visual servo regulation of wheeledmobile robots with simultaneous depth identification,”IEEETrans.Ind.Electron.,vol.65,no.l,pp.460-469,Jan.2018.
[28]X.Zhang,Y.Fang,and N.Sun,“Visual servoing of mobile robots forposture stabilization:from theory to experiments,”Int.J.Robust NonlinearControl,vol.25,no.1,pp.1-15,Jan.2015.
[29]H.Wang,D.Guo,H.Xu,W.Chen,T.Liu,and K.Leang,“Eye-inhand trackingcontrol of free-floating space manipulator,”IEEE Trans.Aerosp.Electron.Syst.,vol.53,no.4,pp.1855-1865,2017.
[30]X.Zhang,Y.Fang,B.Li,and J.Wang,“Visual servoing of nonholonomicmobile robots with uncalibrated camera-to-robot parameters,”IEEETrans.Ind.Electron.,vol.64,no.l,pp.390□400,Jan.2017.
[31]A.Cherubini and F.Chaumette,“Visual navigation of a mobile robotwith laser-based collision avoidance,”Int.J.Robot.Res.,vol.32,no.2,pp.189□205,Feb.2013.
[32]J.Chen,B.Jia,and K.Zhang,“Trifocal tensor-based adaptive visualtrajectory tracking control of mobile robots,”IEEE Trans.Cybernetics,onlinepublished,DOI:10.1109/TCYB.2016.2582210.
[33]X.Liang,H.Wang,Y.H.Liu,and W.Chen,“Formation control ofnonholonomic mobile robots without position and velocity measurements,”IEEETrans.Robot.,vol.34,no.2,pp.434□446,Apr.2018.
[34]W.E.Dixon,D.M.Dawson,E.Zergeroglu,A.Behal,“Adaptive trackingcontrol of a wheeled mobile robot via an uncalibrated camera system,”IEEETrans.Syst.Man Cybern.:Cybern.,vol.31,no.3,pp.341□352,Jun.2001.
[35]H.Chen,C.Wang,Z.Liang,D.Zhang,and H.Zhang,“Robust practicalstabilization of nonholonomic mobile robots based on visual servoing feedbackwith inputs saturation,”Asian Journal of Control,vol.16,no.3,pp.692□702,May.2014.
[36]X.Liang,H.Wang,Y.-H.Liu,W.Chen,and J.Zhao,“A unified designmethod for adaptive visual tracking control of robots with eyein-hand/fixedcamera configuration,”Automatica,vol.59,pp.97□105,Sep.2015.
[37]J.Chen,W.Dixon,M.Dawson,and M.McIntyre,“Homography-based visualservo tracking control of a wheeled mobile robot,”IEEE Trans.Robot.,vol.22,no.2,pp.406-415,Apr.2006.
[38]A.D.Luca,G Oriolo,and P.R.Giordano,“Feature depth observation forimage-based visual servoing:theory and experiments,”Int.J.Robot.Res.,vol.27,no.10,pp.1093-1116,Oct.2008.
[39]J.J.Craig,Introduction to robotics:mechanics and control,3rd ed.,NJ:Prentice-Hall,2005.
Claims (1)
1. The utility model provides a wheeled mobile robot extrinsic parameter does not have demarcation vision servo tracking system which characterized in that includes the following step:
1 st, System and kinematic model
1.1, System description
Fig. 1 is a schematic diagram into which translational external parameters of the camera relative to the robot are introduced, unlike the conventional assumption that a stationary monocular camera is directly above the center of a wheeled mobile robot. Current camera frame composed ofIs represented by the formula (I) in which zcThe axis is defined along the optical axis. FrameDefine the current frame of the wheeled mobile robot, wherein zrThe shaft is located in front of the robot. Due to the presence of translated camera-to-robot parameters in the system, thereforerTcz,rTcxExpressed along the z and x axes, respectivelyAnd (4) a translation parameter of (c).Andframes of the camera and robot are defined on the desired trajectories, respectively.
Furthermore, with respect to the gesture contrast introduction,andframes of the camera and robot are defined at the reference poses, respectively. Thetac(t) and θd(t) each representsAndrelative toIs easy to know the rotation angle of thetac(t) and θd(t) are respectively equivalent toAndrelative toThe angle of rotation of (c).
1.2 kinematic model
Will be provided withIs regarded as a characteristic point, and is definedIn the robot coordinate systemThe coordinates ofWhere σ corresponds to the height difference between the camera and the robot.The following relation is satisfied with the movement speed of the robot:
v:=[0,0,vr]T,w:=[0,-wr,0]T(2)
wherein v isr(t) and wr(t) is the linear velocity and angular velocity of the mobile robot, respectively, substituting (2) into (1) can result inKinematic equation in z, x direction:
for convenience, an unknown translational extrinsic parameter is defined asrTcx:=a,rTcz: will be as bThe coordinates of the origin point of (A) in the camera coordinate system are defined asAccording to the rule of transformation of the coordinate system,andthe following relationships are provided:
whereinAndrTcrespectively representIn thatThe lower rotation matrix and the translation vector. The form is as follows:
next, from (4) can be obtained:
whereinAndrTcrespectively representIn thatThe formula (6) is substituted into the formulaAt the origin ofThe following kinematic equation:
whereindT*z(t),dT*x(t) each representsAt the origin ofZ, x coordinates of, vrd(t),wrd(t) represents the linear and angular velocities of the robot on the desired trajectory, respectively.
No. 2, control development
2.1 open-Loop error equation
Assuming that there are several coplanar feature points in space, the estimation and decomposition of the homography matrix are used to obtain proportional images from the reference image and the image on the expected trackRelative toThe pose of (2):cT*z/d*(t),cT*x/d*(t),θc(t) additionally, from the reference image and the current image, a ratio-wise expression can be obtainedRelative toThe pose of (2):dT*z/d*(t),dT*x/d*(t),θd(t).
defining the error of track tracking as
When e is known1,e2,e3When the convergence is 0, the robot tracks the expected track, the two ends of (9) are derived with respect to time, and (7) and (8) are substituted by
In which the angular kinematic equation of the robot is used
In addition, the method can be used for producing a composite materialAndcan be obtained by a front-back time difference mode.
2.2 adaptive controller design
whereinIs a depth estimate. For one of the translation extrinsic parameters, the auxiliary parameter ρ ∈ R+Is defined asAnd the estimation error is
WhereinIs an estimate of a.
By using the adaptive control framework, the update rule of the unknown parameters is as follows:
wherein gamma is1,Γ2,Γ3∈R+
In order to realize the track tracking task, the motion controller of the mobile robot is designed as follows:
wherein k isv;kw∈R+Is a control gain, and χ (t) ∈ R is expressed as
3 rd, stability analysis
Theorem 1: control input and parameter update laws (15) designed in (16) drive wheeled mobile robots to track desired trajectories with unknown translational external parameters in a sense
Assume that the desired trajectory satisfies the following condition:
and (3) proving that: the non-negative Lyapunov function V (t) is chosen as follows:
after taking the time derivative of v (t) and substituting the open loop kinetics (10) and the designed controller (16), we have the following relationship:
by substituting (15) into (21), the following expression is obtained by eliminating the commonly used terms:
V=-kve1 2-kw(e3+χ)2(22)
according to the formulae (15) and (21), the compoundsAccording to (11)Then, according to (12), (13), (14), the compounds are obtainedAgain, the definition (7) and assumptions of errorCan obtain the productFurther, it is clear from (17)Due to the assumption thatAccording to (16) canTo know
According to the above analysis, further, according to (10), aAccording to (15), aFurther, the parameters (12), (13) and (14) can be definedFrom the left half of (10), the results are obtained
To facilitate the following analysis, the function f (t) is defined as f (t): k ═ kve1 2+kw(e3+χ)2≧ 0, for both ends of which a derivation with respect to time is taken into account (17) are:
Can then be pushed out
Substituting the control law (16) into an open-loop error equation(10) Is/are as followsThe terms may be obtained:
further bringing (26) into (27) and (15) to obtain
To be aligned withIn (1)Partly by inference from the Bobara's theorem, it is necessary to bring the law of control (16) into the open-loop error equation (10)Item, can obtain
Wherein the auxiliary symbolThe definition of (A) is shown in the formula. Is obtained byAfter the time derivative of (2), can be deduced
Diagonal velocity wr(t) derivation, and the use of (16) (17) having:
due to the assumption thatThus, the following (31) showsFurther, due to the assumptionThus, according to (30), it is understood thatIs ready to obtainIs consistently continuous, consideringThus using the extended ballad theorem for equation (29) we can:
On the other hand, toIn (e)3+ X partThe derivative is divided to obtain:
To prove thatIs bounded, of pair (29)The closed-loop error equation is derived with respect to time as:
due to the assumption thatThus, the following (36) showsFor error definition e in (9)1(t) obtaining a second derivative of
In view ofFrom the above analysis, the extended Barbara theorem can be applied to (34) to obtain
According to the formulas (33) and (38)
By substituting (39) into the definition of χ (t) (17)
In summary, the systematic error e1(t),e2(t),e3(t) all converge to 0 asymptotically, which indicates that the mobile robot can track the upper expected track.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810787728.1A CN110722533B (en) | 2018-07-17 | 2018-07-17 | External parameter calibration-free visual servo tracking of wheeled mobile robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810787728.1A CN110722533B (en) | 2018-07-17 | 2018-07-17 | External parameter calibration-free visual servo tracking of wheeled mobile robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110722533A true CN110722533A (en) | 2020-01-24 |
CN110722533B CN110722533B (en) | 2022-12-06 |
Family
ID=69217618
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810787728.1A Expired - Fee Related CN110722533B (en) | 2018-07-17 | 2018-07-17 | External parameter calibration-free visual servo tracking of wheeled mobile robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110722533B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111283683A (en) * | 2020-03-04 | 2020-06-16 | 湖南师范大学 | Servo tracking accelerated convergence method for robot visual feature planning track |
CN111546344A (en) * | 2020-05-18 | 2020-08-18 | 北京邮电大学 | Mechanical arm control method for alignment |
CN112083652A (en) * | 2020-08-27 | 2020-12-15 | 东南大学 | Track tracking control method for multipurpose wheeled mobile robot |
CN113031590A (en) * | 2021-02-06 | 2021-06-25 | 浙江同筑科技有限公司 | Mobile robot vision servo control method based on Lyapunov function |
CN113051767A (en) * | 2021-04-07 | 2021-06-29 | 绍兴敏动科技有限公司 | AGV sliding mode control method based on visual servo |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009059323A1 (en) * | 2007-11-01 | 2009-05-07 | Rimrock Automation, Inc. Dba Wolf Robotics | A method and system for finding a tool center point for a robot using an external camera |
US20110320039A1 (en) * | 2010-06-25 | 2011-12-29 | Hon Hai Precision Industry Co., Ltd. | Robot calibration system and calibrating method thereof |
CN102736626A (en) * | 2012-05-11 | 2012-10-17 | 北京化工大学 | Vision-based pose stabilization control method of moving trolley |
CN104950893A (en) * | 2015-06-26 | 2015-09-30 | 浙江大学 | Homography matrix based visual servo control method for shortest path |
CN106457562A (en) * | 2014-06-23 | 2017-02-22 | Abb瑞士股份有限公司 | Method for calibrating a robot and a robot system |
CN106774309A (en) * | 2016-12-01 | 2017-05-31 | 天津工业大学 | A kind of mobile robot is while visual servo and self adaptation depth discrimination method |
-
2018
- 2018-07-17 CN CN201810787728.1A patent/CN110722533B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009059323A1 (en) * | 2007-11-01 | 2009-05-07 | Rimrock Automation, Inc. Dba Wolf Robotics | A method and system for finding a tool center point for a robot using an external camera |
US20110320039A1 (en) * | 2010-06-25 | 2011-12-29 | Hon Hai Precision Industry Co., Ltd. | Robot calibration system and calibrating method thereof |
CN102736626A (en) * | 2012-05-11 | 2012-10-17 | 北京化工大学 | Vision-based pose stabilization control method of moving trolley |
CN106457562A (en) * | 2014-06-23 | 2017-02-22 | Abb瑞士股份有限公司 | Method for calibrating a robot and a robot system |
CN104950893A (en) * | 2015-06-26 | 2015-09-30 | 浙江大学 | Homography matrix based visual servo control method for shortest path |
CN106774309A (en) * | 2016-12-01 | 2017-05-31 | 天津工业大学 | A kind of mobile robot is while visual servo and self adaptation depth discrimination method |
Non-Patent Citations (3)
Title |
---|
JIAN CHEN: "Homography-Based Visual Servo Tracking Control of a Wheeled Mobile Robot", 《IEEE TRANSACTIONS ON ROBOTICS》 * |
张立阳等: "轮式移动机器人轨迹跟踪与避障研究", 《自动化与仪表》 * |
李宝全: "轮式移动机器人视觉伺服策略研究", 《中国博士学位论文全文数据库 (信息科技辑)》 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111283683A (en) * | 2020-03-04 | 2020-06-16 | 湖南师范大学 | Servo tracking accelerated convergence method for robot visual feature planning track |
CN111546344A (en) * | 2020-05-18 | 2020-08-18 | 北京邮电大学 | Mechanical arm control method for alignment |
CN112083652A (en) * | 2020-08-27 | 2020-12-15 | 东南大学 | Track tracking control method for multipurpose wheeled mobile robot |
CN112083652B (en) * | 2020-08-27 | 2022-06-14 | 东南大学 | Track tracking control method for multipurpose wheeled mobile robot |
CN113031590A (en) * | 2021-02-06 | 2021-06-25 | 浙江同筑科技有限公司 | Mobile robot vision servo control method based on Lyapunov function |
CN113051767A (en) * | 2021-04-07 | 2021-06-29 | 绍兴敏动科技有限公司 | AGV sliding mode control method based on visual servo |
Also Published As
Publication number | Publication date |
---|---|
CN110722533B (en) | 2022-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Qiu et al. | Visual servo tracking of wheeled mobile robots with unknown extrinsic parameters | |
CN110722533B (en) | External parameter calibration-free visual servo tracking of wheeled mobile robot | |
Qi et al. | Contour moments based manipulation of composite rigid-deformable objects with finite time model estimation and shape/position control | |
CN106774309B (en) | A kind of mobile robot visual servo and adaptive depth discrimination method simultaneously | |
Li et al. | Visual servo regulation of wheeled mobile robots with simultaneous depth identification | |
Sun et al. | A review of robot control with visual servoing | |
Siradjuddin et al. | A position based visual tracking system for a 7 DOF robot manipulator using a Kinect camera | |
Wang et al. | A modified image-based visual servo controller with hybrid camera configuration for robust robotic grasping | |
Li et al. | Visual servoing of wheeled mobile robots without desired images | |
CN109960145B (en) | Mobile robot mixed vision trajectory tracking strategy | |
CN115480583B (en) | Visual servo tracking and impedance control method for flying operation robot | |
CN115122325A (en) | Robust visual servo control method for anthropomorphic manipulator with view field constraint | |
Miao et al. | Low-complexity leader-following formation control of mobile robots using only FOV-constrained visual feedback | |
Lai et al. | Image dynamics-based visual servo control for unmanned aerial manipulatorl with a virtual camera | |
De Farias et al. | Dual quaternion-based visual servoing for grasping moving objects | |
Zhang et al. | Recent advances on robot visual servo control methods | |
Kanellakis et al. | On vision enabled aerial manipulation for multirotors | |
Miranda-Moya et al. | Ibvs based on adaptive sliding mode control for a quadrotor target tracking under perturbations | |
Toro-Arcila et al. | Visual path following with obstacle avoidance for quadcopters in indoor environments | |
Abou Moughlbay et al. | Error regulation strategies for model based visual servoing tasks: Application to autonomous object grasping with nao robot | |
CN109542094B (en) | Mobile robot vision stabilization control without desired images | |
Liu et al. | Visual servoing with deep learning and data augmentation for robotic manipulation | |
Cao et al. | Adaptive dynamic surface control for vision-based stabilization of an uncertain electrically driven nonholonomic mobile robot | |
Aflakian et al. | Boosting performance of visual servoing using deep reinforcement learning from multiple demonstrations | |
Pérez-Alcocer et al. | Saturated visual-servoing control strategy for nonholonomic mobile robots with experimental evaluations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20221206 |