CN111612843A - Mobile robot vision stabilization control without expected image - Google Patents

Mobile robot vision stabilization control without expected image Download PDF

Info

Publication number
CN111612843A
CN111612843A CN201910154546.5A CN201910154546A CN111612843A CN 111612843 A CN111612843 A CN 111612843A CN 201910154546 A CN201910154546 A CN 201910154546A CN 111612843 A CN111612843 A CN 111612843A
Authority
CN
China
Prior art keywords
coordinate system
robot
mobile robot
follows
obj
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910154546.5A
Other languages
Chinese (zh)
Inventor
李宝全
宋志伟
高喜天
师五喜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Polytechnic University
Original Assignee
Tianjin Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Polytechnic University filed Critical Tianjin Polytechnic University
Priority to CN201910154546.5A priority Critical patent/CN111612843A/en
Publication of CN111612843A publication Critical patent/CN111612843A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The mobile robot has no expected image vision stabilization control system. A novel monocular vision servo strategy is proposed herein to drive a wheeled mobile robot to a desired pose without a desired image. Firstly, according to the feature points with known 3D coordinates, the transformation relation between the feature point coordinate system and the vehicle-mounted camera coordinate system is estimated. Secondly, a reference coordinate system is defined from the projection of the feature point coordinate system. Then, the relation between the reference coordinate system and the current coordinate system of the robot is obtained. And finally, driving the robot to a desired pose by using a polar coordinate control law. Compared with the conventional visual servo mode, the method can work normally without obtaining the expected image in advance. The simulation result proves the effectiveness of the method.

Description

Mobile robot vision stabilization control without expected image
Technical Field
The invention belongs to the technical field of computer vision and mobile robots, and can complete the stabilization control of vision under the condition of no expected image.
Background
With the rapid development of artificial intelligence, it becomes more and more important to control intelligent equipment, such as mobile robots and mechanical arms, through visual information feedback. With the development of society, intelligent robots are applied in numerous fields. Among them, the processing of visual information is one of the most important functions of a wheeled mobile robot and a robot arm. However, there are still many problems to be solved in the practical application of visual servo control. Generally, it is expected that images play an important role in visual servoing of a mobile robot. Without the desired image, the desired pose of the robotic system cannot be defined. However, in many existing approaches, the mobile robot cannot reach the desired pose without knowing the desired image. For a monocular mobile robot, during experiments or work, the external 3D scene becomes a 2D image plane when mapped to the camera due to the imaging principle of the camera. Thus resulting in the loss of depth information. While there are many strategies to address the problem of unknown depth information, it also increases the complexity of the robotic system. Therefore, in order to drive the mobile robot to a desired pose without a desired image, it is necessary to identify a scene model by making full use of image information. Therefore, the visual servoing process without the desired image becomes more meaningful.
Disclosure of Invention
A strategy for accomplishing visual stabilization control without a desired image is designed based on a monocular wheeled mobile robot.
A novel monocular vision servo strategy is proposed herein to drive a wheeled mobile robot to a desired pose without a desired image. Firstly, according to the feature points with known 3D coordinates, the transformation relation between the feature point coordinate system and the vehicle-mounted camera coordinate system is estimated. Secondly, a reference coordinate system is defined from the projection of the feature point coordinate system. Then, the relation between the reference coordinate system and the current coordinate system of the robot is obtained. And finally, driving the robot to a desired pose by using a polar coordinate control law. Compared with the conventional visual servo mode, the method can work normally without obtaining the expected image in advance. The simulation result proves the effectiveness of the method. The vision stabilization system method of the mobile robot provided by the invention comprises the following steps:
the vision stabilization control system for the mobile robot without the expected image is characterized by comprising the following steps of:
1, description of problems
1.1, System description
And setting the coordinate systems of the vehicle-mounted camera and the mobile robot to coincide. The coordinate system of the camera at the current pose is defined as
Figure BSA0000179553560000011
Taking the optical center of the camera as
Figure BSA0000179553560000012
Of the origin.
Figure BSA0000179553560000013
Z of (a)cxcRepresenting the plane of motion of the mobile robot, wherein zcIn the same direction as the optical axis of the camera. y iscPerpendicular to the plane of motion z of the mobile robotcxcAnd meets the right-hand rule. A visual servo coordinate system without a desired image is shown in fig. 1. Randomly arranging four coplanar feature points for defining
Figure BSA0000179553560000014
The characteristic point with the serial number of 1 is used as
Figure BSA0000179553560000015
The vector between sequence number 1 and sequence number 2 as
Figure BSA0000179553560000016
X ofobjAxis, normal vector of plane of feature point being zobjA shaft. y isobjDetermined according to the right-hand rule.
Figure BSA0000179553560000017
The 3D coordinates of the following feature points are obtained by measuring with a ruler. In addition, the coordinate system
Figure BSA0000179553560000018
Is projected onto the robot motion plane as the robot reference pose, i.e.
Figure BSA0000179553560000019
Of the origin. Will zobjThe coordinate axis is projected on the motion plane and then scaled to a unit vector as
Figure BSA00001795535600000110
Z coordinate axis ofp。ypVertical robot plane of motion down, xpSatisfying the right-hand rule. Has been defined completely
Figure BSA00001795535600000111
Thereafter, the expected pose is defined under the reference coordinate system
Figure BSA00001795535600000112
Will be provided with
Figure BSA00001795535600000113
Is corresponding to
Figure BSA00001795535600000114
Is defined as thetad. In the same way, the desired coordinate system
Figure BSA0000179553560000021
Corresponding to a reference coordinate system
Figure BSA0000179553560000022
Are set to x-coordinate and z-coordinate, respectivelypTdxAndpTdz
1.2 control scheme
The vision stabilization control system for the mobile robot without the expected image consists of three stages: a first stage of defining a feature point coordinate system as a feature point coordinate system based on the 3D coordinates of the known feature points
Figure BSA0000179553560000023
Then estimating a transformation relation between a feature point coordinate system and a vehicle-mounted camera coordinate system; a second stage of defining a reference coordinate system based on the projection of the feature point coordinate system
Figure BSA0000179553560000024
Obtaining the relation between the reference coordinate system of the robot and the current coordinate system, and defining the expected pose by the reference coordinate system
Figure BSA0000179553560000025
Finally, obtaining a relative posture relation between the current posture and the expected posture; in the third stage, the robot is driven to a desired pose by utilizing a polar coordinate control law
Figure BSA0000179553560000026
To (3).
2, coordinate system relationship
In the case of the (2.1) th,
Figure BSA0000179553560000027
and
Figure BSA0000179553560000028
in relation to (2)
Characteristic image point coordinates and
Figure BSA0000179553560000029
the coordinates of the feature points under the coordinate system are respectively defined as follows:
Figure BSA00001795535600000210
the rotation matrix and the translation vector are respectively defined as
Figure BSA00001795535600000211
AndcTobj. From the feature point imaging model, the following relationship can be obtained:
Figure BSA00001795535600000212
where K is the intrinsic parameter matrix of the camera.
From the definition of the coordinate system of the characteristic points, Z is knownobj0. Thus, the rotation matrix can be written as
Figure BSA00001795535600000213
Is represented as follows:
Figure BSA00001795535600000214
Figure BSA00001795535600000215
to
Figure BSA00001795535600000216
The homography matrix of (a) is defined as H ═ K [ r ═ r1,r2cTobj]Taking into account the presence of the scaling factor, using h9H is normalized as follows:
Figure BSA00001795535600000217
the following relationship can be obtained by substituting (4) into (3):
Figure BSA00001795535600000218
the above formula can be written as:
Ax=b (24)
the least squares solution for x is:
x=(ATA)-1ATb (25)
furthermore, according to K [ r ]1,r2cTobj]H may be given by the following relationship:
Figure BSA0000179553560000031
then obtain
Figure BSA0000179553560000032
AndcTobj. Thus, it is possible to obtain
Figure BSA0000179553560000033
Relative to
Figure BSA0000179553560000034
The transformation matrix of (a) is as follows:
Figure BSA0000179553560000035
2.2 determination of the desired coordinates
Figure BSA0000179553560000036
Correspond to
Figure BSA0000179553560000037
Is defined as thetad
Figure BSA0000179553560000038
In that
Figure BSA0000179553560000039
Lower z coordinate set topTdz
Figure BSA00001795535600000310
In that
Figure BSA00001795535600000311
X coordinate of lower is set aspTdx
Figure BSA00001795535600000312
In that
Figure BSA00001795535600000313
The following rotation matrix and translation vector are respectively expressed as follows:
Figure BSA00001795535600000314
then
Figure BSA00001795535600000315
In that
Figure BSA00001795535600000316
The transform matrix below is:
Figure BSA00001795535600000317
2.3, determination of
Figure BSA00001795535600000318
And
Figure BSA00001795535600000319
position and attitude relationship therebetween
It is known that
Figure BSA00001795535600000320
Relative to
Figure BSA00001795535600000321
Is a transition matrix of
Figure BSA00001795535600000322
Wherein the content of the first and second substances,
Figure BSA00001795535600000323
relative to
Figure BSA00001795535600000324
Rotation matrix and translational vector quantity:
Figure BSA00001795535600000325
according to the projection principle, can obtain
Figure BSA00001795535600000326
In that
Figure BSA00001795535600000327
The following rotation matrix and translation vector are respectively as follows:
Figure BSA00001795535600000328
therefore, the first and second electrodes are formed on the substrate,
Figure BSA0000179553560000041
in that
Figure BSA0000179553560000042
The following transition matrix is expressed as follows:
Figure BSA0000179553560000043
therefore, it can be calculated by the following formula
Figure BSA0000179553560000044
In that
Figure BSA0000179553560000045
The following transition matrix:
Figure BSA0000179553560000046
2.4 kinematics of the robot
Fig. 2 shows polar control of the mobile robot, using polar coordinates to define the current pose of the mobile robot in a desired coordinate system. The moving speed and linear velocity of the mobile robot are set to v and w respectively,
Figure BSA0000179553560000047
of origin and
Figure BSA0000179553560000048
is denoted as e and the distance between the origins,
Figure BSA0000179553560000049
in that
Figure BSA00001795535600000410
The lower rotation angle is denoted as phi. And the number of the first and second electrodes,
Figure BSA00001795535600000411
in that
Figure BSA00001795535600000412
The lower direction angle is defined as theta, zcThe angle between e and e is defined as α, and it can be known that α ═ θ - Φ.
Figure BSA00001795535600000413
The design linear velocity control rate is as follows:
v=(γcosα)e (35)
the adopted angular speed controller is as follows:
Figure BSA00001795535600000414
description of the drawings:
FIG. 1 is a visual servo coordinate system without a desired image
FIG. 2 is a polar coordinate system diagram of a mobile robot
FIG. 3 is a movement trace of a mobile robot
FIG. 4 is a track of the mobile robot reaching different phase poses under the same initial pose
FIG. 5 shows the change of the robot state
FIG. 6 is a two-dimensional trajectory of feature points in an image
FIG. 7 is a graph showing linear and angular velocities of a mobile robot
The specific implementation mode is as follows:
1. the vision stabilization control system for the mobile robot without the expected image is characterized by comprising the following steps of:
1, description of problems
1.1, System description
And setting the coordinate systems of the vehicle-mounted camera and the mobile robot to coincide. The coordinate system of the camera at the current pose is defined as
Figure BSA00001795535600000415
Taking the optical center of the camera as
Figure BSA00001795535600000416
Of the origin.
Figure BSA00001795535600000417
Z of (a)cxcRepresenting the plane of motion of the mobile robot, wherein zcIn the same direction as the optical axis of the camera. y iscPerpendicular to the plane of motion z of the mobile robotcxcAnd meets the right-hand rule. A visual servo coordinate system without a desired image is shown in fig. 1. Randomly arranging four coplanar feature points for defining
Figure BSA0000179553560000051
The characteristic point with the serial number of 1 is used as
Figure BSA0000179553560000052
The vector between sequence number 1 and sequence number 2 as
Figure BSA0000179553560000053
X ofobjAxis, normal vector of plane of feature point being zobjA shaft. y isobjDetermined according to the right-hand rule.
Figure BSA0000179553560000054
The 3D coordinates of the following feature points are obtained by measuring with a ruler. In addition, the coordinate system
Figure BSA0000179553560000055
Is projected onto the robot motion plane as the robot reference pose, i.e.
Figure BSA0000179553560000056
Of the origin. Will zobjThe coordinate axis is projected on the motion plane and then scaled to a unit vector as
Figure BSA0000179553560000057
Z coordinate axis ofp。ypVertical robot plane of motion down, xpSatisfying the right-hand rule. Has been defined completely
Figure BSA0000179553560000058
Thereafter, the expected pose is defined under the reference coordinate system
Figure BSA0000179553560000059
Will be provided with
Figure BSA00001795535600000510
Is corresponding to
Figure BSA00001795535600000511
Is defined as thetad. In the same way, the desired coordinate system
Figure BSA00001795535600000512
Corresponding to a reference coordinate system
Figure BSA00001795535600000513
Are set to x-coordinate and z-coordinate, respectivelypTdxAndpTdz
1.2 control scheme
The vision stabilization control system for the mobile robot without the expected image consists of three stages: a first stage of defining a feature point coordinate system as a feature point coordinate system based on the 3D coordinates of the known feature points
Figure BSA00001795535600000514
Then estimating a transformation relation between a feature point coordinate system and a vehicle-mounted camera coordinate system; a second stage of defining a reference coordinate system based on the projection of the feature point coordinate system
Figure BSA00001795535600000515
Obtaining the relation between the reference coordinate system of the robot and the current coordinate system, and defining the expected pose by the reference coordinate system
Figure BSA00001795535600000516
Finally, obtaining a relative posture relation between the current posture and the expected posture; in the third stage, the robot is driven to a desired pose by utilizing a polar coordinate control law
Figure BSA00001795535600000517
To (3).
2, coordinate system relationship
In the case of the (2.1) th,
Figure BSA00001795535600000518
and
Figure BSA00001795535600000519
in relation to (2)
Characteristic image point coordinates and
Figure BSA00001795535600000520
the coordinates of the feature points under the coordinate system are respectively defined as follows:
Figure BSA00001795535600000521
the rotation matrix and the translation vector are respectively defined as
Figure BSA00001795535600000522
AndcTobj. From the feature point imaging model, the following relationship can be obtained:
Figure BSA00001795535600000523
where K is the intrinsic parameter matrix of the camera.
From the definition of the coordinate system of the characteristic points, Z is knownobj0. Thus, the rotation matrix can be written as
Figure BSA00001795535600000524
Is represented as follows:
Figure BSA00001795535600000525
Figure BSA00001795535600000526
to
Figure BSA00001795535600000527
The homography matrix of (a) is defined as H ═ K [ r ═ r1,r2cTobj]Taking into account the presence of the scaling factor, using h9H is normalized as follows:
Figure BSA0000179553560000061
the following relationship can be obtained by substituting (4) into (3):
Figure BSA0000179553560000062
the above formula can be written as:
Ax=b (42)
the least squares solution for x is:
x=(ATA)-1ATb (43)
furthermore, according to K [ r ]1,r2cTobj]H may be given by the following relationship:
Figure BSA0000179553560000064
then obtain
Figure BSA0000179553560000065
AndcTobj. Thus, it is possible to obtain
Figure BSA0000179553560000066
Relative to
Figure BSA0000179553560000067
The transformation matrix of (a) is as follows:
Figure BSA0000179553560000068
2.2 determination of the desired coordinates
Figure BSA0000179553560000069
Correspond to
Figure BSA00001795535600000610
Is defined as thetad
Figure BSA00001795535600000611
In that
Figure BSA00001795535600000612
Lower z coordinate set topTdz
Figure BSA00001795535600000613
In that
Figure BSA00001795535600000614
X coordinate of lower is set aspTdx
Figure BSA00001795535600000615
In that
Figure BSA00001795535600000616
The following rotation matrix and translation vector are respectively expressed as follows:
Figure BSA00001795535600000617
then
Figure BSA00001795535600000618
In that
Figure BSA00001795535600000619
The transform matrix below is:
Figure BSA00001795535600000620
2.3, determination of
Figure BSA00001795535600000621
And
Figure BSA00001795535600000622
position and attitude relationship therebetween
It is known that
Figure BSA00001795535600000623
Relative to
Figure BSA00001795535600000624
Is a transition matrix of
Figure BSA00001795535600000625
Wherein the content of the first and second substances,
Figure BSA00001795535600000626
relative to
Figure BSA00001795535600000627
Rotation matrix and translational vector quantity:
Figure BSA0000179553560000071
according to the projection principle, can obtain
Figure BSA0000179553560000072
In that
Figure BSA0000179553560000073
The following rotation matrix and translation vector are respectively as follows:
Figure BSA0000179553560000074
therefore, the first and second electrodes are formed on the substrate,
Figure BSA0000179553560000075
in that
Figure BSA0000179553560000076
The following transition matrix is expressed as follows:
Figure BSA0000179553560000077
therefore, it can be calculated by the following formula
Figure BSA0000179553560000078
In that
Figure BSA0000179553560000079
The following transition matrix:
Figure BSA00001795535600000710
2.4 kinematics of the robot
Fig. 2 shows polar control of the mobile robot, using polar coordinates to define the current pose of the mobile robot in a desired coordinate system. The moving speed and linear velocity of the mobile robot are set to v and w respectively,
Figure BSA00001795535600000711
of origin and
Figure BSA00001795535600000712
is denoted as e and the distance between the origins,
Figure BSA00001795535600000713
in that
Figure BSA00001795535600000714
The lower rotation angle is denoted as phi. And the number of the first and second electrodes,
Figure BSA00001795535600000715
in that
Figure BSA00001795535600000716
The lower direction angle is defined as theta, zcThe angle between e and e is defined as α, and α ═ θ - ΦThe following:
Figure BSA00001795535600000717
the design linear velocity control rate is as follows:
v=(γcosα)e (53)
the adopted angular speed controller is as follows:
Figure BSA00001795535600000718
no. 3 simulation results
3.1, simulation results
The effectiveness of the method is proved through simulation. 4 coplanar feature points are set in the simulation scene, and moved feature points are set.
The internal parameter settings of the virtual camera in the simulation are as follows:
Figure BSA0000179553560000081
in a world coordinate system, setting the initial pose of the mobile robot as follows:
(WT0zWT0x,θ0)=(-6.5m,1m,30°). (56)
in the reference coordinate system, the expected pose is set to (pTdzpTdx,θp) (-0.8m, 0.0m, 0 °). It is then converted into the desired pose in the world coordinate system as:
(WT0zWT0x,θd)=(-0.7m,0.0m,0°). (57)
in addition, random noise with a standard deviation of σ ═ 0.1 pixels was added to the pixel coordinates to test the robustness of the method. The control gain and other parameters are chosen as follows:
γ=0.5,k=1.5,h=0.8. (58)
referring to the drawings, FIG. 3 shows a mobile robotAnd (4) moving tracks. The robot can reach the expected pose efficiently according to the graph. Fig. 4 is a trajectory of the mobile robot to reach different phase poses in the same initial pose. Fig. 6 is a two-dimensional trajectory of feature points in an image, the circular and square points representing initial and desired image feature points, respectively. It can also be seen from the figure that the robot successfully reaches the desired pose. Fig. 7 is a linear velocity and an angular velocity of the mobile robot. FIG. 5 shows the robot states (WTcz(t),WTcx(t),qc(t)), where the dashed line represents the desired pose in the world coordinate system. The results show that the steady state error of the method is small enough, which indicates the effectiveness of the method.
3.2, conclusion
In this context, a novel mobile robot vision servo strategy is proposed, i.e. driving a wheeled mobile robot to a desired pose without a desired image. The method comprises the steps of firstly estimating a transformation relation between a feature point coordinate system and a vehicle-mounted camera coordinate system according to feature points of known 3D coordinates. Secondly, defining a reference coordinate system according to the projection of the feature point coordinate system, and obtaining the relation between the reference coordinate system and the current coordinate system. And then, defining an expected pose according to the reference coordinate system, and obtaining the relation between the current pose and the expected pose. And finally, stabilizing the robot to a desired pose by utilizing a polar coordinate control law.

Claims (1)

1. The vision stabilization control system for the mobile robot without the expected image is characterized by comprising the following steps of:
1, description of problems
1.1, System description
Setting a coordinate system of the vehicle-mounted camera to coincide with a coordinate system of the mobile robot; the coordinate system of the camera at the current pose is defined as
Figure FSA0000179553550000011
Taking the optical center of the camera as
Figure FSA0000179553550000012
The origin of (a);
Figure FSA0000179553550000013
z of (a)cxcRepresenting the plane of motion of the mobile robot, wherein zcIn the same direction as the optical axis of the camera; y iscPerpendicular to the plane of motion z of the mobile robotcxcAnd meets the right-hand rule; the visual servo coordinate system without the expected image is shown in the attached figure 1; randomly arranging four coplanar feature points for defining
Figure FSA0000179553550000014
The characteristic point with the serial number of 1 is used as
Figure FSA0000179553550000015
The vector between sequence number 1 and sequence number 2 as
Figure FSA0000179553550000016
X ofobjAxis, normal vector of plane of feature point being zobjA shaft; y isobjDetermining according to a right-hand rule;
Figure FSA0000179553550000017
measuring the 3D coordinates of the lower characteristic points by a ruler; in addition, the coordinate system
Figure FSA0000179553550000018
Is projected onto the robot motion plane as the robot reference pose, i.e.
Figure FSA0000179553550000019
The origin of (a); will zobjThe coordinate axis is projected on the motion plane and then scaled to a unit vector as
Figure FSA00001795535500000110
Z coordinate axis ofp;ypVertical robot plane of motion down, xpMeets the right-hand rule; has been defined completely
Figure FSA00001795535500000111
Thereafter, the expected pose is defined under the reference coordinate system
Figure FSA00001795535500000112
Will be provided with
Figure FSA00001795535500000113
Is corresponding to
Figure FSA00001795535500000114
Is defined as thetad(ii) a In the same way, the desired coordinate system
Figure FSA00001795535500000115
Corresponding to a reference coordinate system
Figure FSA00001795535500000116
Are set to x-coordinate and z-coordinate, respectivelypTdxAndpTdz
1.2 control scheme
The vision stabilization control system for the mobile robot without the expected image consists of three stages: a first stage of defining a feature point coordinate system as a feature point coordinate system based on the 3D coordinates of the known feature points
Figure FSA00001795535500000117
Then estimating a transformation relation between a feature point coordinate system and a vehicle-mounted camera coordinate system; a second stage of defining a reference coordinate system based on the projection of the feature point coordinate system
Figure FSA00001795535500000118
Obtaining the relation between the reference coordinate system of the robot and the current coordinate system, and defining the expected pose by the reference coordinate system
Figure FSA00001795535500000119
Finally, obtaining a relative posture relation between the current posture and the expected posture; in the third stage, the robot is driven to a desired pose by utilizing a polar coordinate control law
Figure FSA00001795535500000120
At least one of (1) and (b);
2, coordinate system relationship
In the case of the (2.1) th,
Figure FSA00001795535500000121
and
Figure FSA00001795535500000122
in relation to (2)
Characteristic image point coordinates andthe coordinates of the feature points under the coordinate system are respectively defined as follows:
Figure FSA00001795535500000124
the rotation matrix and the translation vector are respectively defined as
Figure FSA00001795535500000126
AndcTobj(ii) a From the feature point imaging model, the following relationship can be obtained:
Figure FSA00001795535500000125
wherein K is an internal parameter matrix of the camera;
from the definition of the coordinate system of the characteristic points, Z is knownobj0; thus, the rotation matrix can be written as
Figure FSA00001795535500000127
Is represented as follows:
Figure FSA0000179553550000021
Figure FSA0000179553550000022
to
Figure FSA0000179553550000023
The homography matrix of (a) is defined as H ═ K [ r ═ r1,r2cTobj]Taking into account the presence of the scaling factor, using h9H is normalized as follows:
Figure FSA0000179553550000024
the following relationship can be obtained by substituting (4) into (3):
Figure FSA0000179553550000025
the above formula can be written as:
Ax=b (6)
the least squares solution for x is:
x=(ATA)-1ATb (7)
furthermore, according to K [ r ]1,r2cTobj]H may be given by the following relationship:
Figure FSA0000179553550000026
then obtain
Figure FSA00001795535500000224
AndcTobj(ii) a Thus, it is possible to obtain
Figure FSA0000179553550000027
Relative to
Figure FSA0000179553550000028
The transformation matrix of (a) is as follows:
Figure FSA0000179553550000029
2.2 determination of the desired coordinates
Figure FSA00001795535500000210
Correspond to
Figure FSA00001795535500000211
Is defined as thetad
Figure FSA00001795535500000212
In that
Figure FSA00001795535500000213
Lower z coordinate set topTdz
Figure FSA00001795535500000214
In that
Figure FSA00001795535500000215
X coordinate of lower is set aspTdx
Figure FSA00001795535500000216
In that
Figure FSA00001795535500000217
The following rotation matrix and translation vector are respectively expressed as follows:
Figure FSA00001795535500000218
then
Figure FSA00001795535500000219
In that
Figure FSA00001795535500000220
The transform matrix below is:
Figure FSA00001795535500000221
2.3, determination of
Figure FSA00001795535500000222
And
Figure FSA00001795535500000223
position and attitude relationship therebetween
It is known that
Figure FSA0000179553550000031
Relative to
Figure FSA0000179553550000032
Is a transition matrix of
Figure FSA0000179553550000033
Wherein the content of the first and second substances,
Figure FSA0000179553550000034
relative to
Figure FSA0000179553550000035
Rotation matrix and translational vector quantity:
Figure FSA0000179553550000036
according to the projection principle, can obtain
Figure FSA0000179553550000037
In that
Figure FSA0000179553550000038
The following rotation matrix and translation vector are respectively as follows:
Figure FSA0000179553550000039
therefore, the first and second electrodes are formed on the substrate,
Figure FSA00001795535500000310
in that
Figure FSA00001795535500000311
The following transition matrix is expressed as follows:
Figure FSA00001795535500000312
therefore, it can be calculated by the following formula
Figure FSA00001795535500000313
In that
Figure FSA00001795535500000314
The following transition matrix:
Figure FSA00001795535500000315
2.4 kinematics of the robot
FIG. 2 illustrates polar control of a mobile robot by using polar coordinates to define a current pose of the mobile robot in a desired coordinate system; the moving speed and linear velocity of the mobile robot are set to v and w respectively,
Figure FSA00001795535500000316
of origin and
Figure FSA00001795535500000317
of originThe distance between them is noted as e,
Figure FSA00001795535500000318
in that
Figure FSA00001795535500000319
The lower rotation angle is denoted as φ; and the number of the first and second electrodes,
Figure FSA00001795535500000320
in that
Figure FSA00001795535500000321
The lower direction angle is defined as theta, zcThe included angle between e and e is defined as α, and α ═ theta-phi can be known, and the polar coordinate kinematic equation of the robot can be expressed as follows:
Figure FSA00001795535500000322
the design linear velocity control rate is as follows:
v=(γcosα)e (17)
the adopted angular speed controller is as follows:
Figure FSA0000179553550000041
CN201910154546.5A 2019-02-26 2019-02-26 Mobile robot vision stabilization control without expected image Pending CN111612843A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910154546.5A CN111612843A (en) 2019-02-26 2019-02-26 Mobile robot vision stabilization control without expected image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910154546.5A CN111612843A (en) 2019-02-26 2019-02-26 Mobile robot vision stabilization control without expected image

Publications (1)

Publication Number Publication Date
CN111612843A true CN111612843A (en) 2020-09-01

Family

ID=72205299

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910154546.5A Pending CN111612843A (en) 2019-02-26 2019-02-26 Mobile robot vision stabilization control without expected image

Country Status (1)

Country Link
CN (1) CN111612843A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022143626A1 (en) * 2020-12-31 2022-07-07 深圳市优必选科技股份有限公司 Method for controlling mobile robot, computer-implemented storage medium, and mobile robot

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106774309A (en) * 2016-12-01 2017-05-31 天津工业大学 A kind of mobile robot is while visual servo and self adaptation depth discrimination method
CN106737774A (en) * 2017-02-23 2017-05-31 天津商业大学 One kind is without demarcation mechanical arm Visual servoing control device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106774309A (en) * 2016-12-01 2017-05-31 天津工业大学 A kind of mobile robot is while visual servo and self adaptation depth discrimination method
CN106737774A (en) * 2017-02-23 2017-05-31 天津商业大学 One kind is without demarcation mechanical arm Visual servoing control device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHIWEI SONG, BAOQUAN LI AND WUXI SHI: "Visual Servoing of Mobile Robot with Setting Desired Pose Arbitrarily", 2018 IEEE 8TH ANNUAL INTERNATIONAL CONFERENCE ON CYBER TECHNOLOGY IN AUTOMATION, CONTROL, AND INTELLIGENT SYSTEMS, pages 1089 - 1093 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022143626A1 (en) * 2020-12-31 2022-07-07 深圳市优必选科技股份有限公司 Method for controlling mobile robot, computer-implemented storage medium, and mobile robot

Similar Documents

Publication Publication Date Title
CN109397249B (en) Method for positioning and grabbing robot system by two-dimensional code based on visual identification
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
CN109102525B (en) Mobile robot following control method based on self-adaptive posture estimation
Liu et al. Target tracking of moving and rotating object by high-speed monocular active vision
CN113706621B (en) Mark point positioning and posture obtaining method and system based on marked image
Alizadeh Object distance measurement using a single camera for robotic applications
CN110928311B (en) Indoor mobile robot navigation method based on linear features under panoramic camera
CN111993422A (en) Robot axis and hole alignment control method based on uncalibrated vision
Martinet et al. Stacking jacobians properly in stereo visual servoing
CN109693235B (en) Human eye vision-imitating tracking device and control method thereof
CN110722547B (en) Vision stabilization of mobile robot under model unknown dynamic scene
CN111612843A (en) Mobile robot vision stabilization control without expected image
CN116872216B (en) Robot vision servo operation method based on finite time control
WO2020179416A1 (en) Robot control device, robot control method, and robot control program
CN109816717A (en) The vision point stabilization of wheeled mobile robot in dynamic scene
CN116749233A (en) Mechanical arm grabbing system and method based on visual servoing
CN109542094B (en) Mobile robot vision stabilization control without desired images
Cong Combination of two visual servoing techniques in contour following task
Srikaew et al. Humanoid drawing robot
CN107363831B (en) Teleoperation robot control system and method based on vision
CN111353941A (en) Space coordinate conversion method
Torkaman et al. Real-time visual tracking of a moving object using pan and tilt platform: A Kalman filter approach
Nielsen et al. Learning mobile robot navigation: A behavior-based approach
CN112123370B (en) Mobile robot vision stabilization control with desired pose change
CN112975988A (en) Live working robot control system based on VR technique

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination