CN111612843A - Mobile robot vision stabilization control without expected image - Google Patents
Mobile robot vision stabilization control without expected image Download PDFInfo
- Publication number
- CN111612843A CN111612843A CN201910154546.5A CN201910154546A CN111612843A CN 111612843 A CN111612843 A CN 111612843A CN 201910154546 A CN201910154546 A CN 201910154546A CN 111612843 A CN111612843 A CN 111612843A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- robot
- mobile robot
- follows
- obj
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000006641 stabilisation Effects 0.000 title claims abstract description 12
- 238000011105 stabilization Methods 0.000 title claims abstract description 12
- 230000000007 visual effect Effects 0.000 claims abstract description 12
- 230000009466 transformation Effects 0.000 claims abstract description 9
- 239000011159 matrix material Substances 0.000 claims description 36
- 230000007704 transition Effects 0.000 claims description 9
- 238000013519 translation Methods 0.000 claims description 9
- 230000003287 optical effect Effects 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 claims description 4
- 238000013461 design Methods 0.000 claims description 3
- 239000000126 substance Substances 0.000 claims description 3
- 239000000758 substrate Substances 0.000 claims description 3
- 238000000034 method Methods 0.000 abstract description 11
- 238000004088 simulation Methods 0.000 abstract description 7
- 238000011161 development Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The mobile robot has no expected image vision stabilization control system. A novel monocular vision servo strategy is proposed herein to drive a wheeled mobile robot to a desired pose without a desired image. Firstly, according to the feature points with known 3D coordinates, the transformation relation between the feature point coordinate system and the vehicle-mounted camera coordinate system is estimated. Secondly, a reference coordinate system is defined from the projection of the feature point coordinate system. Then, the relation between the reference coordinate system and the current coordinate system of the robot is obtained. And finally, driving the robot to a desired pose by using a polar coordinate control law. Compared with the conventional visual servo mode, the method can work normally without obtaining the expected image in advance. The simulation result proves the effectiveness of the method.
Description
Technical Field
The invention belongs to the technical field of computer vision and mobile robots, and can complete the stabilization control of vision under the condition of no expected image.
Background
With the rapid development of artificial intelligence, it becomes more and more important to control intelligent equipment, such as mobile robots and mechanical arms, through visual information feedback. With the development of society, intelligent robots are applied in numerous fields. Among them, the processing of visual information is one of the most important functions of a wheeled mobile robot and a robot arm. However, there are still many problems to be solved in the practical application of visual servo control. Generally, it is expected that images play an important role in visual servoing of a mobile robot. Without the desired image, the desired pose of the robotic system cannot be defined. However, in many existing approaches, the mobile robot cannot reach the desired pose without knowing the desired image. For a monocular mobile robot, during experiments or work, the external 3D scene becomes a 2D image plane when mapped to the camera due to the imaging principle of the camera. Thus resulting in the loss of depth information. While there are many strategies to address the problem of unknown depth information, it also increases the complexity of the robotic system. Therefore, in order to drive the mobile robot to a desired pose without a desired image, it is necessary to identify a scene model by making full use of image information. Therefore, the visual servoing process without the desired image becomes more meaningful.
Disclosure of Invention
A strategy for accomplishing visual stabilization control without a desired image is designed based on a monocular wheeled mobile robot.
A novel monocular vision servo strategy is proposed herein to drive a wheeled mobile robot to a desired pose without a desired image. Firstly, according to the feature points with known 3D coordinates, the transformation relation between the feature point coordinate system and the vehicle-mounted camera coordinate system is estimated. Secondly, a reference coordinate system is defined from the projection of the feature point coordinate system. Then, the relation between the reference coordinate system and the current coordinate system of the robot is obtained. And finally, driving the robot to a desired pose by using a polar coordinate control law. Compared with the conventional visual servo mode, the method can work normally without obtaining the expected image in advance. The simulation result proves the effectiveness of the method. The vision stabilization system method of the mobile robot provided by the invention comprises the following steps:
the vision stabilization control system for the mobile robot without the expected image is characterized by comprising the following steps of:
1, description of problems
1.1, System description
And setting the coordinate systems of the vehicle-mounted camera and the mobile robot to coincide. The coordinate system of the camera at the current pose is defined asTaking the optical center of the camera asOf the origin.Z of (a)cxcRepresenting the plane of motion of the mobile robot, wherein zcIn the same direction as the optical axis of the camera. y iscPerpendicular to the plane of motion z of the mobile robotcxcAnd meets the right-hand rule. A visual servo coordinate system without a desired image is shown in fig. 1. Randomly arranging four coplanar feature points for definingThe characteristic point with the serial number of 1 is used asThe vector between sequence number 1 and sequence number 2 asX ofobjAxis, normal vector of plane of feature point being zobjA shaft. y isobjDetermined according to the right-hand rule.The 3D coordinates of the following feature points are obtained by measuring with a ruler. In addition, the coordinate systemIs projected onto the robot motion plane as the robot reference pose, i.e.Of the origin. Will zobjThe coordinate axis is projected on the motion plane and then scaled to a unit vector asZ coordinate axis ofp。ypVertical robot plane of motion down, xpSatisfying the right-hand rule. Has been defined completelyThereafter, the expected pose is defined under the reference coordinate systemWill be provided withIs corresponding toIs defined as thetad. In the same way, the desired coordinate systemCorresponding to a reference coordinate systemAre set to x-coordinate and z-coordinate, respectivelypTdxAndpTdz。
1.2 control scheme
The vision stabilization control system for the mobile robot without the expected image consists of three stages: a first stage of defining a feature point coordinate system as a feature point coordinate system based on the 3D coordinates of the known feature pointsThen estimating a transformation relation between a feature point coordinate system and a vehicle-mounted camera coordinate system; a second stage of defining a reference coordinate system based on the projection of the feature point coordinate systemObtaining the relation between the reference coordinate system of the robot and the current coordinate system, and defining the expected pose by the reference coordinate systemFinally, obtaining a relative posture relation between the current posture and the expected posture; in the third stage, the robot is driven to a desired pose by utilizing a polar coordinate control lawTo (3).
2, coordinate system relationship
Characteristic image point coordinates andthe coordinates of the feature points under the coordinate system are respectively defined as follows:
the rotation matrix and the translation vector are respectively defined asAndcTobj. From the feature point imaging model, the following relationship can be obtained:
where K is the intrinsic parameter matrix of the camera.
From the definition of the coordinate system of the characteristic points, Z is knownobj0. Thus, the rotation matrix can be written asIs represented as follows:
toThe homography matrix of (a) is defined as H ═ K [ r ═ r1,r2,cTobj]Taking into account the presence of the scaling factor, using h9H is normalized as follows:
the following relationship can be obtained by substituting (4) into (3):
the above formula can be written as:
Ax=b (24)
the least squares solution for x is:
x=(ATA)-1ATb (25)
furthermore, according to K [ r ]1,r2,cTobj]H may be given by the following relationship:
then obtainAndcTobj. Thus, it is possible to obtainRelative toThe transformation matrix of (a) is as follows:
2.2 determination of the desired coordinates
Correspond toIs defined as thetad。In thatLower z coordinate set topTdz,In thatX coordinate of lower is set aspTdx。In thatThe following rotation matrix and translation vector are respectively expressed as follows:
It is known thatRelative toIs a transition matrix ofWherein the content of the first and second substances,relative toRotation matrix and translational vector quantity:
according to the projection principle, can obtainIn thatThe following rotation matrix and translation vector are respectively as follows:
therefore, the first and second electrodes are formed on the substrate,in thatThe following transition matrix is expressed as follows:
2.4 kinematics of the robot
Fig. 2 shows polar control of the mobile robot, using polar coordinates to define the current pose of the mobile robot in a desired coordinate system. The moving speed and linear velocity of the mobile robot are set to v and w respectively,of origin andis denoted as e and the distance between the origins,in thatThe lower rotation angle is denoted as phi. And the number of the first and second electrodes,in thatThe lower direction angle is defined as theta, zcThe angle between e and e is defined as α, and it can be known that α ═ θ - Φ.
The design linear velocity control rate is as follows:
v=(γcosα)e (35)
the adopted angular speed controller is as follows:
description of the drawings:
FIG. 1 is a visual servo coordinate system without a desired image
FIG. 2 is a polar coordinate system diagram of a mobile robot
FIG. 3 is a movement trace of a mobile robot
FIG. 4 is a track of the mobile robot reaching different phase poses under the same initial pose
FIG. 5 shows the change of the robot state
FIG. 6 is a two-dimensional trajectory of feature points in an image
FIG. 7 is a graph showing linear and angular velocities of a mobile robot
The specific implementation mode is as follows:
1. the vision stabilization control system for the mobile robot without the expected image is characterized by comprising the following steps of:
1, description of problems
1.1, System description
And setting the coordinate systems of the vehicle-mounted camera and the mobile robot to coincide. The coordinate system of the camera at the current pose is defined asTaking the optical center of the camera asOf the origin.Z of (a)cxcRepresenting the plane of motion of the mobile robot, wherein zcIn the same direction as the optical axis of the camera. y iscPerpendicular to the plane of motion z of the mobile robotcxcAnd meets the right-hand rule. A visual servo coordinate system without a desired image is shown in fig. 1. Randomly arranging four coplanar feature points for definingThe characteristic point with the serial number of 1 is used asThe vector between sequence number 1 and sequence number 2 asX ofobjAxis, normal vector of plane of feature point being zobjA shaft. y isobjDetermined according to the right-hand rule.The 3D coordinates of the following feature points are obtained by measuring with a ruler. In addition, the coordinate systemIs projected onto the robot motion plane as the robot reference pose, i.e.Of the origin. Will zobjThe coordinate axis is projected on the motion plane and then scaled to a unit vector asZ coordinate axis ofp。ypVertical robot plane of motion down, xpSatisfying the right-hand rule. Has been defined completelyThereafter, the expected pose is defined under the reference coordinate systemWill be provided withIs corresponding toIs defined as thetad. In the same way, the desired coordinate systemCorresponding to a reference coordinate systemAre set to x-coordinate and z-coordinate, respectivelypTdxAndpTdz。
1.2 control scheme
The vision stabilization control system for the mobile robot without the expected image consists of three stages: a first stage of defining a feature point coordinate system as a feature point coordinate system based on the 3D coordinates of the known feature pointsThen estimating a transformation relation between a feature point coordinate system and a vehicle-mounted camera coordinate system; a second stage of defining a reference coordinate system based on the projection of the feature point coordinate systemObtaining the relation between the reference coordinate system of the robot and the current coordinate system, and defining the expected pose by the reference coordinate systemFinally, obtaining a relative posture relation between the current posture and the expected posture; in the third stage, the robot is driven to a desired pose by utilizing a polar coordinate control lawTo (3).
2, coordinate system relationship
Characteristic image point coordinates andthe coordinates of the feature points under the coordinate system are respectively defined as follows:
the rotation matrix and the translation vector are respectively defined asAndcTobj. From the feature point imaging model, the following relationship can be obtained:
where K is the intrinsic parameter matrix of the camera.
From the definition of the coordinate system of the characteristic points, Z is knownobj0. Thus, the rotation matrix can be written asIs represented as follows:
toThe homography matrix of (a) is defined as H ═ K [ r ═ r1,r2,cTobj]Taking into account the presence of the scaling factor, using h9H is normalized as follows:
the following relationship can be obtained by substituting (4) into (3):
the above formula can be written as:
Ax=b (42)
the least squares solution for x is:
x=(ATA)-1ATb (43)
furthermore, according to K [ r ]1,r2,cTobj]H may be given by the following relationship:
then obtainAndcTobj. Thus, it is possible to obtainRelative toThe transformation matrix of (a) is as follows:
2.2 determination of the desired coordinates
Correspond toIs defined as thetad。In thatLower z coordinate set topTdz,In thatX coordinate of lower is set aspTdx。In thatThe following rotation matrix and translation vector are respectively expressed as follows:
It is known thatRelative toIs a transition matrix ofWherein the content of the first and second substances,relative toRotation matrix and translational vector quantity:
according to the projection principle, can obtainIn thatThe following rotation matrix and translation vector are respectively as follows:
therefore, the first and second electrodes are formed on the substrate,in thatThe following transition matrix is expressed as follows:
2.4 kinematics of the robot
Fig. 2 shows polar control of the mobile robot, using polar coordinates to define the current pose of the mobile robot in a desired coordinate system. The moving speed and linear velocity of the mobile robot are set to v and w respectively,of origin andis denoted as e and the distance between the origins,in thatThe lower rotation angle is denoted as phi. And the number of the first and second electrodes,in thatThe lower direction angle is defined as theta, zcThe angle between e and e is defined as α, and α ═ θ - ΦThe following:
the design linear velocity control rate is as follows:
v=(γcosα)e (53)
the adopted angular speed controller is as follows:
no. 3 simulation results
3.1, simulation results
The effectiveness of the method is proved through simulation. 4 coplanar feature points are set in the simulation scene, and moved feature points are set.
The internal parameter settings of the virtual camera in the simulation are as follows:
in a world coordinate system, setting the initial pose of the mobile robot as follows:
(WT0z,WT0x,θ0)=(-6.5m,1m,30°). (56)
in the reference coordinate system, the expected pose is set to (pTdz,pTdx,θp) (-0.8m, 0.0m, 0 °). It is then converted into the desired pose in the world coordinate system as:
(WT0z,WT0x,θd)=(-0.7m,0.0m,0°). (57)
in addition, random noise with a standard deviation of σ ═ 0.1 pixels was added to the pixel coordinates to test the robustness of the method. The control gain and other parameters are chosen as follows:
γ=0.5,k=1.5,h=0.8. (58)
referring to the drawings, FIG. 3 shows a mobile robotAnd (4) moving tracks. The robot can reach the expected pose efficiently according to the graph. Fig. 4 is a trajectory of the mobile robot to reach different phase poses in the same initial pose. Fig. 6 is a two-dimensional trajectory of feature points in an image, the circular and square points representing initial and desired image feature points, respectively. It can also be seen from the figure that the robot successfully reaches the desired pose. Fig. 7 is a linear velocity and an angular velocity of the mobile robot. FIG. 5 shows the robot states (WTcz(t),WTcx(t),qc(t)), where the dashed line represents the desired pose in the world coordinate system. The results show that the steady state error of the method is small enough, which indicates the effectiveness of the method.
3.2, conclusion
In this context, a novel mobile robot vision servo strategy is proposed, i.e. driving a wheeled mobile robot to a desired pose without a desired image. The method comprises the steps of firstly estimating a transformation relation between a feature point coordinate system and a vehicle-mounted camera coordinate system according to feature points of known 3D coordinates. Secondly, defining a reference coordinate system according to the projection of the feature point coordinate system, and obtaining the relation between the reference coordinate system and the current coordinate system. And then, defining an expected pose according to the reference coordinate system, and obtaining the relation between the current pose and the expected pose. And finally, stabilizing the robot to a desired pose by utilizing a polar coordinate control law.
Claims (1)
1. The vision stabilization control system for the mobile robot without the expected image is characterized by comprising the following steps of:
1, description of problems
1.1, System description
Setting a coordinate system of the vehicle-mounted camera to coincide with a coordinate system of the mobile robot; the coordinate system of the camera at the current pose is defined asTaking the optical center of the camera asThe origin of (a);z of (a)cxcRepresenting the plane of motion of the mobile robot, wherein zcIn the same direction as the optical axis of the camera; y iscPerpendicular to the plane of motion z of the mobile robotcxcAnd meets the right-hand rule; the visual servo coordinate system without the expected image is shown in the attached figure 1; randomly arranging four coplanar feature points for definingThe characteristic point with the serial number of 1 is used asThe vector between sequence number 1 and sequence number 2 asX ofobjAxis, normal vector of plane of feature point being zobjA shaft; y isobjDetermining according to a right-hand rule;measuring the 3D coordinates of the lower characteristic points by a ruler; in addition, the coordinate systemIs projected onto the robot motion plane as the robot reference pose, i.e.The origin of (a); will zobjThe coordinate axis is projected on the motion plane and then scaled to a unit vector asZ coordinate axis ofp;ypVertical robot plane of motion down, xpMeets the right-hand rule; has been defined completelyThereafter, the expected pose is defined under the reference coordinate systemWill be provided withIs corresponding toIs defined as thetad(ii) a In the same way, the desired coordinate systemCorresponding to a reference coordinate systemAre set to x-coordinate and z-coordinate, respectivelypTdxAndpTdz;
1.2 control scheme
The vision stabilization control system for the mobile robot without the expected image consists of three stages: a first stage of defining a feature point coordinate system as a feature point coordinate system based on the 3D coordinates of the known feature pointsThen estimating a transformation relation between a feature point coordinate system and a vehicle-mounted camera coordinate system; a second stage of defining a reference coordinate system based on the projection of the feature point coordinate systemObtaining the relation between the reference coordinate system of the robot and the current coordinate system, and defining the expected pose by the reference coordinate systemFinally, obtaining a relative posture relation between the current posture and the expected posture; in the third stage, the robot is driven to a desired pose by utilizing a polar coordinate control lawAt least one of (1) and (b);
2, coordinate system relationship
Characteristic image point coordinates andthe coordinates of the feature points under the coordinate system are respectively defined as follows:
the rotation matrix and the translation vector are respectively defined asAndcTobj(ii) a From the feature point imaging model, the following relationship can be obtained:
wherein K is an internal parameter matrix of the camera;
from the definition of the coordinate system of the characteristic points, Z is knownobj0; thus, the rotation matrix can be written asIs represented as follows:
toThe homography matrix of (a) is defined as H ═ K [ r ═ r1,r2,cTobj]Taking into account the presence of the scaling factor, using h9H is normalized as follows:
the following relationship can be obtained by substituting (4) into (3):
the above formula can be written as:
Ax=b (6)
the least squares solution for x is:
x=(ATA)-1ATb (7)
furthermore, according to K [ r ]1,r2,cTobj]H may be given by the following relationship:
then obtainAndcTobj(ii) a Thus, it is possible to obtainRelative toThe transformation matrix of (a) is as follows:
2.2 determination of the desired coordinates
Correspond toIs defined as thetad;In thatLower z coordinate set topTdz,In thatX coordinate of lower is set aspTdx;In thatThe following rotation matrix and translation vector are respectively expressed as follows:
It is known thatRelative toIs a transition matrix ofWherein the content of the first and second substances,relative toRotation matrix and translational vector quantity:
according to the projection principle, can obtainIn thatThe following rotation matrix and translation vector are respectively as follows:
therefore, the first and second electrodes are formed on the substrate,in thatThe following transition matrix is expressed as follows:
2.4 kinematics of the robot
FIG. 2 illustrates polar control of a mobile robot by using polar coordinates to define a current pose of the mobile robot in a desired coordinate system; the moving speed and linear velocity of the mobile robot are set to v and w respectively,of origin andof originThe distance between them is noted as e,in thatThe lower rotation angle is denoted as φ; and the number of the first and second electrodes,in thatThe lower direction angle is defined as theta, zcThe included angle between e and e is defined as α, and α ═ theta-phi can be known, and the polar coordinate kinematic equation of the robot can be expressed as follows:
the design linear velocity control rate is as follows:
v=(γcosα)e (17)
the adopted angular speed controller is as follows:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910154546.5A CN111612843A (en) | 2019-02-26 | 2019-02-26 | Mobile robot vision stabilization control without expected image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910154546.5A CN111612843A (en) | 2019-02-26 | 2019-02-26 | Mobile robot vision stabilization control without expected image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111612843A true CN111612843A (en) | 2020-09-01 |
Family
ID=72205299
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910154546.5A Pending CN111612843A (en) | 2019-02-26 | 2019-02-26 | Mobile robot vision stabilization control without expected image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111612843A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022143626A1 (en) * | 2020-12-31 | 2022-07-07 | 深圳市优必选科技股份有限公司 | Method for controlling mobile robot, computer-implemented storage medium, and mobile robot |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106774309A (en) * | 2016-12-01 | 2017-05-31 | 天津工业大学 | A kind of mobile robot is while visual servo and self adaptation depth discrimination method |
CN106737774A (en) * | 2017-02-23 | 2017-05-31 | 天津商业大学 | One kind is without demarcation mechanical arm Visual servoing control device |
-
2019
- 2019-02-26 CN CN201910154546.5A patent/CN111612843A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106774309A (en) * | 2016-12-01 | 2017-05-31 | 天津工业大学 | A kind of mobile robot is while visual servo and self adaptation depth discrimination method |
CN106737774A (en) * | 2017-02-23 | 2017-05-31 | 天津商业大学 | One kind is without demarcation mechanical arm Visual servoing control device |
Non-Patent Citations (1)
Title |
---|
ZHIWEI SONG, BAOQUAN LI AND WUXI SHI: "Visual Servoing of Mobile Robot with Setting Desired Pose Arbitrarily", 2018 IEEE 8TH ANNUAL INTERNATIONAL CONFERENCE ON CYBER TECHNOLOGY IN AUTOMATION, CONTROL, AND INTELLIGENT SYSTEMS, pages 1089 - 1093 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022143626A1 (en) * | 2020-12-31 | 2022-07-07 | 深圳市优必选科技股份有限公司 | Method for controlling mobile robot, computer-implemented storage medium, and mobile robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109397249B (en) | Method for positioning and grabbing robot system by two-dimensional code based on visual identification | |
CN111801198B (en) | Hand-eye calibration method, system and computer storage medium | |
CN109102525B (en) | Mobile robot following control method based on self-adaptive posture estimation | |
Liu et al. | Target tracking of moving and rotating object by high-speed monocular active vision | |
CN113706621B (en) | Mark point positioning and posture obtaining method and system based on marked image | |
Alizadeh | Object distance measurement using a single camera for robotic applications | |
CN110928311B (en) | Indoor mobile robot navigation method based on linear features under panoramic camera | |
CN111993422A (en) | Robot axis and hole alignment control method based on uncalibrated vision | |
Martinet et al. | Stacking jacobians properly in stereo visual servoing | |
CN109693235B (en) | Human eye vision-imitating tracking device and control method thereof | |
CN110722547B (en) | Vision stabilization of mobile robot under model unknown dynamic scene | |
CN111612843A (en) | Mobile robot vision stabilization control without expected image | |
CN116872216B (en) | Robot vision servo operation method based on finite time control | |
WO2020179416A1 (en) | Robot control device, robot control method, and robot control program | |
CN109816717A (en) | The vision point stabilization of wheeled mobile robot in dynamic scene | |
CN116749233A (en) | Mechanical arm grabbing system and method based on visual servoing | |
CN109542094B (en) | Mobile robot vision stabilization control without desired images | |
Cong | Combination of two visual servoing techniques in contour following task | |
Srikaew et al. | Humanoid drawing robot | |
CN107363831B (en) | Teleoperation robot control system and method based on vision | |
CN111353941A (en) | Space coordinate conversion method | |
Torkaman et al. | Real-time visual tracking of a moving object using pan and tilt platform: A Kalman filter approach | |
Nielsen et al. | Learning mobile robot navigation: A behavior-based approach | |
CN112123370B (en) | Mobile robot vision stabilization control with desired pose change | |
CN112975988A (en) | Live working robot control system based on VR technique |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |