CN113674353A - Method for measuring accurate pose of space non-cooperative target - Google Patents

Method for measuring accurate pose of space non-cooperative target Download PDF

Info

Publication number
CN113674353A
CN113674353A CN202110948038.1A CN202110948038A CN113674353A CN 113674353 A CN113674353 A CN 113674353A CN 202110948038 A CN202110948038 A CN 202110948038A CN 113674353 A CN113674353 A CN 113674353A
Authority
CN
China
Prior art keywords
dimensional
cooperative target
straight line
pose
space non
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110948038.1A
Other languages
Chinese (zh)
Other versions
CN113674353B (en
Inventor
刘海波
刘子宾
宋俊尧
张进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202110948038.1A priority Critical patent/CN113674353B/en
Publication of CN113674353A publication Critical patent/CN113674353A/en
Application granted granted Critical
Publication of CN113674353B publication Critical patent/CN113674353B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method for measuring the accurate pose of a space non-cooperative target, which comprises the following steps of utilizing a TOF camera and a color camera to realize the estimation of the accurate relative pose of the space non-cooperative target: acquiring three-dimensional point clouds of the space non-cooperative target by using TOF (time of flight), and splicing according to an ICP (inductively coupled plasma) algorithm to obtain complete three-dimensional point clouds of the space non-cooperative target; extracting three-dimensional feature points and three-dimensional straight lines from a complete three-dimensional point cloud of a space non-cooperative target; and acquiring a sequence two-dimensional image of the space non-cooperative target by using a color camera, extracting two-dimensional feature points and two-dimensional straight lines from the sequence two-dimensional image, and solving the relative pose of the space non-cooperative target according to the corresponding relation of the 2D-3D feature points and the straight lines. The invention combines the imaging advantages of a TOF camera and a color camera, can accurately solve the pose of a spatial non-cooperative target, and can be applied to spatial tasks such as deep space exploration, situation perception and the like.

Description

Method for measuring accurate pose of space non-cooperative target
Technical Field
The invention relates to the field of image measurement, in particular to a method for measuring the accurate pose of a space non-cooperative target.
Background
With the progress of science and technology and the development of aerospace industry, deep space exploration and situation perception become important links for human exploration on space roads. In space exploration, more and more task objects are space non-cooperative targets, and the task objects lack cooperative markers and cannot provide effective prior information. Therefore, under a complex space environment, the problem of accurate pose estimation of a completely unknown space target causes wide attention of scholars, and the method has important research value and engineering practice significance.
Common spatial target pose measurement devices include color cameras, binocular cameras, lidar, and the like. The color camera cannot directly acquire target depth information; the measurement accuracy of the binocular camera is limited by a base line; the laser three-dimensional imaging is restricted by the process level, and the resolution is not high. Therefore, the method of multi-sensor fusion measurement can effectively combine the advantages of imaging technology of each sensor and make up for the inherent defects of a single sensor. The invention provides a space non-cooperative target accurate pose measurement scheme based on a TOF (time of flight) camera and a color camera, which can fully utilize the imaging advantages of the TOF camera and the color camera and realize accurate pose measurement of the space non-cooperative target.
Disclosure of Invention
Aiming at the problems, the invention provides a method for measuring the accurate pose of a space non-cooperative target.
The technical scheme adopted by the invention for solving the technical problems is as follows: a method for measuring the accurate pose of a spatial non-cooperative target comprises the following steps:
step 1, a TOF camera is used for obtaining complete three-dimensional point cloud of a space non-cooperative target;
step 2, extracting three-dimensional characteristic points and straight lines from the acquired complete three-dimensional point cloud of the space non-cooperative target;
step 3, carrying out image data acquisition on the space non-cooperative target in motion by using a color camera to obtain a sequence two-dimensional image of the space non-cooperative target;
step 4, extracting two-dimensional characteristic points and two-dimensional straight lines of the space non-cooperative target from the obtained sequence two-dimensional image;
and 5, projecting the three-dimensional characteristic points and the three-dimensional straight lines onto a two-dimensional plane, and solving the pose parameters of the spatial non-cooperative target according to the corresponding relation of the characteristic points and the straight lines.
Preferably, the step 5 specifically comprises:
step 5.1, calibrating the color camera to obtain the equivalent focal length f of the color camerax,fy
And 5.2, assuming that the initial value of the pose of the space non-cooperative target is known, projecting a three-dimensional straight line onto a two-dimensional plane according to the initial value of the pose, matching the three-dimensional straight line with the extracted two-dimensional straight line of the space non-cooperative target, and setting the point P of the three-dimensional straight line and the point P of projection as (P)x,py) The relationship (c) can be described by a pinhole camera model:
Figure BDA0003217437090000021
where the rotation matrix R and the translation vector t describe a rigid transformation from the world coordinate system to the camera coordinate system, the three-dimensional linear equation can be expressed in polar coordinates as:
pxcosθ+pysinθ-ρd=0 (9)
and 5.3, calculating the distance from the three-dimensional straight line end point to the two-dimensional straight line, and combining equations (8) and (9) to obtain:
Figure BDA0003217437090000022
N=(fxcosθ,fysinθ,-ρd)T (11)
wherein N is a normal vector of the projection plane, and the distance d from the projection of the three-dimensional straight line end point to the corresponding plane and the two-dimensional straight line is represented as:
Figure BDA0003217437090000023
step 5.4, the corresponding distance of the 2D-3D straight line pair is expressed as
Figure BDA0003217437090000024
Wherein d isl1,dl2Respectively representing two end points p projected from a three-dimensional straight line to a plane1,p2Distance to a two-dimensional line;
and 5.5, projecting the three-dimensional characteristic points onto a two-dimensional plane according to the pose initial values, and respectively representing the three-dimensional points and the two-dimensional characteristic points projected onto the plane as Q, wherein Q is [ Q ═ Q [, andx,qy]the projection relationship between the two can be expressed as:
Figure BDA0003217437090000031
step 5.6, the three-dimensional characteristic points are projected on a two-dimensional plane, and the two-dimensional characteristic points q 'corresponding to the image are [ q'x,q′y]Matching is carried out, and the distance between the projection point and the two-dimensional characteristic point is represented as dq
Figure BDA0003217437090000032
Step 5.7, projecting all the characteristic points and linear characteristics of the space non-cooperative target onto a two-dimensional plane, and setting the distance of the ith pair of 2D-3D corresponding points as
Figure BDA0003217437090000033
The total number of the corresponding point pairs is N, and the distance of the corresponding straight line pair of the jth pair 2D-3D is set as
Figure BDA0003217437090000034
And M pairs of 2D-3D corresponding straight line pairs are shared, and then the target pose is solved through a minimization formula 16:
Figure BDA0003217437090000035
solving equation (16) by a least squares method to solve the pose of the spatial non-cooperative target: the rotation matrix R and the translation vector t.
Compared with the prior art, the invention has the following beneficial effects:
1. the method combines a TOF camera and a color camera, acquires a clear sequence two-dimensional image by using the color camera, directly acquires the advantage of target depth information by using the TOF camera, realizes accurate pose estimation of a space non-cooperative target, specifically acquires target point cloud by using the TOF camera, realizes three-dimensional reconstruction of the target, can acquire three-dimensional structure information of the space non-cooperative target, acquires a high-resolution sequence two-dimensional image under a target motion state by using the color camera, extracts two-dimensional feature points and two-dimensional straight line information from the high-resolution sequence two-dimensional image, and jointly solves the motion pose of the space non-cooperative target with the three-dimensional feature points and the three-dimensional straight line information;
2. the method aims at the completely unknown space non-cooperative target, realizes the accurate pose estimation of the space non-cooperative target, can be applied to space tasks such as deep space exploration, situation perception and the like, and can provide effective information for subsequent space tasks such as capture, attack and defense and the like.
3. Compared with the paper 'Relative position estimation of uncooperative spatial use 2D-3D line correlation', the invention not only utilizes the linear characteristics, but also utilizes the point characteristics of the space non-cooperative target, and has the advantages that the linear characteristics are difficult to stably extract under the conditions of shielding, background interference and the like of the space non-cooperative target by fully utilizing the point and linear characteristics of the target structure and the edge, and when the position settlement is difficult, the position solution can still be carried out by utilizing the corresponding key characteristic points. The invention also optimizes the objective function, and gives different weights to key points and straight lines, so that the pose measurement is more accurate.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The invention will now be described in detail with reference to fig. 1, wherein exemplary embodiments and descriptions of the invention are provided to explain the invention, but not to limit the invention.
A method for measuring the accurate pose of a spatial non-cooperative target comprises the following steps:
step 1, acquiring a complete three-dimensional point cloud of a spatial non-cooperative target by using a TOF camera, and specifically comprising the following steps:
step 1.1, adopting a Zhang Zhengyou calibration method (A flexible new technique for camera calibration), published in IEEE Transactions on Pattern Analysis and Machine Analysis in 2000, to calibrate the TOF camera, and using a chessboard calibration plate to calibrate to obtain an internal parameter K of the TOF camera;
step 1.2, shooting around each angle of a space non-cooperative target by using a TOF camera to obtain a sequence depth map;
step 1.3, mapping the depth map to a space according to the internal parameters of the camera to obtain a local point cloud of a space non-cooperative target;
step 1.4, denoising the spatial non-cooperative target point cloud, and removing noise points to obtain a fine local point cloud of the spatial non-cooperative target;
step 1.5, providing an initial value for local point cloud registration by using a Fast Point Feature Histogram (FPFH) algorithm;
step 1.6, after the initial value is obtained, point clouds of two adjacent frames of the space non-cooperative target are matched by utilizing an ICP algorithm, and it is assumed that adjacent point sets to be matched are respectively represented as A and B:
A={a1,...,ai,...,an},B={b1,...,bj,...,bm}
wherein, ai,bjRespectively representing three-dimensional points in the point sets A and B, respectively representing the number of the point sets A and B by n and m, respectively implementing point cloud matching by minimizing the distance between the two point sets by an ICP algorithm, and using a rotation matrix RtAnd translation vector TtRigid transformation from point set A to point set B is described, and the specific steps are as follows:
step 1.6.1, finding the corresponding relation of the three-dimensional points in the adjacent point sets A and B, and carrying out ICP method on the three-dimensional points with the shortest Euclidean distanceTwo three-dimensional points are taken as corresponding points, and the corresponding points are marked as ai,bi
Step 1.6.2, solving the rotation matrix R by minimizing the distance of the corresponding point pair in the point settAnd translation vector Tt
Figure BDA0003217437090000051
Step 1.6.3, calculating the centroid coordinates of the point sets A and B:
Figure BDA0003217437090000052
step 1.6.4, calculating the coordinates of the centroid removed of each point in the point sets A and B:
ai′=aia,bi′=bib (3)
1.6.5, solving the rotation matrix R by SVD methodt
Figure BDA0003217437090000053
W=UΣVT (5)
Rt=UVT (6)
Step 1.6.6, solving translation vector Tt
Tt=μb-Rtμa (7)
Repeating the steps 1.6.1-1.6.6 until the distance meets the threshold requirement to obtain the optimal rotation matrix RtAnd translation vector TtSplicing the two frames of point clouds;
step 1.7, repeating the step 1.6, and splicing all the point clouds to obtain a complete three-dimensional point cloud of a space non-cooperative target;
step 2, extracting three-dimensional feature points and three-dimensional straight lines from the acquired complete three-dimensional point cloud of the space non-cooperative target;
step 3, carrying out image data acquisition on the space non-cooperative target in motion by using a color camera to obtain a sequence two-dimensional image of the space non-cooperative target;
step 4, extracting two-dimensional feature points and two-dimensional straight lines of the space non-cooperative target from the obtained sequence two-dimensional image by using an EDlines two-dimensional straight line detection algorithm and an SIFT feature point extraction algorithm;
step 5, matching the two-dimensional characteristic points and the two-dimensional straight lines with the three-dimensional characteristic points and the three-dimensional straight lines according to the solved two-dimensional characteristic points and the two-dimensional straight lines, and correspondingly solving the pose parameters of the space non-cooperative target by using the 2D-3D straight lines, wherein the method specifically comprises the following steps:
step 5.1, calibrating the color camera by adopting a Zhang Zhengyou calibration method to obtain the equivalent focal length f of the color camerax,fy
And 5.2, assuming that the initial value of the pose of the space non-cooperative target is known, projecting a three-dimensional straight line onto a two-dimensional plane according to the initial value of the pose, matching the three-dimensional straight line with the extracted two-dimensional straight line of the space non-cooperative target, and setting the point P of the three-dimensional straight line and the point P of projection as (P)x,py) The relationship (c) can be described by a pinhole camera model:
Figure BDA0003217437090000061
where the rotation matrix R and the translation vector t describe a rigid transformation from the world coordinate system to the camera coordinate system, the three-dimensional linear equation can be expressed in polar coordinates as:
pxcosθ+pysinθ-ρd=0 (9)
and 5.3, calculating the distance from the three-dimensional straight line end point to the two-dimensional straight line, and combining equations (8) and (9) to obtain:
Figure BDA0003217437090000062
N=(fxcosθ,fysinθ,-ρd)T (11)
wherein N is a normal vector of the projection plane, and the distance d from the projection of the three-dimensional straight line end point to the corresponding plane and the two-dimensional straight line is represented as:
Figure BDA0003217437090000063
step 5.4, the corresponding distance of the 2D-3D straight line pair is expressed as
Figure BDA0003217437090000064
Wherein d isl1,dl2Respectively representing two end points p projected from a three-dimensional straight line to a plane1,p2Distance to a two-dimensional line;
and 5.5, projecting the three-dimensional characteristic points onto a two-dimensional plane according to the pose initial values, and respectively representing the three-dimensional characteristic points and the two-dimensional characteristic points projected onto the plane as Q, wherein Q is [ Q ═ Q [, Q ═ Q [ ]x,qy]The projection relationship between the two can be expressed as:
Figure BDA0003217437090000071
step 5.6, the three-dimensional characteristic points are projected on a two-dimensional plane, and the two-dimensional characteristic points q 'corresponding to the image are [ q'x,q′y]Matching is carried out, and the distance between the projection point and the two-dimensional characteristic point is represented as dq:
Figure BDA0003217437090000072
Step 5.7, projecting all the characteristic points and linear characteristics of the space non-cooperative target onto a two-dimensional plane, and setting the distance of the ith pair of 2D-3D corresponding points as
Figure BDA0003217437090000073
There are N pairs of corresponding pointsIn pair, the distance of the 2D-3D corresponding straight line pair of the jth pair is set as
Figure BDA0003217437090000074
And M pairs of 2D-3D corresponding straight line pairs are shared, and then the target pose is solved through a minimization formula 16:
Figure BDA0003217437090000075
solving equation (16) by a least squares method to solve the pose of the spatial non-cooperative target: the rotation matrix R and the translation vector t.
The technical solutions provided by the embodiments of the present invention are described in detail above, and the principles and embodiments of the present invention are explained herein by using specific examples, and the descriptions of the embodiments are only used to help understanding the principles of the embodiments of the present invention; meanwhile, for a person skilled in the art, according to the embodiments of the present invention, there may be variations in the specific implementation manners and application ranges, and in summary, the content of the present description should not be construed as a limitation to the present invention.

Claims (2)

1. A method for measuring the accurate pose of a spatial non-cooperative target is characterized by comprising the following steps:
step 1, a TOF camera is used for obtaining complete three-dimensional point cloud of a space non-cooperative target;
step 2, extracting three-dimensional feature points and three-dimensional straight lines from the acquired complete three-dimensional point cloud of the space non-cooperative target;
step 3, carrying out image data acquisition on the space non-cooperative target in motion by using a color camera to obtain a sequence two-dimensional image of the space non-cooperative target;
step 4, extracting two-dimensional characteristic points and two-dimensional straight lines of the space non-cooperative target from the obtained sequence two-dimensional image;
and 5, respectively matching according to the solved 2D-3D characteristic points and straight lines, and solving the pose parameters of the spatial non-cooperative target by utilizing the corresponding relation.
2. The method for measuring the accurate pose of the spatial non-cooperative target according to claim 1, wherein the step 5 specifically comprises:
step 5.1, calibrating the color camera to obtain the equivalent focal length f of the color camerax,fy
And 5.2, assuming that the initial value of the pose of the space non-cooperative target is known, projecting a three-dimensional straight line onto a two-dimensional plane according to the initial value of the pose, matching the three-dimensional straight line with the extracted two-dimensional straight line of the space non-cooperative target, and setting the point P of the three-dimensional straight line and the point P of projection as (P)x,py) The relationship of (c) can be described as:
Figure FDA0003217437080000011
wherein, the pose of the target is described by the rotation matrix R and the translational vector t, i.e. the rigid transformation from the world coordinate system to the camera coordinate system, and the three-dimensional linear equation can be expressed by polar coordinates as:
pxcosθ+pysinθ-ρd=0 (2)
and 5.3, calculating the distance from the three-dimensional straight line end point to the two-dimensional straight line, and combining equations (8) and (9) to obtain:
Figure FDA0003217437080000012
N=(fxcosθ,fysinθ,-ρd)T (4)
wherein N is a normal vector of the projection plane, and the distance d from the projection of the three-dimensional straight line end point to the corresponding plane and the two-dimensional straight line is represented as:
Figure FDA0003217437080000021
step 5.4, the corresponding distance of the 2D-3D straight line pair is expressed as
Figure FDA0003217437080000022
Wherein d isl1,dl2Respectively representing two end points p projected from a three-dimensional straight line to a plane1,p2Distance to a two-dimensional line;
and 5.5, projecting the three-dimensional characteristic points onto a two-dimensional plane according to the pose initial values, and respectively representing the three-dimensional characteristic points and the two-dimensional characteristic points projected onto the plane as Q, wherein Q is [ Q ═ Q [, Q ═ Q [ ]x,qy]The projection relationship between the two can be expressed as:
Figure FDA0003217437080000023
step 5.6, the three-dimensional characteristic points are projected on a two-dimensional plane, and the two-dimensional characteristic points q 'corresponding to the image are [ q'x,q′y]Matching is carried out, and the distance between the projection point and the two-dimensional characteristic point is represented as dq
Figure FDA0003217437080000024
Step 5.7, projecting all the characteristic points and linear characteristics of the space non-cooperative target onto a two-dimensional plane, and setting the distance of the ith pair of 2D-3D corresponding points as
Figure FDA0003217437080000025
The total number of the corresponding point pairs is N, and the distance of the corresponding straight line pair of the jth pair 2D-3D is set as
Figure FDA0003217437080000026
And M pairs of 2D-3D corresponding straight line pairs are shared, and then the target pose is solved through a minimization formula 16:
Figure FDA0003217437080000027
solving equation (16) by a least squares method to solve the pose of the spatial non-cooperative target: the rotation matrix R and the translation vector t.
CN202110948038.1A 2021-08-18 2021-08-18 Accurate pose measurement method for space non-cooperative target Active CN113674353B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110948038.1A CN113674353B (en) 2021-08-18 2021-08-18 Accurate pose measurement method for space non-cooperative target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110948038.1A CN113674353B (en) 2021-08-18 2021-08-18 Accurate pose measurement method for space non-cooperative target

Publications (2)

Publication Number Publication Date
CN113674353A true CN113674353A (en) 2021-11-19
CN113674353B CN113674353B (en) 2023-05-16

Family

ID=78543640

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110948038.1A Active CN113674353B (en) 2021-08-18 2021-08-18 Accurate pose measurement method for space non-cooperative target

Country Status (1)

Country Link
CN (1) CN113674353B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115661493A (en) * 2022-12-28 2023-01-31 航天云机(北京)科技有限公司 Object pose determination method and device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120268567A1 (en) * 2010-02-24 2012-10-25 Canon Kabushiki Kaisha Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
CN109242873A (en) * 2018-08-22 2019-01-18 浙江大学 A method of 360 degree of real-time three-dimensionals are carried out to object based on consumer level color depth camera and are rebuild
CN111243002A (en) * 2020-01-15 2020-06-05 中国人民解放军国防科技大学 Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement
CN112179357A (en) * 2020-09-25 2021-01-05 中国人民解放军国防科技大学 Monocular camera-based visual navigation method and system for plane moving target
CN112284293A (en) * 2020-12-24 2021-01-29 中国人民解放军国防科技大学 Method for measuring space non-cooperative target fine three-dimensional morphology

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120268567A1 (en) * 2010-02-24 2012-10-25 Canon Kabushiki Kaisha Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
CN109242873A (en) * 2018-08-22 2019-01-18 浙江大学 A method of 360 degree of real-time three-dimensionals are carried out to object based on consumer level color depth camera and are rebuild
CN111243002A (en) * 2020-01-15 2020-06-05 中国人民解放军国防科技大学 Monocular laser speckle projection system calibration and depth estimation method applied to high-precision three-dimensional measurement
CN112179357A (en) * 2020-09-25 2021-01-05 中国人民解放军国防科技大学 Monocular camera-based visual navigation method and system for plane moving target
CN112284293A (en) * 2020-12-24 2021-01-29 中国人民解放军国防科技大学 Method for measuring space non-cooperative target fine three-dimensional morphology

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
孙聪,刘海波,陈圣义,尚洋: "基于广义成像模型的Scheimpflug相机标定方法", 光学学报 *
张跃强,苏昂 刘海波,尚洋,于起峰: "基于多级直线表述和M-估计的三维目标位姿跟踪优化算法", 光学学报 *
张雄锋,刘海波,尚洋: "单目相机位姿估计的稳健正交迭代方法", 光学学报 *
徐影,张进,于沫尧,许丹丹: "多星近距离绕飞观测任务姿轨耦合控制研究", 中国空间科学技术 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115661493A (en) * 2022-12-28 2023-01-31 航天云机(北京)科技有限公司 Object pose determination method and device, equipment and storage medium

Also Published As

Publication number Publication date
CN113674353B (en) 2023-05-16

Similar Documents

Publication Publication Date Title
CN110443836B (en) Point cloud data automatic registration method and device based on plane features
RU2609434C2 (en) Detection of objects arrangement and location
CN103971378B (en) A kind of mix the three-dimensional rebuilding method of panoramic picture in visual system
CN107560592B (en) Precise distance measurement method for photoelectric tracker linkage target
CN111325801B (en) Combined calibration method for laser radar and camera
CN106826815A (en) Target object method of the identification with positioning based on coloured image and depth image
CN107729893B (en) Visual positioning method and system of die spotting machine and storage medium
CN105976353A (en) Spatial non-cooperative target pose estimation method based on model and point cloud global matching
CN108177143A (en) A kind of robot localization grasping means and system based on laser vision guiding
CN104880176A (en) Moving object posture measurement method based on prior knowledge model optimization
Ellmauthaler et al. A novel iterative calibration approach for thermal infrared cameras
CN111220126A (en) Space object pose measurement method based on point features and monocular camera
CN108362205B (en) Space distance measuring method based on fringe projection
CN110147162B (en) Fingertip characteristic-based enhanced assembly teaching system and control method thereof
CN102567991B (en) A kind of binocular vision calibration method based on concentric circle composite image matching and system
CN110060304B (en) Method for acquiring three-dimensional information of organism
CN110532865B (en) Spacecraft structure identification method based on fusion of visible light and laser
CN113050074B (en) Camera and laser radar calibration system and calibration method in unmanned environment perception
CN115201883B (en) Moving target video positioning and speed measuring system and method
CN112362034B (en) Solid engine multi-cylinder section butt joint guiding measurement method based on binocular vision
CN112365545B (en) Calibration method of laser radar and visible light camera based on large-plane composite target
CN111524174A (en) Binocular vision three-dimensional construction method for moving target of moving platform
CN113674353B (en) Accurate pose measurement method for space non-cooperative target
Han et al. Target positioning method in binocular vision manipulator control based on improved canny operator
Sun et al. Automatic targetless calibration for LiDAR and camera based on instance segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant