CN111649746B - Positioning and navigation method integrating inertial navigation measurement and ArUco marker - Google Patents

Positioning and navigation method integrating inertial navigation measurement and ArUco marker Download PDF

Info

Publication number
CN111649746B
CN111649746B CN202010526692.9A CN202010526692A CN111649746B CN 111649746 B CN111649746 B CN 111649746B CN 202010526692 A CN202010526692 A CN 202010526692A CN 111649746 B CN111649746 B CN 111649746B
Authority
CN
China
Prior art keywords
coordinate system
inertial navigation
camera
moment
world coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010526692.9A
Other languages
Chinese (zh)
Other versions
CN111649746A (en
Inventor
王海霞
宋诗斌
李嘉怡
张治国
***
梁慧斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Zhuo Xintong Intelligent Technology Co ltd
Original Assignee
Qingdao Zhuo Xintong Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Zhuo Xintong Intelligent Technology Co ltd filed Critical Qingdao Zhuo Xintong Intelligent Technology Co ltd
Priority to CN202010526692.9A priority Critical patent/CN111649746B/en
Publication of CN111649746A publication Critical patent/CN111649746A/en
Application granted granted Critical
Publication of CN111649746B publication Critical patent/CN111649746B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a positioning and navigation method integrating inertial navigation measurement and ArUco markers, and belongs to the technical field of computer vision positioning and mobile robot navigation. In the invention, under the Aruco map environment, the coordinate systems of the inertial navigation element and the camera are unified, and the current attitude loss of the mobile robot caused by misrecognition or missing recognition of the Aruco marker is overcome; the problem of limited camera vision is overcome to a certain extent, the gesture sensing capability of the robot is improved, and the robot can complete more complex gesture adjustment in navigation.

Description

Positioning and navigation method integrating inertial navigation measurement and ArUco marker
Technical Field
The invention belongs to the technical field of computer vision positioning and mobile robot navigation, relates to an indoor positioning and navigation method, and more particularly relates to a positioning and navigation method integrating inertial navigation measurement and Aruco (Augmented Reality of Cordoba University) marking, which can be used for autonomous positioning and navigation of intelligent mobile equipment such as a mobile robot, an unmanned aerial vehicle and the like in an indoor environment.
Background
Under the promotion of the fields of computers, sensors and the like, the mobile robot technology is rapidly developed and widely applied to the fields of national defense science and technology and economic life, such as medical treatment, military, industry, space and the like. Positioning navigation is a basic technology of robot motion control and action operation, and especially under the condition of lacking satellite navigation signals indoors, positioning of an indoor robot by utilizing environmental information becomes a common implementation mode.
Indoor positioning and autonomous navigation methods can be classified into a relative estimation method and a tracking method. The relative estimation method is to use inertial navigation element or distance measuring element to measure and calculate the current position and relative moving distance, but the disadvantage is that the measured data of the element has drift, and the precision instrument is expensive. The tracing method needs to arrange a track or a road sign, and the track or the road sign is identified and positioned through an electromagnetic sensor or an image sensor carried by the robot, and when the task of the robot is changed, a path needs to be rearranged. In addition, a mode of providing navigation and positioning information by using traditional two-dimension code information appears, and in the mode, the robot identifies the two-dimension code information by arranging the two-dimension code on the ground so as to judge the self pose and guide the operation of the robot. However, the traditional two-dimensional code has a short scanning distance, and in practical use, the robot cannot well identify the two-dimensional code, so that the two-dimensional code has great limitation. In addition, the method generally needs to be arranged in a checkerboard shape, the arrangement position of the two-dimensional code needs to be measured before arrangement, and the arrangement cost is high. In addition, the robot can only travel on a path with the two-dimensional code, the movement range of the robot is limited, and certain flexibility is insufficient.
In order to overcome the problems of high cost, insufficient flexibility and the like of artificial landmark arrangement, the invention provides an indoor positioning and navigation method integrating inertial navigation measurement and ArUco marking. The method comprises the steps of constructing an indoor map by using an ArUco marker, realizing robot positioning calculation by using the ArUco marker as a navigation beacon, fusing an inertial navigation measuring element into positioning based on the ArUco marker, and providing attitude output under the conditions of right-angle turning, loss of the ArUco marker and the like by using the autonomy and stability of an inertial measurement component, so that the stability of acquiring the position and attitude of the robot is ensured, and the navigation robustness is improved.
The method provided by the invention realizes the indoor positioning navigation of the robot by utilizing the ArUco marker mapping, simultaneously, the inertial navigation measurement information is fused, the problem of attitude error caused by the navigation that the ArUco marker cannot be recognized at the path corner and ambiguity is recognized is solved, and the stability of the navigation system is enhanced.
Disclosure of Invention
In view of the above technical problems in the prior art, the main object of the present invention is to: aiming at the problems of high arrangement cost, insufficient flexibility and the like of the traditional indoor navigation method, the positioning and navigation method integrating inertial navigation measurement and the ArUco marker is provided, so that the cost is reduced, and the flexibility and the robustness of the system are improved.
In order to achieve the purpose, the invention adopts the following technical scheme:
a positioning and navigation method fusing inertial navigation measurement and ArUco markers adopts an inertial navigation element and a camera, and comprises the following steps:
step 1: establishing an Aruco map;
and 2, step: establishing a path according to task needs, wherein the path is represented by a series of coordinate points, and setting a task target point P set
And 3, step 3: fixing the relative poses of the inertial navigation element and the camera, and calculating to obtain the rotation relation between the inertial element coordinate system and the camera coordinate system;
and 4, step 4: the inertial navigation element begins to solve the attitude;
and 5: the robot rotates at a fixed angular speed until the camera observes the ArUco mark, the posture of the camera under a world coordinate system is successfully obtained, the current time is set to be k time, and the current posture of the camera in the world coordinate system and the self-calculation posture of the inertial navigation element are recorded;
step 6: calculating the attitude of the inertial navigation element in the world coordinate system according to the attitude of the camera in the world coordinate system obtained in the step 5 and the self-solution attitude of the inertial navigation element;
and 7: setting the next moment as a moment k +1, and calculating the posture of the inertial navigation element at the moment k +1 in the world coordinate system according to the posture of the inertial navigation element in the world coordinate system obtained in the step 6;
and 8: calculating to obtain the posture of the camera at the moment k +1 in the world coordinate system according to the posture of the inertial navigation element at the moment k +1 in the world coordinate system obtained in the step 7;
and step 9: obtaining the position P of the robot in a world coordinate system through Aruco marking current And calculating to obtain a point P closest to the robot on the set path nearest ,P nearest Along the set path direction P set The nth point of the direction is marked as P aim
Step 10: calculating P nearest To P aim Vector V of aim ,V aim The direction vector of the robot is obtained;
step 11: according to the attitude vector V of the robot at the k +1 moment robot And a direction vector V aim Calculating a yaw angle alpha and a pitch angle beta of the robot; controlling the motion angular speed of the robot according to the sizes of alpha and beta, and controlling the motion angular speed of the robot according to P aim And a set point P set The distance between the two controls the linear speed of the movement of the robot;
step 12: judging the current position P of the robot current Whether or not to equal P set And if the difference is not equal, jumping to the step 7, and if the difference is equal, ending the navigation.
Preferably, in step 3,
the inertial navigation element and the camera adopt an orthogonal fixing method, and the rotation relation between a camera coordinate system and an inertial navigation coordinate system can be directly deduced;
the inertial navigation element and the camera adopt a combined calibration method, and the specific coordinate relation between the inertial navigation element and the camera can be calibrated.
Preferably, in step 4,
the initial quaternion value of the attitude at which the inertial navigation element starts to solve is generally set to [10 0].
Preferably, in step 5, after the camera obtains the pose under the world coordinate system by recognizing the ArUco mark, the pose average data in a period of time is taken to increase the positioning accuracy;
preferably, in step 6, the attitude of the inertial navigation element in the world coordinate system is calculated according to formula (1):
Figure BDA0002533855040000031
in the formula, q is a quaternion,
Figure BDA0002533855040000032
is a quaternion representing the rotation operation, k represents the time when the camera obtains the gesture under the world coordinate system by recognizing the Aruco mark, and/or>
Figure BDA0002533855040000033
Is the gesture of the inertial navigation element in the world coordinate system at the initial moment, namely 0 moment>
Figure BDA0002533855040000034
For the position of the camera at the moment k in the world coordinate system, < >>
Figure BDA00025338550400000315
The rotation of the camera at the moment k relative to a time coordinate system of the inertial navigation element 0;
according to the formula (2)
Figure BDA0002533855040000035
Figure BDA0002533855040000036
In the formula (I), the compound is shown in the specification,
Figure BDA0002533855040000037
for solving the attitude at the moment k of the inertial navigation element>
Figure BDA0002533855040000038
Is the rotational relationship between the inertial navigation coordinate system and the camera coordinate system.
Preferably, in step 7, the attitude of the inertial navigation unit in the world coordinate system at the time k +1 is calculated according to formula (3):
Figure BDA0002533855040000039
in the formula
Figure BDA00025338550400000310
Is the posture of the inertial navigation element at the moment k +1 in the world coordinate system, and is/is judged>
Figure BDA00025338550400000311
And solving the attitude of the inertial navigation element at the moment k + 1.
Preferably, in step 8, the pose of the camera in the world coordinate system at the time k +1 is calculated according to formula (4):
Figure BDA00025338550400000312
in the formula
Figure BDA00025338550400000313
Is the pose of the camera in the world coordinate system at time k +1, when ≥ er>
Figure BDA00025338550400000314
The calculation of (a) is no longer dependent on the recognition result of the Aruco marker at the time k + 1.
The invention has the following beneficial technical effects:
(1) Under the ArUco map environment, coordinate systems of an inertial navigation element and a camera are unified, and the problem that the current posture of the mobile robot is lost due to misrecognition or missing recognition of an ArUco mark is solved.
(2) The problem of limited camera vision is overcome to a certain extent, the gesture sensing capability of the robot is improved, and the robot can complete more complex gesture adjustment in navigation.
Drawings
FIG. 1 is a schematic diagram of an indoor Aruco marker arrangement.
Fig. 2 is a schematic diagram of the effect of the ArUco map and the task path.
FIG. 3 is a schematic diagram of a coordinate system relationship of a navigation system.
FIG. 4 is a flowchart of AGV initial pose acquisition.
FIG. 5 is a schematic diagram of AGV navigation control.
FIG. 6 is a flow chart for implementing visual positioning and navigation by fusing inertial navigation measurement and ArUco marker.
Detailed Description
The invention is described in further detail below with reference to the following figures and detailed description:
the following figures and embodiments are described with reference to AGV (Automated Guided Vehicle) navigation as an example.
Step 1: as shown in fig. 1, the ArUco marks are arranged indoors, are generated by an ArUco official repository, are uniform in size, and are arranged in a camera-recognizable range such as a wall surface. Designation number 26 is marked as a reference mark and visual mapping is completed, with an ArUco map as shown in fig. 2. In addition, the calibration method Zhang Zhengyou is adopted to calibrate the camera internal parameters.
Step 2: the navigation path is established according to the task requirements, as shown in fig. 2. A mathematical equation is established according to the No. 26 mark coordinate system and is scattered into a series of points, and the scattering density is 1000/m. In this example, the navigation path is a path with a quarter turn, and the path equation is:
Figure BDA0002533855040000041
and step 3: as shown in fig. 3, a relationship diagram of each coordinate system in the present invention, the present invention obtains position information of a camera in a world coordinate system by using a camera and an ArUco marker, and obtains pose information of the camera in the world coordinate system by using an Inertial Measurement Unit (IMU). The world coordinate system in FIG. 3 is the Aruco marker coordinate system specified by the Aruco map, denoted w, X-Y-Z, constituting the right hand coordinate system. The camera coordinate system and the IMU coordinate system are fixed in relation, and the IMU and the camera are installed orthogonally. The camera coordinate system is denoted as c, the X axis is the advancing direction of the AGV, the Y axis is vertical to the moving plane of the vehicle body and faces downwards, and the Z axis is the optical axis of the camera and points to the left side of the advancing direction of the vehicle body horizontally. The IMU coordinate system is marked as b, the X axis horizontally points to the left side of the advancing direction of the vehicle body, the direction of the X axis is the same as that of the Z axis of the camera, the advancing direction of the Y axis is opposite to that of the AGV, and the Z axis is vertical to the moving plane of the vehicle body and faces upwards.
In fig. 3, the IMU coordinate system is rotated by-90 ° around the X axis and then rotated by 90 ° around the Y axis, which may be the same as the coordinate axis of the camera coordinate system. At this time, the rotational relationship between b and c
Figure BDA0002533855040000042
Comprises the following steps:
Figure BDA0002533855040000043
the rotation is expressed in quaternion form:
Figure BDA0002533855040000051
in the formula
Figure BDA0002533855040000052
Is a unit quaternion representing the rotation of c to b, f Q Is the mapping of the identity matrix to the identity quaternion.
And 4, step 4: the resolving attitude of the inertial navigation element is set as the initial attitude at the power-on time of the inertial navigation element
Figure BDA0002533855040000053
And 5: the AGV rotates according to a certain angular speed, the identification mode is a FAST mode, so that the identification speed of the mark is increased until the ArUco mark is observed by the camera, and the identification mode is switched to a NORMAL mode, so that the identification precision is increased. After the ArUco marker is successfully identified by the camera, the pixel coordinates of the marker corner point are obtained, the position and the posture of the camera under a world coordinate system are obtained through a PnP algorithm, the posture average data in one second is taken to increase the positioning accuracy, and the method for obtaining the average posture is as follows:
1) Constructing the following attitude cost function:
Figure BDA0002533855040000054
in the formula
Figure BDA0002533855040000055
For the mean attitude quaternion to be found, omega i Is the weight corresponding to the i-th sample attitude quaternion, where ω is i Are all 1.A (q) i ) A rotation matrix corresponding to an ith sample attitude quaternion>
Figure BDA0002533855040000056
Is->
Figure BDA0002533855040000057
Corresponding rotation matrix, according to which the status of the rotating device is selected>
Figure BDA0002533855040000058
Is Frobenius norm. />
2) Defined by the Frobenius norm, and A (q) i ) And
Figure BDA0002533855040000059
are all orthogonal matrices, and can obtain:
Figure BDA00025338550400000510
for any 3 rd order orthogonal matrix a, there are:
tr{A T A}=3
thus:
Figure BDA00025338550400000511
therefore, the problem of finding the minimum value of the mahalanobis distance can be converted into the problem of finding the maximum value of the trace:
Figure BDA00025338550400000512
3) Order to
Figure BDA00025338550400000513
Singular value decomposition is carried out on the obtained product:
Figure BDA0002533855040000061
in the formula j As diagonal arrays of singular values, U j And
Figure BDA0002533855040000062
for orthogonal singular vector matrices, the optimal solution to the above problem is:
Figure BDA0002533855040000063
quaternion of the mean
Figure BDA0002533855040000064
Can be selected from the gesture matrix>
Figure BDA0002533855040000065
Is obtained and is based on>
Figure BDA0002533855040000066
I.e. is>
Figure BDA0002533855040000067
Step 6: obtained according to step 5
Figure BDA0002533855040000068
And &>
Figure BDA0002533855040000069
Calculating the initial resolving attitude of the inertial navigation element in a world coordinate system:
Figure BDA00025338550400000610
in the formula
Figure BDA00025338550400000611
For the relative rotation of the power-on time coordinate system and the world coordinate system of the inertial navigation element, the device>
Figure BDA00025338550400000612
The rotation operation is expressed by quaternion.
And 7: obtained according to step 6
Figure BDA00025338550400000613
Obtaining the attitude of the inertial navigation element in the world coordinate system at the next moment
Figure BDA00025338550400000614
Figure BDA00025338550400000615
Where k +1 represents the next time the pose of the camera in the world coordinate system is first obtained,
Figure BDA00025338550400000616
and solving the attitude of the inertial navigation element at the moment k + 1.
And step 8: obtained according to step 7
Figure BDA00025338550400000617
Obtaining the gesture of the camera in the world coordinate system at the next moment>
Figure BDA00025338550400000618
/>
Figure BDA00025338550400000619
In the formula
Figure BDA00025338550400000620
The calculation of (a) is no longer dependent on the recognition result of the Aruco marker at the time k + 1.
And step 9: obtaining the position P of the robot in a world coordinate system through Aruco marking current As shown in fig. 5, a point P closest to the robot on the set path is calculated nearest And the nth point along the set path toward the target direction is denoted as P aim And the value of n is proportional to the maximum running speed and the path of the AGVDot density.
Step 10: calculating P nearest To P aim Vector V of aim ,V aim I.e. the target direction of the robot.
Step 11: according to the attitude vector V of the robot at the k +1 moment robot And a direction vector V aim And calculating the yaw angle alpha of the robot. Controlling the angular speed of the robot according to the alpha value; according to P aim And a set point P set The distance between them controls the linear velocity of the robot.
Step 12: judging the current position P of the robot current Whether or not to equal P set And (7) if the navigation is not equal, ending the navigation.
It is to be understood that the above description is not intended to limit the present invention, and the present invention is not limited to the above examples, and those skilled in the art may make modifications, alterations, additions or substitutions within the spirit and scope of the present invention.

Claims (3)

1.A positioning and navigation method integrating inertial navigation measurement and Aruco markers is characterized in that: adopting an inertial navigation element and a camera, comprising the following steps:
step 1: establishing an Aruco map;
and 2, step: establishing a path according to task needs, wherein the path is represented by a series of coordinate points, and setting a task target point P set
And 3, step 3: fixing the relative pose of the inertial navigation element and the camera, and calculating to obtain the rotation relation between the inertial navigation element coordinate system and the camera coordinate system;
and 4, step 4: the inertial navigation element begins to solve the attitude;
and 5: the robot rotates at a fixed angular speed until the camera observes an ArUco mark, the posture of the camera under a world coordinate system is successfully obtained, the current moment is set to be k moment, and the posture of the current camera in the world coordinate system and the self-calculation posture of the inertial navigation element are recorded;
step 6: calculating the attitude of the inertial navigation element in the world coordinate system according to the attitude of the camera in the world coordinate system obtained in the step 5 and the self-solution attitude of the inertial navigation element;
and 7: setting the next moment as a moment k +1, and calculating the posture of the inertial navigation element at the moment k +1 in the world coordinate system according to the posture of the inertial navigation element in the world coordinate system obtained in the step 6;
and 8: calculating to obtain the posture of the camera at the moment k +1 in the world coordinate system according to the posture of the inertial navigation element at the moment k +1 in the world coordinate system obtained in the step 7;
and step 9: obtaining the position P of the robot in a world coordinate system through Aruco marking current And calculating to obtain a point P closest to the robot on the set path nearest ,P nearest Along the set path direction P set The nth point of the direction is marked as P aim
Step 10: calculating P nearest To P aim Vector V of aim ,V aim The direction vector of the robot is obtained;
step 11: according to the attitude vector V of the robot at the t + delta t moment robot And a direction vector V aim Calculating a yaw angle alpha and a pitch angle beta of the robot; controlling the motion angular speed of the robot according to the sizes of alpha and beta, and controlling the motion angular speed of the robot according to P aim And a set point P set The distance between the two controls the moving linear speed of the robot;
step 12: judging the current position P of the robot current Whether or not to equal P set If the difference is not equal, skipping to the step 7, and if the difference is equal, ending the navigation;
in step 3;
the inertial navigation element and the camera adopt an orthogonal fixing method, and the rotation relation between a camera coordinate system and an inertial navigation coordinate system can be directly deduced;
the inertial navigation element and the camera adopt a combined calibration method, so that the specific coordinate relation between the inertial navigation element and the camera can be calibrated;
in the step 4, the process is carried out,
the initial value of the quaternion of the attitude of the inertial navigation element which starts to solve is generally set to be [1000];
in step 5, after acquiring the posture under the world coordinate system by the camera through recognizing the Aruco mark, taking posture average data in a period of time to increase the positioning accuracy;
in step 6, the attitude of the inertial navigation element in the world coordinate system is calculated according to formula (1):
Figure FDA0003924094800000021
in the formula, q is a quaternion,
Figure FDA0003924094800000022
the quaternion represents the operation of rotation, k represents the time when the camera obtains the gesture under the world coordinate system by identifying the Aruco mark, and/or>
Figure FDA0003924094800000023
Is the gesture of the inertial navigation element in the world coordinate system at the initial moment, namely 0 moment>
Figure FDA0003924094800000024
For the position of the camera at the moment k in the world coordinate system, < >>
Figure FDA0003924094800000025
The rotation of the camera at the moment k relative to a time coordinate system of the inertial navigation element 0;
according to the formula (2)
Figure FDA0003924094800000026
Figure FDA0003924094800000027
In the formula (I), the compound is shown in the specification,
Figure FDA0003924094800000028
for solving the attitude at the moment k of the inertial navigation element>
Figure FDA0003924094800000029
Is the rotational relationship between the inertial navigation coordinate system and the camera coordinate system.
2. The method for fused inertial navigation measurement and location and navigation of Aruco markers according to claim 1, wherein the method comprises the following steps: in step 7, the attitude of the inertial navigation element at the moment k +1 in the world coordinate system is calculated according to the formula (3):
Figure FDA00039240948000000210
in the formula
Figure FDA00039240948000000211
Is the posture of the inertial navigation element at the moment k +1 in the world coordinate system, and is/is judged>
Figure FDA00039240948000000212
And solving the attitude of the inertial navigation element at the moment k + 1.
3. The method for fused inertial navigation measurement and location and navigation of Aruco markers according to claim 1, wherein the method comprises the following steps: in step 8, the pose of the camera in the world coordinate system at the time k +1 is calculated according to formula (4):
Figure FDA00039240948000000213
in the formula
Figure FDA00039240948000000214
Is the pose of the camera in the world coordinate system at time k +1, when ≥ er>
Figure FDA00039240948000000215
Is no longer dependent on the identification of the ArUco marker at the time of k +1And (4) obtaining the result. />
CN202010526692.9A 2020-06-11 2020-06-11 Positioning and navigation method integrating inertial navigation measurement and ArUco marker Active CN111649746B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010526692.9A CN111649746B (en) 2020-06-11 2020-06-11 Positioning and navigation method integrating inertial navigation measurement and ArUco marker

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010526692.9A CN111649746B (en) 2020-06-11 2020-06-11 Positioning and navigation method integrating inertial navigation measurement and ArUco marker

Publications (2)

Publication Number Publication Date
CN111649746A CN111649746A (en) 2020-09-11
CN111649746B true CN111649746B (en) 2023-03-28

Family

ID=72352846

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010526692.9A Active CN111649746B (en) 2020-06-11 2020-06-11 Positioning and navigation method integrating inertial navigation measurement and ArUco marker

Country Status (1)

Country Link
CN (1) CN111649746B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113917927A (en) * 2021-10-26 2022-01-11 沈阳航天新光集团有限公司 Bionic robot fish control system based on Leap Motion interaction

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106679648A (en) * 2016-12-08 2017-05-17 东南大学 Vision-inertia integrated SLAM (Simultaneous Localization and Mapping) method based on genetic algorithm
CN106705965A (en) * 2017-01-12 2017-05-24 苏州中德睿博智能科技有限公司 Scene three-dimensional data registration method and navigation system error correction method
CN109063703A (en) * 2018-06-29 2018-12-21 南京睿悦信息技术有限公司 Augmented reality location algorithm based on mark identification and Inertial Measurement Unit fusion
CN109374003A (en) * 2018-11-06 2019-02-22 山东科技大学 A kind of mobile robot visual positioning and air navigation aid based on ArUco code
CN109443389A (en) * 2018-11-28 2019-03-08 电子科技大学 Act the posture fusion method based on inertial sensor single step calibration in capture systems

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110057352B (en) * 2018-01-19 2021-07-16 北京图森智途科技有限公司 Camera attitude angle determination method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106679648A (en) * 2016-12-08 2017-05-17 东南大学 Vision-inertia integrated SLAM (Simultaneous Localization and Mapping) method based on genetic algorithm
CN106705965A (en) * 2017-01-12 2017-05-24 苏州中德睿博智能科技有限公司 Scene three-dimensional data registration method and navigation system error correction method
CN109063703A (en) * 2018-06-29 2018-12-21 南京睿悦信息技术有限公司 Augmented reality location algorithm based on mark identification and Inertial Measurement Unit fusion
CN109374003A (en) * 2018-11-06 2019-02-22 山东科技大学 A kind of mobile robot visual positioning and air navigation aid based on ArUco code
CN109443389A (en) * 2018-11-28 2019-03-08 电子科技大学 Act the posture fusion method based on inertial sensor single step calibration in capture systems

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
未知环境下飞行器视觉/惯导组合测速测高方法;苏昂等;《国防科技大学学报》;20140228(第01期);全文 *

Also Published As

Publication number Publication date
CN111649746A (en) 2020-09-11

Similar Documents

Publication Publication Date Title
CN110869700B (en) System and method for determining vehicle position
KR102627453B1 (en) Method and device to estimate position
CN103175524B (en) A kind of position of aircraft without view-based access control model under marking environment and attitude determination method
KR102016636B1 (en) Calibration apparatus and method of camera and rader
CN111426320A (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN110488838B (en) Accurate repeated positioning method for indoor autonomous navigation robot
CN114415736A (en) Multi-stage visual accurate landing method and device for unmanned aerial vehicle
CN111649746B (en) Positioning and navigation method integrating inertial navigation measurement and ArUco marker
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment
CN114820793A (en) Target detection and target point positioning method and system based on unmanned aerial vehicle
CN113158387B (en) Visual target point arrangement method based on laser radar grid map coupling
Niu et al. Camera-based lane-aided multi-information integration for land vehicle navigation
WO2021166845A1 (en) Information processing device, information processing method, and program
Burdziakowski Towards precise visual navigation and direct georeferencing for MAV using ORB-SLAM2
CN112074706A (en) Accurate positioning system
KR20210010309A (en) Apparatus and method for generating three dimensional map using aerial images
CN111521996A (en) Laser radar installation calibration method
Shmatko et al. Estimation of rotation measurement error of objects using computer simulation
US20240077880A1 (en) Slope location correction method and apparatus, robot and readable storage medium
CN112904883B (en) Terrain perception method, motion control method and system for quadruped robot
US11348278B2 (en) Object detection
Wang et al. Indoor visual navigation system based on paired-landmark for small UAVs
Cho et al. Analysis in long-range apriltag pose estimation and error modeling
KR20160070384A (en) System for detecting flying object by thermal image monitoring
CN111258306B (en) Vehicle positioning method and device based on imaging equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant