CN114252073A - Robot attitude data fusion method - Google Patents

Robot attitude data fusion method Download PDF

Info

Publication number
CN114252073A
CN114252073A CN202111415634.XA CN202111415634A CN114252073A CN 114252073 A CN114252073 A CN 114252073A CN 202111415634 A CN202111415634 A CN 202111415634A CN 114252073 A CN114252073 A CN 114252073A
Authority
CN
China
Prior art keywords
angle
coordinate system
quaternion
acceleration
data fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111415634.XA
Other languages
Chinese (zh)
Other versions
CN114252073B (en
Inventor
张琳
骆敏舟
杭小树
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Intelligent Manufacturing Technology JITRI
Original Assignee
Institute of Intelligent Manufacturing Technology JITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Intelligent Manufacturing Technology JITRI filed Critical Institute of Intelligent Manufacturing Technology JITRI
Priority to CN202111415634.XA priority Critical patent/CN114252073B/en
Publication of CN114252073A publication Critical patent/CN114252073A/en
Application granted granted Critical
Publication of CN114252073B publication Critical patent/CN114252073B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a robot attitude data fusion method, which utilizes angular acceleration integral output by a gyroscope to obtain attitude angle quaternion and carries out three-axis acceleration and acceleration true value [0,0,1 ] under a world coordinate system]TComparing to obtain delta qacc to correct the roll angle and pitch angle, and obtaining delta q according to the output of the magnetometermagTo correct the yaw angle and then to correct the attitude. The drawn map obtained by data fusion of the invention has obvious edge lines, smooth wall and clear right-angled corners.

Description

Robot attitude data fusion method
Technical Field
The invention belongs to the technical field of robots, and particularly relates to a robot attitude data fusion method.
Background
SLAM (simultaneous localization and mapping) of a mobile robot is one of the key technologies for automatic navigation. With the continuous improvement of artificial intelligence and hardware computing capability, the SLAM technology is more and more diversified. SLAM based on two-dimensional radar has been a hotspot in research, and will face relatively complicated environmental problems and difficult problems. Gmapping is a common filtering-based open source SLAM algorithm framework. The algorithm can draw an indoor map in real time, and the progress is high when a small scene map is created, but the algorithm is highly dependent on odometer data. If the odometer uses only data obtained from the motion sensor to estimate the position of the object, the position fix may be biased. In practice, there is also wheel slip and cumulative error. Accurate odometer data is very important for the whole system, and a certain deviation can occur if the odometer is obtained by using the encoder alone.
The IMU is an inertia measuring unit and is a high-sensitivity sensor, and due to physical factors in the manufacturing process, certain deviation exists between the actual coordinate axis and the ideal coordinate axis of the IMU inertia measuring unit, so that IMU noise, scale errors and axis deviation are caused. Therefore, calibration of the collected data is required to compensate for these errors. In the data fusion process of the IMU calibration data, the high-frequency noise of the accelerometer and the magnetometer and the low-frequency noise of the gyroscope can cause a large error of attitude calculation. Therefore, a proper data fusion method needs to be selected.
Disclosure of Invention
The invention provides a robot attitude data fusion method aiming at the problems in the prior art.
The invention solves the technical problems through the following technical means:
the accelerometer is sensitive to acceleration, and an error of an inclination angle calculated by taking an instantaneous value is large; the acceleration measures the inclination angle, the dynamic response of which is slow, and the signal is not available at high frequency, so that the high frequency can be suppressed by low pass. The angle obtained by integrating the gyroscope is not influenced by the acceleration, but the error caused by integral drift and temperature drift along with the increase of time is larger, the response of the gyroscope is fast, and the inclination angle can be measured after integration.
The invention uses the angular acceleration output by the gyroscope to obtain an angle through integration, and then corrects the roll angle and the pitch angle of the angle through the output of the accelerometer; and finally correcting the yaw angle through the magnetometer.
The method comprises the following specific steps:
1) and (3) calculating the attitude angle quaternion at the K-th moment under the discrete system by adopting the following formula:
Figure BDA0003375660940000021
g represents a world coordinate system, L represents a current coordinate system,
Figure BDA0003375660940000022
representing the derivative of quaternion q at time k.
2) The three-axis acceleration is converted to the world coordinate system using the following equation:
Figure BDA0003375660940000023
to represent
Figure BDA0003375660940000025
And (4) performing inverse transformation on the quaternion.
3) The obtained estimation of the acceleration under the world coordinate systemGgpComparing with actual gravity acceleration, and calculating by using the following formula
Figure BDA0003375660940000026
Error quaternion of (2):
Figure BDA0003375660940000027
through the calculation, the method has the advantages that,
Figure BDA0003375660940000028
4) calculating Δ q using the formulamag
Δqmag=[Δq0mag 0 0 Δq3mag]T
Through the calculation, the method has the advantages that,
Figure BDA0003375660940000031
5) and posture correction by the following formula
Figure BDA0003375660940000032
The invention has the beneficial effects that: the drawn map obtained by data fusion of the invention has obvious edge lines, smooth wall and clear right-angled corners.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a rendered map obtained without the method of the present invention;
FIG. 3 is a rendered map obtained by the method of the present invention;
fig. 4 is an application scenario of the robot pose data fusion method of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
As shown in fig. 4, which is an application scenario of the robot attitude data fusion method of the present invention, the IMU attitude sensor is located on the central axis of the disinfection robot, coaxial with the laser radar, the robot is driven by wheels, the odometer is located in the DS servo driver connected with the hub motors, and the hub motors are distributed on both sides of the central axis, and are symmetric about the central axis.
The rotation matrix R in SLAM can be represented by a quaternion q, q having a real part and three imaginary parts, written q ═ q0, q1, q2, q3]TOr q ═ w, x, y, z]TQ0 is a real part and is a scalar, [ q1, q2, q3]TIs the imaginary part, which is a vector.
Fig. 1 is a flow chart of the method, and the method specifically includes the following steps:
using a quaternion q to represent the rotation, the attitude angle quaternion at the K-th time in the discrete system can be obtained by the following formula (1):
Figure BDA0003375660940000041
in the formula (1), G represents a world coordinate system, and L represents a current coordinate system;
Figure BDA0003375660940000042
multiplying any vector in the world coordinate system, and rotating the vector from the world coordinate system to the current coordinate system;
Figure BDA0003375660940000043
the derivative of quaternion q at time k;
the quaternion derivative calculation formula (2) is:
Figure BDA0003375660940000044
in the formula (2), the formula (3) for calculating the circulant is:
Figure BDA0003375660940000045
using forward difference calculation in equation (1)
Figure BDA0003375660940000051
Figure BDA0003375660940000052
Multiply the circle into a common dot product operation:
Figure BDA0003375660940000053
Figure BDA0003375660940000054
Figure BDA0003375660940000055
using calculation from angular velocity
Figure BDA0003375660940000056
Converting the three-axis acceleration into a world coordinate system, and utilizing the real value of the converted acceleration [0,0,1 ] in the world coordinate system]TAnd (gravity acceleration) comparison is carried out to obtain delta qacc, and because the three-axis acceleration data is utilized, the roll angle and the pitch angle can only be calculated, so that the delta qacc only comprises the correction of the roll angle and the pitch angle. Obtained by means of gyroscopes
Figure BDA0003375660940000057
Will be provided withLa is converted to the world coordinate system:
Figure BDA0003375660940000058
in the formula (8), the reaction mixture is,
Figure BDA0003375660940000059
to represent
Figure BDA00033756609400000510
And (4) performing inverse transformation on the quaternion.
The obtained estimation of the acceleration under the world coordinate systemGgpComparing with the actual gravity acceleration to calculate
Figure BDA00033756609400000511
Error quaternion of (2):
Figure BDA00033756609400000512
r (q) represents a rotation matrix corresponding to the quaternion q, and the specific formula is as follows:
Figure BDA0003375660940000061
finishing to obtain:
Figure BDA0003375660940000062
let Δ qacc equal to 0, and finally solve:
Figure BDA0003375660940000063
after superposition, the following can be obtained:
Figure BDA0003375660940000064
because the accelerometer can only correct the roll angle and the yaw angle, the yaw angle is corrected by adopting the magnetometer, and the output of the magnetometer can be understood as the table or projection of the included angle of the magnetic wire and the three axes of the machine body coordinate system.
Δqmag=[Δq0mag 0 0 Δq3mag]T......(14);
Figure BDA0003375660940000065
Coupled (14) and (15) to obtain:
Figure BDA0003375660940000071
gamma is a gamma function in formula (16);
correcting the attitude after acceleration, adding yaw angle correction:
Figure BDA0003375660940000072
the simulation experiment is carried out in a Turtlebot simulation experiment environment, wherein FIG. 2 corresponds to a drawn map obtained without data fusion according to the present invention, and FIG. 3 corresponds to a drawn map obtained with data fusion according to the present invention. As can be clearly seen from the simulation graph, the map before data fusion has the defects that the deviation of wall identification is unclear and particularly the position of a right-angle corner has a fuzzy condition. The map edge line after data fusion is obvious and the wall body is smooth, and the right-angle corner is also clear.
It is noted that, in this document, relational terms such as first and second, and the like, if any, are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (5)

1. A robot attitude data fusion method is characterized by comprising the following steps:
step 1), obtaining an attitude angle quaternion by utilizing angular acceleration integral output by a gyroscope;
step 2), three-axis acceleration and true acceleration value [0,0,1 ] are calculated under a world coordinate system]TComparing to obtain delta qacc to correct a rolling angle and a pitching angle;
step 3), obtaining delta q according to output of the magnetometermagCorrecting the yaw angle;
step 4), adopting the following formula to carry out attitude correction
Figure FDA0003375660930000011
2. The robot attitude data fusion method according to claim 1, wherein the attitude angle quaternion at the K-th time in the discrete system in step 1) is obtained by the following calculation:
Figure FDA0003375660930000012
g represents a world coordinate system, L represents a current coordinate system,
Figure FDA0003375660930000013
representing the derivative of quaternion q at time k.
3. The method for fusing robot pose data according to claim 1, wherein the three-axis acceleration is converted to the world coordinate system in step 2) by using the following formula:
Figure FDA0003375660930000014
Figure FDA0003375660930000015
to represent
Figure FDA0003375660930000016
And (4) performing inverse transformation on the quaternion.
4. A robot pose data fusion method according to claim 3, wherein the obtained estimation of acceleration under world coordinate systemGgpComparing with actual gravity acceleration, and calculating by using the following formula
Figure FDA0003375660930000017
Error quaternion of (2):
Figure FDA0003375660930000018
through the calculation, the method has the advantages that,
Figure FDA0003375660930000021
5. the method as claimed in claim 1, wherein in step 3), Δ q is calculated by using the following formulamag
Δqmag=[Δq0mag 0 0 Δq3mag]T
Through the calculation, the method has the advantages that,
Figure FDA0003375660930000022
CN202111415634.XA 2021-11-25 2021-11-25 Robot attitude data fusion method Active CN114252073B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111415634.XA CN114252073B (en) 2021-11-25 2021-11-25 Robot attitude data fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111415634.XA CN114252073B (en) 2021-11-25 2021-11-25 Robot attitude data fusion method

Publications (2)

Publication Number Publication Date
CN114252073A true CN114252073A (en) 2022-03-29
CN114252073B CN114252073B (en) 2023-09-15

Family

ID=80793301

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111415634.XA Active CN114252073B (en) 2021-11-25 2021-11-25 Robot attitude data fusion method

Country Status (1)

Country Link
CN (1) CN114252073B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117288187A (en) * 2023-11-23 2023-12-26 北京小米机器人技术有限公司 Robot pose determining method and device, electronic equipment and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7216055B1 (en) * 1998-06-05 2007-05-08 Crossbow Technology, Inc. Dynamic attitude measurement method and apparatus
CN102438521A (en) * 2009-04-24 2012-05-02 原子能和辅助替代能源委员会 System and method for determining the posture of a person
CN108225370A (en) * 2017-12-15 2018-06-29 路军 A kind of data fusion and calculation method of athletic posture sensor
WO2018214226A1 (en) * 2017-05-22 2018-11-29 深圳市靖洲科技有限公司 Unmanned vehicle real-time posture measurement method
CN109001787A (en) * 2018-05-25 2018-12-14 北京大学深圳研究生院 A kind of method and its merge sensor of solving of attitude and positioning
CN109108936A (en) * 2018-10-24 2019-01-01 电子科技大学 A kind of the self-balance robot control system and control method of Multiple Source Sensor data fusion
CN110231029A (en) * 2019-05-08 2019-09-13 西安交通大学 A kind of underwater robot Multi-sensor Fusion data processing method
CN110440746A (en) * 2019-08-05 2019-11-12 桂林电子科技大学 A kind of no-dig technique subterranean drill bit posture fusion method based on the decline of quaternary number gradient
CN110595434A (en) * 2019-09-10 2019-12-20 兰州交通大学 Quaternion fusion attitude estimation method based on MEMS sensor
CN111156998A (en) * 2019-12-26 2020-05-15 华南理工大学 Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN112082547A (en) * 2020-09-08 2020-12-15 北京邮电大学 Integrated navigation system optimization method and device, electronic equipment and storage medium
CN112256047A (en) * 2020-09-30 2021-01-22 江苏集萃智能制造技术研究所有限公司 Quaternion-based four-foot attitude control strategy
CN112665574A (en) * 2020-11-26 2021-04-16 江苏科技大学 Underwater robot attitude acquisition method based on momentum gradient descent method
CN113091738A (en) * 2021-04-09 2021-07-09 安徽工程大学 Mobile robot map construction method based on visual inertial navigation fusion and related equipment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7216055B1 (en) * 1998-06-05 2007-05-08 Crossbow Technology, Inc. Dynamic attitude measurement method and apparatus
CN102438521A (en) * 2009-04-24 2012-05-02 原子能和辅助替代能源委员会 System and method for determining the posture of a person
WO2018214226A1 (en) * 2017-05-22 2018-11-29 深圳市靖洲科技有限公司 Unmanned vehicle real-time posture measurement method
CN108225370A (en) * 2017-12-15 2018-06-29 路军 A kind of data fusion and calculation method of athletic posture sensor
CN109001787A (en) * 2018-05-25 2018-12-14 北京大学深圳研究生院 A kind of method and its merge sensor of solving of attitude and positioning
CN109108936A (en) * 2018-10-24 2019-01-01 电子科技大学 A kind of the self-balance robot control system and control method of Multiple Source Sensor data fusion
CN110231029A (en) * 2019-05-08 2019-09-13 西安交通大学 A kind of underwater robot Multi-sensor Fusion data processing method
CN110440746A (en) * 2019-08-05 2019-11-12 桂林电子科技大学 A kind of no-dig technique subterranean drill bit posture fusion method based on the decline of quaternary number gradient
CN110595434A (en) * 2019-09-10 2019-12-20 兰州交通大学 Quaternion fusion attitude estimation method based on MEMS sensor
CN111156998A (en) * 2019-12-26 2020-05-15 华南理工大学 Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN112082547A (en) * 2020-09-08 2020-12-15 北京邮电大学 Integrated navigation system optimization method and device, electronic equipment and storage medium
CN112256047A (en) * 2020-09-30 2021-01-22 江苏集萃智能制造技术研究所有限公司 Quaternion-based four-foot attitude control strategy
CN112665574A (en) * 2020-11-26 2021-04-16 江苏科技大学 Underwater robot attitude acquisition method based on momentum gradient descent method
CN113091738A (en) * 2021-04-09 2021-07-09 安徽工程大学 Mobile robot map construction method based on visual inertial navigation fusion and related equipment

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
SAURAV AGARWAL等: "Robust Pose-Graph SLAM Using Absolute Orientation Sensing", pages 981 - 988 *
吴骁航: "非理想情况下非线性***的滤波及信息融合算法", no. 1, pages 135 - 81 *
孙玉杰: "协作机器人姿态检测及姿态误差补偿技术研", no. 9, pages 140 - 311 *
章政等: "运动加速度抑制的动态步长梯度下降姿态解算算法", no. 2, pages 136 - 143 *
赵梓乔: "基于三维激光扫描仪的室内移动机器人定位与建图", no. 1, pages 140 - 552 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117288187A (en) * 2023-11-23 2023-12-26 北京小米机器人技术有限公司 Robot pose determining method and device, electronic equipment and storage medium
CN117288187B (en) * 2023-11-23 2024-02-23 北京小米机器人技术有限公司 Robot pose determining method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114252073B (en) 2023-09-15

Similar Documents

Publication Publication Date Title
CN111207774B (en) Method and system for laser-IMU external reference calibration
CN110398245B (en) Indoor pedestrian navigation attitude estimation method based on foot-worn inertial measurement unit
CN111896007B (en) Attitude calculation method for quadruped robot for compensating foot-ground impact
CN112815939B (en) Pose estimation method of mobile robot and computer readable storage medium
CN113155129B (en) Holder attitude estimation method based on extended Kalman filtering
JP2012173190A (en) Positioning system and positioning method
CN110986928B (en) Real-time drift correction method for triaxial gyroscope of photoelectric pod
CN116817896B (en) Gesture resolving method based on extended Kalman filtering
CN116147624B (en) Ship motion attitude calculation method based on low-cost MEMS navigation attitude reference system
CN115540860A (en) Multi-sensor fusion pose estimation algorithm
CN114252073B (en) Robot attitude data fusion method
CN115752507A (en) Online single-steering-wheel AGV parameter calibration method and system based on two-dimensional code navigation
CN109443355B (en) Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF
CN111679681B (en) Underwater robot motion path tracking method based on sliding mode robust control
CN115993089B (en) PL-ICP-based online four-steering-wheel AGV internal and external parameter calibration method
CN113155152A (en) Camera and inertial sensor spatial relationship self-calibration method based on lie group filtering
CN112904883B (en) Terrain perception method, motion control method and system for quadruped robot
CN110733671B (en) Dynamic correction method for small celestial body spin angular velocity
CN113237485A (en) SLAM method and system based on multi-sensor fusion
CN113483762A (en) Pose optimization method and device
CN115717901B (en) Inertial/visual odometer installation error estimation method based on filtering
CN112589806B (en) Robot pose information determination method, device, equipment and storage medium
Min et al. Quadrotor hovering scheme based on improved optical flow
Doer et al. Inertial sensor data based motion estimation aided by image processing and differential barometry
CN116858230A (en) Lissy-based inertial navigation/radar combined navigation method of radar ranging and speed measuring emission system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant