CN109186596B - IMU measurement data generation method, system, computer device and readable storage medium - Google Patents

IMU measurement data generation method, system, computer device and readable storage medium Download PDF

Info

Publication number
CN109186596B
CN109186596B CN201810939333.9A CN201810939333A CN109186596B CN 109186596 B CN109186596 B CN 109186596B CN 201810939333 A CN201810939333 A CN 201810939333A CN 109186596 B CN109186596 B CN 109186596B
Authority
CN
China
Prior art keywords
information
pose
trajectory
motion model
interpolation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810939333.9A
Other languages
Chinese (zh)
Other versions
CN109186596A (en
Inventor
凌勇
梁斌
刘厚德
姜军
王学谦
朱晓俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Research Institute Tsinghua University
Original Assignee
Shenzhen Research Institute Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Research Institute Tsinghua University filed Critical Shenzhen Research Institute Tsinghua University
Priority to CN201810939333.9A priority Critical patent/CN109186596B/en
Publication of CN109186596A publication Critical patent/CN109186596A/en
Application granted granted Critical
Publication of CN109186596B publication Critical patent/CN109186596B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an IMU measurement data generation method and device, a computer device and a readable storage medium, wherein the IMU measurement data generation method comprises the steps of obtaining pose track data, wherein the pose track data comprise a plurality of pose track information; carrying out interpolation processing on the two phases of adjacent position attitude trajectory information to obtain one or more interpolated position trajectory information positioned between the two phases of adjacent position attitude trajectory information; calculating acceleration information of a motion model according to position information in the pose track data; calculating the angular velocity information of the motion model according to the rotation information in the pose track data; and executing pose transformation operation on the motion model. According to the invention, the measurement data of the IMU can be generated according to the pose track information of the camera, so that the hardware synchronization problem after the IMU is installed on the robot and the problem of the relative installation position of the camera and the IMU are avoided.

Description

IMU measurement data generation method, system, computer device and readable storage medium
Technical Field
The invention relates to the field of robots, in particular to an IMU measurement data generation method and system based on camera pose, a computer device and a readable storage medium.
Background
This section is intended to provide a background or context to the embodiments of the invention that are recited in the claims and the detailed description. The description herein is not admitted to be prior art by inclusion in this section.
With the continuous use and development of mobile robots in human life, the line of sight of cleaning robots, family accompanying robots, meal delivery robots and the like which continuously enter the public is cleaned. The development of the robot at present is focused and emphasized on that the robot can acquire environmental information through various sensors, and complete certain tasks by utilizing artificial intelligence to recognize, understand and reason and make judgment and decision.
The SLAM (Simultaneous Localization And Mapping) problem can be described as: a robot is placed at an unknown position in an unknown environment, and whether a method is available for the robot to gradually draw a map of the environment and estimate the position and the posture of the robot according to the generated map. SLAM was first proposed by Smith, Self and Cheeseman in 1988 and, due to its important theoretical and application value, was considered by many scholars to be critical for the realization of truly autonomous mobile robots. And the visual SLAM is a SLAM method using a visual sensor as a robot perception device. The vision sensor may be classified into a monocular camera, a binocular camera, an RGB-D camera, etc., wherein the RGB-D camera is a vision sensor capable of simultaneously obtaining color image information (RGB) and Depth information (Depth) of an image.
The visual odometer is an important component of the SLAM technology, and the part is mainly responsible for collecting information of the sensor and preliminarily estimating a motion track and a motion state of the camera (namely the motion track and the motion state of the fixedly connected robot). After the above information is preliminarily estimated, the odometer part will transmit the result to other parts of the SLAM to complete estimation with higher accuracy.
An IMU (Inertial Measurement Unit) is a device that measures angular velocities and linear accelerations of three axes of an object. Typically, an IMU contains three single axis accelerometers (accelerometers) and three single axis gyroscopes (gyroscopes). The accelerometer can detect acceleration signals of the object on three independent axes of a carrier coordinate system, the gyroscope detects angular velocity signals of the carrier relative to the carrier coordinate system, and the angular velocity and the acceleration of the object in a three-dimensional space are measured together with the accelerometer and the gyroscope, so that the posture of the object is calculated. Therefore, the IMU has important application value in the navigation and control of the robot.
In the field of measuring the effect of the visual inertia odometer and the visual inertia SLAM algorithm, and testing the usability of the algorithm, an IMU can be installed on the robot, but if the IMU is installed at the same time, the problem of hardware synchronization (namely measurement time synchronization) between IMU measurement data and camera measurement data and the problem of relative installation positions of the camera and the IMU need to be considered, which relates to the calibration problem of the data of the camera and the IMU, and no perfect solution exists at present. And for most open source datasets in the field of visual SLAM, there is no measurement data containing the IMU.
Disclosure of Invention
In view of the above, the present invention provides an IMU measurement data generation method, system, computer device and readable storage medium, which can generate IMU data based on phase pose information, and is advantageous for solving the problem of hardware synchronization involved after the robot installs an IMU and the problem of relative installation position of a camera and an IMU.
An IMU measurement data generation method, comprising:
acquiring pose track data, wherein the pose track data comprises a plurality of pose track information, and each pose track information comprises position information and rotation information;
interpolating two phases of adjacent position posture trajectory information to obtain one or more interpolated position trajectory information positioned between the two phases of adjacent position posture trajectory information, and updating the position trajectory data;
calculating acceleration information of a motion model according to the position information in the updated pose trajectory data; calculating the angular velocity information of the motion model according to the updated rotation information in the pose trajectory data;
and executing a pose transformation operation on the motion model to transform the motion model into a motion model under an inertial measurement unit coordinate system.
Further, in the IMU measurement data generation method, the interpolating the two phases of neighboring pose trajectory information includes:
and carrying out interpolation processing on the two adjacent pose track information according to the frequency of the inertial measurement unit.
Further, in the IMU measurement data generation method, the interpolating the two phases of neighboring pose trajectory information includes:
performing interpolation operation on the position information in the two-phase adjacent position attitude trajectory information by using a linear interpolation algorithm;
and carrying out interpolation operation on the rotation information in the two-phase adjacent position attitude trajectory information by utilizing the spherical interpolation value.
Further, in the IMU measurement data generation method, the interpolating the two phases of neighboring pose trajectory information includes:
performing interpolation operation on the position information in the two phases of adjacent position attitude trajectory information by utilizing a polynomial function algorithm;
and carrying out interpolation operation on the rotation information in the two-phase adjacent position attitude trajectory information by utilizing the spherical interpolation value.
Further, in the IMU measurement data generation method, the interpolating, with a spherical interpolation value, the rotation information in the two phases of neighboring pose trajectory information includes:
judging whether the dot product of rotation information in first position posture track information and rotation information in second position posture track information is negative or not, wherein the two-phase position posture track information comprises the first position posture track information and the second position posture track information;
when the dot product is negative, performing interpolation operation between the rotation information in the first position attitude trajectory information and the conjugate of the rotation information in the second position attitude trajectory information;
when the dot product is positive, performing interpolation operation between the rotation information in the first position posture trajectory information and the rotation information in the second position posture trajectory information.
Further, in the IMU measurement data generation method, the performing a pose transformation operation on the motion model includes:
and multiplying the motion model by a homogeneous transformation matrix to obtain the motion model under the coordinate system of the inertial measurement unit.
Further, the IMU measurement data generation method further includes:
and carrying out noise processing on the transformed motion model to generate measurement data corresponding to the inertial measurement unit.
An IMU measurement data generation system, comprising:
the system comprises an acquisition unit, a rotation unit and a control unit, wherein the acquisition unit is used for acquiring pose track data, the pose track data comprises a plurality of pose track information, and each pose track information comprises position information and rotation information;
the interpolation unit is used for carrying out interpolation processing on the two phases of adjacent position posture trajectory information to obtain one or more interpolated position trajectory information positioned between the two phases of adjacent position posture trajectory information and updating the position trajectory data;
the calculation unit is used for calculating the acceleration information of the motion model according to the position information in the updated pose track data; calculating the angular velocity information of the motion model according to the rotation information in the updated pose trajectory data;
and the transformation unit is used for executing pose transformation operation on the motion model so as to transform the motion model into a motion model under an inertial measurement unit coordinate system.
A computer arrangement comprising a processor for implementing the steps of the IMU measurement data generation method when executing a computer program stored in a memory.
A readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the IMU measurement data generation method.
The IMU measurement data generation method, the IMU measurement data generation system, the computer device and the readable storage medium can generate the IMU measurement data according to the pose track information of the camera, and therefore the problem of hardware synchronization after the IMU is installed on the robot and the problem of relative installation positions of the camera and the IMU are solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flow chart of a preferred embodiment of the IMU measurement data generation method of the present invention;
FIG. 2 is a schematic diagram of an IMU measurement data generation system of the present invention applied to a computer device;
FIG. 3 is a block diagram of a preferred embodiment of the IMU measurement data generation system of the present invention.
Description of the main elements
Computer device 1
IMU measurement data generation system 417
Processor 401
Display screen 403
Memory 405
Input/output interface 407
Network interface 409
Acquisition unit 300
Interpolation unit 302
Calculation unit 304
Transform unit 306
Processing unit 308
Detailed Description
So that the manner in which the above recited objects, features and advantages of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to the embodiments thereof which are illustrated in the appended drawings. In addition, the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are merely a subset of the embodiments of the present invention, rather than a complete embodiment. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Fig. 1 is a flowchart of an IMU measurement data generation method according to a first embodiment of the present invention, and it should be noted that the capturing method according to this embodiment is not limited to the steps and the sequence in the flowchart shown in fig. 1. Steps in the illustrated flowcharts may be added, removed, or changed in order according to various needs.
Step 100, pose trajectory data are obtained, wherein the pose trajectory data comprise a plurality of pose trajectory information, and each pose trajectory information comprises position information and rotation information.
It is understood that a camera may be disposed on the robot to obtain attitude data such as a motion trajectory and a motion state of the robot. The camera may have a six degree of freedom trajectory (from which the data set may come from a group route). The pose data may be stored in a pose trajectory file or in a pose trajectory database, and thus may be obtained from the corresponding pose trajectory file or database when the pose trajectory data is obtained. Preferably, the pose trajectory data may include pose trajectory information, each of which may include position timestamp, position information, and rotation information. Specifically, the data type of the timestamp may be double precision (double type); the position information may correspond to a three degree of freedom position, and may have three double precision types: the rotation information may correspond to three-degree-of-freedom rotation information, and may have four double-precision types, where the rotation information may be a quaternion, and its representation may be: w + xi + yj + zk.
For example, for a bit of gesture trajectory information, its timestamp may be: 1305031102.211214, the first, second and third positions in the position information may be-0.001420095, 0.007655886, 0.010920254, respectively; the quaternion x, quaternion y, quaternion z and quaternion w in the rotation information may be respectively: -0.001964978, -0.003080067, -0.001635573, 0.999992013.
And 102, performing interpolation processing on the two phases of adjacent position posture trajectory information to obtain one or more interpolated position trajectory information positioned between the two phases of adjacent position posture trajectory information, and updating the position trajectory data.
In an embodiment, the two adjacent pose trajectory information may be interpolated according to the frequency of the inertial measurement unit. The frequency of IMU data measurement is generally higher than that of camera measurement, and some IMU data measurement frequencies can reach more than 5 times. For example, the camera may measure around 30Hz and the IMU may measure around 200 Hz. In other embodiments, the two adjacent pose trajectory information may be interpolated in other manners (by inserting a fixed number of pose trajectory information into the two adjacent pose trajectory information).
In an embodiment, the frequency of camera measurements may be derived by calculation of a timestamp of the camera. For example, the pose trajectory data may include first pose trajectory information, second pose trajectory information, third pose trajectory information, and fourth pose trajectory information. The measurement frequency of the camera may be based on the number of records of the pose trajectory information in the pose trajectory data in a unit time (e.g., 1 second), and thus, when the time intervals corresponding to the two adjacent position pose trajectory data are different (the difference between the time stamps of the two adjacent position pose trajectory data may be represented as a time interval), the number of pose trajectory information inserted in the two adjacent position pose trajectory data may be different. In other embodiments, the measurement frequency of the camera may also be obtained by other methods, such as obtaining relevant parameters of the camera.
Preferably, after the measurement frequency of the camera is confirmed, interpolation processing can be performed on the two adjacent position attitude trajectory information according to the frequency of the inertial measurement unit. For example, N-bit posture trajectory information may be inserted between the first posture trajectory information and the second posture trajectory information, M-bit posture trajectory information may be inserted between the second posture trajectory information and the third posture trajectory information, and P-bit posture trajectory information may be inserted between the third posture trajectory information and the fourth posture trajectory information. It will be appreciated that the frequency of the inertial measurement unit may be stored in a recipe file, and as such, the frequency of the inertial measurement unit may be obtained from the recipe file.
It is to be understood that the interpolation processing in the two-phase neighboring pose trajectory information may include interpolation processing of position information and interpolation processing of rotation information.
Preferably, in the interpolation processing of the position information, if the time interval between the two adjacent position posture trajectory information is within a preset range (for example, within 0.03 s), which indicates that the motion change of the robot or the carrier is not severe, the interpolation algorithm may be used to calculate the posture trajectory information of the plurality of virtual cameras between the two adjacent position posture trajectory information.
For the three-degree-of-freedom pose information of the camera, a linear interpolation algorithm is used to complete interpolation operation, as follows:
y=ty0+(1-t)y1
wherein t is set in the interval [0, 1]]Y of0Representing pose track information, y, corresponding to a first timestamp in the two adjacent pose track information1Representing pose track information corresponding to a second timestamp in the two adjacent pose track information, wherein the first timestamp can be earlier than the second timestamp, so that the interpolation result can be ensured to be in y0And y 1. A plurality of interpolation results can be obtained by setting a plurality of t. For example, N bits of posture track information are inserted between the first position posture track information and the second position posture track information, and t may be N; inserting M position track information between the second position track information and the third position track information, wherein t can be M. The time stamp in the pose track information can also be determined based on the formula, such as y in the formula0And y1And replacing the first position gesture track information with the corresponding time stamp in the first position gesture track information and the time stamp in the second position gesture track information.
It will be appreciated that in addition to linear interpolation, where the motion of the camera changes more dramatically, interpolation operations may be performed using polynomial functions, which may model the functional changes of the polynomial.
Preferably, in the interpolation process of the rotation information, a spherical interpolation algorithm may be used. Since the quaternion of the rotation information is distributed in the manifold space (e.g., spherical space), which is not closed to addition and subtraction, the result of using simple linear interpolation is not uniform and it cannot be guaranteed that it is still a unit quaternion. Therefore, the rotation information can be interpolated by the spherical interpolation method.
For example, for the rotation information Q1 in the first position posture trajectory information and the rotation information Q2 in the adjacent second position posture trajectory information, the dot product of Q1 and Q2 may be calculated first, if the dot product is negative, which indicates that the included angle between Q1 and Q2 is an angle greater than 180 degrees, then Q2 is conjugated to Q2, and the interpolation between Q1 and Q2 is the optimal interpolation; if the dot product is positive, interpolation is performed between Q1 and Q2.
For the following calculations:
K0=sin((1.0-t)*atan(sina/cosa))/sina;
K1=sin(t*atan(sina/cosa))/sina;
wherein cosa is the absolute value of the dot product of Q1 and Q2, sina ═ sqrt (1-cosa ═ cosa).
And outputting a result: k0 × Q1+ K1 × Q2 or K0 × Q1+ K1 (Q2), and the interpolation result can ensure the uniform distribution of quaternions.
It is to be understood that after the interpolation operation is completed in the two adjacent position pose trajectory information, one or more interpolated position pose trajectory information may be included between the two adjacent position pose trajectory information, and at this time, the position pose trajectory data may be updated. For example, when 3-bit pose trajectory information (first interpolation pose trajectory information, second interpolation pose trajectory information, and third interpolation pose trajectory information) is inserted between the first pose trajectory information and the second pose trajectory information, the updated pose trajectory data may include the first pose trajectory information, the first interpolation pose trajectory information, the second interpolation pose trajectory information, the third interpolation pose trajectory information, and the second pose trajectory information.
104, calculating the acceleration information of the motion model according to the position information in the updated pose track data; and calculating the angular velocity information of the motion model according to the updated rotation information in the pose trajectory data.
It is to be understood that calculating the acceleration information a [ k ] of the motion model according to the position information in the updated pose trajectory data may be expressed as:
vk ═ p [ k +1] — (p [ k ])/Δ t;
a[k]=(v[k+1]-v[k])/Δt;
wherein p [ k +1] and p [ k ] represent position information in the two adjacent position attitude trajectory information, delta t represents the difference of time stamps in the two adjacent position attitude trajectory information, and k is a natural number. For example, for updated pose trajectory data: when k is 1, p [ k +1] and p [ k ] represent position information in the first position trajectory information and the first interpolation pose trajectory information, and delta t represents the difference between timestamps in the first position trajectory information and the first interpolation pose trajectory information.
It is to be understood that the angular velocity information ω of the calculated motion model according to the rotation information in the updated pose trajectory data may be represented as:
Figure BDA0001766128450000101
wherein, q [ k +1], q [ k ] represent rotation information in the two adjacent position posture orbit information, delta t represents the difference of time stamps in the two adjacent position posture orbit information, and k is a natural number. For example, for updated pose trajectory data: when k is 1, q [ k +1] and q [ k ] represent rotation information in the first position trajectory information and the first interpolation pose trajectory information, and delta t represents the difference between timestamps in the first position trajectory information and the first interpolation pose trajectory information.
And 106, executing a pose transformation operation on the motion model to transform the motion model into a motion model under an inertial measurement unit coordinate system.
It will be appreciated that throughout the system, a total of three coordinate systems are used: camera coordinate system FCIMU coordinate system FIAnd a world coordinate system FW. The world coordinate system is the actual northeast sky coordinate system, and in this coordinate system, the earth gravity vector is (0, 0, -9.78) (Shenzhen). The camera coordinate system is a coordinate system fixedly connected with the camera at the origin in the self-moving process of the camera, and the IMU coordinate system is a coordinate system fixedly connected with the IMU in the moving process of the IMU. Because, in general, the IMU and the camera are both fixed on the carrier, there is a fixed homogeneous transformation matrix T between the twoCSAnd a homogeneous transformation matrix T also exists between the camera coordinate system and the world coordinate systemCW. Therefore, the motion model calculated according to the updated pose trajectory data is based on the camera coordinates, and when the motion model is applicable to the IMU, the motion model needs to be subjected to pose transformation operation, such as the motion model and the homogeneous transformation matrix TCSThe motion model under the inertial measurement unit coordinate system can be transformed by multiplication.
And 108, carrying out noise processing on the transformed motion model to generate measurement data corresponding to the inertial measurement unit.
The IMU is a miniature electronic measurement element, and the measured data contains random walk error and reading noise, which are modeled as wiener noise (bias) and white gaussian noise (noise), respectively. To make the simulated IMU data more realistic, both types of noise are added using algorithms in the IMU data generated every frame. The algorithm is as follows:
the algorithm is as follows: increasing noise
Inputting: acceleration and angular velocity information of a motion model, a Gaussian noise method and a variance of wiener noise.
Beginning: generating a random number generator, defining a standard Gaussian distribution model, generating three random numbers by using standard Gaussian distribution, multiplying the variance of Gaussian noise by an identity matrix of 3x3 to obtain a covariance matrix of triaxial acceleration, and dividing the vector formed by multiplying the covariance matrix by the generated 3 random numbers by the square root of the algorithm of time interval to obtain the Gaussian noise. Wiener noise generates 3 random numbers by using a new standard Gaussian distribution, then the variance of the wiener noise is multiplied by the arithmetic square root of a time interval to be multiplied by a vector formed by three random numbers, and the multiplication result is used as an updated value of the wiener noise (random walk noise) and is added on the previous wiener noise to form new wiener noise. In contrast, in the noise generation method of angular velocity, the variance of two accelerations may be converted into the variance of angular velocity, as in the noise generation method of acceleration.
The specific formula is as follows:
gaussian noise:
Figure BDA0001766128450000111
wiener noise:
Figure BDA0001766128450000121
therefore, the simulation data of the IMU can be output, and the measurement data of the IMU can be obtained.
The IMU measurement data generation method can generate the IMU measurement data according to the pose track information of the camera, and avoids the problem of hardware synchronization after the IMU is installed on the robot and the problem of relative installation positions of the camera and the IMU. In addition, simulated IMU data can be generated by using a group route provided by the public data set, so that the performance change of a visual inertia algorithm after inertia fusion is added and a visual mileage calculation method without adding inertial sensor data is conveniently tested.
Please refer to fig. 2, which is a schematic diagram illustrating an exemplary structure of a computer device according to an embodiment of the present invention. The computer device 1 provided in this embodiment includes a memory 405, an input/output interface 407, a display 403, a network interface 409, and a processor 401 that exchanges data with the memory 405, the input/output interface 407, the network interface 409, and the display 403 through a bus 411. The input/output interface 407 can be connected to a mouse and/or a keyboard (not shown). The modules referred to in this application are program segments that perform a certain function and are better suited than programs for describing the execution of software on a processor.
The Processor 401 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, and the processor 401 is a control center of the computer apparatus 1 and connects various parts of the whole computer apparatus 1 by various interfaces and lines.
The memory 405 may be used for storing the computer programs and/or modules, and the processor 401 may implement various functions of the computer device 1 by running or executing the computer programs and/or modules stored in the memory 405 and calling the data stored in the memory 405. The memory 405 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program (such as a graphical interface display function) required by at least one function, and the like; the storage data area may store data (such as audio data, video data, etc.) created according to the use of the computer device, etc. Further, the memory 405 may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
In this embodiment, the display screen 403 may be a display screen with a touch function, so as to facilitate operations of a user. The memory 405 may store program code for execution by the processor 401 to implement the functionality of the IMU measurement data generation system 417.
Illustratively, the computer program may be partitioned into one or more modules/units, which are stored in the memory 405 and executed by the processor 401 to implement the present invention. The one or more modules may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program in the computer device 1. For example, the computer program may be divided into an IMU measurement data generation system 417 as shown in fig. 3, which includes an acquisition unit 300, an interpolation unit 302, a calculation unit 304, a transformation unit 306, and a processing unit 308, and the specific functions of each module/unit are as follows:
the acquiring unit 300 is configured to acquire pose trajectory data, where the pose trajectory data includes pose trajectory information, and each pose trajectory information includes position information and rotation information.
It is understood that a camera may be disposed on the robot to obtain attitude data such as a motion trajectory and a motion state of the robot. The camera may have a six degree of freedom trajectory (from which the data set may come from a group route). The pose data may be stored in a pose trajectory file or in a pose trajectory database, and thus, the acquisition unit 300 may acquire the pose trajectory data from the corresponding pose trajectory file or database. Preferably, the pose trajectory data may include pose trajectory information, each of which may include position timestamp, position information, and rotation information. Specifically, the data type of the timestamp may be double precision (double type); the position information can correspond to the position of three degrees of freedom and can have three double-precision types; the rotation information may correspond to three-degree-of-freedom rotation information, and may have four double-precision types, where the rotation information may be a quaternion, and its representation may be: w + xi + yj + zk.
For example, for a bit of gesture trajectory information, its timestamp may be: 1305031102.211214, the first, second and third positions in the position information may be-0.001420095, 0.007655886, 0.010920254, respectively; the quaternion x, quaternion y, quaternion z and quaternion w in the rotation information may be respectively: -0.001964978, -0.003080067, -0.001635573, 0.999992013.
The interpolation unit 302 is configured to perform interpolation processing on the two adjacent position posture trajectory information to obtain one or more interpolated position trajectory information located between the two adjacent position posture trajectory information, and update the position trajectory data.
In an embodiment, the interpolation unit 302 may perform interpolation processing on the two adjacent pose trajectory information according to the frequency of the inertial measurement unit. The frequency of IMU data measurement is generally higher than that of camera measurement, and some IMU data measurement frequencies can reach more than 5 times. For example, the camera may measure around 30Hz and the IMU may measure around 200 Hz. In other embodiments, the two adjacent pose trajectory information may be interpolated in other manners (by inserting a fixed number of pose trajectory information into the two adjacent pose trajectory information).
In one embodiment, the interpolation unit 302 may obtain the frequency of the camera measurement by calculating a timestamp of the camera. For example, the pose trajectory data may include first pose trajectory information, second pose trajectory information, third pose trajectory information, and fourth pose trajectory information. The measurement frequency of the camera may be based on the number of records of the pose trajectory information in the pose trajectory data in a unit time (e.g., 1 second), and thus, when the time intervals corresponding to the two adjacent position pose trajectory data are different (the difference between the time stamps of the two adjacent position pose trajectory data may be represented as a time interval), the number of pose trajectory information inserted in the two adjacent position pose trajectory data may be different. In other embodiments, the measurement frequency of the camera may also be obtained by other methods, such as obtaining relevant parameters of the camera.
Preferably, after the measurement frequency of the camera is confirmed, the interpolation unit 302 may interpolate two neighboring pose trajectory information according to the frequency of the inertial measurement unit. For example, the interpolation unit 302 may insert N-bit posture trajectory information between the first posture trajectory information and the second posture trajectory information, the interpolation unit 302 may insert M-bit posture trajectory information between the second posture trajectory information and the third posture trajectory information, and the interpolation unit 302 may insert P-bit posture trajectory information between the third posture trajectory information and the fourth posture trajectory information. It will be appreciated that the frequency of the inertial measurement unit may be stored in a recipe file, and as such, the frequency of the inertial measurement unit may be obtained from the recipe file.
It is to be understood that the interpolation processing in the two-phase neighboring pose trajectory information may include interpolation processing of position information and interpolation processing of rotation information.
Preferably, in the interpolation processing of the position information, if the time interval between the two adjacent position pose trajectory information is within a preset range (e.g., within 0.03 s), which indicates that the motion of the robot or the carrier is not changed drastically, the interpolation unit 302 may use an interpolation algorithm to calculate pose trajectory information of a plurality of virtual cameras between the two adjacent position pose trajectory information.
For the three-degree-of-freedom pose information of the camera, a linear interpolation algorithm is used to complete interpolation operation, as follows:
y=ty0+(1-t)y1
wherein t is set in the interval [0, 1]]Y of0Representing pose track information, y, corresponding to a first timestamp in the two adjacent pose track information1Representing pose track information corresponding to a second timestamp in the two adjacent pose track information, wherein the first timestamp can be earlier than the second timestamp, so that the interpolation result can be ensured to be in y0And y1In the meantime. A plurality of interpolation results can be obtained by setting a plurality of t. For example, N bits of posture track information are inserted between the first position posture track information and the second position posture track information, and t may be N; inserting M position track information between the second position track information and the third position track information, wherein t can be M. The time stamp in the pose track information can also be determined based on the formula, such as y in the formula0And y1And replacing the first position gesture track information with the corresponding time stamp in the first position gesture track information and the time stamp in the second position gesture track information.
It is to be understood that, in addition to linear interpolation, when the motion of the camera changes dramatically, the interpolation unit 302 may perform an interpolation operation using a polynomial function, and such an interpolation operation may simulate a function change of a polynomial.
Preferably, in the interpolation process of the rotation information, the interpolation unit 302 may use a spherical interpolation algorithm. Since the quaternion of the rotation information is distributed in the manifold space (e.g., spherical space), which is not closed to addition and subtraction, the result of using simple linear interpolation is not uniform and it cannot be guaranteed that it is still a unit quaternion. Therefore, the interpolation unit 302 can interpolate the rotation information by the spherical interpolation method.
For example, for the rotation information Q1 in the first position posture trajectory information and the rotation information Q2 in the adjacent second position posture trajectory information, the dot product of Q1 and Q2 may be calculated first, if the dot product is negative, which indicates that the included angle between Q1 and Q2 is an angle greater than 180 degrees, then Q2 is conjugated to Q2, and the interpolation between Q1 and Q2 is the optimal interpolation; if the dot product is positive, interpolation is performed between Q1 and Q2.
For the following calculations:
K0=sin((1.0-t)*atan(sina/cosa))/sina;
K1=sin(t*atan(sina/cosa))/sina;
wherein cosa is the absolute value of the dot product of Q1 and Q2, sina ═ sqrt (1-cosa ═ cosa).
And outputting a result: k0 × Q1+ K1 × Q2 or K0 × Q1+ K1 (Q2), and the interpolation result can ensure the uniform distribution of quaternions.
It is to be understood that after the interpolation operation is completed in the two adjacent position pose trajectory information, one or more interpolated position pose trajectory information may be included between the two adjacent position pose trajectory information, and at this time, the position pose trajectory data may be updated. For example, when 3-bit pose trajectory information (first interpolation pose trajectory information, second interpolation pose trajectory information, and third interpolation pose trajectory information) is inserted between the first pose trajectory information and the second pose trajectory information, the updated pose trajectory data may include the first pose trajectory information, the first interpolation pose trajectory information, the second interpolation pose trajectory information, the third interpolation pose trajectory information, and the second pose trajectory information.
The calculation unit 304 may calculate acceleration information of a motion model according to the position information in the updated pose trajectory data; and calculating the angular velocity information of the motion model according to the rotation information in the updated pose trajectory data.
As can be understood, the calculation unit 304 may calculate the acceleration information a [ k ] of the motion model according to the position information in the updated pose trajectory data, which may be expressed as:
v[k]=(p[k+1]-p[k])/Δt;
a[k]=(v[k+1]-v[k])/Δt;
wherein p [ k +1] and p [ k ] represent position information in the two adjacent position attitude trajectory information, delta t represents the difference of time stamps in the two adjacent position attitude trajectory information, and k is a natural number. For example, for updated pose trajectory data: when k is 1, p [ k +1] and p [ k ] represent position information in the first position trajectory information and the first interpolation pose trajectory information, and delta t represents the difference between timestamps in the first position trajectory information and the first interpolation pose trajectory information.
As can be understood, the calculation unit 304 can calculate the angular velocity information ω of the motion model according to the rotation information in the updated pose trajectory data as follows:
Figure BDA0001766128450000171
wherein, q [ k +1], q [ k ] represent rotation information in the two adjacent position posture orbit information, delta t represents the difference of time stamps in the two adjacent position posture orbit information, and k is a natural number. For example, for updated pose trajectory data: when k is 1, q [ k +1] and q [ k ] represent rotation information in the first position trajectory information and the first interpolation pose trajectory information, and delta t represents the difference between timestamps in the first position trajectory information and the first interpolation pose trajectory information.
The transformation unit 306 may perform a pose transformation operation on the motion model to transform the motion model to a motion model in an inertial measurement unit coordinate system.
It will be appreciated that throughout the system, a total of three coordinate systems are used: camera coordinate system FCIMU coordinate system FIAnd a world coordinate system FW. The world coordinate system is the actual northeast sky coordinate system, and in this coordinate system, the earth gravity vector is (0, 0, -9.78) (Shenzhen). The camera coordinate system is a coordinate system fixedly connected with the camera at the origin in the self-moving process of the camera, and the IMU coordinate system is a coordinate system fixedly connected with the IMU in the moving process of the IMU. Because, in general, the IMU and the camera are both fixed on the carrier, there is a fixed homogeneous transformation matrix T between the twoCSAnd a homogeneous transformation matrix T also exists between the camera coordinate system and the world coordinate systemCW. Therefore, the motion model calculated according to the updated pose trajectory data is based on the camera coordinates, and when the motion model is applicable to the IMU, the motion model needs to be subjected to pose transformation operation, for example, the transformation unit 306 can transform the motion model with the homogeneous transformation matrix TCSThe motion model under the inertial measurement unit coordinate system can be transformed by multiplication.
The processing unit 308 may perform noise processing on the transformed motion model to generate measurement data corresponding to the inertial measurement unit, so as to achieve the purpose of improving the accuracy of the measurement data. In other embodiments, the processing unit 308 may be omitted.
The IMU is a miniature electronic measurement element, and the measured data contains random walk error and reading noise, which are modeled as wiener noise (bias) and white gaussian noise (noise), respectively. To make the simulated IMU data more realistic, both types of noise are added using algorithms in the IMU data generated every frame. The algorithm is as follows:
the algorithm is as follows: increasing noise
Inputting: acceleration and angular velocity information of a motion model, a Gaussian noise method and a variance of wiener noise.
Beginning: generating a random number generator, defining a standard Gaussian distribution model, generating three random numbers by using standard Gaussian distribution, multiplying the variance of Gaussian noise by an identity matrix of 3x3 to obtain a covariance matrix of triaxial acceleration, and dividing the vector formed by multiplying the covariance matrix by the generated 3 random numbers by the square root of the algorithm of time interval to obtain the Gaussian noise. Wiener noise generates 3 random numbers by using a new standard Gaussian distribution, then the variance of the wiener noise is multiplied by the arithmetic square root of a time interval to be multiplied by a vector formed by three random numbers, and the multiplication result is used as an updated value of the wiener noise (random walk noise) and is added on the previous wiener noise to form new wiener noise. In contrast, in the noise generation method of angular velocity, the variance of two accelerations may be converted into the variance of angular velocity, as in the noise generation method of acceleration.
The specific formula is as follows:
gaussian noise:
Figure BDA0001766128450000181
wiener noise:
Figure BDA0001766128450000182
therefore, the simulation data of the IMU can be output, and the measurement data of the IMU can be obtained.
The IMU measurement data generation system can generate the measurement data of the IMU according to the pose track information of the camera, and avoids the problem of hardware synchronization after the IMU is installed on the robot and the problem of relative installation positions of the camera and the IMU. In addition, simulated IMU data can be generated by using a group route provided by the public data set, so that the performance change of a visual inertia algorithm after inertia fusion is added and a visual mileage calculation method without adding inertial sensor data is conveniently tested.
The modules integrated in the computer device 1 according to the present invention may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as independent products. Based on such understanding, all or part of the flow in the volume control method according to the above embodiments may be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps in the volume control method according to the above embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units, modules or means recited in the system, apparatus or mobile terminal apparatus claims may also be implemented by one and the same unit, module or means by software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention.

Claims (7)

1. A method of generating IMU measurement data, the method comprising:
acquiring pose track data, wherein the pose track data comprises a plurality of pose track information, and each pose track information comprises position information and rotation information;
interpolating two phases of adjacent position posture trajectory information to obtain one or more interpolated position trajectory information positioned between the two phases of adjacent position posture trajectory information, and updating the position trajectory data;
calculating acceleration information of a motion model according to the position information in the updated pose trajectory data; calculating the angular velocity information of the motion model according to the updated rotation information in the pose trajectory data;
performing a pose transformation operation on the motion model to transform the motion model to a motion model under an inertial measurement unit coordinate system;
the interpolation processing of the two-phase adjacent position posture orbit information comprises the following steps:
interpolating position information in the two phases of adjacent position attitude and orbit information by using a linear interpolation algorithm or a polynomial function algorithm, and interpolating rotation information in the two phases of adjacent position attitude and orbit information by using a spherical interpolation value;
the acceleration information is calculated according to the following equation:
a[k]=(v[k+1]-v[k])/△t;
v[k]=(p[k+1]-p[k])/△t;
the angular velocity information is calculated according to the following equation:
Figure FDA0002683874600000011
wherein a [ k ] represents the acceleration information, ω represents the angular velocity information, p [ k +1], p [ k ] represent position information in the two-phase neighboring position attitude trajectory information, q [ k +1], q [ k ] represent rotation information in the two-phase neighboring position attitude trajectory information, Δ t represents a difference between time stamps in the two-phase neighboring position attitude trajectory information, and k is a natural number;
the performing a pose transformation operation on the motion model comprises:
and multiplying the motion model by a homogeneous transformation matrix to obtain the motion model under the coordinate system of the inertial measurement unit.
2. The IMU measurement data generation method of claim 1, wherein interpolating the two phases of neighboring pose trajectory information comprises:
and carrying out interpolation processing on the two adjacent pose track information according to the frequency of the inertial measurement unit.
3. The IMU measurement data generation method of claim 1, wherein the interpolating the rotation information in the two phases of ortho pose trajectory information using spherical interpolation comprises:
judging whether the dot product of rotation information in first position posture track information and rotation information in second position posture track information is negative or not, wherein the two-phase position posture track information comprises the first position posture track information and the second position posture track information;
when the dot product is negative, performing interpolation operation between the rotation information in the first position attitude trajectory information and the conjugate of the rotation information in the second position attitude trajectory information;
when the dot product is positive, performing interpolation operation between the rotation information in the first position posture trajectory information and the rotation information in the second position posture trajectory information.
4. The IMU measurement data generation method of any one of claims 1-3, further comprising:
and carrying out noise processing on the transformed motion model to generate measurement data corresponding to the inertial measurement unit.
5. An IMU measurement data generation system, the system comprising:
the system comprises an acquisition unit, a rotation unit and a control unit, wherein the acquisition unit is used for acquiring pose track data, the pose track data comprises a plurality of pose track information, and each pose track information comprises position information and rotation information;
the interpolation unit is used for carrying out interpolation processing on the two phases of adjacent position posture trajectory information to obtain one or more interpolated position trajectory information positioned between the two phases of adjacent position posture trajectory information and updating the position trajectory data;
the calculation unit is used for calculating the acceleration information of the motion model according to the position information in the updated pose track data; calculating the angular velocity information of the motion model according to the rotation information in the updated pose trajectory data;
the transformation unit is used for executing pose transformation operation on the motion model so as to transform the motion model into a motion model under an inertial measurement unit coordinate system;
the interpolation unit carries out interpolation processing on the two-phase adjacent position attitude trajectory information, and the interpolation processing comprises the following steps:
interpolating position information in the two phases of adjacent position attitude and orbit information by using a linear interpolation algorithm or a polynomial function algorithm, and interpolating rotation information in the two phases of adjacent position attitude and orbit information by using a spherical interpolation value;
the calculation unit calculates the acceleration information according to the following equation:
a[k]=(v[k+1]-v[k])/△t;
v[k]=(p[k+1]-p[k])/△t;
the calculation unit calculates the angular velocity information according to:
Figure FDA0002683874600000031
wherein a [ k ] represents the acceleration information, ω represents the angular velocity information, p [ k +1], p [ k ] represent position information in the two-phase neighboring position attitude trajectory information, q [ k +1], q [ k ] represent rotation information in the two-phase neighboring position attitude trajectory information, Δ t represents a difference between time stamps in the two-phase neighboring position attitude trajectory information, and k is a natural number;
the transformation unit performing a pose transformation operation on the motion model includes:
and multiplying the motion model by a homogeneous transformation matrix to obtain the motion model under the coordinate system of the inertial measurement unit.
6. A computer arrangement, characterized in that the computer arrangement comprises a processor for implementing the steps of the IMU measurement data generation method of any of claims 1-4 when executing a computer program stored in a memory.
7. A readable storage medium having stored thereon a computer program, wherein the computer program, when being executed by a processor, is adapted to carry out the steps of the IMU measurement data generation method according to any one of claims 1-4.
CN201810939333.9A 2018-08-14 2018-08-14 IMU measurement data generation method, system, computer device and readable storage medium Active CN109186596B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810939333.9A CN109186596B (en) 2018-08-14 2018-08-14 IMU measurement data generation method, system, computer device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810939333.9A CN109186596B (en) 2018-08-14 2018-08-14 IMU measurement data generation method, system, computer device and readable storage medium

Publications (2)

Publication Number Publication Date
CN109186596A CN109186596A (en) 2019-01-11
CN109186596B true CN109186596B (en) 2020-11-10

Family

ID=64918165

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810939333.9A Active CN109186596B (en) 2018-08-14 2018-08-14 IMU measurement data generation method, system, computer device and readable storage medium

Country Status (1)

Country Link
CN (1) CN109186596B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112113582A (en) * 2019-06-21 2020-12-22 上海商汤临港智能科技有限公司 Time synchronization processing method, electronic device, and storage medium
CN110906922A (en) * 2019-11-08 2020-03-24 沈阳无距科技有限公司 Unmanned aerial vehicle pose information determining method and device, storage medium and terminal
CN112455606B (en) * 2020-12-03 2023-01-31 天津小鲨鱼智能科技有限公司 Vehicle steering control system and method
CN112948411B (en) * 2021-04-15 2022-10-18 深圳市慧鲤科技有限公司 Pose data processing method, interface, device, system, equipment and medium
CN113658260A (en) * 2021-07-12 2021-11-16 南方科技大学 Robot pose calculation method and system, robot and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107123142A (en) * 2017-05-09 2017-09-01 北京京东尚科信息技术有限公司 Position and orientation estimation method and device
CN108375382A (en) * 2018-02-22 2018-08-07 北京航空航天大学 Position and attitude measuring system precision calibration method based on monocular vision and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014068302A1 (en) * 2012-11-05 2014-05-08 The Chancellor Masters And Scholars Of The University Of Oxford Extrinsic calibration of imaging sensing devices and 2d lidars mounted on transportable apparatus
US10012504B2 (en) * 2014-06-19 2018-07-03 Regents Of The University Of Minnesota Efficient vision-aided inertial navigation using a rolling-shutter camera with inaccurate timestamps

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107123142A (en) * 2017-05-09 2017-09-01 北京京东尚科信息技术有限公司 Position and orientation estimation method and device
CN108375382A (en) * 2018-02-22 2018-08-07 北京航空航天大学 Position and attitude measuring system precision calibration method based on monocular vision and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
IMU-Camera的相对位姿标定方法及应用;田颖;《中国优秀硕士学位论文全文数据库信息科技辑》;20180215(第2期);I138-1891 *
Spatio-temporal initialization for IMU to camera registration;Elmar Mair等;《Proceedings of the 2011 IEEE International Conference on Robotics and Biomimetics》;20111211;557-564 *
基于IMU-Camera标定的鲁棒电子稳像方法;方明等;《信息与控制》;20180430;第47卷(第2期);156-165 *
车载激光点云的颜色信息获取与融合处理;吴胜浩;《中国优秀硕士学位论文全文数据库信息科技辑》;20110815(第8期);I140-68 *

Also Published As

Publication number Publication date
CN109186596A (en) 2019-01-11

Similar Documents

Publication Publication Date Title
CN109186596B (en) IMU measurement data generation method, system, computer device and readable storage medium
Panahandeh et al. Vision-aided inertial navigation based on ground plane feature detection
Piao et al. Real-time visual–inertial SLAM based on adaptive keyframe selection for mobile AR applications
EP3552077B1 (en) Systems and methods for tracking motion and gesture of heads and eyes
WO2020253260A1 (en) Time synchronization processing method, electronic apparatus, and storage medium
CN110310362A (en) High dynamic scene three-dimensional reconstruction method, system based on depth map and IMU
CN108170297B (en) Real-time six-degree-of-freedom VR/AR/MR device positioning method
CN111156998A (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN108062776A (en) Camera Attitude Tracking method and apparatus
CN108846857A (en) The measurement method and visual odometry of visual odometry
CN109186592B (en) Method and device for visual and inertial navigation information fusion and storage medium
US11181379B2 (en) System and method for enhancing non-inertial tracking system with inertial constraints
CN105164726A (en) Camera pose estimation for 3d reconstruction
CN104658012A (en) Motion capture method based on inertia and optical measurement fusion
CN109211277A (en) The state of vision inertia odometer determines method, apparatus and electronic equipment
CN110388919B (en) Three-dimensional model positioning method based on feature map and inertial measurement in augmented reality
Fang et al. Multi-sensor based real-time 6-DoF pose tracking for wearable augmented reality
CN112486331A (en) IMU-based three-dimensional space handwriting input method and device
CN107302845B (en) Low delay simulation apparatus, method and computer readable medium using direction prediction
US20140285513A1 (en) Animation of a virtual object
Ratchatanantakit et al. A sensor fusion approach to MARG module orientation estimation for a real-time hand tracking application
CN112284381B (en) Visual inertia real-time initialization alignment method and system
Cao et al. The WHU rolling shutter visual-inertial dataset
Irmisch et al. Simulation framework for a visual-inertial navigation system
CN111127661B (en) Data processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant