CN110567486B - Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof - Google Patents

Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof Download PDF

Info

Publication number
CN110567486B
CN110567486B CN201910753049.7A CN201910753049A CN110567486B CN 110567486 B CN110567486 B CN 110567486B CN 201910753049 A CN201910753049 A CN 201910753049A CN 110567486 B CN110567486 B CN 110567486B
Authority
CN
China
Prior art keywords
data
conversion
coefficient
mathematical model
detection data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910753049.7A
Other languages
Chinese (zh)
Other versions
CN110567486A (en
Inventor
吴昆临
许秋子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Realis Multimedia Technology Co Ltd
Original Assignee
Shenzhen Realis Multimedia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Realis Multimedia Technology Co Ltd filed Critical Shenzhen Realis Multimedia Technology Co Ltd
Priority to CN202110215033.8A priority Critical patent/CN113008271A/en
Priority to CN201910753049.7A priority patent/CN110567486B/en
Publication of CN110567486A publication Critical patent/CN110567486A/en
Application granted granted Critical
Publication of CN110567486B publication Critical patent/CN110567486B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/04Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
    • G01B21/042Calibration or calibration artifacts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computational Mathematics (AREA)
  • Algebra (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Remote Sensing (AREA)
  • Pure & Applied Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Gyroscopes (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)

Abstract

A mathematical model construction method for calibrating 3D rotation difference, a calibration method and a device thereof are provided, the mathematical model construction method comprises the following steps: acquiring first detection data of a motion detection device and second detection data of a 3D measurement device; a preset conversion function model, through which the first detection data can be converted into corresponding first conversion data; establishing a loss function, and using a nonlinear least square method to carry out the pair of coefficients qaAnd coefficient qbCarrying out optimization solution; coefficient q of the optimization solutionaAnd coefficient qbAnd introducing the data into a conversion function model, and establishing a mathematical model for obtaining the calibration 3D rotation difference. The established mathematical model can enable the coordinate direction of the motion detection device to rotate to be consistent with the coordinate direction of the 3D measurement device, and can also enable the zero degree reference position and the coordinate direction of the motion detection device and the 3D measurement device to be consistent, thereby providing conditions for calibrating the 3D rotation difference between the motion detection device and the 3D measurement device.

Description

Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof
Technical Field
The invention relates to the technical field of 3D measurement, in particular to a mathematical model construction method for calibrating 3D rotation difference, a calibration method and a device thereof.
Background
The attitude positioning of the target plays an important role in the identification of the target and the analysis of the image, and the target attitude positioning in a complex background has important application in the fields of military, industrial monitoring, traffic control management and the like. Currently, there are technical means for performing three-dimensional measurement on a target object by using a laser measurement technology, technical means for acquiring three-dimensional information of the target object by using binocular stereo vision, and technical means for detecting three-dimensional spatial information of the target object by using an angular motion detection technology. In practical applications, a target object is usually measured in three dimensions by means of a 3D measuring instrument to obtain three-dimensional surface information or three-dimensional motion information of the target object.
In the prior art, there are various ways to measure the orientation (or orientation, rotation) of a rigid body (i.e. an object with a constant shape and size), such as: the 3D rotation state of the rigid body, i.e., the attitude information, is measured using a gyroscope or a 3D measuring instrument. However, under the influence of the self-precision of the 3D measuring instrument or environmental factors, the pose information measured by the 3D measuring instrument may have errors, resulting in lower measurement precision. At this time, a highly accurate and calibrated gyroscope is needed for comparison to determine whether the data obtained by the 3D measuring instrument is accurate. However, since the gyroscope has its own defined coordinate system, which is often inconsistent with the coordinate system adopted by the 3D measuring instrument, for example, the rigid body rotates 90 ° to the left, the rigid body measured by the 3D measuring instrument rotates 90 ° toward the Y axis, and for the gyroscope, it may be defined as rotating 90 ° toward the Z axis or the X axis, and thus the two describe the value of the 90 ° differently, which may cause the rotation difference in the three-dimensional space; in addition, the difference between the data may be caused when the reference position (zero degree reference position) at which the gyroscope and the 3D measuring instrument measure rotation is different. The prior art lacks a simple and effective calibration mode to keep the coordinate systems, reference positions and the like of the gyroscope and the 3D measuring instrument consistent.
Disclosure of Invention
The invention mainly solves the technical problem of how to calibrate the rotation difference of two directional measuring instruments so as to overcome the condition that the three-dimensional measuring result is inaccurate. In order to overcome the technical problem, the application discloses a mathematical model construction method for calibrating 3D rotation difference, a calibration method and a device thereof.
According to a first aspect, an embodiment provides a mathematical model construction method for calibrating 3D rotational differences, comprising: acquiring first detection data of a motion detection device and second detection data of a 3D measurement device; the motion detection device is arranged on an object to be detected, and the motion detection device and the object to be detectedThe 3D measuring devices are all used for detecting the three-dimensional motion state of the object to be detected; presetting a conversion function model by which the first detection data can be converted into corresponding first conversion data, the conversion function model by redefining a coordinate system conversion coefficient q of the motion detection apparatusaAnd redefining a zero degree reference position conversion coefficient q of the motion detection apparatusbPerforming 3D rotational disparity conversion on the first detection data; establishing a loss function according to the first detection data, the second detection data and the first conversion data, and utilizing a nonlinear least square method to carry out the processing on the coefficient qaAnd said coefficient qbCarrying out optimization solution; the coefficient q of the optimization solutionaAnd said coefficient qbAnd introducing the data into the transfer function model, and establishing a mathematical model for obtaining a calibration 3D rotation difference, wherein the mathematical model for calibrating the 3D rotation difference is used for calibrating the 3D rotation measurement difference between the motion detection device and the 3D measurement device.
The preset conversion function model, by which the first detection data can be converted into corresponding first conversion data, includes: let the transfer function model be denoted as qgyro=f(qraw) Wherein q isgyroExpressed as first converted data, q, after conversion of said first detection datarawDenoted as said first detection data.
The preset process of the conversion function model comprises the following steps: redefining coordinate system conversion coefficients of the motion detection device
Figure BDA0002167856370000021
According to said coefficient qaConverting a coordinate system of the first detection data to enable the coordinate direction of the motion detection device to be rotated to be consistent with the coordinate direction of the 3D measurement device, and obtaining data after the coordinate system conversion: q. q.sraw′=qaqrawqa -1Wherein q israwRepresents the first detection data, and
Figure BDA0002167856370000022
qa -1is qaThe inverse of (1); redefining zero degree reference position conversion coefficient of the motion detection device
Figure BDA0002167856370000023
According to said coefficient qbPerforming zero degree reference position conversion on the data after the coordinate system conversion so as to enable zero degree reference positions of the motion detection device and the 3D measurement device to be consistent with the coordinate direction, and obtaining data after 3D rotation difference conversion is performed on the first detection data: q. q.sgyro=qbqraw'; the transfer function model is built and is denoted as qgyro=f(qraw)=qbqaqrawqa -1And will be
Figure BDA0002167856370000031
As the first conversion data.
Establishing a loss function according to the first detection data, the second detection data and the first conversion data, and utilizing a nonlinear least square method to carry out the processing on the coefficient qaAnd said coefficient qbThe optimization solution comprises the following steps: the loss function is defined as:
Figure BDA0002167856370000032
wherein the content of the first and second substances,
Figure BDA0002167856370000033
represents the second detection data, qrawRepresenting said first detection data, wgyro、xgyro、ygyro、zgyroRepresenting said first transformed data, w, obtained from said transfer function model and said first detected dataa、xa、ya、zaAnd wb、xb、yb、zbEach representsCoefficient qaAnd said coefficient qbThe data value of (a); substituting the corresponding data values into the loss function, and optimizing and solving the coefficient q when the value of the loss function is minimum by utilizing a nonlinear least square methodaAnd said coefficient qbThe value of (c).
The coefficient q to be solvedaAnd said coefficient qbIntroducing the calibration data into the transfer function model, and establishing a mathematical model for obtaining the calibration 3D rotation difference, wherein the mathematical model comprises the following steps: the coefficient q solved by the optimizationaAnd said coefficient qbIs introduced into the conversion function model qgyro=f(qraw) And obtaining a mathematical model formula for calibrating the 3D rotation difference:
qgyro=f(qraw)=qb′qa′qraw(qa′)-1
wherein, q'aExpressed as q solved for said optimizationaValue of (1), q'bExpressed as q solved for said optimizationbThe value of (c).
According to a second aspect, an embodiment provides a method of calibrating 3D rotational disparity, comprising: acquiring first detection data of a motion detection device and second detection data of a 3D measurement device; the motion detection device is arranged on an object to be detected, and the motion detection device and the 3D measurement device are both used for detecting the three-dimensional motion state of the object to be detected; establishing a mathematical model for obtaining a calibrated 3D rotation difference according to the mathematical model construction method of the first aspect, the mathematical model for calibrating a 3D rotation measurement difference between the motion detection device and the 3D measurement device; and converting the first detection data into corresponding first conversion data according to the mathematical model for calibrating the 3D rotation difference so as to finish the calibration of the 3D rotation difference between the motion detection device and the 3D measurement device, wherein the first conversion data is used for comparing with the second detection data so as to determine the accuracy of the second detection data.
The first converted data is used for comparing with the second detection data to determine the accuracy of the second detection data, and comprises the following steps: and comparing the first conversion data with the second detection data, judging whether the second detection data is credible according to a comparison result, and replacing the second detection data with the first conversion data if the second detection data is determined to be credible.
According to a third aspect, an embodiment provides a 3D rotational disparity calibration apparatus, comprising: an acquisition unit for acquiring first detection data of a motion detection device and second detection data of a 3D measurement device; the motion detection device is arranged on an object to be detected, and the motion detection device and the 3D measurement device are both used for detecting the three-dimensional motion state of the object to be detected; a building unit, configured to build a mathematical model for obtaining a calibration 3D rotation difference according to the mathematical model building method in the first aspect, where the mathematical model for calibrating a 3D rotation measurement difference between the motion detection apparatus and the 3D measurement apparatus is used for calibrating the 3D rotation measurement difference; and the calibration unit is used for converting the first detection data into corresponding first conversion data according to the mathematical model of the calibration 3D rotation difference so as to finish the calibration of the 3D rotation difference between the motion detection device and the 3D measurement device, and the first conversion data is used for comparing with the second detection data so as to determine the accuracy of the second detection data.
According to a fourth aspect, there is provided in an embodiment an apparatus comprising: a memory, a processor and a 3D rotational disparity calibration method program stored on the memory and executable on the processor, the 3D rotational disparity calibration method program when executed by the processor implementing the steps of the 3D rotational disparity calibration method described in the second aspect above.
According to a fifth aspect, an embodiment provides a computer readable storage medium comprising a program executable by a processor to implement the method of the first and/or second aspect.
The beneficial effect of this application is:
according to the aboveThe embodiment provides a mathematical model construction method, a calibration method and a device for calibrating 3D rotation difference, wherein the mathematical model construction method comprises the following steps: acquiring first detection data of a motion detection device and second detection data of a 3D measurement device; the motion detection device is arranged on an object to be detected, and both the motion detection device and the 3D measurement device are used for detecting the three-dimensional motion state of the object to be detected; presetting a conversion function model by which the first detection data is converted into corresponding first conversion data, the conversion function model converting the coefficient q by redefining the coordinate system of the motion detection apparatusaAnd redefining zero degree reference position conversion coefficient q of motion detection devicebPerforming 3D rotational disparity conversion on the first detection data; establishing a loss function according to the first detection data, the second detection data and the first conversion data, and utilizing a nonlinear least square method to carry out coefficient qaAnd coefficient qbCarrying out optimization solution; coefficient q of the optimization solutionaAnd coefficient qbIntroducing into a transfer function model, and establishing a mathematical model of the calibrated 3D rotation difference, the mathematical model of the calibrated 3D rotation difference being used for calibrating the 3D rotation measurement difference between the motion detection device and the 3D measurement device. On the first hand, because the mathematical model is constructed by utilizing the first detection data of the motion detection device and the second detection data of the 3D measurement device, the establishment process of the mathematical model is close to the application environment as much as possible, which is beneficial to ensuring the accuracy of the model; in the second aspect, since the loss function is established based on the first detected data, the second detected data and the first converted data, and the coefficient q is calculated by the nonlinear least square methodaAnd coefficient qbCarrying out optimization solution to obtain a calculated coefficient qaAnd coefficient qbThe coordinate system difference and the zero degree reference position difference between the motion detection device and the 3D measuring device can be respectively reflected, and accurate guidance is provided for conversion of detection data; in a third aspect, the established mathematical model can enable the coordinate direction of the motion detection device to rotate to be consistent with the coordinate direction of the 3D measurement device, and can also enable the zero degree reference position and the coordinate direction of the motion detection device and the 3D measurement device to be consistent, so as to calibrate the motion detectionThe 3D rotation difference between the device and the 3D measurement device provides a condition; in the fourth aspect, the obtained mathematical model is adopted in the calibration method, so that the 3D rotation measurement difference between the motion detection device and the 3D measurement device can be conveniently calibrated, the converted data can be compared with other directionally measured data, the accuracy of the measured data is ensured, the efficiency of product development is improved, the data does not need to be physically aligned, and convenience is brought to technicians.
Drawings
FIG. 1 is a flow chart of a mathematical model construction method for calibrating 3D rotational differences according to the present application;
FIG. 2 is a flow chart of a transfer function model building process;
FIG. 3 is a graph of the coefficient q versus the loss functionaAnd coefficient qbA flow chart for performing an optimization solution;
FIG. 4 is a flow chart of a 3D rotational disparity calibration method of the present application;
FIG. 5 is a flow chart of determining accuracy of second sensed data;
FIG. 6 is a schematic diagram of a 3D rotational disparity calibration apparatus according to the present application;
fig. 7 is a schematic structural diagram of an apparatus for implementing a calibration method program in the present application.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like associated elements. In the following description, numerous details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art.
Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the method descriptions may be transposed or transposed in order, as will be apparent to one of ordinary skill in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings).
The first embodiment,
Referring to fig. 1, the present application discloses a mathematical model construction method for calibrating 3D rotation difference, which includes steps S110-S140, which are described below.
In step S110, first detection data of a motion detection device and second detection data of a 3D measurement device are obtained.
In this embodiment, the motion detection device is disposed on the object to be detected, and is configured to detect a three-dimensional motion state of the object to be detected; and, the 3D measuring device is also used to measure the three-dimensional motion attitude of the object to be detected. Therefore, the first detection data output by the motion detection device and the second detection data output by the 3D measurement device are both detection data of the three-dimensional motion state of the object to be detected.
It should be noted that the motion detection device may be an angular motion detection instrument, such as a gyroscope, which is an angular motion detection device using a moment-of-momentum sensitive housing of a high-speed rotation body about one or two axes orthogonal to the rotation axis with respect to the inertial space; gyroscopes can be particularly classified as piezoelectric gyroscopes, micromechanical gyroscopes, fiber optic gyroscopes and laser gyroscopes, all of which are electronic, and which can form, together with accelerometers, magnetoresistive chips, GPS, an orientation indicating system for inertial navigation.
In general, a 3D measuring apparatus for measuring a three-dimensional motion posture of a target object uses a non-contact (non-contact) measuring means, such as a laser, a radar, a structured grating, a binocular vision image, or the like, to measure a shape, an appearance, or a posture of the target object. Currently, various 3D measuring devices, such as three-dimensional image measuring instruments, three-dimensional scanners, depth cameras, etc., are available on the market, and belong to the prior art, so that detailed descriptions thereof are omitted.
Step S120, a conversion function model is preset, and the first detection data can be converted into corresponding first conversion data through the conversion function model, where the first conversion data is unknown at this time, and the first conversion data can be calculated only by establishing a mathematical model for calibrating 3D rotation difference, and the preset conversion function model is simple, so as to express the unknown first conversion data by using the conversion function model and the known first detection data, and facilitate the subsequent establishment of a loss function solution.
In this embodiment, the transfer function model can be expressed as:
qgyro=f(qraw) (1-1)
wherein q isgyroExpressed as first converted data, q, after conversion of the first detected datarawDenoted as said first detection data. Then, the first detection data may be converted into corresponding first conversion data according to the formula (1-1).
In this embodiment, q may begyro、qrawExpressed in quaternion form, some quaternion operations are involved in the conversion process, and some explanations will be made on quaternion for the convenience of understanding the operation process of quaternion. In general, a quaternion can be understood as a simple hypercomplex number, consisting of a real part and three imaginary parts.
A unit quaternion can be expressed as:
q=w+xi+yj+zk
or is represented as:
Figure BDA0002167856370000071
w, X, Y and Z are real numbers, i, j and k participate in operation as a special imaginary unit, and the geometric meaning of i, j and k itself can be understood as rotation, wherein i rotation represents the positive rotation of an X axis to a Y axis in an X axis and Y axis intersection plane, j rotation represents the positive rotation of a Y axis to a Z axis in a Y axis and Z axis intersection plane, and k rotation represents the positive rotation of a Z axis to the X axis in a Z axis and X axis intersection plane.
Then, the inverse operation of the unit quaternion is expressed as: q. q.s-1W-xi-yj-zk, where-i, -j, -k represent the opposite rotation of i, j, k rotation, respectively. Or as:
Figure BDA0002167856370000072
then, in unit quaternion q1And q is2For example, to represent a multiplication of a unit quaternion, the formula is:
Figure BDA0002167856370000081
in the present embodiment, the conversion function model can be converted by redefining the coordinate system conversion coefficient q of the motion detection apparatus by the unit quaternion defined aboveaAnd redefining zero degree reference position conversion coefficient q of motion detection devicebThe first detection data is subjected to 3D rotation difference conversion. In one embodiment, referring to FIG. 2, the process of creating the transfer function model includes steps S121-S123, which are described below.
Step S121, redefining coordinate system conversion coefficient of motion detection device
Figure BDA0002167856370000082
According to the coefficient qaThe first detection data is subjected to coordinate system conversion so that the coordinate direction of the motion detection device is rotated to be consistent with the coordinate direction of the 3D measurement device, and the coordinate system-converted data may be expressed as:
Figure BDA0002167856370000083
wherein q israwRepresents the first detection data, and
Figure BDA0002167856370000084
qa -1is qaThe inverse of (1); and, here, the coefficient qaTemporarily unknown.
It should be noted that one of the reasons for the data difference between the motion detection device and the 3D measurement device is that the coordinate system (X-Y-Z direction) defined by the motion detection device itself is not consistent with the coordinate system of the 3D measurement device, so that the coordinate direction of the motion detection device needs to be rotated to be consistent with the coordinate direction of the 3D measurement device. As can be seen from the formula (1-2) of this step, w is seen after expansionrawHeld constant so that the coefficients of the three axes are represented by qaThe generated rotation matrix is adjusted to be consistent with the coordinates of the 3D measuring device.
Step S122, redefining zero degree reference position conversion coefficient of motion detection device
Figure BDA0002167856370000085
According to the coefficient qbData q 'after conversion of coordinate system'rawAnd performing zero degree reference position conversion to make the zero degree reference positions of the motion detection device and the 3D measurement device consistent with the coordinate direction, thereby obtaining data obtained by performing 3D rotation difference conversion on the first detection data:
qgyro=qbqraw′ (1-3)
wherein the zero degree reference position conversion coefficient qbTemporarily unknown.
It is to be noted thatAnother reason for the data difference between the motion detection means and the 3D measurement means is that the zero degree reference position does not coincide, and since the offset position of the rotation is measured with respect to the reference position, the rotation results are not the same for the reference position. This step is multiplied by the reference position difference quaternion, i.e. by the coefficient q, on the basis of equation (1-2)b
Step S123, building a conversion function model according to the above formula, and expressed as:
qgyro=f(qraw)=qbqaqrawqa -1 (1-4)
in this embodiment, the
Figure BDA0002167856370000091
As the first conversion data, when the first conversion data is unknown, the first conversion data may be passed through the coefficient qaAnd coefficient qbAnd the first detected data is represented.
Step S130, establishing a loss function according to the first detection data, the second detection data and the first conversion data, and utilizing a nonlinear least square method to carry out coefficient qaAnd coefficient qbAnd carrying out optimization solution to obtain respective corresponding data values. In one embodiment, referring to FIG. 3, the step S130 may include steps S131-S132, respectively, as described below.
Step S131, defining a loss function, which is expressed by the formula:
Figure BDA0002167856370000092
wherein the content of the first and second substances,
Figure BDA0002167856370000093
representing second detection data; q. q.srawRepresenting first detection data; w is agyro、xgyro、ygyro、zgyroRepresenting a model according to a transfer function, i.e. according to the coefficient qaAnd coefficient qbAnd the first detection data representsA conversion of data, wa、xa、ya、zaAnd wb、xb、yb、zbRespectively represent the coefficients qaAnd coefficient qbThe data value of (2).
It should be noted that q isgyroIs about qa、qbThen, after the expansion of the formula (1-5), the formula (1-5) is made to become with respect to the unknown quantity wa、xa、ya、zbAnd unknown quantity wb、xb、yb、zaThereby the unknowns can be solved optimally.
Step S132, the corresponding data value w is processed3D、x3D、y3D、z3D、wgyro、xgyro、ygyro、zgyro、wa、xa、ya、zaAnd wb、xb、yb、zbSubstituting into the loss function formula (1-5), and optimizing to obtain w when the value of the loss function is minimized by nonlinear least square methoda、xa、ya、zbAnd wb、xb、yb、zaObtaining the coefficient qaAnd coefficient qbThe solution value of (2).
It should be noted that, in the formula (1-5), in order to minimize the value of Loss, the coefficient q needs to be ensured by the last two calculation termsaAnd coefficient qbIs a unit quaternion, i.e.
Figure BDA0002167856370000101
And
Figure BDA0002167856370000102
respectively equal to 1. Theoretically, two sets of corresponding q3DAnd q israwThe data can be used for calculating the coefficient q by using a Ceres Solver library and the likeaAnd coefficient qbBut due to noise and other imprecision reasons, the more data the better, and preferably the data is three-dimensionalSpace 3 degrees of freedom rotation.
It should be noted that the non-linear least square method is a parameter estimation method for estimating parameters of a non-linear static model by using the square sum minimum of errors as a criterion, and the non-linear least square method is often used for optimizing a problem, such as modeling and solving a large-scale complex optimization problem by using a Ceres solution library (open source C + + library), and particularly for a non-linear least square problem with boundary constraint. Regarding the optimization problem of using Ceres solution library to solve the above loss function, reference may be made to the usage flow of Ceres solution library: (1) defining a residual function fiDefining a structure (struct) of a cost function, and reloading an operator in the structure; (2) create a Cost _ Function, and set fiAs a parameter thereof; (3) add one residual block (problem. addresidualblock); (4) problem solving (problem. Since the process of using the nonlinear least squares method to optimize the solution parameters belongs to the prior art, it will not be described in detail here.
Step S140, the optimized solved coefficient qaAnd coefficient qbIs introduced into the conversion function model q preset in step S120gyro=f(qraw) Thereby obtaining a mathematical model of the calibrated 3D rotational difference, formulated as:
qgyro=f(qraw)=qb′qa′qraw(qa′)-1 (1-6)
wherein, q'aExpressed as the solved-for-optimization qaValue of (1), q'bExpressed as the solved-for-optimization qbThe value of (c). It will be appreciated that this mathematical model of the 3D rotation difference is used to calibrate the 3D rotation measurement difference between the motion detection means and the 3D measurement means.
Example II,
Referring to fig. 4, on the basis of the mathematical model construction method for calibrating 3D rotation difference disclosed in the first embodiment, the present application further discloses a calibration method for 3D rotation difference, which includes steps S210-S240.
In step S210, first detection data of a motion detection device and second detection data of a 3D measurement device are obtained.
In this embodiment, the motion detection device may be disposed on the object to be detected, and is configured to detect a three-dimensional motion state of the object to be detected; and, 3D measuring device can aim at the object to be detected, is used for treating the three-dimensional motion state of the object to be detected and detects equally. Then, the first detection data may be expressed as
Figure BDA0002167856370000111
The second detection data may be expressed as
Figure BDA0002167856370000112
Step S220, a mathematical model for calibrating the 3D rotation difference is established according to the mathematical model construction method in the first embodiment, and the mathematical model for calibrating the 3D rotation difference is used for calibrating the 3D rotation measurement difference between the motion detection device and the 3D measurement device.
It should be noted that, the expression for establishing the mathematical model for obtaining the calibration 3D rotation difference is:
qgyro=f(qraw)=qb′qa′qraw(qa′)-1
wherein, q'aExpressed as the solved-for-optimization qaValue of (1), q'bExpressed as the solved-for-optimization qbThe value of (c). For the construction process of the mathematical model, reference may be made to steps S110 to S140 in the first embodiment, which are not described herein again.
Step S230, converting the first detection data into corresponding first conversion data according to the mathematical model for calibrating the 3D rotation difference, so as to complete the calibration of the 3D rotation difference between the motion detection device and the 3D measurement device.
In the present embodiment, the first detection data is set
Figure BDA0002167856370000113
Substituting into the mathematical model for calibrating 3D rotation differenceTo the first conversion data qgyroAnd is represented by
Figure BDA0002167856370000114
The first conversion data is used as target data after calibration to make the coordinate direction and the zero degree reference position between the motion detection device and the 3D measurement device consistent, and after the first conversion data is obtained, it represents that the calibration of the 3D rotation difference between the motion detection device and the 3D measurement device is completed.
Step S240, comparing the first conversion data with the second detection data to determine the second detection data q3DThe accuracy of (2). In a specific embodiment, see FIG. 5, step S240 may include steps S241-S244, respectively, as described below.
Step S241, converting the first conversion data qgyroAnd second detection data q3DAnd comparing to obtain a comparison result. Step S242, judging the second detection data q according to the comparison result3DIf the second detection data q is determined to be credible3DIf the user is not trusted, the process proceeds to step S244, otherwise, the process proceeds to step S243.
In one embodiment, the comparison result, such as the difference between the two, may be compared with a predetermined threshold range, and if the comparison result is within the threshold range, the second detection data q is determined3DAnd if not, the data is not credible.
Step S243, determining the second detection data q3DIf the detected data is credible, the second detection data q is directly output3D
Step S244, determining two detection data q3DWhen not authentic, using the first conversion data qgyroSubstitute for second detection data q3DTo output, in such a way that the purpose of calibrating the 3D rotation measurement difference between the motion detection means and the 3D measurement means is achieved.
Example III,
Referring to fig. 6, the present application discloses a 3D rotation difference calibration apparatus 3, which mainly includes an obtaining unit 31, a constructing unit 32, and a calibrating unit 33, which are respectively described below.
The acquiring unit 31 is used for acquiring first detection data of a motion detection device and second detection data of a 3D measurement device. As shown in fig. 6, the motion detection device is disposed on an object to be detected, and is configured to detect a three-dimensional motion state of the object to be detected; and the 3D measuring device is arranged on one side of the object to be detected and is also used for measuring the three-dimensional motion state of the object to be detected. Regarding the specific functions of the acquiring unit 31, reference may be made to relevant contents in step S210, and detailed description is not given here.
The construction unit 32 is connected to the obtaining unit 31, and is configured to obtain a mathematical model for calibrating the 3D rotation difference according to the mathematical model construction method disclosed in the first embodiment. The mathematical model for calibrating 3D rotation differences is used to calibrate the 3D rotation measurement differences between the motion detection means and the 3D measurement means. Regarding the specific functions of the constructing unit 32, reference may be made to the relevant contents in step S220, and detailed description is not given here.
The calibration unit 33 is connected to the construction unit 32 for converting the first detection data into corresponding first conversion data according to the established mathematical model for calibrating the 3D rotational difference to complete the calibration of the 3D rotational difference between the motion detection device and the 3D measurement device. Wherein the first converted data is used for comparing with the second detection data to determine the accuracy of the second detection data. Regarding the specific functions of the calibration unit 33, reference may be made to relevant contents in steps S230-S240, and detailed description is omitted here.
Example four,
Referring to fig. 7, the present application further discloses a device 4, wherein the device 4 may include a memory 41 and a processor 42, and a 3D rotation difference calibration method program stored on the memory 41 and executable on the processor 42. It should be noted that the 3D rotation difference calibration method program herein can implement the steps (i.e., steps S210-S240) of the 3D rotation difference calibration method in the second embodiment when executed by the processor 42.
It should be noted that, in the present embodiment, the calibration method program for 3D rotation difference stored on the memory 41 may include some functional modules, such as the obtaining unit 31, the constructing unit 32, and the calibration unit 33 claimed in the third embodiment. The acquiring unit 31 is configured to acquire first detection data of a motion detection device and second detection data of a 3D measurement device; the constructing unit 32 is connected to the obtaining unit 31, and is configured to establish a mathematical model for obtaining a calibrated 3D rotation difference according to the mathematical model constructing method disclosed in the first embodiment; the calibration unit 33 is connected to the construction unit 32 for converting the first detection data into corresponding first conversion data according to the established mathematical model for calibrating the 3D rotational difference to complete the calibration of the 3D rotational difference between the motion detection device and the 3D measurement device.
Those skilled in the art will appreciate that all or part of the functions of the various methods in the above embodiments may be implemented by hardware, or may be implemented by computer programs. When all or part of the functions of the above embodiments are implemented by a computer program, the program may be stored in a computer-readable storage medium, and the storage medium may include: a read only memory, a random access memory, a magnetic disk, an optical disk, a hard disk, etc., and the program is executed by a computer to realize the above functions. For example, the program may be stored in a memory of the device, and when the program in the memory is executed by the processor, all or part of the functions described above may be implemented. In addition, when all or part of the functions in the above embodiments are implemented by a computer program, the program may be stored in a storage medium such as a server, another computer, a magnetic disk, an optical disk, a flash disk, or a removable hard disk, and may be downloaded or copied to a memory of a local device, or may be version-updated in a system of the local device, and when the program in the memory is executed by a processor, all or part of the functions in the above embodiments may be implemented.
The present invention has been described in terms of specific examples, which are provided to aid understanding of the invention and are not intended to be limiting. For a person skilled in the art to which the invention pertains, several simple deductions, modifications or substitutions may be made according to the idea of the invention.

Claims (8)

1. A mathematical model construction method for calibrating 3D rotation difference is characterized by comprising the following steps:
acquiring first detection data of a motion detection device and second detection data of a 3D measurement device; the motion detection device is arranged on an object to be detected, and the motion detection device and the 3D measurement device are both used for detecting the three-dimensional motion state of the object to be detected;
presetting a conversion function model by which the first detection data can be converted into corresponding first conversion data, the conversion function model by redefining a coordinate system conversion coefficient q of the motion detection apparatusaAnd redefining a zero degree reference position conversion coefficient q of the motion detection apparatusbPerforming 3D rotational disparity conversion on the first detection data; wherein the step of converting the first detection data into corresponding first conversion data by the preset conversion function model comprises:
let the transfer function model be denoted as qgyro=f(qraw) Wherein q isgyroExpressed as first converted data, q, after conversion of said first detection datarawExpressed as said first detected data;
the preset process of the conversion function model comprises the following steps:
redefining coordinate system conversion coefficients of the motion detection device
Figure FDA0002951279050000011
According to said coefficient qaConverting a coordinate system of the first detection data to enable the coordinate direction of the motion detection device to be rotated to be consistent with the coordinate direction of the 3D measurement device, and obtaining data after the coordinate system conversion: q. q.sraw′qaqrawqa -1Wherein q israwRepresents the first detection data, and
Figure FDA0002951279050000012
qa -1is qaThe inverse of (1);
redefining zero degree reference position conversion coefficient of the motion detection device
Figure FDA0002951279050000013
According to said coefficient qbPerforming zero degree reference position conversion on the data after the coordinate system conversion so as to enable zero degree reference positions of the motion detection device and the 3D measurement device to be consistent with the coordinate direction, and obtaining data after 3D rotation difference conversion is performed on the first detection data: q. q.sgyro=qbqraw′;
The transfer function model is built and is denoted as qgyro=f(qraw)=qbqaqrawqa -1And will be
Figure FDA0002951279050000021
As the first conversion data;
establishing a loss function according to the first detection data, the second detection data and the first conversion data, and utilizing a nonlinear least square method to carry out the processing on the coefficient qaAnd said coefficient qbCarrying out optimization solution;
the coefficient q of the optimization solutionaAnd said coefficient qbAnd introducing the data into the transfer function model, and establishing a mathematical model for obtaining a calibration 3D rotation difference, wherein the mathematical model for calibrating the 3D rotation difference is used for calibrating the 3D rotation measurement difference between the motion detection device and the 3D measurement device.
2. The method of constructing a mathematical model of claim 1 wherein said creating a loss function from said first test data, said second test data and said first transformed data uses a non-linear least squares method on said coefficients qaAnd said coefficient qbThe optimization solution comprises the following steps:
the loss function is defined as:
Figure FDA0002951279050000022
wherein the content of the first and second substances,
Figure FDA0002951279050000023
represents the second detection data, qrawRepresenting said first detection data, wgyro、xgyro、ygyro、zgyroRepresenting said first transformed data, w, obtained from said transfer function model and said first detected dataa、xa、ya、zaAnd wb、xb、yb、zbRespectively represent the coefficients qaAnd said coefficient qbThe data value of (a);
substituting the corresponding data values into the loss function, and optimizing and solving the coefficient q when the value of the loss function is minimum by utilizing a nonlinear least square methodaAnd said coefficient qbThe value of (c).
3. A method of constructing a mathematical model as claimed in claim 2, wherein the coefficients q to be solved are qaAnd said coefficient qbIntroducing the calibration data into the transfer function model, and establishing a mathematical model for obtaining the calibration 3D rotation difference, wherein the mathematical model comprises the following steps:
the coefficient q solved by the optimizationaAnd said coefficient qbIs introduced into the conversion function model qgyro=f(qraw) And obtaining a mathematical model for calibrating the 3D rotation difference, wherein the mathematical model is expressed by the formula:
qgyro=f(qraw)=qb′qa′qraw(qa′)-1
wherein, q'aExpressed as q solved for said optimizationaValue of (1), q'bExpressed as q solved for said optimizationbThe value of (c).
4. A method for calibrating 3D rotational disparity, comprising:
acquiring first detection data of a motion detection device and second detection data of a 3D measurement device; the motion detection device is arranged on an object to be detected, and the motion detection device and the 3D measurement device are both used for detecting the three-dimensional motion state of the object to be detected;
establishing a mathematical model resulting in a calibrated 3D rotation difference according to the mathematical model construction method of any of claims 1-3, the mathematical model of calibrated 3D rotation difference being used to calibrate the 3D rotation measurement difference between the motion detection device and the 3D measurement device;
and converting the first detection data into corresponding first conversion data according to the mathematical model for calibrating the 3D rotation difference so as to finish the calibration of the 3D rotation difference between the motion detection device and the 3D measurement device, wherein the first conversion data is used for comparing with the second detection data so as to determine the accuracy of the second detection data.
5. The calibration method of claim 4, wherein the first converted data is used to compare with the second detected data to determine the accuracy of the second detected data, comprising:
and comparing the first conversion data with the second detection data, judging whether the second detection data is credible according to a comparison result, and replacing the second detection data with the first conversion data if the second detection data is determined to be credible.
6. A device for calibrating 3D rotational disparity, comprising:
an acquisition unit for acquiring first detection data of a motion detection device and second detection data of a 3D measurement device; the motion detection device is arranged on an object to be detected, and the motion detection device and the 3D measurement device are both used for detecting the three-dimensional motion state of the object to be detected;
a construction unit for building a mathematical model resulting in a calibrated 3D rotation difference according to the mathematical model construction method of any of claims 1-3, the mathematical model of calibrated 3D rotation difference being used for calibrating the 3D rotation measurement difference between the motion detection device and the 3D measurement device;
and the calibration unit is used for converting the first detection data into corresponding first conversion data according to the mathematical model of the calibration 3D rotation difference so as to finish the calibration of the 3D rotation difference between the motion detection device and the 3D measurement device, and the first conversion data is used for comparing with the second detection data so as to determine the accuracy of the second detection data.
7. A device for calibrating 3D rotational differences, characterized in that it comprises: memory, a processor and a 3D rotational disparity calibration method program stored on the memory and executable on the processor, the 3D rotational disparity calibration method program when executed by the processor implementing the steps of the 3D rotational disparity calibration method according to any of claims 4-5.
8. A computer-readable storage medium, characterized by comprising a program executable by a processor to implement the method of any one of claims 1-5.
CN201910753049.7A 2019-08-15 2019-08-15 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof Active CN110567486B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110215033.8A CN113008271A (en) 2019-08-15 2019-08-15 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof
CN201910753049.7A CN110567486B (en) 2019-08-15 2019-08-15 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910753049.7A CN110567486B (en) 2019-08-15 2019-08-15 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110215033.8A Division CN113008271A (en) 2019-08-15 2019-08-15 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof

Publications (2)

Publication Number Publication Date
CN110567486A CN110567486A (en) 2019-12-13
CN110567486B true CN110567486B (en) 2021-04-13

Family

ID=68775478

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110215033.8A Pending CN113008271A (en) 2019-08-15 2019-08-15 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof
CN201910753049.7A Active CN110567486B (en) 2019-08-15 2019-08-15 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110215033.8A Pending CN113008271A (en) 2019-08-15 2019-08-15 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof

Country Status (1)

Country Link
CN (2) CN113008271A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113008271A (en) * 2019-08-15 2021-06-22 深圳市瑞立视多媒体科技有限公司 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114739313B (en) * 2022-02-18 2023-04-18 西北大学 Calibration method of three-dimensional laser scanner

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012202749A (en) * 2011-03-24 2012-10-22 Yokogawa Denshikiki Co Ltd Orientation detection device
CN103337066A (en) * 2013-05-27 2013-10-02 清华大学 Calibration method for 3D (three-dimensional) acquisition system
CN103424114A (en) * 2012-05-22 2013-12-04 同济大学 Visual navigation/inertial navigation full combination method
CN103728647A (en) * 2013-12-20 2014-04-16 西安电子工程研究所 Projectile roll angle measurement method based on satellite carrier signal modulation
CN107167130A (en) * 2017-05-18 2017-09-15 上海谦尊升网络科技有限公司 Map match localization method and system
CN107796391A (en) * 2017-10-27 2018-03-13 哈尔滨工程大学 A kind of strapdown inertial navigation system/visual odometry Combinated navigation method

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1026154C (en) * 1990-01-24 1994-10-05 李秋山 Three-dimensional synchronous interpretation system
CN101539405B (en) * 2009-04-09 2011-03-30 南京航空航天大学 Multiple-view measurement data self-matching method based on attitude sensor
EP2325612B1 (en) * 2009-11-13 2016-01-27 SICK Sensors Ltd. Method for calibrating a rotational angle sensor by means of a laser gyroscope
CN102297693B (en) * 2010-06-24 2013-03-27 鼎亿数码科技(上海)有限公司 Method for measuring position and azimuths of object
JP2012035399A (en) * 2010-08-11 2012-02-23 Shin Nippon Koki Co Ltd Correction matrix derivation device, error correction device, and machine tool
KR101297317B1 (en) * 2011-11-30 2013-08-16 한국과학기술연구원 Calibration Method of Motion Sensor for Motion Tracking
CN102997919B (en) * 2012-11-22 2015-07-15 北京理工大学 Method for improving error inhibition effect of rotary type strapdown inertial navigation by insulation of carrier movement
KR101405317B1 (en) * 2013-01-15 2014-06-11 한국과학기술연구원 Assist Apparatus for calibarating Camera Sensor and Laser Sensor, and Sensor Calibration System and Method using the same
CN103411623B (en) * 2013-07-10 2015-12-23 西北工业大学 Rate gyro calibration steps
AT516248B1 (en) * 2014-12-12 2016-04-15 System 7 Railsupport Gmbh Method for calibrating a device for measuring tracks
CN104764443B (en) * 2015-04-24 2017-07-18 国家测绘地理信息局卫星测绘应用中心 A kind of tight imaging geometry model building method of Optical remote satellite
US20170050080A1 (en) * 2015-08-19 2017-02-23 Seiko Epson Corporation Pedaling measurement apparatus, pedaling measurement system, pedaling measurement method, and recording medium
CN105180937B (en) * 2015-10-15 2018-01-02 常熟理工学院 A kind of MEMS IMU Initial Alignment Methods
CN105606127A (en) * 2016-01-11 2016-05-25 北京邮电大学 Calibration method for relative attitude of binocular stereo camera and inertial measurement unit
CN106052683A (en) * 2016-05-25 2016-10-26 速感科技(北京)有限公司 Robot motion attitude estimating method
US10012517B2 (en) * 2016-08-01 2018-07-03 Infinity Augmented Reality Israel Ltd. Method and system for calibrating components of an inertial measurement unit (IMU) using scene-captured data
CN107782309A (en) * 2017-09-21 2018-03-09 天津大学 Noninertial system vision and double tops instrument multi tate CKF fusion attitude measurement methods
CN109062048B (en) * 2018-08-20 2019-07-19 西北工业大学 The nonsingular sliding-mode control of MEMS gyroscope default capabilities based on Hybrid Learning
CN109405850A (en) * 2018-10-31 2019-03-01 张维玲 A kind of the inertial navigation positioning calibration method and its system of view-based access control model and priori knowledge
CN113008271A (en) * 2019-08-15 2021-06-22 深圳市瑞立视多媒体科技有限公司 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012202749A (en) * 2011-03-24 2012-10-22 Yokogawa Denshikiki Co Ltd Orientation detection device
CN103424114A (en) * 2012-05-22 2013-12-04 同济大学 Visual navigation/inertial navigation full combination method
CN103337066A (en) * 2013-05-27 2013-10-02 清华大学 Calibration method for 3D (three-dimensional) acquisition system
CN103728647A (en) * 2013-12-20 2014-04-16 西安电子工程研究所 Projectile roll angle measurement method based on satellite carrier signal modulation
CN107167130A (en) * 2017-05-18 2017-09-15 上海谦尊升网络科技有限公司 Map match localization method and system
CN107796391A (en) * 2017-10-27 2018-03-13 哈尔滨工程大学 A kind of strapdown inertial navigation system/visual odometry Combinated navigation method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Adaptive control of kinematically redundant robots;SHENGWU LUO 等;《IMA Journal of Mathematical Control & Information (19997)》;19971231;第14卷;第225-253页 *
姿态检测地磁传感器静态校正技术;刘建敬 等;《南京理工大学学报》;20120229;第36卷(第1期);第127-131页 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113008271A (en) * 2019-08-15 2021-06-22 深圳市瑞立视多媒体科技有限公司 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof

Also Published As

Publication number Publication date
CN110567486A (en) 2019-12-13
CN113008271A (en) 2021-06-22

Similar Documents

Publication Publication Date Title
Hol et al. Modeling and calibration of inertial and vision sensors
Zhang Two-step calibration methods for miniature inertial and magnetic sensor units
WO2011091083A1 (en) Apparatus and methodology for calibration of a gyroscope and a compass included in a handheld device
Klug et al. A complete workflow for automatic forward kinematics model extraction of robotic total stations using the Denavit-Hartenberg convention
CN104715469A (en) Data processing method and electronic device
CN110567486B (en) Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof
Wu et al. A novel calibration method for non-orthogonal shaft laser theodolite measurement system
El-Ashmawy Using direct linear transformation (DLT) method for aerial photogrammetry applications
Schreve How accurate can a stereovision measurement be?
Alves et al. Camera-inertial sensor modelling and alignment for visual navigation
de Araujo et al. Cloud-based approach for automatic CNC workpiece origin localization based on image analysis
CN109506617B (en) Sensor data processing method, storage medium, and electronic device
US10267649B2 (en) Method and apparatus for calculating azimuth
Avrutov et al. Calibration of an inertial measurement unit
Wang et al. Calibration technology of the articulated arm flexible CMM
CN105758422B (en) A kind of test method of integration type closed-loop fiber optic gyroscope
CN113436272B (en) Spherical refraction correction method for underwater stereoscopic vision system and electronic equipment
Wang An efficient coordinate frame calibration method for 3-D measurement by multiple camera systems
KR100735494B1 (en) Azimuth measurement device and azimuth measurement method
JP6872324B2 (en) Measurement system, measurement method and measurement program
Schmitt et al. Estimation of the absolute camera pose for environment recognition of industrial robotics
CN115839726B (en) Method, system and medium for jointly calibrating magnetic sensor and angular velocity sensor
Dolereit et al. Calibration of shared flat refractive stereo systems
Zhu et al. Uncertainty and sensitivity analysis for camera calibration
CN116592861B (en) Magnetic compass calibration model construction method, magnetic compass calibration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant