CN111275769B - Monocular vision parameter correction method and device - Google Patents

Monocular vision parameter correction method and device Download PDF

Info

Publication number
CN111275769B
CN111275769B CN202010052244.XA CN202010052244A CN111275769B CN 111275769 B CN111275769 B CN 111275769B CN 202010052244 A CN202010052244 A CN 202010052244A CN 111275769 B CN111275769 B CN 111275769B
Authority
CN
China
Prior art keywords
data
external parameter
image acquisition
scale factor
measurement unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010052244.XA
Other languages
Chinese (zh)
Other versions
CN111275769A (en
Inventor
范锡睿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202010052244.XA priority Critical patent/CN111275769B/en
Publication of CN111275769A publication Critical patent/CN111275769A/en
Application granted granted Critical
Publication of CN111275769B publication Critical patent/CN111275769B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Navigation (AREA)

Abstract

The application provides a method for correcting monocular vision parameters, which comprises the following steps: determining first external parameter data corresponding to the current moment, wherein the first external parameter characterizes the position relation of an image acquisition unit relative to an inertial measurement unit in equipment; comparing the first external parameter data with second external parameter data to obtain a first scale factor corresponding to the current moment, wherein the second external parameter data characterizes the initial position relation of the image acquisition unit relative to the inertia measurement unit; and correcting the input data at the next moment based on the first scale factor. Meanwhile, the application also provides a device for correcting the monocular vision parameters.

Description

Monocular vision parameter correction method and device
Technical Field
The present application relates to a visual parameter correction technology, and in particular, to a method and an apparatus for correcting monocular visual parameters.
Background
Augmented reality (AR, augmented Reality) technology is a technology that integrates virtual information with the real world ingeniously, in the AR field, the monocular vision odometer VIO (Visual Inertial Odometry) system is the dominant solution for instant localization and mapping (SLAM, simultaneous localization and mapping). In practical applications, the dimensions of the inertial measurement units (IMUs, inertial measurement unit) are inherently uncertain and continuously changed with the increase of the system operation time, and the tracks obtained by the monocular VIO system are also often faced with the problem of dimensional drift. The method is characterized in that the track output by the monocular VIO system is more accurate in shape, but is larger or smaller in size than the actual motion track.
Disclosure of Invention
In view of the above, embodiments of the present application desirably provide
In order to achieve the above purpose, the technical scheme of the application is realized as follows:
according to an aspect of the present application, there is provided a method of correcting monocular vision parameters, including:
determining first external parameter data corresponding to the current moment, wherein the first external parameter characterizes the position relation of an image acquisition unit relative to an inertial measurement unit in equipment;
comparing the first external parameter data with second external parameter data to obtain a first scale factor corresponding to the current moment, wherein the second external parameter data characterizes the initial position relation of the image acquisition unit relative to the inertia measurement unit;
and correcting the input data at the next moment based on the first scale factor.
In the above solution, before the determining the first external parameter corresponding to the current time, the method further includes
And calibrating the image acquisition unit and the inertial measurement unit to obtain second extrinsic data, wherein the second extrinsic data characterizes the position relation of the image acquisition unit relative to the inertial measurement unit.
In the above solution, the determining the first external parameter data corresponding to the current time includes:
Acquiring image data acquired by the image acquisition unit at the current moment;
determining first pose data corresponding to the three-dimensional space point at the current moment based on the three-dimensional space point in the image data and the second extrinsic data;
and comparing the first pose data with second pose data corresponding to the three-dimensional space point at adjacent time to obtain the first external reference data.
In the above scheme, the comparing the first extrinsic data with the second extrinsic data to obtain a first scale factor corresponding to the current time includes:
determining scale variation values of the first external reference data and the second external reference data;
and determining the scale change value as a first scale factor corresponding to the current moment.
In the above solution, the correcting the input data at the next moment based on the first scale factor includes:
acquiring motion data acquired by the inertial measurement unit at the next moment;
and determining real track data corresponding to the motion data based on the motion data and the first scale factor.
In the above solution, the determining, based on the motion data and the first scale factor, real trajectory data corresponding to the motion data includes:
Calculating a quotient between the motion data and the first scale factor;
and correcting the motion data based on the quotient to obtain the real track data corresponding to the motion data.
In the above scheme, the motion data at least includes: at least one of acceleration data, angular velocity data, and distance data between the present time and the previous time.
In the above scheme, the method further comprises:
acquiring corresponding third external parameter data at the next moment, wherein the third external parameter data characterizes the position relationship of the image acquisition unit and the inertia measurement unit at the next moment;
comparing the third external parameter data with the first external parameter data to obtain a second scale factor corresponding to the next moment;
and correcting the input data at the next moment based on the second scale factor.
According to another aspect of the present application, there is provided a correcting apparatus for monocular vision parameters, comprising:
the determining unit is used for determining first external parameter data corresponding to the current moment, and the first external parameter data characterizes the position relation of the image acquisition unit in the equipment relative to the inertial measurement unit;
the comparison unit is used for comparing the first external parameter data with the second external parameter data to obtain a first scale factor corresponding to the current moment, and the second external parameter data characterizes the initial position relation of the image acquisition unit relative to the inertia measurement unit;
And the correction unit is used for correcting the input data at the next moment based on the first scale factor.
According to a third aspect of the present application, there is provided a correction device for monocular vision parameters, comprising: memory, a processor and a responsive program stored in the memory for execution by the processor, wherein the processor is responsive to the steps of the correction method according to any one of claims 1 to 8 when executing the responsive program.
The application provides a monocular vision parameter correction method and device, which are characterized in that the first external parameter data corresponding to the current moment is determined, and the first external parameter data characterizes the position relation of an image acquisition unit relative to an inertia measurement unit in equipment; comparing the first external parameter data with second external parameter data to obtain a first scale factor corresponding to the current moment, wherein the second external parameter data characterizes the initial position relationship between the image acquisition unit and the inertia measurement unit; and correcting the input data at the next moment based on the first scale factor. Therefore, the motion data acquired by the inertial measurement unit at the next moment is corrected by combining the image data acquired by the image acquisition unit and the motion data acquired by the inertial measurement unit, so that the problem of track scale drift caused by inaccurate IMU scale can be solved without increasing additional hardware cost, and the stability and accuracy of the equipment are improved.
Drawings
FIG. 1 is a schematic diagram showing a process implementation of a method for correcting monocular vision parameters according to the present application;
FIG. 2 is a schematic diagram II of a flow implementation of a method for correcting monocular vision parameters according to the present application;
FIG. 3 is a schematic diagram showing the structural components of a monocular vision parameter calibration apparatus according to the present application;
FIG. 4 is a schematic diagram II of the structural composition of the monocular vision parameter correcting device according to the present application;
fig. 5 is a schematic diagram III of the structural composition of the monocular vision parameter correcting device in the present application.
Detailed Description
The technical scheme of the application is further elaborated below by referring to the drawings in the specification and the specific embodiments.
Fig. 1 is a schematic diagram of a process implementation of a method for correcting monocular vision parameters according to the present application, as shown in fig. 1, the method includes:
step 101, determining first external parameter data corresponding to the current moment, wherein the first external parameter characterizes the position relation of an image acquisition unit in equipment relative to an inertial measurement unit;
here, the method is mainly applied to an AR device, which may be a mobile phone, AR glasses, a tablet computer, a smart watch, a projector, or the like. The AR device may include an image acquisition unit through which image data corresponding to the AR device at a current time may be acquired, and an Inertial Measurement Unit (IMU) through which motion data of the AR device at the current time may be acquired. For example, the motion data includes acceleration data, angular velocity data, distance data between the present time and the last time, and the like.
In the application, the AR equipment acquires the image data corresponding to the current moment through the image acquisition unit, and the first external reference data of the image acquisition unit relative to the inertial measurement unit at the current moment can be determined through the initial position relation between the image data and the image acquisition unit relative to the inertial measurement unit.
Here, the first external parameter characterizes a positional relationship of the image acquisition unit relative to the inertial measurement unit in the AR device at a current time. The initial positional relationship of the image acquisition unit relative to the inertial measurement unit may include a three-dimensional rotational angle relationship, a three-dimensional translational distance relationship, and the like of the image acquisition unit relative to the inertial measurement unit.
The image acquisition unit may specifically be at least one of a monocular camera, an image acquisition card, an image acquisition chip, and the like.
When the AR equipment determines first external parameter data corresponding to the current moment, acquiring image data at the current moment through an image acquisition unit in the AR equipment, acquiring the image data corresponding to the current moment of the AR equipment, and projecting the image data into a camera coordinate system to acquire a three-dimensional space point corresponding to the image data in the camera coordinate system; based on the three-dimensional space point and the second external parameter data of the image acquisition unit relative to the inertial measurement unit, the first pose data corresponding to the three-dimensional space point at the current moment can be determined; and comparing the first pose data with second pose data corresponding to the three-dimensional space point at adjacent time to obtain the first external reference data.
Here, the second external parameter characterizes an initial positional relationship of the image acquisition unit with respect to the inertial measurement unit. The second pose data may be preset initial pose data, or pose data obtained by the same obtaining manner as that of the first pose data.
In the application, the AR equipment also needs to calibrate the image acquisition unit and the inertial measurement unit before determining the first external parameter data of the image acquisition unit relative to the inertial measurement unit at the current moment, and obtains the second external parameter data of the image acquisition unit relative to the inertial measurement unit according to the calibration result, wherein the second external parameter data represents the initial position relation of the image acquisition unit relative to the inertial measurement unit. And the second extrinsic data includes a three-dimensional rotation matrix R and a three-dimensional translation vector T of the image acquisition unit relative to the inertial measurement unit. Through the three-dimensional rotation matrix R and the three-dimensional translation vector T, image characteristic point data under the coordinate system of the image acquisition unit at the current moment can be converted into image characteristic point data under the coordinate system of the Inertial Measurement Unit (IMU).
For example, after the AR device is assembled, the three-dimensional rotation matrix R (representing four point data) and the three-dimensional translation vector T of the image acquisition unit relative to the inertial measurement unit are obtained by calibrating the image acquisition unit and the inertial measurement unit, then the image data at the current moment is acquired by the image acquisition unit, the three-dimensional space point p_c of the image data in the camera coordinate system can be obtained by projecting the image data into the camera coordinate system, and then the three-dimensional space point p_c can be converted into the IMU coordinate system by R and T, thereby obtaining the three-dimensional space point p_b in the IMU coordinate system. The conversion formula may be: p_b=r_p_c+t, where R and T may be initial position references of the image acquisition unit relative to the inertial measurement unit.
In the application, the image acquisition unit and the inertial measurement unit are calibrated, and the calibration can be realized by a calibration tool Kalibr, wherein the calibration process of Kalibr is as follows:
first, installing a Kalibr tool;
after the Kalibr tool is installed, the calibration plate starts to be prepared. Wherein, be applicable to three kinds of calibration boards in Kalibr's instrument, respectively: aprilgrid, checkerboard and circlecrid. Among them, checkboard is most commonly used, but aprigrd is most accurate because it can provide sequence number information, which can prevent jumping during posture calculation.
Printing by a second calibration plate;
and printing a calibration plate required by Kalibr calibration, and measuring the length interval of the checkerboard in the calibration plate.
Thirdly, data acquisition;
after the calibration plate is manufactured, the image acquisition unit in the AR equipment can be used for acquiring data of the calibration plate, and the calibration plate can be shot from multiple angles during acquisition to obtain calibration data. Here, the calibration data includes image data of the calibration plate and image feature points of the image data in the camera coordinate system correspond to IMU data in the IMU coordinate system.
Fourth, outputting external parameter data;
inputting the obtained checkerboard length data and calibration data into a Kalibr tool, and outputting external parameter data corresponding to an image acquisition unit and an IMU in the AR equipment through calculation by Kalibr.
102, comparing the first external parameter data with second external parameter data to obtain a first scale factor corresponding to the current moment, wherein the second external parameter data characterizes an initial position relation of the image acquisition unit relative to the inertia measurement unit;
in the application, the position relation of the image acquisition unit relative to the inertial measurement unit can change slightly in actual movement of the AR equipment, so that the AR equipment needs to continuously perform optimization adjustment on the external parameter data of the image acquisition unit relative to the inertial measurement unit in operation, and the optimized result can be in the vicinity of the external reference data. Of course, it can also be said that the VIO system installed in the AR device requires continuous optimization of the external parameter data of the image acquisition unit relative to the inertial measurement unit during operation.
When the AR device moves, the motion information obtained by the Inertial Measurement Unit (IMU) in the AR device may undergo a scale drift compared to the actual motion information, that is, the motion information (including the acceleration a, the velocity v, and the distance d) of the AR device obtained by the Inertial Measurement Unit (IMU) in the AR device should be worse than the actual motion information by a scale factor λ, that is, become λa, λv, and λd. The external parameter data of the image acquisition unit relative to the inertial measurement unit in the AR device also changes due to the change of the motion scale obtained by the IMU, so that after the optimization of the VIO system, the scale factor also acts on the external parameter data to become lambda T. Through the calibration of the position of the image acquisition unit relative to the inertial measurement unit, external reference original values R and T of the image acquisition unit relative to the inertial measurement unit can be obtained, and therefore the real-time scale factor lambda at each moment T can be determined through the R and T.
In the application, in order to eliminate the problem of movement track scale drift of the AR equipment in the actual movement process. After the AR equipment obtains the first external parameter data of the image acquisition unit relative inertia measurement unit, the first external parameter data and the second external parameter data need to be compared to obtain a first scale factor corresponding to the current moment, and the influence of scale drift on a monocular VIO system in the AR equipment is removed based on the first scale factor.
In the present application, when the AR device compares the first external parameter data with the second external parameter data to obtain a first scale factor corresponding to the current moment, specifically, the AR device may also determine a scale change value of the first external parameter data and the second external parameter data; and then determining the scale change value as a first scale factor corresponding to the current moment.
For example, the AR device may use the calculated quotient as the first scale factor corresponding to the current time by calculating a quotient between the first extrinsic data and the second extrinsic data.
At this time, the quotient is the scale change value of the first extrinsic data and the second extrinsic data.
Of course, the calculated difference quotient can also be used as the first scale factor corresponding to the current moment by calculating the difference between the first external reference data and the second external reference data.
At this time, the difference is the scale change value of the first extrinsic data and the second extrinsic data.
And step 103, correcting the input data at the next moment based on the first scale factor.
Here, when the AR device corrects the input data at the next moment based on the first scale factor, the motion data of the AR device may be specifically acquired by the inertial measurement unit in the AR device at the next moment, so as to acquire the motion data acquired by the inertial measurement unit at the next moment; and determining real track data corresponding to the motion data based on the motion data and the first scale factor.
Here, the motion data includes at least: at least one of acceleration data, angular velocity data, and distance data between the present time and the previous time.
In the application, when the AR equipment determines the real track data corresponding to the motion data based on the motion data and the first scale factor, the quotient between the motion data and the first scale factor can be calculated; and correcting the motion data based on the quotient to obtain the real track data corresponding to the motion data.
For example, a first scale factor corresponding to the current moment is fed back to input data of the IMU, and the motion data obtained by the IMU at the next moment is divided by the first scale factor, so that the influence of scale drift on the VIO system can be removed, and a true accurate track is obtained.
In the application, since the scale drift correction is required to be carried out on the moving track of the AR equipment at each moment, the AR equipment can also acquire corresponding third external parameter data at the next moment, and the third external parameter data characterizes the position relationship of the image acquisition unit and the inertia measurement unit in the AR equipment at the next moment; then, comparing the third external parameter data with the first external parameter data to obtain a second scale factor corresponding to the next moment; and correcting the input data at the next moment based on the second scale factor.
Here, the time corresponding to the first external parameter data and the time corresponding to the third external parameter data are adjacent times.
According to the monocular vision parameter correction method provided by the application, the motion data acquired by the inertial measurement unit at the next moment is corrected by combining the image data acquired by the image acquisition unit and the motion data acquired by the inertial measurement unit, so that the problem of track scale drift caused by inaccurate IMU scale can be solved without increasing extra hardware cost, and the stability and accuracy of the device are improved.
Fig. 2 is a schematic diagram II of a process implementation of a method for correcting monocular vision parameters according to the present application, as shown in fig. 2, the method includes the following steps:
Step 201, calibrating an image acquisition unit and an inertial measurement unit in the equipment, and acquiring an initial external parameter relation of the image acquisition unit in the equipment relative to the inertial measurement unit;
specifically, an AR device with a VIO system is calibrated, and an initial external reference relation T0 of a camera in the AR device relative to an IMU is obtained.
Here, the purpose of calibrating the VIO system is to obtain a three-dimensional rotation matrix R and a three-dimensional translation vector T. That is, the initial external parameter T0 of the VIO system at least includes a three-dimensional rotation matrix R and a three-dimensional translation vector T, and the coordinate conversion relationship from camera to IMU can be determined by calibrating the VIO system.
If: given a three-dimensional point p_c in the camera coordinate system, p_c can be converted to a point p_b in the IMU coordinate system by transformation of R and T. The correspondence is p_b=r×p_c+t.
In the application, the external parameter calibration of the VIO system can be concretely realized by using a calibration tool Kalibr, and the specific steps are as follows:
1. printing a calibration plate required by Kalibr calibration, and measuring the length interval of at least one checkerboard on the calibration plate;
2. calibration data is acquired against the checkerboard using an image acquisition unit in the AR device to be calibrated, the calibration data comprising image data and corresponding IMU data (i.e. motion data, including angular velocity, acceleration and distance).
3. The length interval data and the calibration data are input into Kalibr, and the Kalibr outputs external parameter data between the corresponding image acquisition unit and the IMU through calculation.
Step 202, optimizing the external reference relation of the equipment to obtain a real-time external reference relation at the current moment;
specifically, in the running process of the AR equipment, continuously optimizing the external reference relation of the VIO system to obtain a real-time external reference relation T (T) at the current moment T;
here, the VIO system needs to perform comprehensive optimization in combination with image data and IMU data when calculating the pose. First, a model based on minimized re-projection errors (BA, bundle adjustment) needs to be constructed, wherein the projection process of the BA model is specifically as follows: camera coordinate system → IMU coordinate system → world coordinate system → IMU coordinate system → camera coordinate system. The method comprises the steps of firstly obtaining matching points p_i and p_j of a pair of images at the ith moment and the jth moment through image matching, then back projecting the p_i to a camera coordinate system at the jth moment according to the projection sequence, so that a residual error is formed by the p_i at the ith moment and the true image point p_j at the jth moment, in the projection process, firstly inputting an image characteristic point p_i at the ith moment to the camera coordinate system to obtain a three-dimensional space point p_i, then converting the three-dimensional space point p_i into an IMU coordinate system by combining R and T to obtain pose R_i and T_i corresponding to the ith moment, then converting the pose R_j and T_i into a world coordinate system to obtain pose R_j and T_j corresponding to the jth moment, and then converting the pose R_j and T_j corresponding to the jth moment into the IMU coordinate system by utilizing R and T to obtain the characteristic point p_j of the camera.
Thus, the first scale factor corresponding to the ith moment can be obtained through the residual error between the image characteristic point p_j and the image characteristic point p_i, and the motion trail at the j moment can be corrected by using the first scale factor.
Here, after obtaining the residual error between the image feature point p_j and the image feature point p_i, the residual error may be solved and optimized by a gradient descent method to minimize the residual error, so as to obtain pose r_i, t_i, r_j, t_j with higher precision.
Step 203, comparing the initial external reference relation with the real-time external reference relation to determine a scale change value corresponding to the current moment;
specifically, the initial external reference relation T0 is compared with the real-time external reference relation T (T), and the size lambda (T) of the scale change corresponding to the current time T is determined, wherein a specific calculation formula can be as follows: λ (T) =t (T)/T0.
Here, the time corresponding to the external relation T0 and the time corresponding to the real-time external relation T (T) are adjacent times.
Step 204, obtaining motion data at the next moment, and processing the motion data at the next moment based on the scale change value corresponding to the current moment to obtain accurate motion trail data corresponding to the next moment;
specifically, the IMU is used for acquiring the motion data of the AR equipment at the next moment, and the motion data at the next moment is divided by lambda (t) to obtain accurate motion trail data. To remove the effect of motion scale factors on the VIO system.
Step 205, the scale change value corresponding to the current time is continuously updated.
Specifically, returning to step 202, λ (t) continues to be updated.
Here, when λ (t) is updated, the adopted external reference relationship at the current time is compared with the external reference relationship adjacent to the history, and the size of the scale corresponding to the current time is determined.
Compared with the method for accurately calibrating the IMU by using equipment before the IMU is installed in the prior art, or the method for correcting the scale drift of the IMU by using other sensors (such as a binocular camera, a concentration camera and the like) after the IMU is installed, the method for correcting the scale drift of the VIO system provided by the application has the advantages that the influence of the scale drift on the VIO system is removed by combining the IMU with the monocular camera, the additional hardware cost is not needed, the problem of the scale drift can be solved by using the low-price IMU, the cost is reduced, and the stability and the accuracy of the VIO system can be improved.
Fig. 3 is a schematic structural diagram of a monocular vision parameter correcting apparatus according to the present application, as shown in fig. 3, the apparatus includes:
a determining unit 301, configured to determine first external parameter data corresponding to a current moment, where the first external parameter characterizes a positional relationship between an image acquisition unit and an inertial measurement unit in the device;
The comparing unit 302 is configured to compare the first extrinsic data with second extrinsic data to obtain a first scale factor corresponding to a current time, where the second extrinsic data characterizes an initial position relationship of the image acquisition unit relative to the inertial measurement unit;
a correction unit 303, configured to correct the input data at the next moment based on the first scale factor.
Fig. 4 is a schematic diagram of a second structural component of the monocular vision parameter correcting apparatus according to the present application, as shown in fig. 4, the apparatus includes:
and the calibration unit 304 is configured to calibrate the image acquisition unit and the inertial measurement unit to obtain the second extrinsic data, where the second extrinsic data characterizes an initial positional relationship of the image acquisition unit relative to the inertial measurement unit.
A determining unit 301, configured to determine first external parameter data corresponding to a current moment, where the first external parameter characterizes a positional relationship between an image acquisition unit and an inertial measurement unit in the device;
the comparing unit 302 is configured to compare the first extrinsic data with second extrinsic data to obtain a first scale factor corresponding to a current time, where the second extrinsic data characterizes an initial position relationship of the image acquisition unit relative to the inertial measurement unit;
A correction unit 303, configured to correct the input data at the next moment based on the first scale factor.
In a preferred embodiment of the present application, the apparatus further comprises:
an acquiring unit 305, configured to acquire image data acquired by the image acquiring unit at a current time; after the acquiring unit 305 acquires the image data acquired at the current time, the determining unit 301 is triggered, and the determining unit 301 determines the first pose data corresponding to the three-dimensional space point at the current time based on the three-dimensional space point in the image data and the second extrinsic data; after the determining unit 301 determines the first pose data corresponding to the three-dimensional space point at the current moment, the determining unit 301 triggers the comparing unit 302, and the comparing unit 302 compares the first pose data with the second pose data corresponding to the three-dimensional space point at the adjacent moment to obtain the first external parameter data.
In a preferred embodiment of the present application, the determining unit 301 is specifically configured to determine a scale change value of the first external parameter data and the second external parameter data; and determining the scale change value as a first scale factor corresponding to the current moment.
In a preferred embodiment of the present application, the obtaining unit 305 is further configured to obtain motion data collected by the inertial measurement unit at a next moment; and after the motion data acquired by the inertial measurement unit at the next moment is acquired, triggering the determining unit 301, determining real track data corresponding to the motion data by the determining unit 301 based on the motion data and the first scale factor, and after determining the real track data corresponding to the motion data, triggering the correcting unit 303 by the determining unit 301, and correcting input data at the next moment by using the real track data by the correcting unit 303.
In a preferred embodiment of the present application, the determining unit 301 is specifically configured to calculate a quotient between the motion data and the first scale factor; and after the quotient is obtained, triggering a correction unit 303, and correcting the motion data by the correction unit 303 based on the quotient to obtain the real track data corresponding to the motion data.
Here, the motion data includes at least: at least one of acceleration data, angular velocity data, and distance data between the present time and the previous time.
In a preferred embodiment of the present application, the obtaining unit 305 is further configured to obtain corresponding third external parameter data at a next time, where the third external parameter data characterizes a positional relationship between the image capturing unit and the inertial measurement unit at the next time; after the obtaining unit 305 obtains the corresponding third external parameter data at the next moment, triggering the comparing unit 302, and comparing the third external parameter data with the first external parameter data by the comparing unit 302 to obtain a second scale factor corresponding to the next moment; and after the comparing unit 302 obtains the second scale factor corresponding to the next time, the correcting unit 303 is triggered, and the correcting unit 303 corrects the input data of the next time based on the second scale factor.
According to the correcting device for the monocular vision parameters, provided by the application, the influence of the scale drift on the VIO system is removed by combining the IMU and the monocular camera, so that extra hardware expenditure is not required, the problem of the scale drift can be solved by using the low-price IMU, the cost is reduced, and the stability and the accuracy of the VIO system can be improved.
It should be noted that: in the correcting device for monocular vision parameters provided in the above embodiment, when correcting the scale drift of the CIO system, only the division of each program module is used for illustration, in practical application, the process allocation may be completed by different program modules according to needs, that is, the internal structure of the correcting device for monocular vision parameters is divided into different program modules, so as to complete all or part of the processes described above. In addition, the device for correcting the monocular vision parameters provided in the above embodiment and the method embodiment for correcting the monocular vision parameters belong to the same concept, and detailed implementation processes of the device are shown in the method embodiment, and are not described herein.
The embodiment of the application also provides a device for correcting the monocular vision parameters, which comprises: a processor and a memory for storing a computer program capable of running on the processor,
Wherein the processor, when executing the computer program, performs: determining first external parameter data corresponding to the current moment, wherein the first external parameter characterizes the position relation of an image acquisition unit relative to an inertial measurement unit in equipment; comparing the first external parameter data with second external parameter data to obtain a first scale factor corresponding to the current moment, wherein the second external parameter data characterizes the initial position relation of the image acquisition unit relative to the inertia measurement unit; and correcting the input data at the next moment based on the first scale factor.
The processor is further configured to execute, when the computer program is executed: and calibrating the image acquisition unit and the inertial measurement unit to obtain second extrinsic data, wherein the second extrinsic data characterizes the initial position relation of the image acquisition unit relative to the inertial measurement unit.
The processor is further configured to execute, when the computer program is executed: acquiring image data acquired by the image acquisition unit at the current moment; determining first pose data corresponding to the three-dimensional space point at the current moment based on the three-dimensional space point in the image data and the second extrinsic data; and comparing the first pose data with second pose data corresponding to the three-dimensional space point at adjacent time to obtain the first external reference data.
The processor is further configured to execute, when the computer program is executed: determining scale variation values of the first external reference data and the second external reference data; and determining the scale change value as a first scale factor corresponding to the current moment.
The processor is further configured to execute, when the computer program is executed: acquiring motion data acquired by the inertial measurement unit at the next moment; and determining real track data corresponding to the motion data based on the motion data and the first scale factor.
The processor is further configured to execute, when the computer program is executed: calculating a quotient between the motion data and the first scale factor; and correcting the motion data based on the quotient to obtain the real track data corresponding to the motion data.
The motion data includes at least: at least one of acceleration data, angular velocity data, and distance data between the present time and the previous time.
The processor is further configured to execute, when the computer program is executed: acquiring corresponding third external parameter data at the next moment, wherein the third external parameter data characterizes the position relationship of the image acquisition unit and the inertia measurement unit at the next moment; comparing the third external parameter data with the first external parameter data to obtain a second scale factor corresponding to the next moment; and correcting the input data at the next moment based on the second scale factor.
Fig. 5 is a schematic diagram of a third structural component of the monocular vision parameter correcting apparatus according to the present application, and the monocular vision parameter correcting apparatus 400 may be an AR device, a mobile phone, a computer, a digital broadcasting terminal, an information transceiver device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, glasses, or the like. The monocular vision parameter correction apparatus 400 shown in fig. 5 includes: at least one processor 401, a memory 402, at least one network interface 404, and a user interface 403. The various components in the monocular vision parameter correction device 400 are coupled together by a bus system 405. It is understood that the bus system 405 is used to enable connected communications between these components. The bus system 405 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 405 in fig. 4.
The user interface 403 may include, among other things, a display, keyboard, mouse, trackball, click wheel, keys, buttons, touch pad, or touch screen, etc.
It is to be appreciated that memory 402 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory. Wherein the nonvolatile Memory may be Read Only Memory (ROM), programmable Read Only Memory (PROM, programmable Read-Only Memory), erasable programmable Read Only Memory (EPROM, erasable Programmable Read-Only Memory), electrically erasable programmable Read Only Memory (EEPROM, electrically Erasable Programmable Read-Only Memory), magnetic random access Memory (FRAM, ferromagnetic random access Memory), flash Memory (Flash Memory), magnetic surface Memory, optical disk, or compact disk Read Only Memory (CD-ROM, compact Disc Read-Only Memory); the magnetic surface memory may be a disk memory or a tape memory. The volatile memory may be random access memory (RAM, random Access Memory), which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available, such as static random access memory (SRAM, static Random Access Memory), synchronous static random access memory (SSRAM, synchronous Static Random Access Memory), dynamic random access memory (DRAM, dynamic Random Access Memory), synchronous dynamic random access memory (SDRAM, synchronous Dynamic Random Access Memory), double data rate synchronous dynamic random access memory (ddr SDRAM, double Data Rate Synchronous Dynamic Random Access Memory), enhanced synchronous dynamic random access memory (ESDRAM, enhanced Synchronous Dynamic Random Access Memory), synchronous link dynamic random access memory (SLDRAM, syncLink Dynamic Random Access Memory), direct memory bus random access memory (DRRAM, direct Rambus Random Access Memory). The memory 402 described in embodiments of the present application is intended to comprise, without being limited to, these and any other suitable types of memory.
The memory 402 in an embodiment of the present application is used to store various types of data to support the operation of the correction device 400 for monocular vision parameters. Examples of such data include: any computer program for operating on the correction device 400 for monocular vision parameters, such as an operating system 4021 and an application program 4022; contact data; telephone book data; a message; a picture; video, etc. The operating system 4021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application programs 4022 may include various application programs such as a Media Player (Media Player), a Browser (Browser), and the like for implementing various application services. A program for implementing the method of the embodiment of the present application may be included in the application program 4022.
The method disclosed in the above embodiment of the present application may be applied to the processor 401 or implemented by the processor 401. The processor 401 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 401 or by instructions in the form of software. The processor 401 described above may be a general purpose processor, a digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. Processor 401 may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiment of the application can be directly embodied in the hardware of the decoding processor or can be implemented by combining hardware and software modules in the decoding processor. The software module may be located in a storage medium located in the memory 402, the processor 401 reading information in the memory 402, in combination with its hardware performing the steps of the method described above.
In an exemplary embodiment, the monocular vision parameter correction device 400 may be implemented by one or more application specific integrated circuits (ASIC, application Specific Integrated Circuit), DSPs, programmable logic devices (PLDs, programmable Logic Device), complex programmable logic devices (CPLDs, complex Programmable Logic Device), field programmable gate arrays (FPGAs, field-Programmable Gate Array), general purpose processors, controllers, microcontrollers (MCUs, micro Controller Unit), microprocessors (microprocessors), or other electronic components for performing the aforementioned methods.
In an exemplary embodiment, the present application also provides a computer readable storage medium, e.g. a memory 402 comprising a computer program executable by the processor 401 of the correction device 400 of monocular vision parameters to perform the steps of the method described above. The computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash Memory, magnetic surface Memory, optical disk, or CD-ROM; but may be a variety of devices including one or any combination of the above-described memories, such as a mobile phone, computer, tablet device, personal digital assistant, or the like.
A computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs: determining first external parameter data corresponding to the current moment, wherein the first external parameter characterizes the position relation of an image acquisition unit relative to an inertial measurement unit in equipment;
comparing the first external parameter data with second external parameter data to obtain a first scale factor corresponding to the current moment, wherein the second external parameter data characterizes the initial position relation of the image acquisition unit relative to the inertia measurement unit;
and correcting the input data at the next moment based on the first scale factor.
The computer program, when executed by the processor, further performs:
and calibrating the image acquisition unit and the inertial measurement unit to obtain second extrinsic data, wherein the second extrinsic data characterizes the initial position relation of the image acquisition unit relative to the inertial measurement unit.
The computer program, when executed by the processor, further performs:
acquiring image data acquired by the image acquisition unit at the current moment;
determining first pose data corresponding to the three-dimensional space point at the current moment based on the three-dimensional space point in the image data and the second extrinsic data;
And comparing the first pose data with second pose data corresponding to the three-dimensional space point at adjacent time to obtain the first external reference data.
The computer program, when executed by the processor, further performs:
determining scale variation values of the first external reference data and the second external reference data;
and determining the scale change value as a first scale factor corresponding to the current moment.
The computer program, when executed by the processor, further performs:
acquiring motion data acquired by the inertial measurement unit at the next moment;
and determining real track data corresponding to the motion data based on the motion data and the first scale factor.
The computer program, when executed by the processor, further performs:
calculating a quotient between the motion data and the first scale factor;
and correcting the motion data based on the quotient to obtain the real track data corresponding to the motion data.
The motion data includes at least: at least one of acceleration data, angular velocity data, and distance data between the present time and the previous time.
The computer program, when executed by the processor, further performs:
Acquiring corresponding third external parameter data at the next moment, wherein the third external parameter data characterizes the position relationship of the image acquisition unit and the inertia measurement unit at the next moment;
comparing the third external parameter data with the first external parameter data to obtain a second scale factor corresponding to the next moment;
and correcting the input data at the next moment based on the second scale factor.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described as separate units may or may not be physically separate, and units displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units; some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The methods disclosed in the method embodiments provided by the application can be arbitrarily combined under the condition of no conflict to obtain a new method embodiment.
The features disclosed in the several product embodiments provided by the application can be combined arbitrarily under the condition of no conflict to obtain new product embodiments.
The features disclosed in the embodiments of the method or the apparatus provided by the application can be arbitrarily combined without conflict to obtain new embodiments of the method or the apparatus.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A method of correcting monocular vision parameters, comprising:
determining first external parameter data corresponding to the current moment, wherein the first external parameter characterizes the position relation of an image acquisition unit relative to an inertial measurement unit in equipment;
comparing the first external parameter data with second external parameter data to obtain a first scale factor corresponding to the current moment, wherein the second external parameter data characterizes the initial position relation of the image acquisition unit relative to the inertia measurement unit;
acquiring motion data acquired by the inertial measurement unit at the next moment;
and determining real track data corresponding to the motion data based on the motion data and the first scale factor.
2. The method of claim 1, further comprising, prior to said determining the first extrinsic data corresponding to the current time
And calibrating the image acquisition unit and the inertial measurement unit to obtain second extrinsic data, wherein the second extrinsic data characterizes the initial position relation of the image acquisition unit relative to the inertial measurement unit.
3. The method of claim 1, wherein the determining the first extrinsic data corresponding to the current time comprises:
Acquiring image data acquired by the image acquisition unit at the current moment;
determining first pose data corresponding to the three-dimensional space point at the current moment based on the three-dimensional space point in the image data and the second extrinsic data;
and comparing the first pose data with second pose data corresponding to the three-dimensional space point at adjacent time to obtain the first external reference data.
4. The method of claim 1, wherein comparing the first extrinsic data with the second extrinsic data to obtain a first scale factor corresponding to a current time, comprises:
determining scale variation values of the first external reference data and the second external reference data;
and determining the scale change value as a first scale factor corresponding to the current moment.
5. The method of claim 4, the determining real trajectory data corresponding to the motion data based on the motion data and the first scale factor, comprising:
calculating a quotient between the motion data and the first scale factor;
and correcting the motion data based on the quotient to obtain the real track data corresponding to the motion data.
6. The method of claim 4, the motion data comprising at least: at least one of acceleration data, angular velocity data, and distance data between the present time and the previous time.
7. The method of claim 1, the method further comprising:
acquiring corresponding third external parameter data at the next moment, wherein the third external parameter data characterizes the position relationship of the image acquisition unit and the inertia measurement unit at the next moment;
comparing the third external parameter data with the first external parameter data to obtain a second scale factor corresponding to the next moment;
and correcting the input data at the next moment based on the second scale factor.
8. A monocular vision parameter correction apparatus comprising:
the determining unit is used for determining first external parameter data corresponding to the current moment, and the first external parameter data characterizes the position relation of the image acquisition unit in the equipment relative to the inertial measurement unit;
the comparison unit is used for comparing the first external parameter data with the second external parameter data to obtain a first scale factor corresponding to the current moment, and the second external parameter data characterizes the initial position relation of the image acquisition unit relative to the inertia measurement unit;
The correction unit is used for acquiring the motion data acquired by the inertial measurement unit at the next moment; and determining real track data corresponding to the motion data based on the motion data and the first scale factor.
9. A monocular vision parameter correction apparatus comprising: memory, a processor and a responsive program stored in the memory for execution by the processor, wherein the processor is responsive to the steps of the correction method according to any one of claims 1 to 7 when executing the responsive program.
CN202010052244.XA 2020-01-17 2020-01-17 Monocular vision parameter correction method and device Active CN111275769B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010052244.XA CN111275769B (en) 2020-01-17 2020-01-17 Monocular vision parameter correction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010052244.XA CN111275769B (en) 2020-01-17 2020-01-17 Monocular vision parameter correction method and device

Publications (2)

Publication Number Publication Date
CN111275769A CN111275769A (en) 2020-06-12
CN111275769B true CN111275769B (en) 2023-10-24

Family

ID=71003501

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010052244.XA Active CN111275769B (en) 2020-01-17 2020-01-17 Monocular vision parameter correction method and device

Country Status (1)

Country Link
CN (1) CN111275769B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114812610B (en) * 2020-11-16 2024-07-16 浙江商汤科技开发有限公司 Parameter calibration method and device of visual inertial system, electronic equipment and medium
CN114147727B (en) * 2022-02-07 2022-05-20 杭州灵西机器人智能科技有限公司 Method, device and system for correcting pose of robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018077176A1 (en) * 2016-10-26 2018-05-03 北京小鸟看看科技有限公司 Wearable device and method for determining user displacement in wearable device
CN108489482A (en) * 2018-02-13 2018-09-04 视辰信息科技(上海)有限公司 The realization method and system of vision inertia odometer
CN109520476A (en) * 2018-10-24 2019-03-26 天津大学 Resection dynamic pose measurement system and method based on Inertial Measurement Unit
CN110118554A (en) * 2019-05-16 2019-08-13 深圳前海达闼云端智能科技有限公司 SLAM method, apparatus, storage medium and device based on visual inertia
CN110378968A (en) * 2019-06-24 2019-10-25 深圳奥比中光科技有限公司 The scaling method and device of camera and Inertial Measurement Unit relative attitude

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018077176A1 (en) * 2016-10-26 2018-05-03 北京小鸟看看科技有限公司 Wearable device and method for determining user displacement in wearable device
CN108489482A (en) * 2018-02-13 2018-09-04 视辰信息科技(上海)有限公司 The realization method and system of vision inertia odometer
CN109520476A (en) * 2018-10-24 2019-03-26 天津大学 Resection dynamic pose measurement system and method based on Inertial Measurement Unit
CN110118554A (en) * 2019-05-16 2019-08-13 深圳前海达闼云端智能科技有限公司 SLAM method, apparatus, storage medium and device based on visual inertia
CN110378968A (en) * 2019-06-24 2019-10-25 深圳奥比中光科技有限公司 The scaling method and device of camera and Inertial Measurement Unit relative attitude

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于STM32的多惯性测量单元数据采集***设计;彭飞;杨傲雷;;仪表技术(07);全文 *

Also Published As

Publication number Publication date
CN111275769A (en) 2020-06-12

Similar Documents

Publication Publication Date Title
CN108364319B (en) Dimension determination method and device, storage medium and equipment
CN109613543B (en) Method and device for correcting laser point cloud data, storage medium and electronic equipment
JP2021507393A (en) How to calibrate an augmented reality device
CN111275769B (en) Monocular vision parameter correction method and device
KR20210084622A (en) Time synchronization processing methods, electronic devices and storage media
CN110068326B (en) Attitude calculation method and apparatus, electronic device, and storage medium
CN110530356B (en) Pose information processing method, device, equipment and storage medium
CN111443337A (en) Radar-IMU calibration method based on hand-eye calibration
CN110567484A (en) method and device for calibrating IMU and rigid body posture and readable storage medium
CN112446917A (en) Attitude determination method and device
CN111240468B (en) Calibration method and device for hand motion capture, electronic equipment and storage medium
CN114543797B (en) Pose prediction method and device, equipment and medium
CN115240140A (en) Equipment installation progress monitoring method and system based on image recognition
CN109506617B (en) Sensor data processing method, storage medium, and electronic device
CN108286946B (en) Method and system for sensor position calibration and data splicing
CN112945231A (en) IMU and rigid body posture alignment method, device, equipment and readable storage medium
CN111141217A (en) Object measuring method, device, terminal equipment and computer storage medium
CN109866217B (en) Robot mileage positioning method, device, terminal equipment and computer storage medium
CN115719387A (en) 3D camera calibration method, point cloud image acquisition method and camera calibration system
CN115205419A (en) Instant positioning and map construction method and device, electronic equipment and readable storage medium
CN115294280A (en) Three-dimensional reconstruction method, apparatus, device, storage medium, and program product
CN110967027B (en) Map correction method and device and electronic equipment
CN110180185B (en) Time delay measurement method, device, system and storage medium
JP3452188B2 (en) Tracking method of feature points in 2D video
CN108495104A (en) A kind of information processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant