CN113055598B - Orientation data compensation method and device, electronic equipment and readable storage medium - Google Patents

Orientation data compensation method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN113055598B
CN113055598B CN202110321774.4A CN202110321774A CN113055598B CN 113055598 B CN113055598 B CN 113055598B CN 202110321774 A CN202110321774 A CN 202110321774A CN 113055598 B CN113055598 B CN 113055598B
Authority
CN
China
Prior art keywords
orientation data
sequence
orientation
shooting time
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110321774.4A
Other languages
Chinese (zh)
Other versions
CN113055598A (en
Inventor
黄凯
王楠
章国锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202110321774.4A priority Critical patent/CN113055598B/en
Publication of CN113055598A publication Critical patent/CN113055598A/en
Application granted granted Critical
Publication of CN113055598B publication Critical patent/CN113055598B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an orientation data compensation method, an orientation data compensation device, an electronic device and a computer-readable storage medium, wherein the compensation method comprises the following steps: acquiring a first orientation data sequence and a shooting time sequence of an image sequence shot in a preset time; acquiring a second orientation data sequence of the orientation meter in the preset time; calculating the offset speed of the yaw angle of the orientation meter according to the first orientation data sequence, the second orientation data sequence and the shooting time sequence; obtaining compensation data corresponding to each second orientation data according to the offset speed; compensating each of the second orientation data in the second orientation data sequence based on the compensation data. The method is used for compensating the yaw angle, so that the negative influence of the yaw angle deviation on the tracking precision is eliminated or reduced.

Description

Orientation data compensation method and device, electronic equipment and readable storage medium
Technical Field
The invention relates to the technical field of computer vision, in particular to an orientation data compensation method, an orientation data compensation device, electronic equipment and a readable storage medium.
Background
Visual odometry technology is a core problem of some current scenarios applying computer vision technology. Visual odometry technology can have important application in many fields, such as AR/VR, automatic driving technology, indoor navigation and the like.
On a smart phone platform, the system estimates and maintains the orientation of the mobile phone based on an accelerometer and a gyroscope, and in the method, under the condition of continuous motion of the mobile phone, an unblocked continuous deviation exists in the estimated orientation of the yaw angle. This constant offset will cause an increasing error towards the yaw angle part of the odometer, which will eventually have an increasingly negative effect on the tracking accuracy of the odometer.
Disclosure of Invention
The invention provides an orientation data compensation method, an electronic device and a readable storage medium, which are used for eliminating or reducing the negative influence of yaw angle deviation on tracking accuracy.
In order to solve the above technical problems, a first technical solution provided by the present invention is: there is provided an orientation data compensation method, comprising: acquiring a first orientation data sequence and a shooting time sequence of an image sequence shot in a preset time; acquiring a second orientation data sequence of the orientation meter in the preset time; calculating the offset speed of the yaw angle of the orientation meter according to the first orientation data sequence, the second orientation data sequence and the shooting time sequence; obtaining compensation data corresponding to each second orientation data according to the offset speed; compensating each of the second orientation data in the second orientation data sequence based on the compensation data.
Wherein the calculating of the offset speed of the yaw angle of the heading meter according to the first heading data sequence, the second heading data sequence, and the shooting time sequence includes: calculating a first difference value of each first orientation data and a first orientation data in the first orientation data sequence; calculating a second difference value between each second orientation data and the first second orientation data in the second orientation data sequence; calculating the difference value of the shooting time of each shooting time and the first shooting time in the shooting time sequence; and calculating the offset speed of the yaw angle of the orientation meter by using the first difference, the second difference and the shooting time difference corresponding to the shooting time.
Wherein the calculating the offset speed of the yaw angle of the heading meter by using the first difference, the second difference and the difference between the shooting times corresponding to the shooting times includes: calculating the offset speed of the yaw angle of the orientation meter corresponding to each shooting time by using the first difference, the second difference and the shooting time difference corresponding to each shooting time, and calculating the offset speed of the yaw angle of the orientation meter corresponding to all shooting times according to the offset speeds of the yaw angles of the orientation meter corresponding to all shooting times.
Wherein the method further comprises: acquiring a statistical time sequence corresponding to the second orientation data sequence, wherein the statistical time in the statistical time sequence corresponds to the second orientation data in the second orientation data sequence one to one; the calculating the offset speed of the yaw angle of the heading meter corresponding to each shooting time by using the first difference, the second difference and the shooting time difference corresponding to each shooting time comprises: for any target shooting time, determining a corresponding target first difference value according to first orientation data corresponding to the target shooting time; determining target statistic time closest to the target shooting time in the statistic time sequence, and determining a corresponding target second difference value according to a second orientation corresponding to the target statistic time; and calculating the offset speed of the yaw angle of the heading meter corresponding to the target shooting time according to the target first difference, the target second difference and the shooting time difference corresponding to the target shooting time.
Wherein the first orientation data and the first second orientation data correspond to the same time.
Wherein, the method should include: acquiring a statistical time sequence corresponding to the second orientation data sequence, wherein the statistical time in the statistical time sequence corresponds to the second orientation data in the second orientation data sequence one to one; the obtaining compensation data corresponding to each second orientation data according to the offset speed includes: calculating to obtain a statistical time difference value between each statistical time and the first statistical time in the statistical time sequence; and calculating to obtain the compensation data corresponding to each second orientation data according to the statistical time difference and the offset speed.
Wherein the compensating each second orientation data in the sequence of second orientation data based on the compensation data comprises: and calculating the product of each second orientation data and the compensation data to obtain each compensated second orientation data.
Wherein the acquiring of the first orientation data sequence and the capturing time sequence of the image sequence captured within the predetermined time includes: acquiring an image sequence, and acquiring the shooting time of each image to obtain the shooting time sequence; processing each image by using a neural network algorithm to obtain first orientation data corresponding to each image, and obtaining the first orientation data sequence according to each first orientation data.
Wherein the acquiring of the first orientation data sequence and the capturing time sequence of the image sequence captured within the predetermined time includes: acquiring an image sequence, and acquiring the shooting time of each image to obtain the shooting time sequence; processing each image by using a neural network algorithm to obtain first orientation data corresponding to each image, and obtaining the first orientation data sequence according to each first orientation data.
In order to solve the above technical problems, a second technical solution provided by the present invention is: provided is a yaw angle compensation device including: the data acquisition module is used for acquiring a first orientation data sequence and a shooting time sequence of the image sequence shot in a preset time; acquiring a second orientation data sequence of the orientation meter in the preset time; the speed calculation module is used for calculating the offset speed of the yaw angle of the orientation meter according to the first orientation data sequence, the second orientation data sequence and the shooting time sequence; the compensation data calculation module is used for obtaining compensation data corresponding to each second orientation data according to the offset speed; a compensation module for compensating each of the second orientation data in the second orientation data sequence based on the compensation data.
In order to solve the above technical problems, a third technical solution provided by the present invention is: provided is an electronic device including: the orientation data compensation method comprises a memory and a processor, wherein the memory stores program instructions, and the processor calls the program instructions from the memory to execute the orientation data compensation method.
In order to solve the above technical problems, a fourth technical solution provided by the present invention is: there is provided a computer readable storage medium storing a program file executable to implement the orientation data compensation method as described above.
The method has the advantages that the method is different from the prior art, and the offset speed of the yaw angle of the heading indicator is obtained through calculation; obtaining compensation data corresponding to each second orientation data according to the offset speed; and compensating each second orientation data in the second orientation data sequence based on the compensation data so as to compensate the yaw angle, thereby eliminating or reducing the negative influence of the yaw angle deviation on the tracking precision.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without inventive efforts, wherein:
FIG. 1 is a flow chart illustrating a method for orientation data compensation according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating an embodiment of step S11 in FIG. 1;
FIG. 3 is a flowchart illustrating an embodiment of step S12 in FIG. 1;
FIG. 4 is a flowchart illustrating an embodiment of step S13 in FIG. 1;
FIG. 5 is a schematic structural diagram of a yaw angle compensation apparatus according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the invention;
FIG. 7 is a schematic structural diagram of a storage medium according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Visual odometry technology is a core problem of some current scenarios applying computer vision technology. Visual odometry technology can have important application in many fields, such as AR/VR, automatic driving technology, indoor navigation and the like.
On a smart phone platform, the system estimates and maintains the orientation of the mobile phone based on an accelerometer and a gyroscope, estimates the current orientation of the mobile phone through a series of filtering algorithms on the bottom layer, and constructs a software sensor orientation meter for a visual mileage calculation method. Estimating the orientation of the camera again from the original accelerometer and gyroscope in the visual odometry calculation method is redundant, especially at the application layer far from the bottom layer (such as the Web end of a non-native application), the calculation time is multiplied, and therefore directly reading the orientation meter is the best performance scheme. However, most smartphones do not use magnetometers, and maintain orientation based only on accelerometers and gyroscopes, and this algorithm is not observable in the yaw angle dimension, so there is an unobstructed continuous shift in yaw angle in the estimated orientation under continuous motion. This constant offset will cause an increasing error towards the yaw angle part of the odometer, which will eventually have an increasingly negative effect on the tracking accuracy of the odometer.
Referring to fig. 1, a schematic flow chart of a first embodiment of a method for data compensation according to the present invention includes:
step S11: acquiring a first orientation data sequence and a shooting time sequence of an image sequence shot in a preset time; and acquiring a second orientation data sequence of the orientation meter in a preset time.
In this embodiment, the mobile terminal is taken as an example to perform orientation data compensation; the mobile terminal may be a mobile phone, smart glasses, a watch, etc., and the mobile phone is taken as an example for explanation in the following description. The mobile phone is provided with a camera, and continuously shoots in a preset time period through the camera to obtain all shot images in the preset time period, wherein each shot image is correspondingly recorded with shooting time to form a shooting time sequence, and the shot images can be sequenced according to the shooting time to form an image sequence. Furthermore, the mobile phone can acquire corresponding first orientation data through the shot images, wherein the first orientation data represents the image shooting time of the mobile phone and the orientation of a camera in the mobile phone, and the first orientation data can be the shape of a unit quaternion.
The mobile phone is also provided with an orientation meter which is a device integrated in the mobile terminal with a data processing function and can be used for measuring the orientation of a camera in the mobile terminal; the heading meter counts the second heading data of the yaw angle in each preset time period, and each counting records the counting time. The orientation meter can be a device integrated in the mobile terminal with a data processing function and can be used for measuring the orientation of a camera in the mobile terminal; the mobile terminal may be a mobile phone, smart glasses, a watch, and the like, and the following description will be given by taking the mobile phone as an example; and the second orientation data represents a state quantity representing the orientation of the camera in the mobile terminal at the present moment, and may be in the form of a unit quaternion.
Specifically, the mobile phone obtains each second orientation data counted by the orientation meter within a preset time period and the corresponding statistical time of each second orientation data to obtain a second orientation data sequence and a statistical time sequence. For example, the orientation in a cell phone is in 9: 00-9: acquiring and counting orientation data i times in a cycle of 10 to obtain i second orientation data which are respectively recorded as qa0, qa1, qa2 and qa3 … … qai, and acquiring and counting time of the second orientation data are respectively recorded as ta0, ta1, ta2 and ta3 … … tai; accordingly, the second orientation data are sorted according to time sequence, and 9: 00-9: the second direction data series of the direction count statistics in 10 is qa ═ q (qa0, qa1, qa2, qa3 … … qai), and the corresponding statistical time series is ta ═ ta (ta0, ta1, ta2, ta3 … … tai). It should be noted that the above sequences are ordered according to time sequence, and certainly, the sequences may also be ordered according to a reverse time sequence manner in practice, or ordered randomly, but it should be understood that a certain second orientation data in the second orientation data sequence has a corresponding relationship with the statistical time that is the same as the statistical time in the statistical time sequence, for example, the statistical time of the nth second orientation data in the second orientation data sequence is the nth data in the statistical time sequence.
The first orientation data and the second orientation data are unit quaternions, which represent a state quantity indicating the orientation of the camera in the mobile terminal at the present time, and yaw refers to a relative change, for example, a change in yaw between time tai and time ta0 is denoted by yaw (qai-qa 0).
Referring to fig. 2, step S11 includes:
step S111: acquiring an image sequence, and acquiring the shooting time of each image to obtain the shooting time sequence.
In a specific embodiment, the mobile phone continuously shoots within a preset time period by using the camera, and then can obtain shot images within the preset time period, each shot image is correspondingly recorded with shooting time to form a shooting time sequence, and the images can be sorted according to time to obtain corresponding image sequences. For example, at 9: 00-9: when images are captured at tv0, tv1, tv2 and tv3 … … tvi within 10 time slots, the sequence of captured images is tv (tv0, tv1, tv2 and tv3 … … tvi), and these images can be sorted according to the capturing time to form an image sequence.
Step S112: processing each image by using a neural network algorithm to obtain first orientation data corresponding to each image, and obtaining a first orientation data sequence according to each first orientation data.
Specifically, for the captured images, a neural network algorithm may be used to process each captured image to obtain first orientation data corresponding to each image, and then a first orientation data sequence corresponding to the image sequence is obtained according to the first orientation data. Neural network algorithms include, but are not limited to, convolutional neural network algorithms, deep neural network algorithms, and the like. For example, 9: 00-9: the first orientation data sequence corresponding to the captured image within 10 is qv ═ (qv0, qv1, qv2, qv3 … … qvi). It should be noted that, the process of processing the image by using the neural network algorithm may be executed locally by the mobile terminal; certainly, the first orientation data may also be executed by the cloud server, that is, after the mobile terminal obtains the image, the image is sent to the cloud server, the cloud server processes the image by using a neural network algorithm to obtain the corresponding first orientation data, and then the first orientation data is returned to the mobile terminal. Orientation data of an orientation meter in the visual image can be accurately obtained through the neural network algorithm.
Step S12: and calculating the offset speed of the yaw angle of the orientation meter according to the first orientation data sequence, the second orientation data sequence and the shooting time sequence.
In a specific implementation, the yaw angle of the mobile phone can be estimated through data collected by the orientation meter, but the estimation has an error, that is, the yaw angle estimated by the orientation meter (hereinafter, referred to as the "yaw angle of the orientation meter") has an error, and the yaw angle s continuously deviates, that is, the error is continuously accumulated (the yaw angle continuously deviates); in this regard, a speed of yaw rate of the yaw angle of the heading meter may be defined for characterizing a speed of heading meter error accumulation, wherein a change in yaw angle between time tai and time ta0 may be denoted as yaw (qai-qa 0). In one embodiment, the offset speed of the yaw angle of the heading meter can be calculated through the first orientation data sequence, the second orientation data sequence and the shooting time sequence.
Specifically, referring to fig. 3, step S12 includes:
step S121: calculating a first difference value of each first orientation data and the first orientation data in the first orientation data sequence; calculating a second difference value between each second orientation data and the first second orientation data in the second orientation data sequence; and calculating the difference value of the shooting time of each shooting time and the first shooting time in the shooting time sequence.
In an embodiment, the shooting times in the shooting time sequence and the statistical times in the statistical time sequence are sorted in chronological order, and the first shooting time and the first statistical time may be the same time (initial time), that is, the first orientation data and the first second orientation data correspond to the same time, that is, tv0 is ta0, and it can be considered that the orientation at this time (initial time) is not offset by the yaw angle.
Specifically, a first difference value between each first orientation data in the first orientation data sequence qv and the first orientation data is calculated, for example, Δ v0-qv 0-qv0, Δ v1-qv 1-qv0, Δ v2-qv 2-qv0, Δ v3-qv 3-qv0, and … … Δ vi-qv 0). A second difference between each second orientation data and the first second orientation data in the second orientation data sequence is calculated, for example, Δ a0-qa 0-qa0, Δ a1-qa 1-qa0, Δ a2-qa 2-qa0, Δ a3-qa 3-qa0, and … … Δ ai-qai-qa 0). The difference in shooting time between each shooting time and the first shooting time in the sequence of shooting times is calculated, for example, Δ tv0 ═ tv1-tv0, Δ tv1 ═ tv2-tv0, Δ tv2 ═ tv3-tv0, … … Δ tvi ═ tv 0.
Step S122: and calculating the offset speed of the yaw angle of the orientation meter by using the first difference, the second difference and the shooting time difference corresponding to the shooting time.
When the first difference, the second difference, and the shooting time difference are obtained, the offset speed of the yaw angle of the heading meter can be calculated according to the first difference, the second difference, and the shooting time difference. The first difference is obtained from the captured image, and thus can be considered as global information obtained by visual positioning to determine the deviation rule of the heading instrument yaw angle.
Through analysis, the deviation of the heading meter yaw angle fluctuates in a short time window, but is relatively stable in a long time window, the deviation angle is in a linear relation with time, the frequency of the heading meter counting a second heading data sequence is high, but in the actual tracking process of the visual odometer, each frame of image corresponds to a first heading data, and when the image is shot, each preset time is needed for shooting, so the frequency of obtaining the first heading data sequence is low. In this regard, the shooting time may be used as a reference, and for each shooting time, the corresponding first difference, second difference, and shooting time difference are obtained, and the yaw angle of the heading meter corresponding to each shooting time is calculated.
Specifically, any shooting time in the shooting time series may be referred to as a target shooting time; for the target capture time, there is a corresponding captured image and first orientation data, and from the first orientation data, a corresponding first difference, referred to as a target first difference, may be determined. Secondly, in the statistical time sequence, a target statistical time closest to the target shooting time can be determined, for example, the target shooting time is 9 o ' clock 1 min 10 sec, and two statistical times, namely 9 o ' clock 1 min 6 sec and 9 o ' clock 1 min 11 sec, are around 9 o ' clock 1 min 10 sec in the statistical time sequence, so that 9 o ' clock 1 min 11 sec is the target statistical time; after the target statistical time is determined, corresponding second orientation data can be determined according to the target statistical time, and a corresponding second difference value is determined according to the second orientation data, and the second difference value is called a target second difference value. Then, the offset speed of the yaw angle of the heading meter corresponding to the target shooting time, that is, the offset speed vdi ═ Δ vi- Δ ai)/Δ tvi ═ yaw (qvi-qv0) -yaw (qai-qa0))/(tvi-tv0) can be determined according to the target first difference, the target second difference, and the shooting time difference corresponding to the target shooting time (i.e., the difference between the target shooting time and the first shooting time).
For each shooting time of the shooting time sequence, the steps can be executed to obtain the offset speed of the yaw angle corresponding to each shooting time, and further obtain an offset speed sequence formed by a series of offset speeds. Then, the average value of the offset speeds of the yaw angles of the heading meters corresponding to all shooting times can be calculated, and further the offset speeds of the yaw angles of the heading meters in the whole preset time period can be obtained, namely the average offset speed of all the offset speeds in the offset speed sequence is calculated, so that the offset rule of the heading meters is obtained. Of course, in practice, statistical indexes such as the median and mode of the offset velocity sequence may be used as the offset velocity of the yaw angle of the heading meter.
Step S13: and obtaining compensation data corresponding to each second orientation data according to the offset speed.
After the offset speed of the yaw angle of the heading meter in the whole preset time period is determined, the compensation data corresponding to each piece of second heading data can be determined according to the offset speed.
Referring to fig. 4, step S13 specifically includes:
step S131: and calculating to obtain the statistical time difference between each statistical time and the first statistical time in the statistical time sequence.
Specifically, the statistical time difference between each statistical time and the first statistical time in the statistical time series is calculated, for example, Δ ta 0-ta 1-ta0, Δ ta1-ta 2-ta0, Δ ta2-ta 3-ta0, and … … Δ tai-ta 0.
Step S132: and calculating to obtain compensation data corresponding to each second orientation data according to the statistical time difference and the offset speed.
Specifically, for each second orientation data in the second orientation data sequence counted by the orientation meter, the elapsed time Δ tai-ta0 is required, and the compensation data corresponding to each second orientation data is obtained according to the offset speed calculated in step S12 as follows: x ═ vdi × Δ tai, and the calculated compensation data is converted into a corresponding relative rotation amount, for example, Y ═ rot (X).
Step S14: each second orientation data in the second orientation data sequence is compensated based on the compensation data.
Specifically, a product of each second orientation data and the compensation data is calculated, and then each second orientation data after compensation is obtained. And inputting the compensated second orientation data into the visual mileage calculation, so that the visual positioning result is more accurate. Specifically, the compensated second orientation data is input into the visual odometer technology for computer visual positioning, so that the tracking precision of the visual odometer is more accurate.
The method obtains the offset speed of the yaw angle of the heading indicator through calculation; obtaining compensation data corresponding to each second orientation data according to the offset speed; and compensating each second orientation data in the second orientation data sequence based on the compensation data so as to compensate the yaw angle, thereby eliminating or reducing the negative influence of the yaw angle deviation on the tracking precision. The global information of the visual positioning is utilized to make up the continuous deviation of the yaw angle of the heading meter, and the tracking precision of the visual mileage calculation method is improved.
Please refer to fig. 5, which is a schematic structural diagram of a yaw angle compensation apparatus of the present invention, including: a data acquisition module 51, a speed calculation module 52, a compensation data calculation module 53, and a compensation module 54.
The data acquisition module 51 is configured to acquire a first orientation data sequence and a shooting time sequence of an image sequence shot in a predetermined time; and acquiring a second orientation data sequence of the orientation meter in the preset time.
The speed calculation module 52 is configured to calculate a deviation speed of a yaw angle of the heading meter according to the first heading data sequence, the second heading data sequence, and the shooting time sequence.
The compensation data calculating module 53 is configured to obtain compensation data corresponding to each second orientation data according to the offset speed.
The compensation module 54 is configured to compensate each of the second orientation data in the second orientation data sequence based on compensation data. In an embodiment, a product of each second orientation data and the compensation data is calculated to obtain each compensated second orientation data. The compensated second orientation data may then be input into the visual odometry calculation, thereby making the visual positioning result more accurate.
Further, the speed calculating module 52 is further configured to calculate a first difference between each first orientation data in the first orientation data sequence and a first orientation data; calculating a second difference value between each second orientation data and the first second orientation data in the second orientation data sequence; calculating the difference value of the shooting time of each shooting time and the first shooting time in the shooting time sequence; and calculating the offset speed of the yaw angle of the orientation meter by using the first difference, the second difference and the shooting time difference corresponding to the shooting time.
Further, the speed calculating module 52 is further configured to calculate, by using the first difference, the second difference and the difference between the shooting times corresponding to each shooting time, an offset speed of a yaw angle of the heading meter corresponding to each shooting time; and calculating the offset speed of the yaw angle of the orientation meter according to the offset speeds of the yaw angle of the orientation meter corresponding to all the shooting time.
Further, the data obtaining module 51 is further configured to obtain a statistical time sequence corresponding to the second orientation data sequence, where statistical time in the statistical time sequence corresponds to second orientation data in the second orientation data sequence one to one;
the speed calculation module 52 is further configured to determine, for any target shooting time, a corresponding target first difference according to the first orientation data corresponding to the target shooting time; determining target statistic time closest to the target shooting time in the statistic time sequence, and determining a corresponding target second difference according to second orientation data corresponding to the target statistic time; and calculating the offset speed of the yaw angle of the heading meter corresponding to the target shooting time according to the target first difference, the target second difference and the shooting time difference corresponding to the target shooting time.
Further, the first orientation data and the first second orientation data correspond to the same time.
Further, the data obtaining module 51 is further configured to obtain a statistical time sequence corresponding to the second orientation data sequence, where statistical time in the statistical time sequence corresponds to second orientation data in the second orientation data sequence one to one;
the compensation data calculating module 53 is further configured to calculate a statistical time difference between each statistical time in the statistical time sequence and the first statistical time; and calculating to obtain the compensation data corresponding to each second orientation data according to the statistical time difference and the offset speed.
Further, the compensation module 54 is further configured to calculate a product of each second orientation data and the compensation data, so as to obtain each compensated second orientation data.
Further, the data obtaining module 51 is further configured to obtain an image sequence, and obtain a shooting time of each image to obtain the shooting time sequence; processing each image by using a neural network algorithm to obtain first orientation data corresponding to each image, and obtaining the first orientation data sequence according to each first orientation data.
For specific functional implementation of each module of the yaw angle compensation device, reference may be made to each embodiment of the orientation data compensation method, which is not described herein again. The yaw angle compensation device obtains the offset speed of the yaw angle of the orientation meter through calculation; obtaining compensation data corresponding to each second orientation data according to the offset speed; and compensating each second orientation data in the second orientation data sequence based on the compensation data so as to compensate the yaw angle, thereby eliminating or reducing the negative influence of the yaw angle deviation on the tracking precision. The global information of the visual positioning is utilized to make up the continuous deviation of the yaw angle of the heading meter, and the tracking precision of the visual mileage calculation method is improved.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the invention. The electronic device comprises a memory 202 and a processor 201 connected to each other.
The memory 202 is used for storing program instructions for implementing the orientation data compensation method of the device, which can be specifically referred to in the embodiments of the orientation data compensation method, and will not be described herein again.
The processor 201 is used to execute program instructions stored by the memory 202.
The processor 201 may also be referred to as a Central Processing Unit (CPU). The processor 201 may be an integrated circuit chip having signal processing capabilities. The processor 201 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 202 may be a memory bank, a TF card, etc., and may store all information in the electronic device of the device, including the input raw data, the computer program, the intermediate operation results, and the final operation results. It stores and retrieves information based on the location specified by the controller. With the memory, the electronic device can only have the memory function to ensure the normal operation. The storage of electronic devices can be classified into a main storage (internal storage) and an auxiliary storage (external storage) according to the use, and also into an external storage and an internal storage. The external memory is usually a magnetic medium, an optical disk, or the like, and can store information for a long period of time. The memory refers to a storage component on the main board, which is used for storing data and programs currently being executed, but is only used for temporarily storing the programs and the data, and the data is lost when the power is turned off or the power is cut off.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a system server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method of the embodiments of the present application.
Fig. 7 is a schematic structural diagram of a computer-readable storage medium according to the present invention. The storage medium of the present application stores a program file 203 capable of implementing all the above orientation data compensation methods, wherein the program file 203 may be stored in the storage medium in the form of a software product, and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. The aforementioned storage device includes: various media capable of storing program codes, such as a usb disk, a mobile hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or terminal devices, such as a computer, a server, a mobile phone, and a tablet.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An orientation data compensation method, comprising:
acquiring a first orientation data sequence and a shooting time sequence of an image sequence shot in a preset time; acquiring a second orientation data sequence of the orientation meter in the preset time;
calculating the offset speed of the yaw angle of the orientation meter according to the first orientation data sequence, the second orientation data sequence and the shooting time sequence;
obtaining compensation data corresponding to each second orientation data according to the offset speed;
compensating each of the second orientation data in the sequence of second orientation data based on the compensation data;
the calculating of the offset speed of the yaw angle of the heading meter according to the first heading data sequence, the second heading data sequence, and the shooting time sequence includes:
calculating a first difference value of each first orientation data and a first orientation data in the first orientation data sequence; calculating a second difference value between each second orientation data and the first second orientation data in the second orientation data sequence; calculating the difference value of the shooting time of each shooting time and the first shooting time in the shooting time sequence;
and calculating the offset speed of the yaw angle of the orientation meter by using the first difference, the second difference and the shooting time difference corresponding to the shooting time.
2. Compensation method according to claim 1,
the calculating the offset speed of the yaw angle of the heading meter by using the first difference, the second difference and the shooting time difference corresponding to the shooting time includes:
calculating the offset speed of the yaw angle of the heading meter corresponding to each shooting time by using the first difference, the second difference and the shooting time difference corresponding to each shooting time;
and calculating the offset speed of the yaw angle of the orientation meter according to the offset speeds of the yaw angle of the orientation meter corresponding to all the shooting time.
3. The compensation method of claim 2, further comprising:
acquiring a statistical time sequence corresponding to the second orientation data sequence, wherein the statistical time in the statistical time sequence corresponds to the second orientation data in the second orientation data sequence one to one;
the calculating the offset speed of the yaw angle of the heading meter corresponding to each shooting time by using the first difference, the second difference and the shooting time difference corresponding to each shooting time comprises:
for any target shooting time, determining a corresponding target first difference value according to first orientation data corresponding to the target shooting time;
determining target statistic time closest to the target shooting time in the statistic time sequence, and determining a corresponding target second difference according to second orientation data corresponding to the target statistic time;
and calculating the offset speed of the yaw angle of the heading meter corresponding to the target shooting time according to the target first difference, the target second difference and the shooting time difference corresponding to the target shooting time.
4. The compensation method of claim 1, wherein the first orientation data corresponds to the same time as the first second orientation data.
5. The compensation method of claim 1, wherein the method comprises:
acquiring a statistical time sequence corresponding to the second orientation data sequence, wherein the statistical time in the statistical time sequence corresponds to the second orientation data in the second orientation data sequence one to one;
the obtaining compensation data corresponding to each second orientation data according to the offset speed includes:
calculating to obtain a statistical time difference value between each statistical time and the first statistical time in the statistical time sequence;
and calculating to obtain the compensation data corresponding to each second orientation data according to the statistical time difference and the offset speed.
6. Compensation method according to claim 5,
the compensating each second orientation data in the sequence of second orientation data based on the compensation data comprises:
and calculating the product of each second orientation data and the compensation data to obtain each compensated second orientation data.
7. Compensation method according to claim 1,
the acquiring of the first orientation data sequence and the capturing time sequence of the image sequence captured within the predetermined time includes:
acquiring an image sequence, and acquiring the shooting time of each image to obtain the shooting time sequence;
processing each image by using a neural network algorithm to obtain first orientation data corresponding to each image, and obtaining the first orientation data sequence according to each first orientation data.
8. An orientation data compensating apparatus, comprising:
the data acquisition module is used for acquiring a first orientation data sequence and a shooting time sequence of the image sequence shot in a preset time; acquiring a second orientation data sequence of the orientation meter in the preset time;
the speed calculation module is used for calculating the offset speed of the yaw angle of the orientation meter according to the first orientation data sequence, the second orientation data sequence and the shooting time sequence;
the compensation data calculation module is used for obtaining compensation data corresponding to each second orientation data according to the offset speed;
a compensation module for compensating each of the second orientation data in the second orientation data sequence based on the compensation data;
the speed calculation module is configured to calculate, according to the first orientation data series, the second orientation data series, and the shooting time series, a deviation speed of a yaw angle of the heading meter, and includes:
calculating a first difference value of each first orientation data and a first orientation data in the first orientation data sequence; calculating a second difference value between each second orientation data and the first second orientation data in the second orientation data sequence; calculating the difference value of the shooting time of each shooting time and the first shooting time in the shooting time sequence;
and calculating the offset speed of the yaw angle of the orientation meter by using the first difference, the second difference and the shooting time difference corresponding to the shooting time.
9. An electronic device comprising a memory and a processor, wherein the memory stores program instructions that the processor retrieves from the memory to perform the orientation data compensation method of any one of claims 1-7.
10. A computer-readable storage medium, characterized in that a program file is stored, which can be executed to implement the orientation data compensation method according to any one of claims 1-7.
CN202110321774.4A 2021-03-25 2021-03-25 Orientation data compensation method and device, electronic equipment and readable storage medium Active CN113055598B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110321774.4A CN113055598B (en) 2021-03-25 2021-03-25 Orientation data compensation method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110321774.4A CN113055598B (en) 2021-03-25 2021-03-25 Orientation data compensation method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN113055598A CN113055598A (en) 2021-06-29
CN113055598B true CN113055598B (en) 2022-08-05

Family

ID=76515692

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110321774.4A Active CN113055598B (en) 2021-03-25 2021-03-25 Orientation data compensation method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN113055598B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017106891A (en) * 2015-11-30 2017-06-15 株式会社リコー Inertia device, program, and positioning method
JP2018190024A (en) * 2017-04-28 2018-11-29 公立大学法人大阪市立大学 Flight body control system and flight body

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2353629C (en) * 1998-12-17 2005-12-27 Tokin Corporation Orientation angle detector
JP4328551B2 (en) * 2003-03-05 2009-09-09 富士重工業株式会社 Imaging posture control device
US8201436B2 (en) * 2006-04-28 2012-06-19 Nokia Corporation Calibration
US9760186B2 (en) * 2010-01-06 2017-09-12 Cm Hk Limited Electronic device for use in motion detection and method for obtaining resultant deviation thereof
FR3020169A1 (en) * 2014-04-16 2015-10-23 Parrot ROTATING WING DRONE WITH VIDEO CAMERA DELIVERING STABILIZED IMAGE SEQUENCES
US10665115B2 (en) * 2016-01-05 2020-05-26 California Institute Of Technology Controlling unmanned aerial vehicles to avoid obstacle collision
CN107655470B (en) * 2016-07-26 2020-02-21 广州亿航智能技术有限公司 Method and system for calibrating yaw angle value of unmanned aerial vehicle
CN107167131B (en) * 2017-05-23 2019-07-02 北京理工大学 A kind of depth integration of micro-inertia measuring information and the method and system of real-time compensation
CN107945220B (en) * 2017-11-30 2020-07-10 华中科技大学 Binocular vision-based reconstruction method
CN108561274B (en) * 2017-12-29 2020-07-07 华润电力风能(汕头潮南)有限公司 Fan yaw correction method and device, computer device and readable storage medium
JP7250824B2 (en) * 2018-05-30 2023-04-03 オーリス ヘルス インコーポレイテッド Systems and methods for location sensor-based branch prediction
CN108737734B (en) * 2018-06-15 2020-12-01 Oppo广东移动通信有限公司 Image compensation method and apparatus, computer-readable storage medium, and electronic device
JP2021028188A (en) * 2019-08-09 2021-02-25 富士通株式会社 Drone imaging device and method
CN110595464A (en) * 2019-08-19 2019-12-20 北京数研科技发展有限公司 IMU and visual sensor fusion positioning method and device
CN110377058B (en) * 2019-08-30 2021-11-09 深圳市道通智能航空技术股份有限公司 Aircraft yaw angle correction method and device and aircraft
CN110775288B (en) * 2019-11-26 2021-05-25 哈尔滨工业大学(深圳) Bionic-based flight mechanical neck eye system and control method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017106891A (en) * 2015-11-30 2017-06-15 株式会社リコー Inertia device, program, and positioning method
JP2018190024A (en) * 2017-04-28 2018-11-29 公立大学法人大阪市立大学 Flight body control system and flight body

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
无人飞行器姿态稳定性优化控制仿真研究;章孝等;《计算机仿真》;20171215;全文 *

Also Published As

Publication number Publication date
CN113055598A (en) 2021-06-29

Similar Documents

Publication Publication Date Title
CN111121768B (en) Robot pose estimation method and device, readable storage medium and robot
EP1489381B1 (en) Method and apparatus for compensating for acceleration errors and inertial navigation system employing the same
CN113029128B (en) Visual navigation method and related device, mobile terminal and storage medium
CN111415387A (en) Camera pose determining method and device, electronic equipment and storage medium
CN113034594A (en) Pose optimization method and device, electronic equipment and storage medium
CN108827341A (en) The method of the deviation in Inertial Measurement Unit for determining image collecting device
CN111220155A (en) Method, device and processor for estimating pose based on binocular vision inertial odometer
CN111721305B (en) Positioning method and apparatus, autonomous vehicle, electronic device, and storage medium
CN112762944A (en) Zero-speed interval detection and zero-speed updating method
CN114119744B (en) Method, device, equipment and storage medium for constructing point cloud map
CN114593735B (en) Pose prediction method and device
CN111998870B (en) Calibration method and device of camera inertial navigation system
CN113055598B (en) Orientation data compensation method and device, electronic equipment and readable storage medium
CN114964270B (en) Fusion positioning method, device, vehicle and storage medium
CN112461258A (en) Parameter correction method and device
CN115906641A (en) IMU gyroscope random error compensation method and device based on deep learning
CN115727871A (en) Track quality detection method and device, electronic equipment and storage medium
CN114322996B (en) Pose optimization method and device of multi-sensor fusion positioning system
CN115311624A (en) Slope displacement monitoring method and device, electronic equipment and storage medium
CN115014421A (en) Calibration method and device of sensor data, anti-shake method and camera module
CN108519100A (en) For the method for estimating step length, cloud system, equipment and computer program product
WO2022179047A1 (en) State information estimation method and apparatus
CN116448105B (en) Pose updating method and device, electronic equipment and storage medium
CN114979456B (en) Anti-shake processing method and device for video data, computer equipment and storage medium
CN114545017A (en) Velocity fusion method and device based on optical flow and accelerometer and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant