WO2020253260A1 - 时间同步处理方法、电子设备及存储介质 - Google Patents

时间同步处理方法、电子设备及存储介质 Download PDF

Info

Publication number
WO2020253260A1
WO2020253260A1 PCT/CN2020/076836 CN2020076836W WO2020253260A1 WO 2020253260 A1 WO2020253260 A1 WO 2020253260A1 CN 2020076836 W CN2020076836 W CN 2020076836W WO 2020253260 A1 WO2020253260 A1 WO 2020253260A1
Authority
WO
WIPO (PCT)
Prior art keywords
angular velocity
information
different sensors
velocity information
sensors
Prior art date
Application number
PCT/CN2020/076836
Other languages
English (en)
French (fr)
Chinese (zh)
Inventor
王潇峰
刘余钱
章国锋
Original Assignee
上海商汤临港智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海商汤临港智能科技有限公司 filed Critical 上海商汤临港智能科技有限公司
Priority to JP2021531851A priority Critical patent/JP2022510418A/ja
Priority to KR1020217017070A priority patent/KR20210084622A/ko
Publication of WO2020253260A1 publication Critical patent/WO2020253260A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data

Definitions

  • the embodiments of the present disclosure relate to the field of computer vision, and specifically relate to time synchronization processing methods, electronic devices, and storage media.
  • Visual inertial odometry is currently a hot research topic in the field of computer vision. It is widely used in the navigation and entertainment of electronic devices.
  • the main principle is to fuse vision and inertial sensors to estimate the position and posture of the camera itself during movement to obtain accuracy.
  • the positioning information which belongs to autonomous navigation.
  • the electronic device needs to determine the delay time information between different sensors when fusing the measurement results of different sensors.
  • the motion in the three-dimensional space is usually used to calibrate the delay time information between different sensors, or the time delay between different sensors is processed as a constant.
  • the delay time information between different sensors changes. Therefore, there is a problem of low time synchronization accuracy.
  • the embodiments of the present disclosure expect to provide a time synchronization processing method, electronic equipment and storage medium.
  • an embodiment of the present disclosure provides a time synchronization processing method, the method includes: obtaining angular velocity information collected by two different sensors; the angular velocity information is the angular velocity information when the electronic device rotates, and the Two different sensors are installed on the electronic device and rigidly connected; the two angular velocity information obtained are aligned, and the delay time information between the two different sensors is determined; according to the delay time The information performs time synchronization processing on the respective measurement results of the two different sensors.
  • an embodiment of the present disclosure provides a time synchronization processing device, the device includes: a first acquisition module, an alignment module, and a synchronization module, wherein the first acquisition module is configured to obtain two different sensors respectively The angular velocity information is collected; the angular velocity information is the angular velocity information when the electronic device rotates, and the two different sensors are both arranged on the electronic device and rigidly connected; the alignment module is used to compare the two obtained Aligning the two angular velocity information to determine the delay time information between the two different sensors; the synchronization module is configured to perform the respective measurement results of the two different sensors according to the delay time information Time synchronization processing.
  • embodiments of the present disclosure provide an electronic device that includes at least a processor and a memory for storing a computer program that can run on the processor; when the processor is used to run the computer program, Perform the steps in the above-mentioned time synchronization processing method.
  • embodiments of the present disclosure provide a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned time synchronization processing method are implemented.
  • the embodiments of the present disclosure provide a time synchronization processing method, an electronic device, and a storage medium.
  • the angular velocity information collected by the two different sensors is obtained through two different sensors that are arranged on the electronic device and rigidly connected;
  • the two angular velocity information is aligned to determine the delay time information between two different sensors, and then the respective measurement results of the two different sensors are time synchronized according to the delay time information, thereby making full use of the rigid connection angular velocity
  • the principle of the same information realizes the time synchronization of different sensors.
  • FIG. 1 is a schematic diagram 1 of a system structure suitable for a time synchronization processing method according to an embodiment of the present disclosure
  • FIG. 2A is a schematic diagram 2 of a system architecture suitable for a time synchronization processing method according to an embodiment of the present disclosure
  • 2B is a third schematic diagram of a system architecture suitable for a time synchronization processing method according to an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram 1 of the implementation process of a time synchronization processing method provided by an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of an exemplary time delay between the angular velocity of the visual sensor and the angular velocity of the inertial sensor in the embodiment of the disclosure;
  • FIG. 5 is a schematic diagram after the angular velocity of the visual sensor and the angular velocity of the inertial sensor are synchronized in an exemplary embodiment of the disclosure
  • FIG. 6 is a second schematic diagram of the implementation process of a time synchronization processing method provided by an embodiment of the disclosure.
  • FIG. 7 is a third schematic diagram of the implementation process of a time synchronization processing method provided by an embodiment of the disclosure.
  • FIG. 8 is a fourth schematic diagram of the implementation process of a time synchronization processing method provided by an embodiment of the present disclosure.
  • FIG. 9 is a first schematic diagram of measuring pose rotation matrix information by using a quaternion to represent a pose sensor in an embodiment of the disclosure.
  • FIG. 10 is a second schematic diagram of the embodiment of the disclosure using a quaternion to represent the pose sensor to measure the pose rotation matrix information
  • FIG. 11 is a schematic diagram of the composition structure of a time synchronization processing device provided by an embodiment of the disclosure.
  • FIG. 12 is a schematic diagram of the composition structure of an electronic device provided by an embodiment of the disclosure.
  • Fig. 1 is a schematic diagram 1 of a system structure suitable for a time synchronization processing method provided by an embodiment of the present disclosure.
  • the system may include a processor 11, two different sensors 12, and a memory 13, two different The sensors 12 respectively send the acquired information to the processor 11 for processing.
  • One of the two different sensors 12 may be a sensor that directly measures angular velocity, and the other sensor may be a sensor that indirectly measures angular velocity.
  • the sensor that indirectly measures angular velocity is capable of independently estimating the rotational movement of the sensor itself (that is, obtaining posture rotation).
  • Matrix information) sensor is capable of independently estimating the rotational movement of the sensor itself (that is, obtaining posture rotation).
  • Matrix information Matrix information
  • the two different sensors 12 may also be sensors of other structures, and the embodiment of the present disclosure does not limit the structure of the two different sensors.
  • the two sensors of the two different sensors 12 are rigidly fixed, that is, the two different sensors 12 of the electronic device are relatively fixed. During the movement, the angular velocity measured by different sensors at the same time is the same. Two different sensors 12 respectively input the acquired information into the processor 11, and the processor 11 performs time synchronization processing by executing the method provided in the embodiment of the present disclosure.
  • the electronic device may include a time synchronization processing device and two different sensors 12; the time synchronization processing device may include the aforementioned processor 11 and memory 13.
  • the time synchronization processing device may include the aforementioned processor 11 and memory 13.
  • the system architecture of the time synchronization processing method may also be shown in FIGS. 2A and 2B.
  • the system includes a time synchronization processing device 20, a first sensor 22a and a second sensor 22b, and a first sensor 22a and a second sensor 22a.
  • the second sensor 22b is arranged on the electronic equipment of the vehicle, for example, the time synchronization processing device 20 may be an on-board equipment in the vehicle; the first sensor 22a and the second sensor 22b may send the acquired information to the time synchronization through an electrical connection or a wireless communication connection ⁇ 20 ⁇ Processing device 20.
  • the time synchronization processing device 20 has a processor and a memory.
  • the first sensor 22a and the second sensor 22b send the acquired information to the time synchronization processing device 20, which is processed by the processor to implement the information provided by the embodiments of the present disclosure.
  • the method of time synchronization is described in FIGS. 2A and 2B.
  • the first sensor 22a and the second sensor 22b may be arranged at different positions of the vehicle, or may be arranged at the same position of the vehicle.
  • the first sensor 22a and the second sensor 22b may both be arranged on the tire of the vehicle; for another example, as shown in FIG. 2A, the first sensor 22a may also be arranged on the tire of the vehicle.
  • the second sensor 22b is arranged in other parts of the vehicle, for example, in the vehicle-mounted device part. The embodiment of the present disclosure does not limit the deployment positions of the first sensor 22a and the second sensor 22b.
  • the processor processes the information sent by different sensors in different ways. For example, when the sensor is a sensor that directly measures angular velocity, the processor can directly process the obtained angular velocity information; when the sensor is a sensor that indirectly measures angular velocity, the processor first obtains the pose rotation matrix information, and then according to the pose rotation matrix Information acquires angular velocity information, and then processes the acquired angular velocity information.
  • the time synchronization processing apparatus can be applied to various types of electronic devices with information processing capabilities during implementation.
  • the time synchronization processing device can obtain the data collected by two different sensors in real time in an online manner, and use the technical solutions of the embodiments of the present disclosure to perform time synchronization processing.
  • the time synchronization processing device may also obtain data collected by two different sensors in an offline manner, and use the technical solutions of the embodiments of the present disclosure to perform time synchronization processing.
  • obtaining the data collected by two different sensors in an offline manner may include: storing the data collected by two different sensors, and exporting the stored data when needed, so as to synchronize the respective measurement results of the different sensors;
  • Obtaining the data collected by two different sensors in an online manner may include: obtaining data collected by two different sensors in real time, so as to synchronize the respective measurement results of the different sensors.
  • the time synchronization processing device can be arranged in the electronic equipment or outside the electronic equipment; two different sensors are arranged in the electronic equipment.
  • the electronic device may be, for example, a mobile robot device, an unmanned device, or various types of mobile terminals and other devices.
  • the unmanned device may include, but is not limited to, a vehicle, an airplane, or a ship, which is not limited in the embodiment of the present disclosure.
  • time synchronization processing method which can solve the problem of low and complex time delay calibration between different sensors.
  • the functions implemented by the time synchronization processing method can be implemented by the processor in the time synchronization processing device calling executable instructions.
  • the executable instructions can be stored in the storage medium of the memory. It can be seen that the time synchronization processing device at least includes a processor And storage media.
  • FIG. 3 is a schematic diagram 1 of the implementation process of a time synchronization processing method provided in an embodiment of the present disclosure, which is applied to a time synchronization processing device.
  • the time synchronization processing method includes:
  • two different sensors are rigidly connected, that is, the two different sensors are relatively fixed.
  • the angular velocity represented by the angular velocity information acquired by two different sensors at the same time is the same.
  • the two different sensors in the embodiments of the present disclosure include sensors with different structures, or sensors with the same structure but different deployment positions.
  • the two different sensors may include any two of the following structures: vision sensors, inertial sensors, magnetic sensors, lidar sensors, and wheel sensors. There is no restriction here.
  • the two different sensors may be vision sensors or magnetic sensors deployed in different positions of the electronic device, and the embodiments of the present disclosure are not limited here.
  • the angular velocity information may be the angular velocity information when the electronic device rotates in a preset period of time, which is obtained by two different sensors. In other words, when the electronic device rotates, it rotates by a certain angle, so that the angular velocity information collected by different sensors can be obtained within a preset time period.
  • the foregoing preset time period may be set by the user according to actual conditions.
  • the preset time period may be set to half an hour or fifteen minutes, etc., which is not limited in the embodiments of the present disclosure.
  • the above two different sensors can be classified based on acquisition methods such as direct acquisition of angular velocity information or indirect acquisition of angular velocity information.
  • one type of sensor is a gyro sensor, which can directly measure angular velocity information; the other type of sensor is a pose sensor, which can indirectly measure angular velocity information.
  • the above two different sensors can also be classified based on the structure of the sensor.
  • the gyro sensor and the attitude sensor are sensors with different structures.
  • At least one of the two different sensors is a pose sensor
  • the obtaining angular velocity information collected by the two different sensors separately includes: obtaining a position The pose rotation matrix information collected by the pose sensor; the angular velocity information when the electronic device rotates is determined by the pose rotation matrix information.
  • At least one of the two different sensors is a gyro sensor, and obtaining angular velocity information collected by the two different sensors respectively includes: obtaining the information collected by the gyro sensor Angular velocity information.
  • acquiring angular velocity information when an electronic device undergoes a rotational movement through two different sensors includes the following application scenarios:
  • Scenario 1 Two different sensors are gyroscopic sensors, but the deployment positions of the two gyroscopic sensors are different, and the angular velocity information of the electronic device when the electronic device rotates is directly collected through the two gyroscopic sensors.
  • Scenario 2 The two different sensors are both pose sensors, but the deployment positions of the two pose sensors are different, and the pose rotation matrix information when the electronic device rotates is obtained through the two pose sensors. Then, the angular velocity information when the electronic device rotates is calculated through the information of the pose rotation matrix.
  • Scenario 3 Among the two different sensors, one is a pose sensor and the other is a gyro sensor; on the one hand, the pose rotation matrix information when the electronic device rotates can be obtained through the pose sensor, and then pass the position The posture rotation matrix information is calculated to obtain the angular velocity information when the electronic device rotates; on the other hand, the angular velocity information when the electronic device rotates can be directly collected through the gyro sensor.
  • the gyro sensor includes any one of an inertial sensor, a magnetic sensor, and a wheel-type odometer sensor
  • the attitude sensor includes any one of a vision sensor and a lidar sensor.
  • S102 Perform alignment processing on the two obtained angular velocity information, and determine delay time information between two different sensors.
  • the two angular velocity information need to be aligned to determine the delay time information between the two different sensors.
  • the alignment processing of the two angular velocity information in the embodiment of the present disclosure is to make the angular velocity information acquired by the two different sensors be consistent in frequency, so that the two angular velocity information can be determined based on the aligned two angular velocity information. Delay time information between two different sensors.
  • the two angular velocities can be interpolated to align the two angular velocity information.
  • the interpolation processing may adopt any interpolation processing method among linear interpolation, cubic spline interpolation, and spherical interpolation, which is not limited in the embodiment of the present disclosure.
  • the respective measurement results of the two different sensors can be time synchronized according to the delay time information. It should be noted that the time synchronization processing of the respective measurement results of two different sensors is to determine the respective measurement results of the two different sensors at the same time, so as to solve the trigger delay and transmission delay caused by the different sensors. The problem of low time synchronization accuracy.
  • the respective measurement results of the two different sensors are the measurement results obtained by the two different sensors respectively measuring the rotational movement of the electronic device when the electronic device undergoes rotational movement.
  • the measurement results obtained by the two different sensors may include at least one of angular velocity information, azimuth angle information, and acceleration information.
  • the measurement result of the inertial sensor may include angular velocity information and acceleration information of the rotational movement of the electronic device;
  • the measurement result of the magnetic sensor may include the azimuth angle information of the rotational movement of the electronic device.
  • At least three sensors may also be provided on the electronic device, and the at least three sensors are rigidly connected; the angular velocity information collected by the at least three sensors is obtained respectively; and the three angular velocity information obtained is performed Alignment processing determines the delay time information between at least three sensors; according to the delay time information, time synchronization processing is performed on the respective measurement results of the at least three sensors.
  • At least three sensors can be combined in pairs to obtain the delay time information between any two sensors, and then the delay time information Time synchronization processing is performed on the respective measurement results of the two sensors corresponding to the same time, so as to realize the time synchronization processing of at least three sensors.
  • the delay time information between the visual sensor and the inertial sensor can be acquired, and the visual effect can be adjusted based on the delay time information.
  • the measurement result of the sensor and the measurement result of the inertial sensor are processed in time synchronization to realize the time synchronization of the measurement result of the visual sensor and the measurement result of the inertial sensor.
  • the delay time information between the vision sensor and the inertial sensor may be determined first, based on the delay time information
  • the measurement result of the vision sensor and the measurement result of the inertial sensor are time-synchronized to realize the time synchronization of the measurement result of the vision sensor and the measurement result of the inertial sensor; then the delay time information between the vision sensor and the lidar sensor is determined, based on The delay time information performs time synchronization processing on the measurement result of the vision sensor and the measurement result of the lidar sensor, and realizes the time synchronization of the measurement result of the vision sensor and the measurement result of the lidar sensor. In this way, through two time synchronization processing, The time synchronization of the measurement results between the vision sensor, inertial sensor and lidar sensor.
  • the following uses one of the two different sensors as a vision sensor, the other sensor as an inertial sensor, and the measurement result is angular velocity information as For example, the time synchronization processing in the embodiment of the present disclosure will be described.
  • FIG. 4 is a schematic diagram showing the time delay between the angular velocity information of the visual sensor and the angular velocity information of the inertial sensor in an exemplary embodiment of the disclosure.
  • FIG. 5 is a schematic diagram after the angular velocity of the visual sensor and the angular velocity of the inertial sensor are synchronized in an exemplary embodiment of the disclosure.
  • the dotted line represents the angular velocity of the vision sensor
  • the solid line represents the angular velocity of the inertial sensor.
  • the angular velocity curve of the visual sensor is staggered from the angular velocity curve of the inertial sensor, and the angular velocity curve of the visual sensor is lagging.
  • the angular velocity curve of the visual sensor and the angular velocity curve of the inertial sensor are aligned and not staggered, thus According to the determined delay time information, the time synchronization processing of the angular velocity information between the visual sensor and the inertial sensor is realized.
  • the respective measurement results of the two different sensors can be directly synchronized in time according to the delay time information.
  • the delay time information may also be stored; when the electronic device is in a non-moving state In this case, according to the stored delay time information, the respective measurement results of the two different sensors are time synchronized.
  • the technical solutions of the embodiments of the present disclosure can realize both online time synchronization processing and offline time synchronization processing, and the time synchronization processing is more flexible.
  • the time synchronization processing in the embodiments of the present disclosure does not need to use a calibration reference object, such as a checkerboard image, and the time synchronization is more convenient and simple, and has strong universality.
  • the embodiments of the present disclosure require the electronic device to be capable of rotating movement, that is, the electronic device is required to only need to rotate around the rotation axis at least to calibrate the delay time information between different sensors on the electronic device, without having to require movement around multiple axes ,
  • the complexity of time synchronization of different sensors is reduced, and it can be adapted to the needs of different scenarios; and the embodiments of the present disclosure can also calibrate the delay time based on the rotation motion generated by the electronic device rotating around multiple axes, so that more abundant rotation information can be obtained ,
  • the embodiments of the present disclosure require sensors to independently obtain angular velocity information to determine the delay time information between different sensors, which can be widely used for time synchronization between multiple sensors.
  • the embodiment of the present disclosure is a software-based method for realizing time synchronization, without additional deployment of dedicated hardware for time synchronization; the embodiment of the present disclosure performs time synchronization based on the delay time information obtained in real time, and can perform time synchronization online Processing: The embodiment of the present disclosure determines the delay time information through the angular velocity information obtained when the electronic device rotates, instead of treating the delay time information as a constant, and improves the accuracy of time synchronization.
  • FIG. 6 is a second schematic diagram of the implementation process of a time synchronization processing method provided by an embodiment of the present disclosure.
  • the angular velocity information acquired by two different sensors is aligned to determine
  • the delay time information between two different sensors, namely S102, can include S102a and S102b, as follows:
  • S102a Perform interpolation processing on at least one of the two angular velocity information to align the two angular velocity information.
  • the interpolation processing of sensors of different structures may be different from the processing of sensors of the same structure. Therefore, based on the different structures of the two sensors, interpolation processing can be performed on at least one of the two angular velocity information to align the two angular velocity information.
  • interpolation processing is performed on at least one of the obtained angular velocity information collected by the two gyroscopic sensors to align the two angular velocity information.
  • interpolation processing is performed on at least one of the obtained pose rotation matrix information collected by the two pose sensors to align the two angular velocities information.
  • interpolation processing is performed on the pose rotation matrix information collected by the obtained pose sensor to align the two angular velocity information.
  • performing interpolation processing on at least one of the two angular velocity information includes performing interpolation processing on both angular velocity information and performing interpolation processing on any one of the two angular velocity information.
  • performing interpolation processing on the two angular velocity information includes: selecting a data acquisition frequency as the standard data acquisition frequency, and performing interpolation processing on the two angular velocity information according to the standard data acquisition frequency.
  • the standard data acquisition frequency may be between the respective data acquisition frequencies of the two sensors or higher than the respective data acquisition frequencies of the two sensors.
  • the data acquisition frequency higher than the two sensors can be selected as the standard data acquisition frequency, for example, the standard data acquisition frequency is 17 Hz.
  • a frequency between the data acquisition frequencies of the two sensors can also be selected as the standard data acquisition frequency, for example, the standard data acquisition frequency is 13 Hz.
  • the angular velocity information acquired by the two sensors is interpolated according to the standard data acquisition frequency, so that the angular velocity acquisition frequency of the first sensor and the angular velocity acquisition frequency of the second sensor are respectively the same as the standard acquisition frequency , Both are 13Hz. In this way, the problem of deviations in data processing caused by inconsistent data acquisition frequencies of different sensors can be solved, which helps to obtain more accurate delay time information based on the aligned angular velocity information.
  • the interpolation processing includes: performing interpolation processing on the acquired angular velocity information collected by the second sensor according to the data acquisition frequency of the first sensor.
  • the angular velocity information acquired by the second sensor can be interpolated according to the data acquisition frequency of the first sensor 15 Hz, so that the The angular velocity acquisition frequency of one sensor is the same as the angular velocity information acquisition frequency of the second sensor, and both are 15 Hz. This solves the problem of deviations in data processing due to inconsistent data acquisition frequencies of different sensors, and helps to obtain more accurate delay time information based on the aligned angular velocity information.
  • the obtained pose rotation matrix information collected by the pose sensor is subjected to interpolation processing to Align the two angular velocity information.
  • it can include S01, S02 and S03, as follows:
  • adjacent pose rotation matrix information may be obtained from the multiple pose rotation matrix information. It should be noted that the pose rotation matrix information collected by the pose sensor in the preset time period is multiple pose rotation matrix information.
  • the pose rotation matrix information of the adjacent frame image can be obtained; when the pose sensor is a lidar sensor, it can be obtained separately by the lidar sensor
  • the pose rotation matrix information in the preset time period is used to estimate the rotation movement of the electronic device.
  • S02 Perform interpolation processing on adjacent pose rotation matrix information according to the frequency at which the gyro sensor obtains angular velocity information, and obtain interpolated rotation matrix information.
  • the interpolated rotation matrix information may be obtained according to the adjacent pose rotation matrix information.
  • an interpolation model can be constructed, and then the interpolation model and the adjacent pose rotation matrix information can be used to obtain the interpolated rotation matrix information. Rotation matrix information.
  • the interpolation model is used to estimate other adjacent pose rotation information between adjacent pose rotation information based on the value status of adjacent pose rotation information at a limited number of points.
  • the interpolation model may include a spherical linear interpolation (Spherical Linear Interpolation) model, a cubic spline interpolation (Cubic Spline Interpolation) model, and a nearest neighbor interpolation model.
  • Spherical Linear Interpolation Spherical Linear Interpolation
  • Cubic Spline Interpolation cubic spline interpolation
  • nearest neighbor interpolation model a nearest neighbor interpolation model.
  • the interpolated rotation matrix information may be differentiated according to a geometric manner to obtain angular velocity information corresponding to the pose sensor.
  • the differential model can be constructed in the process of performing differentiation processing on the interpolated rotation matrix information to obtain angular velocity information corresponding to the pose sensor. Through the constructed differential model and the interpolated rotation matrix information, the angular velocity information corresponding to the pose sensor is obtained.
  • the differential model satisfies formula (1).
  • q(t)' is the differential pose rotation matrix information
  • w is the second sub-angular velocity information
  • q(t) is the interpolated pose rotation matrix information
  • the differential model in the embodiment of the present disclosure is a model used to characterize angular velocity information.
  • the differential model may be a quaternion differential model or a model in other mathematical expression forms, which is not limited in the embodiment of the present disclosure.
  • S102b Determine the delay time information between two different sensors according to the aligned two angular velocity information.
  • the angular velocity information acquired by different sensors at the same time is the same. Therefore, the angular velocity information between the two different sensors can be determined according to the aligned two angular velocity information. Delay time information.
  • an error model can be constructed first, and then based on the aligned two angular velocity information and the error model , To determine the delay time information between two different sensors.
  • the error parameter between the two sensors is introduced, so that the delay time information determined by the error model can be more accurate.
  • the error model can be as shown in formula (2):
  • Q 12 is the rotation parameter between the different coordinate axes corresponding to two different sensors
  • td is the delay time information between the two different sensors
  • bg is the error parameter between the two different sensors
  • w 1 (t) and w 2 (t) are respectively the angular velocity information acquired by two different sensors at the same time t
  • f(Q 12 , td, bg) is the error between the angular velocities.
  • the method further includes: aligning the two angular velocity information, Determine the external parameters between two different sensors.
  • the external parameters include the rotation parameters between the different coordinate axes corresponding to the two different sensors and the error parameters between the two different sensors.
  • the delay time information can be obtained, but also the rotation parameters and error parameters between two different sensors can be obtained, so that the functions of the electronic device are more abundant.
  • FIG. 7 is a schematic diagram of the third implementation flow of a time synchronization processing method provided by an embodiment of the disclosure.
  • the delay time information between two different sensors is determined according to the aligned two angular velocity information, namely Step S102b may include S102b1, S102b2, and S102b3, as follows:
  • the sub-error equations corresponding to the two different sensors at different moments during the rotation of the electronic device can be determined according to the aligned two angular velocity information.
  • n can be preset at different times within the preset time period according to actual conditions. For example, 15 moments or 20 moments can be set within the preset time period, and then the sub-error equation corresponding to each moment can be obtained.
  • the angular velocity information collected by different sensors corresponding to time 1 are respectively w 1 (1) and w 2 (1+td); the angular velocity information collected by different sensors corresponding to time 2 are respectively w 1 (2) And w 2 (2+td), and so on, the angular velocity information collected by different sensors corresponding to time n is w 1 (n) and w 2 (n+td) respectively, then when the error model satisfies formula (2),
  • the corresponding sub-error equations corresponding to different times n are:
  • the sub-error equations corresponding to each time are determined, the sub-error equations corresponding to different times are summed to obtain the final error equation.
  • the embodiment of the present disclosure obtains the delay time information by obtaining the accumulated errors at different times, and further requires summing the sub-error equations corresponding to different times to obtain the final error equation.
  • S102b3 Perform minimum processing on the final error equation to obtain delay time information.
  • the final error equation can be processed to a minimum value to obtain delay time information.
  • the above-mentioned minimum value processing is to minimize the final error equation value, and then estimate the delay time information in the error model, the rotation parameter, and the error parameter between the sensors.
  • the minimum processing of the final error equation may be performed through a nonlinear model or an iterative closest point model to perform the minimum processing on the final error equation.
  • the process of performing minimum value processing on the final error equation to obtain delay time information may include: performing iterative nearest point processing on the final error equation to obtain the second minimization equation; and solving the second minimization equation , Until the preset second threshold is met, the delay time information in the second minimization equation is acquired.
  • the iterative closest point processing is performed on the final error equation to obtain the second minimization equation, and the delay time can be set between the preset time periods. For example, you can use the golden section method to select a delay time and then substitute it into the error model.
  • the error term obtained at this time is only the two unknowns of the rotation parameter and the error parameter between the sensors.
  • it can be processed by iterative closest point. Then the rotation parameters and the error parameters between the sensors can be obtained.
  • f(Q 12 ,bg) is the error between angular velocities
  • n is the number of the nearest point pairs
  • Q 12 is the rotation matrix between the two sensors
  • bg is the error parameter between the two sensors
  • w i1 (t) and w i2 (t) are respectively a point in the point cloud in the angular velocity corresponding to the two sensors.
  • the process of performing minimum value processing on the final error equation to obtain delay time information may further include: performing nonlinear optimization processing on the final error equation to obtain the first minimization equation; Solve, until the preset first threshold is met, obtain the delay time information in the first minimization equation.
  • the final error equation is a non-linear function, it needs to be Taylor expanded. By minimizing the final error equation, it is necessary to find the corresponding delay time information when the preset first threshold is met, so as to make the final error The equation drops to a minimum.
  • nonlinear optimization processing is performed on the final error equation, and a nonlinear optimization model can be constructed in the process of obtaining the first minimization equation. Based on the final error equation and the nonlinear optimization model, the first minimization equation is determined.
  • the preset nonlinear optimization model may include a Gauss-Newton algorithm model or a Levenberg-Marquardt algorithm model, and the first threshold can be set according to the actual needs of the user. If it is 0.1 or 0.01, the embodiments of the present disclosure are not limited here.
  • obtaining the delay time information in the first minimization equation includes: when it is determined that the delay time is determined for the first time Information, according to the preset initial variable value and the preset nonlinear optimization model, obtain the current variable value; according to the initial variable value, the current variable value and the first minimization equation, determine the current solution value of the minimized equation, when the current solution value When the preset first threshold is met, the delay time information in the first minimization equation is obtained.
  • the first minimization equation is composed of the final error equation corresponding to the adjacent variable values.
  • the first minimization equation can be formula (8).
  • x k+1 is the current variable value
  • x k is the initial variable value
  • e is the current solution value of the minimization equation.
  • the next variable value needs to be obtained according to the current variable value and the preset nonlinear optimization model; according to the next variable value, the current variable value and the first Minimize the equation, determine the next solution value of the minimized equation, and determine in turn whether the next solution value meets the preset first threshold, until the next solution value meets the preset first threshold, the iteration ends, and the first Minimize the delay time information in the equation.
  • rotation parameters and sensor deviation parameters can also be obtained.
  • the obtained rotation parameters can transform the measurement information of different coordinate systems into In the same coordinate system.
  • the fusion processing after performing time synchronization processing on the respective measurement results of the two different sensors according to the delay time information, it may further include: performing fusion processing on the synchronized measurement results;
  • the measurement result of the fusion processing performs at least one of the following operations: positioning processing, ranging processing, target detection of the scene where the electronic device is located, generating or updating a map.
  • performing fusion processing on two synchronized measurement results includes: analyzing and synthesizing two measurement results at the same time to obtain a reliable fusion processing result.
  • applying the result of the fusion processing to the positioning process can achieve precise positioning of the electronic device; applying the result of the fusion processing to the ranging process can improve the measurement accuracy; applying the result of the fusion processing to the In the target detection process of the scene where the electronic device is located, accurate target detection results can be obtained; when the fusion processing result is applied to the process of generating or updating a map, an accurate map can be obtained.
  • a fusion algorithm model in the process of fusing the synchronized measurement results, can be constructed to obtain accurate fusion processing results.
  • the fusion algorithm model may include a Kalman filter fusion algorithm model and a cluster analysis recognition algorithm model, which is not limited in the embodiment of the present disclosure.
  • the Kalman filter fusion algorithm model can be used to perform fusion processing on the synchronized measurement results, including: using the measurement results of two different sensors set by the electronic device to perform state propagation, so that the previous moment's The pose estimation of the first measurement result obtains the pose estimation at the current moment, and then uses the second measurement result at the current moment as observation information to correct the preliminary estimation of the pose at the current moment, thereby obtaining an optimal pose at the current moment
  • the best estimate is the result of the fusion processing.
  • the time synchronization processing method in the embodiment of the present disclosure can also be applied to more than three sensors.
  • the measurement results after the time synchronization processing of every two sensors are acquired, and the measurement results after the time synchronization processing are fused to obtain the result of the fusion processing.
  • more accurate measurement results can be obtained by fusing the measurement results corresponding to at least three sensors.
  • FIG. 8 is a fourth schematic diagram of the implementation process of a time synchronization processing method provided by an embodiment of the present disclosure.
  • two different sensors provided on an electronic device are a vision sensor and an inertial sensor.
  • the time synchronization processing method of this embodiment may include the following steps:
  • the inertial sensor on the electronic device can independently obtain angular velocity information. Since the inertial sensor is provided with a gyro unit, the angular velocity information when the electronic device rotates can be directly obtained through the gyro unit.
  • the inertial sensor is a sensor that measures the three-axis attitude angle (or angular rate) and acceleration.
  • the inertial sensor can also include an acceleration unit, where the acceleration unit can detect the three-axis position of the electronic device. Acceleration information of the axis.
  • the visual sensor on the electronic device can independently obtain the pose rotation matrix information.
  • the vision sensor is provided with a pose estimation unit, which can obtain the pose rotation matrix information when the electronic device rotates.
  • the vision sensor is mainly composed of one or two graphics sensors, and is also equipped with a light projector and other auxiliary equipment. It can acquire the original image within a preset time period and store the acquired image in the memory. The benchmarks are compared and analyzed to obtain the pose rotation matrix information, and then calculate the pose rotation matrix information of the electronic device.
  • the rotational movement of the electronic device can be represented by a quaternion; the rotational movement of the electronic device can also be represented by a three-dimensional rotation group, which is not limited in the embodiment of the present disclosure.
  • S203 Determine the second angular velocity information according to the pose rotation matrix information, and align the second angular velocity information with the first angular velocity information.
  • the image frequency of the vision sensor is usually 10 Hz
  • the frequency of the inertial sensor is usually 100 Hz
  • At least one of the second angular velocity information and the first angular velocity information can be interpolated through an interpolation model.
  • the interpolation models used are also different. For example, for the rotational motion represented by the quaternion, the pose rotation matrix can be interpolated through the spherical linear interpolation model; for the rotational motion represented by the three-dimensional rotation group, the cubic spline interpolation model and the nearest neighbor interpolation model can be used. Interpolate the pose rotation matrix information.
  • the rotation motion represented by a quaternion is used as an example to illustrate the interpolation processing process of the pose rotation matrix information of the vision sensor using a spherical linear interpolation model.
  • a quaternion is composed of a real number plus an imaginary number, as in formula (9):
  • e 0 , e 1 , e 2 , and e 3 are real numbers, i, j, and k are mutually orthogonal imaginary number units, and q is the pose rotation matrix information represented by a quaternion.
  • w is the angular velocity
  • t is the time
  • q is the pose rotation matrix information represented by the exponential mapping of quaternion.
  • the constructed spherical interpolation model is formula (11).
  • q 0 and q 1 are adjacent posture rotation matrices
  • q(t) is the posture rotation matrix information obtained by spherical interpolation of the posture rotation matrix.
  • the posture rotation matrix information after interpolation can be obtained.
  • FIG. 9 is a schematic diagram 1 of the embodiment of the present disclosure using a quaternion sensor to measure the pose rotation matrix information.
  • the electronic device continuously moves from the previous moment to the current moment. That is, rotate from a solid circle to a dotted circle.
  • FIG. 10 is a second schematic diagram of the embodiment of the disclosure using a quaternion to represent the pose rotation matrix information measured by the pose sensor. As shown in FIG. 10, it is a plan view of the dotted frame extracted from FIG.
  • q 0 and q 1 is the adjacent posture rotation matrix
  • q(t) is the posture rotation matrix information after spherical interpolation is performed on the posture rotation matrix
  • the angle of rotation is ⁇
  • t is the time
  • the angular velocity information corresponding to the vision sensor in the process of acquiring the second angular velocity information according to the interpolated pose rotation matrix information, can be determined according to the interpolated pose rotation matrix information and the differential model.
  • the differential model is as in formula (1).
  • the second angular velocity information can be obtained indirectly by interpolating the pose rotation matrix information first, and then differential processing. In this way, it can adapt to the time synchronization between more sensors and has universal applicability.
  • S204 Determine the delay time information between the visual sensor and the inertial sensor.
  • the delay time information between the visual sensor and the inertial sensor can be determined according to the first angular velocity information and the second angular velocity information.
  • the delay time information between the vision sensor and the inertial sensor can be solved, but also the rotation parameter and the error parameter between the sensors can be solved.
  • S205 Perform time synchronization processing on the measurement result of the visual sensor and the measurement result of the inertial sensor according to the delay time information.
  • the time synchronization processing method provided by the embodiments of the present disclosure can be applied to unmanned driving or mobile robot navigation.
  • the unmanned electronic device or mobile robot electronic device can be implemented through the present disclosure
  • the time synchronization processing method provided in the example realizes precise positioning.
  • the embodiment of the present disclosure only uses a vision sensor and an inertial sensor as an example to illustrate the process of the electronic device implementing the time synchronization processing method.
  • the time synchronization processing method provided by the embodiment of the present disclosure is not limited to the vision sensor and the inertial sensor. Between the two sensors, it can also be applied to other sensors, as long as the other sensors can independently obtain angular velocity information and pose rotation matrix information.
  • the embodiments of the present disclosure can also be applied to electronic devices including more than three sensors, that is, the above-mentioned time synchronization method is also applicable to electronic devices with more than three sensors.
  • FIG. 11 is a schematic diagram of the composition structure of a time synchronization processing device provided by an embodiment of the disclosure.
  • the synchronization processing device 300 includes a first acquisition module 301, an alignment module 302, and a synchronization module 303, where:
  • the first acquisition module 301 is configured to obtain angular velocity information collected by two different sensors; the angular velocity information is the angular velocity information when the electronic device rotates; the two different sensors are both set on the electronic device And rigidly connected;
  • the alignment module 302 is configured to perform alignment processing on the two obtained angular velocity information, and determine the delay time information between the two different sensors;
  • the synchronization module 303 is configured to perform time synchronization processing on the respective measurement results of the two different sensors according to the delay time information.
  • the time synchronization processing device of the embodiment of the present disclosure requires the electronic device to be capable of rotating motion, that is, the electronic device is required to rotate around the rotation axis at least to calibrate the delay time information between different sensors on the electronic device without It is necessary to require movement around multiple axes, which reduces the complexity of time synchronization of different sensors and can adapt to the needs of different scenarios; and the embodiments of the present disclosure can also calibrate the delay time based on the rotational movement generated by the multi-axis rotation of the electronic device. Obtain richer rotation information to improve the accuracy of time synchronization; the embodiments of the present disclosure require sensors to independently obtain angular velocity information to determine the delay time information between different sensors, which can be widely used in a variety of sensors.
  • the time synchronization between the devices has universal adaptability;
  • the embodiment of the present disclosure is a method for realizing time synchronization based on software, and there is no need to deploy additional dedicated hardware for time synchronization;
  • the embodiment of the present disclosure performs time synchronization based on the delay time information obtained in real time , Can realize online time synchronization processing;
  • embodiments of the present disclosure determine delay time information through angular velocity information obtained when the electronic device rotates, instead of treating the delay time information as a constant, and improve the accuracy of time synchronization.
  • the first obtaining module 301 is configured to obtain the pose rotation matrix information collected by the pose sensor; the angular velocity information when the electronic device rotates is determined by the pose rotation matrix information, where: At least one of the two different sensors is a pose sensor.
  • the first obtaining module 301 is configured to obtain angular velocity information collected by a gyro sensor, wherein at least one of the two different sensors is a gyro sensor.
  • the alignment module 302 is configured to perform interpolation processing on at least one of the angular velocity information to align the two angular velocity information; determine the two angular velocity information according to the aligned two angular velocity information Delay time information between two different sensors.
  • the alignment module 302 is configured to perform at least one of the obtained angular velocity information collected by the two gyroscopic sensors when the two different sensors are both gyroscopic sensors. One performs interpolation processing to align the two angular velocity information.
  • the alignment module 302 is configured to, in a case where the two different sensors are both pose sensors, perform a pose rotation matrix for each of the obtained two pose sensors. At least one of the information is subjected to interpolation processing to align the two angular velocity information.
  • the alignment module 302 is configured to rotate the acquired pose collected by the pose sensor when the two different sensors are a pose sensor and a gyro sensor.
  • the matrix information undergoes interpolation processing to align the two angular velocity information.
  • the time synchronization processing device 300 further includes: a second acquisition module 304, configured to perform alignment processing on the two angular velocity information through the alignment module 302 to determine the difference between the two different sensors
  • the external parameters include rotation parameters between different coordinate axes corresponding to the two different sensors and error parameters between the two different sensors.
  • the time synchronization processing device 300 further includes: a storage module 305, configured to store the delay time information;
  • the synchronization module is configured to perform time synchronization on the respective measurement results of the two different sensors according to the delay time information stored by the storage module 305 when the electronic device is in a non-motion state deal with.
  • the alignment module 302 is configured to determine the sub-error equations corresponding to the two different sensors at different times according to the two aligned angular velocity information; The error equations are summed to obtain the final error equation; the minimum value processing is performed on the final error equation to obtain the delay time information.
  • the alignment module 302 is configured to perform nonlinear processing on the final error equation to obtain a first minimization equation; the first minimization equation is solved until the preset first In the case of a threshold value, the delay time information in the first minimization equation is acquired.
  • the alignment module 302 is configured to perform iterative closest point processing on the final error equation to obtain a second minimization equation; the second minimization equation is solved until the preset second In the case of a threshold value, the delay time information in the second minimization equation is acquired.
  • the time synchronization processing apparatus 300 further includes:
  • the fusion module 307 is used to perform fusion processing on the synchronized measurement results
  • the execution module 308 is configured to perform at least one of the following operations according to the measurement result of the fusion processing: positioning processing, ranging processing, target detection of the scene where the electronic device is located, and generating or updating a map.
  • time synchronization processing device when the time synchronization processing device provided in the above embodiment performs time synchronization processing, only the division of the above program modules is used as an example for illustration. In actual applications, the above processing can be allocated to different program modules according to needs. Complete, that is, divide the internal structure of the device into different program modules to complete all or part of the processing described above.
  • time synchronization processing device provided in the foregoing embodiment and the time synchronization processing method embodiment belong to the same concept, and the specific implementation process is detailed in the method embodiment, and will not be repeated here.
  • FIG. 12 is a schematic diagram of the structure of the electronic device provided by the embodiment of the disclosure.
  • the electronic device at least includes a processor 21 and a storage device capable of running on the processor.
  • the memory 23 of the computer program; the processor 21 is used to execute the steps in the time synchronization processing method provided in the above embodiment when the computer program is running.
  • the electronic device used to execute the steps in the time synchronization processing method provided in the foregoing embodiment may be the same or different from the electronic device provided with two different sensors.
  • the electronic device may further include a communication interface 24, which is used to obtain angular velocity information and pose rotation matrix information.
  • the various components in the electronic device are coupled together through the bus system 25. It can be understood that the bus system 25 is used to implement connection and communication between these components.
  • the bus system 25 also includes a power bus, a control bus, and a status signal bus. However, for the sake of clear description, various buses are marked as the bus system 25 in FIG. 12.
  • the memory 23 may be a volatile memory or a non-volatile memory, and may also include both volatile and non-volatile memory.
  • the non-volatile memory can be a read only memory (ROM, Read Only Memory), a programmable read only memory (PROM, Programmable Read-Only Memory), an erasable programmable read only memory (EPROM, Erasable Programmable Read- Only Memory, Electrically Erasable Programmable Read-Only Memory (EEPROM, Electrically Erasable Programmable Read-Only Memory), magnetic random access memory (FRAM, ferromagnetic random access memory), flash memory (Flash Memory), magnetic surface memory , CD-ROM, or CD-ROM (Compact Disc Read-Only Memory); magnetic surface memory can be magnetic disk storage or tape storage.
  • the volatile memory may be random access memory (RAM, Random Access Memory), which is used as an external cache.
  • RAM random access memory
  • SRAM static random access memory
  • SSRAM synchronous static random access memory
  • DRAM dynamic random access Memory
  • SDRAM Synchronous Dynamic Random Access Memory
  • DDRSDRAM Double Data Rate Synchronous Dynamic Random Access Memory
  • ESDRAM enhanced -Type synchronous dynamic random access memory
  • SLDRAM SyncLink Dynamic Random Access Memory
  • direct memory bus random access memory DRRAM, Direct Rambus Random Access Memory
  • DRRAM Direct Rambus Random Access Memory
  • the memory 23 described in the embodiment of the present invention is intended to include, but is not limited to, these and any other suitable types of memory.
  • the method disclosed in the foregoing embodiment of the present invention may be applied to the processor 21 or implemented by the processor 21.
  • the processor 21 may be an integrated circuit chip with signal processing capability. In the implementation process, the steps of the foregoing method can be completed by an integrated logic circuit of hardware in the processor 21 or instructions in the form of software.
  • the aforementioned processor 21 may be a general-purpose processor, a digital signal processor (DSP, Digital Signal Processor), or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
  • the processor 21 may implement or execute various methods, steps, and logical block diagrams disclosed in the embodiments of the present invention.
  • the general-purpose processor may be a microprocessor or any conventional processor.
  • the steps of the method disclosed in the embodiments of the present invention can be directly embodied as being executed and completed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module may be located in a storage medium, and the storage medium is located in the memory 23.
  • the processor 21 reads the information in the memory 23 and completes the steps of the foregoing method in combination with its hardware.
  • the embodiment of the present disclosure provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by the foregoing processor, the steps in the time synchronization processing in the foregoing embodiment are implemented.
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, such as: multiple units or components can be combined, or It can be integrated into another system, or some features can be ignored or not implemented.
  • the coupling, or direct coupling, or communication connection between the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other forms of.
  • the units described above as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, they may be located in one place or distributed on multiple network units; Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the functional units in the embodiments of the present disclosure can be all integrated into one processing unit, or each unit can be individually used as a unit, or two or more units can be integrated into one unit;
  • the unit can be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • the foregoing program can be stored in a computer readable storage medium. When the program is executed, it is executed. Including the steps of the foregoing method embodiment; and the foregoing storage medium includes: various media that can store program codes, such as a mobile storage device, ROM, RAM, magnetic disk, or optical disk.
  • the aforementioned integrated unit of the present disclosure is implemented in the form of a software function module and sold or used as an independent product, it may also be stored in a computer readable storage medium.
  • the computer software product is stored in a storage medium and includes several instructions for A computer device (which may be a personal computer, a server, or a network device, etc.) is allowed to execute all or part of the methods described in the various embodiments of the present disclosure.
  • the aforementioned storage media include: removable storage devices, ROM, RAM, magnetic disks or optical disks and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Manufacturing & Machinery (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Gyroscopes (AREA)
  • Traffic Control Systems (AREA)
PCT/CN2020/076836 2019-06-21 2020-02-26 时间同步处理方法、电子设备及存储介质 WO2020253260A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021531851A JP2022510418A (ja) 2019-06-21 2020-02-26 時間同期処理方法、電子機器及び記憶媒体
KR1020217017070A KR20210084622A (ko) 2019-06-21 2020-02-26 시간 동기화 처리 방법, 전자 기기 및 저장 매체

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910545218.8A CN112113582A (zh) 2019-06-21 2019-06-21 时间同步处理方法、电子设备及存储介质
CN201910545218.8 2019-06-21

Publications (1)

Publication Number Publication Date
WO2020253260A1 true WO2020253260A1 (zh) 2020-12-24

Family

ID=73796638

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/076836 WO2020253260A1 (zh) 2019-06-21 2020-02-26 时间同步处理方法、电子设备及存储介质

Country Status (4)

Country Link
JP (1) JP2022510418A (ja)
KR (1) KR20210084622A (ja)
CN (1) CN112113582A (ja)
WO (1) WO2020253260A1 (ja)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113310505A (zh) * 2021-06-15 2021-08-27 苏州挚途科技有限公司 传感器***的外参标定方法、装置及电子设备
CN113591015A (zh) * 2021-07-30 2021-11-02 北京小狗吸尘器集团股份有限公司 时间延迟的计算方法、装置、存储介质及电子设备
CN113848696A (zh) * 2021-09-15 2021-12-28 北京易航远智科技有限公司 基于位置信息的多传感器时间同步方法
CN114217665A (zh) * 2021-12-21 2022-03-22 清华大学 一种相机和激光雷达时间同步方法、装置及存储介质
CN115235527A (zh) * 2022-07-20 2022-10-25 上海木蚁机器人科技有限公司 传感器外参标定方法、装置以及电子设备
CN115451932A (zh) * 2022-09-16 2022-12-09 湖南航天机电设备与特种材料研究所 多通道陀螺仪数据同步采集与计算方法、***
CN115979277A (zh) * 2023-02-22 2023-04-18 广州导远电子科技有限公司 时间同步方法、装置、电子设备和计算机可读存储介质
CN117034201A (zh) * 2023-10-08 2023-11-10 东营航空产业技术研究院 一种多源实时数据融合方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113177440A (zh) * 2021-04-09 2021-07-27 深圳市商汤科技有限公司 图像同步方法、装置、电子设备和计算机存储介质
CN113610136A (zh) * 2021-07-30 2021-11-05 深圳元戎启行科技有限公司 传感器数据同步方法、装置、计算机设备和存储介质
CN114413878B (zh) * 2021-12-24 2024-02-13 苏州浪潮智能科技有限公司 一种时间校准***、方法、电子设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103616710A (zh) * 2013-12-17 2014-03-05 靳文瑞 基于fpga的多传感器组合导航时间同步***
CN104009833A (zh) * 2013-02-26 2014-08-27 赫克斯冈技术中心 传感器同步方法和与之有关的传感器测量***
CN104501817A (zh) * 2014-11-24 2015-04-08 李青花 一种基于误差消除的车载导航***
CN108680196A (zh) * 2018-04-28 2018-10-19 远形时空科技(北京)有限公司 一种时延校正方法、***及计算机可读介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002090151A (ja) * 2000-09-14 2002-03-27 Japan Aviation Electronics Industry Ltd センサ異常判定回路および操向制御装置
CN107728617B (zh) * 2017-09-27 2021-07-06 速感科技(北京)有限公司 多目在线标定方法、可移动机器人及***
CN108871311B (zh) * 2018-05-31 2021-01-19 北京字节跳动网络技术有限公司 位姿确定方法和装置
CN109029433B (zh) * 2018-06-28 2020-12-11 东南大学 一种移动平台上基于视觉和惯导融合slam的标定外参和时序的方法
CN109186596B (zh) * 2018-08-14 2020-11-10 深圳清华大学研究院 Imu测量数据生成方法、***、计算机装置及可读存储介质
CN109506617B (zh) * 2018-12-28 2021-08-10 歌尔科技有限公司 传感器数据处理方法、存储介质、电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104009833A (zh) * 2013-02-26 2014-08-27 赫克斯冈技术中心 传感器同步方法和与之有关的传感器测量***
CN103616710A (zh) * 2013-12-17 2014-03-05 靳文瑞 基于fpga的多传感器组合导航时间同步***
CN104501817A (zh) * 2014-11-24 2015-04-08 李青花 一种基于误差消除的车载导航***
CN108680196A (zh) * 2018-04-28 2018-10-19 远形时空科技(北京)有限公司 一种时延校正方法、***及计算机可读介质

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113310505A (zh) * 2021-06-15 2021-08-27 苏州挚途科技有限公司 传感器***的外参标定方法、装置及电子设备
CN113310505B (zh) * 2021-06-15 2024-04-09 苏州挚途科技有限公司 传感器***的外参标定方法、装置及电子设备
CN113591015A (zh) * 2021-07-30 2021-11-02 北京小狗吸尘器集团股份有限公司 时间延迟的计算方法、装置、存储介质及电子设备
CN113848696A (zh) * 2021-09-15 2021-12-28 北京易航远智科技有限公司 基于位置信息的多传感器时间同步方法
CN113848696B (zh) * 2021-09-15 2022-09-16 北京易航远智科技有限公司 基于位置信息的多传感器时间同步方法
CN114217665A (zh) * 2021-12-21 2022-03-22 清华大学 一种相机和激光雷达时间同步方法、装置及存储介质
CN115235527A (zh) * 2022-07-20 2022-10-25 上海木蚁机器人科技有限公司 传感器外参标定方法、装置以及电子设备
CN115451932A (zh) * 2022-09-16 2022-12-09 湖南航天机电设备与特种材料研究所 多通道陀螺仪数据同步采集与计算方法、***
CN115451932B (zh) * 2022-09-16 2024-05-24 湖南航天机电设备与特种材料研究所 多通道陀螺仪数据同步采集与计算方法、***
CN115979277A (zh) * 2023-02-22 2023-04-18 广州导远电子科技有限公司 时间同步方法、装置、电子设备和计算机可读存储介质
CN115979277B (zh) * 2023-02-22 2023-06-02 广州导远电子科技有限公司 时间同步方法、装置、电子设备和计算机可读存储介质
CN117034201A (zh) * 2023-10-08 2023-11-10 东营航空产业技术研究院 一种多源实时数据融合方法

Also Published As

Publication number Publication date
CN112113582A (zh) 2020-12-22
KR20210084622A (ko) 2021-07-07
JP2022510418A (ja) 2022-01-26

Similar Documents

Publication Publication Date Title
WO2020253260A1 (zh) 时间同步处理方法、电子设备及存储介质
CN111156998B (zh) 一种基于rgb-d相机与imu信息融合的移动机器人定位方法
Qin et al. Vins-mono: A robust and versatile monocular visual-inertial state estimator
Heng et al. Self-calibration and visual slam with a multi-camera system on a micro aerial vehicle
CN106803271B (zh) 一种视觉导航无人机的摄像机标定方法及装置
Schmid et al. Autonomous vision‐based micro air vehicle for indoor and outdoor navigation
US11205283B2 (en) Camera auto-calibration with gyroscope
US9243916B2 (en) Observability-constrained vision-aided inertial navigation
CN110207714B (zh) 一种确定车辆位姿的方法、车载***及车辆
US10322819B2 (en) Autonomous system for taking moving images from a drone, with target tracking and improved target location
CN108981687B (zh) 一种视觉与惯性融合的室内定位方法
US20210183100A1 (en) Data processing method and apparatus
CN106767785B (zh) 一种双回路无人机的导航方法及装置
US20180075614A1 (en) Method of Depth Estimation Using a Camera and Inertial Sensor
Zhang et al. Vision-aided localization for ground robots
US20220051031A1 (en) Moving object tracking method and apparatus
CN112116651B (zh) 一种基于无人机单目视觉的地面目标定位方法和***
KR101985344B1 (ko) 관성 및 단일 광학 센서를 이용한 슬라이딩 윈도우 기반 비-구조 위치 인식 방법, 이를 수행하기 위한 기록 매체 및 장치
WO2019191288A1 (en) Direct sparse visual-inertial odometry using dynamic marginalization
WO2021081774A1 (zh) 一种参数优化方法、装置及控制设备、飞行器
CN110824453A (zh) 一种基于图像跟踪与激光测距的无人机目标运动估计方法
Hinzmann et al. Flexible stereo: constrained, non-rigid, wide-baseline stereo vision for fixed-wing aerial platforms
CN113587934A (zh) 一种机器人、室内定位方法、装置和可读存储介质
CN110720113A (zh) 一种参数处理方法、装置及摄像设备、飞行器
WO2020019175A1 (zh) 图像处理方法和设备、摄像装置以及无人机

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20827563

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021531851

Country of ref document: JP

Kind code of ref document: A

Ref document number: 20217017070

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20827563

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 25-05-2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20827563

Country of ref document: EP

Kind code of ref document: A1