CN113890668A - Multi-sensor time synchronization method and device, electronic equipment and storage medium - Google Patents

Multi-sensor time synchronization method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113890668A
CN113890668A CN202111488235.6A CN202111488235A CN113890668A CN 113890668 A CN113890668 A CN 113890668A CN 202111488235 A CN202111488235 A CN 202111488235A CN 113890668 A CN113890668 A CN 113890668A
Authority
CN
China
Prior art keywords
sensor
acquisition time
data
acquisition
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111488235.6A
Other languages
Chinese (zh)
Inventor
宋振伟
李岩
李成军
张海强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhidao Network Technology Beijing Co Ltd
Original Assignee
Zhidao Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhidao Network Technology Beijing Co Ltd filed Critical Zhidao Network Technology Beijing Co Ltd
Priority to CN202111488235.6A priority Critical patent/CN113890668A/en
Publication of CN113890668A publication Critical patent/CN113890668A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0658Clock or time synchronisation among packet nodes
    • H04J3/0661Clock or time synchronisation among packet nodes using timestamps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The application discloses a multi-sensor time synchronization method, a multi-sensor time synchronization device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring sensor data acquired by at least two sensors, wherein the at least two sensors comprise a first sensor and a second sensor, and the sensor data comprises a frame data value and acquisition time of the frame data value; determining first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time according to sensor data acquired by the first sensor; determining a frame data estimation value of a second sensor corresponding to first acquisition time according to the first acquisition time of the first sensor and sensor data of the second sensor; and finishing time synchronization according to the frame data estimation value of the second sensor and the frame data value of the first sensor. The problem that the sensor timestamps of different frequencies can not be aligned is solved, and errors caused by the time synchronization problem can be reduced by estimating the data value of the sensor at a certain moment.

Description

Multi-sensor time synchronization method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of automatic driving technologies, and in particular, to a method and an apparatus for time synchronization of multiple sensors, an electronic device, and a storage medium.
Background
At present, with the rapid development of the technology in the field of automatic driving, a single sensor cannot meet the requirement of intelligent driving, and more sensors are needed for automatic driving of vehicles, including a GPS (Global Positioning System), a Lidar (Laser Radar), an IMU (Inertial Measurement Unit), an odometer (odometer), a camera, and the like, and the frequencies of the sensors are different, so that data received from each sensor cannot achieve time synchronization.
In the prior art, a time synchronization implementation scheme is mainly implemented based on hardware, pulse signals and the like, but under the condition that more and more subsequent sensors are arranged, the real-time performance and the accuracy are difficult to guarantee.
Disclosure of Invention
The embodiment of the application provides a multi-sensor time synchronization method and device, electronic equipment and a storage medium, so as to improve the accuracy of multi-sensor time synchronization.
The embodiment of the application adopts the following technical scheme:
in a first aspect, an embodiment of the present application provides a time synchronization method for multiple sensors.
Acquiring sensor data acquired by at least two sensors, wherein the at least two sensors comprise a first sensor and a second sensor, and the sensor data comprises a frame data value and acquisition time of the frame data value;
determining first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time according to sensor data acquired by the first sensor;
determining a frame data estimation value of a second sensor corresponding to first acquisition time according to the first acquisition time of the first sensor and sensor data of the second sensor;
and finishing the time synchronization of the second sensor and the first sensor according to the frame data estimation value of the second sensor and the frame data value of the first sensor.
Optionally, the determining, according to a first acquisition time of the first sensor and sensor data of the second sensor, a frame data estimation value of the second sensor corresponding to the first acquisition time includes:
determining sensor data corresponding to the first acquisition time in the sensor data of the second sensor according to the first acquisition time;
and determining a frame data estimation value of the second sensor corresponding to the first acquisition time according to the first acquisition time and the sensor data corresponding to the first acquisition time in the sensor data of the second sensor.
Optionally, the determining, according to the first acquisition time, sensor data corresponding to the first acquisition time in the sensor data of the second sensor includes:
determining two second acquisition times adjacent to the first acquisition time in the sensor data of the second sensor, the first acquisition time being between the two second acquisition times;
determining frame data values of a second sensor respectively corresponding to the two second acquisition times;
and taking the two second acquisition times and the corresponding frame data values of the second sensor as the sensor data corresponding to the first acquisition time in the sensor data of the second sensor.
Optionally, the determining, according to a first acquisition time of the first sensor and sensor data of the second sensor, a frame data estimation value of the second sensor corresponding to the first acquisition time includes:
respectively determining the time intervals of the two second acquisition times and the first acquisition time to obtain a first time interval and a second time interval;
and determining a frame data estimation value of a second sensor corresponding to the first acquisition time according to the first time interval, the second time interval and frame data values of the second sensor corresponding to the two second acquisition times.
Optionally, the determining, according to the sensor data acquired by the first sensor, a first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time includes:
determining a signal state of the sensor data according to the sensor data of the first sensor;
determining available sensor data from a signal state of the sensor data;
and taking the acquisition time and the corresponding frame data value in the available sensor data as the first acquisition time of the first sensor and the frame data value of the first sensor corresponding to the first acquisition time.
Optionally, the acquisition frequency of the first sensor is greater than the acquisition frequency of the second sensor.
In a second aspect, an embodiment of the present application further provides a multi-sensor time synchronization apparatus, where the apparatus includes:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring sensor data acquired by at least two sensors, the at least two sensors comprise a first sensor and a second sensor, and the sensor data comprises a frame data value and acquisition time of the frame data value;
the first determining unit is used for determining first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time according to sensor data acquired by the first sensor;
the second determining unit is used for determining a frame data estimation value of a second sensor corresponding to the first acquisition time according to the first acquisition time of the first sensor and the sensor data of the second sensor;
and the time synchronization unit is used for completing the time synchronization of the second sensor and the first sensor according to the frame data estimation value of the second sensor and the frame data value of the first sensor.
Optionally, the second determining unit is specifically configured to:
determining sensor data corresponding to the first acquisition time in the sensor data of the second sensor according to the first acquisition time;
and determining a frame data estimation value of the second sensor corresponding to the first acquisition time according to the first acquisition time and the sensor data corresponding to the first acquisition time in the sensor data of the second sensor.
In a third aspect, an embodiment of the present application further provides an electronic device, including:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform any of the methods described above.
In a fourth aspect, embodiments of the present application further provide a computer-readable storage medium storing one or more programs that, when executed by an electronic device including a plurality of application programs, cause the electronic device to perform any of the methods described above.
The embodiment of the application adopts at least one technical scheme which can achieve the following beneficial effects: the multi-sensor time synchronization method of the embodiment of the application comprises the steps of firstly obtaining sensor data collected by at least two sensors, wherein the at least two sensors comprise a first sensor and a second sensor, and the sensor data comprises a frame data value and collection time of the frame data value; then, according to the sensor data acquired by the first sensor, determining the first acquisition time of the first sensor and the frame data value of the first sensor corresponding to the first acquisition time; then, according to the first acquisition time of the first sensor and the sensor data of the second sensor, determining a frame data estimation value of the second sensor corresponding to the first acquisition time; and finally, completing the time synchronization of the second sensor and the first sensor according to the frame data estimation value of the second sensor and the frame data value of the first sensor. The multi-sensor time synchronization method solves the problem that the time stamps of the sensors with different frequencies cannot be aligned in the automatic driving and other scenes, and can reduce errors caused by the time synchronization problem to a great extent by estimating the data values of the sensors at a certain moment, so that the accuracy of time synchronization is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a schematic flowchart illustrating a method for time synchronization of multiple sensors according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a time synchronization algorithm in an embodiment of the present application;
FIG. 3 is a schematic structural diagram of a multi-sensor time synchronizer according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
The multi-sensor time synchronization method can be applied to an automatic driving scene, under the automatic driving scene, in order to meet requirements of real-time positioning of a vehicle and the like, the vehicle is usually provided with a plurality of sensors for collecting data generated in the driving process of the vehicle, fusion processing is carried out on the data generated by the plurality of sensors, and the like, so that real-time positioning of the vehicle is realized.
However, since the data acquisition frequencies of different sensors are different, for example, the acquisition frequency of Lidar is 0.1 s/time, and the acquisition frequency of GPS is 0.33 s/time, when the Lidar acquires data at a certain time, the GPS does not generate corresponding data, and further, the time is not synchronized. In addition, in order to ensure the real-time performance of vehicle positioning, more data needs to be acquired in a short time as much as possible, so that the data acquired by the GPS with relatively low acquisition frequency needs to be compensated to a certain extent so as to meet the requirements of time synchronization accuracy and real-time performance.
Based on this, an embodiment of the present application provides a method for time synchronization of multiple sensors, and as shown in fig. 1, a flow chart of the method for time synchronization of multiple sensors in the embodiment of the present application is provided, where the method at least includes the following steps S110 to S140:
step S110, obtaining sensor data collected by at least two sensors, where the at least two sensors include a first sensor and a second sensor, and the sensor data includes a frame data value and a collection time of the frame data value.
In the embodiment of the present application, when performing time synchronization processing between multiple sensors, sensor data collected by at least two sensors needs to be obtained first, that is, the number of the sensors may be two or more, and according to a requirement of time synchronization, the multiple sensors may be specifically divided into a first sensor and a second sensor, the first sensor may be regarded as a reference sensor, such as the Lidar mentioned above, and the second sensor may be regarded as a sensor to be synchronized, such as the GPS mentioned above.
For ease of understanding, the following explanation will be given by taking the Lidar and GPS sensors as examples, but should not be taken as limiting the scope of the present application.
The sensor data may specifically include frame data values acquired by each sensor at each acquisition time according to its own acquisition frequency, and the acquisition time and the frame data values are in a one-to-one correspondence relationship.
Step S120, determining a first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time according to the sensor data acquired by the first sensor.
After obtaining the sensor data collected by each sensor, it is necessary to determine a first collection time of the first sensor and a frame data value of the first sensor corresponding to the first collection time, using the sensor data collected by the first sensor as a reference, where the first collection time may be regarded as a reference collection time, for example, Lidar is at Tl1The frame data value S is collected at the momentl1Can be combined with Tl1As a first acquisition time, Sl1As a frame data value acquired at a first acquisition time.
Step S130, determining a frame data estimation value of the second sensor corresponding to the first acquisition time according to the first acquisition time of the first sensor and the sensor data of the second sensor.
After the first acquisition time is determined, it is necessary to further combine the sensor data acquired by the second sensor to determine a frame data estimation value corresponding to the first acquisition time by the second sensor, which is called "frame data estimation value" because the acquisition frequency of the second sensor is different from that of the first sensor, for example, the acquisition frequency of the GPS is less than that of the Lidar, which results in that the second sensor originally does not acquire frame data values at the first acquisition time, and therefore, for the purpose of time synchronization, the frame data estimation value of the second sensor at the first acquisition time can be estimated in the above manner.
Step S140, completing time synchronization between the second sensor and the first sensor according to the frame data estimation value of the second sensor and the frame data value of the first sensor.
Based on the above process, at each first acquisition time of the first sensor, the frame data estimation value of the second sensor can be correspondingly estimated, thereby completing the time synchronization between the first sensor and the second sensor. The process can be carried out in real time along with the data acquisition process, so that the accuracy and the real-time performance of the multi-sensor time synchronization are improved.
The multi-sensor time synchronization method solves the problem that the time stamps of the sensors with different frequencies cannot be aligned in the automatic driving and other scenes, and can reduce errors caused by the time synchronization problem to a great extent by estimating the data values of the sensors at a certain moment, so that the accuracy of time synchronization is improved.
In one embodiment of the present application, the acquisition frequency of the first sensor is greater than the acquisition frequency of the second sensor.
As mentioned above, at least two sensors of the embodiments of the present application have different acquisition frequencies, and if the acquisition frequency of a first sensor is greater than that of a second sensor, the first sensor is required to be used as a reference sensor, and other second sensors are required to be time-synchronized with the first sensor, for example, the first sensor may be Lidar and the second sensor may be GPS.
Of course, the above-mentioned "first" and "second" are only schematic descriptions given for facilitating understanding of the technical solution of the present application, and therefore it can be understood that if the acquisition frequency of the second sensor is greater than that of the first sensor, the second sensor may be used as a reference sensor, and the other first sensors may be time-synchronized with the second sensor.
In one embodiment of the present application, the determining, from a first acquisition time of the first sensor and sensor data of the second sensor, a frame data estimate for the second sensor corresponding to the first acquisition time comprises: determining sensor data corresponding to the first acquisition time in the sensor data of the second sensor according to the first acquisition time; and determining a frame data estimation value of the second sensor corresponding to the first acquisition time according to the first acquisition time and the sensor data corresponding to the first acquisition time in the sensor data of the second sensor.
The embodiment of the application determines the frame data estimation value corresponding to the second sensor in the first acquisition timeThen, the sensor data corresponding to the first acquisition time may be determined in the sensor data of the second sensor, for example, if the first acquisition time is Tl1Then it is necessary to determine the sum T in the sensor data of the second sensorl1Corresponding TgAnd Sg
Then according to the first acquisition time Tl1And with a first acquisition time Tl1T of the corresponding second sensorgAnd SgEstimating that the second sensor is at Tl1An estimate of the frame data at that time.
In an embodiment of the present application, the determining, according to the first acquisition time, sensor data corresponding to the first acquisition time in the sensor data of the second sensor includes: determining two second acquisition times adjacent to the first acquisition time in the sensor data of the second sensor, the first acquisition time being between the two second acquisition times; determining frame data values of a second sensor respectively corresponding to the two second acquisition times; and taking the two second acquisition times and the corresponding frame data values of the second sensor as the sensor data corresponding to the first acquisition time in the sensor data of the second sensor.
In determining T in the above embodimentgAnd SgMay be at a first acquisition time Tl1For reference, determining the sum T in the sensor data of the second sensorl1Two time points T that are closest in timeg1(relative to T)l1Previous time of) and Tg2(relative to T)l1The latter moment of time), and the frame data values S respectively acquired by the second sensor at these two points of timeg1And Sg2The two sets of sensor data (T)g1,Sg1) And (T)g2,Sg2) As the first acquisition time T in the sensor data of the second sensorl1Corresponding sensor data.
In one embodiment of the present application, the determining, from a first acquisition time of the first sensor and sensor data of the second sensor, a frame data estimate for the second sensor corresponding to the first acquisition time comprises: respectively determining the time intervals of the two second acquisition times and the first acquisition time to obtain a first time interval and a second time interval; and determining a frame data estimation value of a second sensor corresponding to the first acquisition time according to the first time interval, the second time interval and frame data values of the second sensor corresponding to the two second acquisition times.
When the frame data estimation value of the second sensor at the time point of the first acquisition time is specifically determined, the following method can be adopted to determine:
it should be noted that although the acquisition frequencies of the sensors are different, the time interval between two adjacent acquired frames of data is very short, so that the vehicle is considered to change at a constant speed in the time period, and then the first acquisition time T is usedl1For reference, a frame data estimate S for the second sensor may be calculatedg’Comprises the following steps:
Figure DEST_PATH_IMAGE001
, (1)
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE003
and
Figure DEST_PATH_IMAGE005
respectively representing a first acquisition time and acquisition times T of two adjacent GPSg1And Tg2The time interval between the start of the cycle,
Figure 746533DEST_PATH_IMAGE006
which represents the displacement of the GPS between two adjacent frames of data.
To facilitate understanding of the above equation (1), as shown in fig. 2, a schematic diagram of a time synchronization algorithm in the embodiment of the present application is provided, and in conjunction with fig. 2, assuming that Lidar performs data acquisition at an acquisition frequency of 0.1 s/time, and GPS performs data acquisition at an acquisition frequency of 0.33 s/timeAccording to the acquisition, two data axes can be obtained respectively. On the two data axes, the data segments corresponding to the Lidar and the GPS are intercepted by taking the first acquisition time of the Lidar at a certain moment as a reference, and on the two intercepted data segments, the first acquisition time T of the Lidarl1Is located at and Tl1Acquisition time T of two adjacent GPSg1And Tg2If it is at Tg1And Tg2Estimate T betweenl1The above expression (1) can be derived based on the uniform motion principle, that is, the relationship between time, velocity and displacement, of the frame data values acquired by the GPS at the moment, so that the frame data estimation value of the GPS corresponding to the Lidar at each acquisition time can be determined according to the expression (1), and time synchronization is finally realized.
In an embodiment of the present application, the determining, according to the sensor data acquired by the first sensor, a first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time includes: determining a signal state of the sensor data according to the sensor data of the first sensor; determining available sensor data from a signal state of the sensor data; and taking the acquisition time and the corresponding frame data value in the available sensor data as the first acquisition time of the first sensor and the frame data value of the first sensor corresponding to the first acquisition time.
In an actual application scenario, the data acquired by the sensor is not always a good signal due to the influence of the state of the sensor or the interference of some external environments when the sensor acquires the data, and therefore, after the data acquired by the first sensor is acquired, whether the currently received sensor signal is a good signal can be judged according to the state of the signal returned by the first sensor, if the currently received sensor signal is a good signal, the currently received sensor signal can be used for subsequent time synchronization processing, and if the currently received sensor signal is not a good signal, for example, the signal is inaccurate, the frame data can be discarded, so that the influence on the accuracy of subsequent time synchronization, the positioning accuracy and the like is avoided.
The embodiment of the present application further provides a multi-sensor time synchronization apparatus 300, as shown in fig. 3, which provides a schematic structural diagram of the multi-sensor time synchronization apparatus in the embodiment of the present application, where the apparatus 300 at least includes: an obtaining unit 310, a first determining unit 320, a second determining unit 330, and a time synchronizing unit 340, wherein:
an obtaining unit 310, configured to obtain sensor data collected by at least two sensors, where the at least two sensors include a first sensor and a second sensor, and the sensor data includes a frame data value and a collection time of the frame data value;
a first determining unit 320, configured to determine, according to sensor data acquired by the first sensor, a first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time;
a second determining unit 330, configured to determine, according to a first acquisition time of the first sensor and sensor data of the second sensor, a frame data estimation value of the second sensor corresponding to the first acquisition time;
a time synchronization unit 340, configured to complete time synchronization between the second sensor and the first sensor according to the frame data estimation value of the second sensor and the frame data value of the first sensor.
In an embodiment of the present application, the second determining unit 330 is specifically configured to: determining sensor data corresponding to the first acquisition time in the sensor data of the second sensor according to the first acquisition time; and determining a frame data estimation value of the second sensor corresponding to the first acquisition time according to the first acquisition time and the sensor data corresponding to the first acquisition time in the sensor data of the second sensor.
In an embodiment of the present application, the second determining unit 330 is specifically configured to: determining two second acquisition times adjacent to the first acquisition time in the sensor data of the second sensor, the first acquisition time being between the two second acquisition times; determining frame data values of a second sensor respectively corresponding to the two second acquisition times; and taking the two second acquisition times and the corresponding frame data values of the second sensor as the sensor data corresponding to the first acquisition time in the sensor data of the second sensor.
In an embodiment of the present application, the second determining unit 330 is specifically configured to: respectively determining the time intervals of the two second acquisition times and the first acquisition time to obtain a first time interval and a second time interval; and determining a frame data estimation value of a second sensor corresponding to the first acquisition time according to the first time interval, the second time interval and frame data values of the second sensor corresponding to the two second acquisition times.
In an embodiment of the present application, the first determining unit 320 is specifically configured to: determining a signal state of the sensor data according to the sensor data of the first sensor; determining available sensor data from a signal state of the sensor data; and taking the acquisition time and the corresponding frame data value in the available sensor data as the first acquisition time of the first sensor and the frame data value of the first sensor corresponding to the first acquisition time.
In one embodiment of the present application, the acquisition frequency of the first sensor is greater than the acquisition frequency of the second sensor.
It can be understood that the multi-sensor time synchronization apparatus can implement the steps of the multi-sensor time synchronization method provided in the foregoing embodiments, and the related explanations regarding the multi-sensor time synchronization method are applicable to the multi-sensor time synchronization apparatus, and are not described herein again.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 4, at a hardware level, the electronic device includes a processor, and optionally further includes an internal bus, a network interface, and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory, such as at least 1 disk Memory. Of course, the electronic device may also include hardware required for other services.
The processor, the network interface, and the memory may be connected to each other via an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 4, but that does not indicate only one bus or one type of bus.
And the memory is used for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory may include both memory and non-volatile storage and provides instructions and data to the processor.
The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to form the multi-sensor time synchronization device on the logic level. The processor is used for executing the program stored in the memory and is specifically used for executing the following operations:
acquiring sensor data acquired by at least two sensors, wherein the at least two sensors comprise a first sensor and a second sensor, and the sensor data comprises a frame data value and acquisition time of the frame data value;
determining first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time according to sensor data acquired by the first sensor;
determining a frame data estimation value of a second sensor corresponding to first acquisition time according to the first acquisition time of the first sensor and sensor data of the second sensor;
and finishing the time synchronization of the second sensor and the first sensor according to the frame data estimation value of the second sensor and the frame data value of the first sensor.
The method performed by the multi-sensor time synchronization apparatus according to the embodiment shown in fig. 1 of the present application may be applied to or implemented by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The electronic device may further execute the method executed by the time synchronization apparatus of multiple sensors in fig. 1, and implement the functions of the time synchronization apparatus of multiple sensors in the embodiment shown in fig. 1, which are not described herein again in this embodiment of the present application.
An embodiment of the present application further provides a computer-readable storage medium storing one or more programs, where the one or more programs include instructions, which, when executed by an electronic device including a plurality of application programs, enable the electronic device to perform the method performed by the multi-sensor time synchronization apparatus in the embodiment shown in fig. 1, and are specifically configured to perform:
acquiring sensor data acquired by at least two sensors, wherein the at least two sensors comprise a first sensor and a second sensor, and the sensor data comprises a frame data value and acquisition time of the frame data value;
determining first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time according to sensor data acquired by the first sensor;
determining a frame data estimation value of a second sensor corresponding to first acquisition time according to the first acquisition time of the first sensor and sensor data of the second sensor;
and finishing the time synchronization of the second sensor and the first sensor according to the frame data estimation value of the second sensor and the frame data value of the first sensor.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A method of time synchronization of multiple sensors, wherein the method comprises:
acquiring sensor data acquired by at least two sensors, wherein the at least two sensors comprise a first sensor and a second sensor, and the sensor data comprises a frame data value and acquisition time of the frame data value;
determining first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time according to sensor data acquired by the first sensor;
determining a frame data estimation value of a second sensor corresponding to first acquisition time according to the first acquisition time of the first sensor and sensor data of the second sensor;
and finishing the time synchronization of the second sensor and the first sensor according to the frame data estimation value of the second sensor and the frame data value of the first sensor.
2. The method of claim 1, wherein said determining from a first acquisition time of the first sensor and sensor data of the second sensor, a frame data estimate for the second sensor corresponding to the first acquisition time comprises:
determining sensor data corresponding to the first acquisition time in the sensor data of the second sensor according to the first acquisition time;
and determining a frame data estimation value of the second sensor corresponding to the first acquisition time according to the first acquisition time and the sensor data corresponding to the first acquisition time in the sensor data of the second sensor.
3. The method of claim 2, wherein said determining, from the first acquisition time, sensor data of the second sensor corresponding to the first acquisition time comprises:
determining two second acquisition times adjacent to the first acquisition time in the sensor data of the second sensor, the first acquisition time being between the two second acquisition times;
determining frame data values of a second sensor respectively corresponding to the two second acquisition times;
and taking the two second acquisition times and the corresponding frame data values of the second sensor as the sensor data corresponding to the first acquisition time in the sensor data of the second sensor.
4. The method of claim 3, wherein the determining, from a first acquisition time of the first sensor and sensor data of the second sensor, a frame data estimate for the second sensor corresponding to the first acquisition time comprises:
respectively determining the time intervals of the two second acquisition times and the first acquisition time to obtain a first time interval and a second time interval;
and determining a frame data estimation value of a second sensor corresponding to the first acquisition time according to the first time interval, the second time interval and frame data values of the second sensor corresponding to the two second acquisition times.
5. The method of claim 1, wherein the determining, from the sensor data collected by the first sensor, a first collection time of the first sensor and a first sensor frame data value corresponding to the first collection time comprises:
determining a signal state of the sensor data according to the sensor data of the first sensor;
determining available sensor data from a signal state of the sensor data;
and taking the acquisition time and the corresponding frame data value in the available sensor data as the first acquisition time of the first sensor and the frame data value of the first sensor corresponding to the first acquisition time.
6. A method according to any one of claims 1 to 5 wherein the acquisition frequency of the first sensor is greater than the acquisition frequency of the second sensor.
7. A multi-sensor time synchronization apparatus, wherein the apparatus comprises:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring sensor data acquired by at least two sensors, the at least two sensors comprise a first sensor and a second sensor, and the sensor data comprises a frame data value and acquisition time of the frame data value;
the first determining unit is used for determining first acquisition time of the first sensor and a frame data value of the first sensor corresponding to the first acquisition time according to sensor data acquired by the first sensor;
the second determining unit is used for determining a frame data estimation value of a second sensor corresponding to the first acquisition time according to the first acquisition time of the first sensor and the sensor data of the second sensor;
and the time synchronization unit is used for completing the time synchronization of the second sensor and the first sensor according to the frame data estimation value of the second sensor and the frame data value of the first sensor.
8. The apparatus of claim 7, wherein the second determining unit is specifically configured to:
determining sensor data corresponding to the first acquisition time in the sensor data of the second sensor according to the first acquisition time;
and determining a frame data estimation value of the second sensor corresponding to the first acquisition time according to the first acquisition time and the sensor data corresponding to the first acquisition time in the sensor data of the second sensor.
9. An electronic device, comprising:
a processor; and
a memory arranged to store computer executable instructions which, when executed, cause the processor to perform the method of any of claims 1 to 6.
10. A computer readable storage medium storing one or more programs which, when executed by an electronic device comprising a plurality of application programs, cause the electronic device to perform the method of any of claims 1-6.
CN202111488235.6A 2021-12-08 2021-12-08 Multi-sensor time synchronization method and device, electronic equipment and storage medium Pending CN113890668A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111488235.6A CN113890668A (en) 2021-12-08 2021-12-08 Multi-sensor time synchronization method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111488235.6A CN113890668A (en) 2021-12-08 2021-12-08 Multi-sensor time synchronization method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113890668A true CN113890668A (en) 2022-01-04

Family

ID=79015884

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111488235.6A Pending CN113890668A (en) 2021-12-08 2021-12-08 Multi-sensor time synchronization method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113890668A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115191871A (en) * 2022-06-07 2022-10-18 深圳市倍思科技有限公司 Data time synchronization method and device, cleaning robot and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011174737A (en) * 2010-02-23 2011-09-08 Nippon Telegr & Teleph Corp <Ntt> Interpolation device, interpolation method and program
CN113610136A (en) * 2021-07-30 2021-11-05 深圳元戎启行科技有限公司 Sensor data synchronization method and device, computer equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011174737A (en) * 2010-02-23 2011-09-08 Nippon Telegr & Teleph Corp <Ntt> Interpolation device, interpolation method and program
CN113610136A (en) * 2021-07-30 2021-11-05 深圳元戎启行科技有限公司 Sensor data synchronization method and device, computer equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115191871A (en) * 2022-06-07 2022-10-18 深圳市倍思科技有限公司 Data time synchronization method and device, cleaning robot and storage medium
CN115191871B (en) * 2022-06-07 2024-05-28 深圳市倍思科技有限公司 Method and device for data time synchronization, cleaning robot and storage medium

Similar Documents

Publication Publication Date Title
CN113899374B (en) Automatic driving vehicle positioning method and device, electronic equipment and storage medium
CN113865620B (en) Time synchronization method and device for AR navigation simulation
CN113868350A (en) Parking lot map processing method, vehicle and equipment
CN113791435B (en) GNSS signal abnormal value detection method and device, electronic equipment and storage medium
CN114279453B (en) Automatic driving vehicle positioning method and device based on vehicle-road cooperation and electronic equipment
CN115371689A (en) Fusion positioning method and device for automatic driving vehicle and electronic equipment
CN115546315A (en) Sensor on-line calibration method and device for automatic driving vehicle and storage medium
CN114114369B (en) Autonomous vehicle positioning method and apparatus, electronic device, and storage medium
CN113890668A (en) Multi-sensor time synchronization method and device, electronic equipment and storage medium
CN113514821A (en) Positioning method, device and system based on time-varying time difference
CN111811527A (en) Time synchronization method and device of map data and related equipment
CN114812595A (en) State early warning method and device for fusion positioning, electronic equipment and storage medium
CN116990776A (en) Laser radar point cloud compensation method and device, electronic equipment and storage medium
CN116164763A (en) Target course angle determining method and device, electronic equipment and storage medium
CN115060289A (en) Positioning track precision evaluation method and device, electronic equipment and storage medium
CN114397671B (en) Course angle smoothing method and device of target and computer readable storage medium
CN115014332A (en) Laser SLAM mapping method and device, electronic equipment and computer readable storage medium
CN115760636A (en) Distortion compensation method, device and equipment for laser radar point cloud and storage medium
CN113922910B (en) Sensor time synchronization processing method, device and system
CN113900133A (en) Target track smoothing method and device, electronic equipment and storage medium
CN112598314B (en) Method, device, equipment and medium for determining perception confidence of intelligent driving automobile
CN115014395A (en) Real-time calibration method and device for vehicle course angle for automatic driving
CN109375243B (en) Pseudo code phase tracking method and pseudo code tracking loop
CN115549884B (en) Sensor time synchronization method, device, equipment and readable storage medium
CN117254870B (en) Time calibration method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220104