CN113327344A - Fusion positioning method, device, equipment, storage medium and program product - Google Patents

Fusion positioning method, device, equipment, storage medium and program product Download PDF

Info

Publication number
CN113327344A
CN113327344A CN202110585158.XA CN202110585158A CN113327344A CN 113327344 A CN113327344 A CN 113327344A CN 202110585158 A CN202110585158 A CN 202110585158A CN 113327344 A CN113327344 A CN 113327344A
Authority
CN
China
Prior art keywords
fusion
data
time
sensors
target data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110585158.XA
Other languages
Chinese (zh)
Other versions
CN113327344B (en
Inventor
邵晓东
曾清喻
张鹏
罗成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110585158.XA priority Critical patent/CN113327344B/en
Publication of CN113327344A publication Critical patent/CN113327344A/en
Priority to PCT/CN2022/095349 priority patent/WO2022247915A1/en
Application granted granted Critical
Publication of CN113327344B publication Critical patent/CN113327344B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0808Diagnosing performance data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The present disclosure provides a fusion positioning method, apparatus, electronic device, non-transitory computer readable storage medium, and computer program product, which relate to the field of artificial intelligence, and in particular, to the field of intelligent driving. The method comprises the following steps: receiving data from a plurality of sensors; when the fusion positioning signal is obtained, determining fusion estimation time according to the current time and preset delay time; determining a plurality of target data from the data of the plurality of sensors according to the fusion estimation time; and carrying out data fusion on the plurality of target data to obtain fusion positioning information. The present disclosure enables time alignment of fusion positioning.

Description

Fusion positioning method, device, equipment, storage medium and program product
Technical Field
The disclosed embodiments relate to the field of artificial intelligence technologies, and in particular, to a fusion positioning method, a fusion positioning device, an electronic device, a non-transitory computer-readable storage medium, and a computer program product, which can be used in the field of intelligent driving.
Background
Positioning systems in intelligent driving vehicles are used to determine the position, attitude, speed, etc. of the current vehicle and are an integral part of the intelligent driving systems. Because of the limitation of sensor performance, the performance requirements of the intelligent driving vehicle cannot be met by using a single sensor for positioning, and therefore, a multi-sensor fusion positioning method is usually adopted to fuse data of a plurality of sensors to finally obtain positioning information.
In order to ensure real-time accuracy of positioning information and low consumption of system resources, an extended Kalman filtering estimation method is mostly adopted at the rear end of multi-sensor fusion positioning, and the method needs to ensure that multi-sensor data are under the same time system, does not record historical data and requires consistent timestamp of the sensor data at the fusion moment. The multi-sensor data acquisition can be guaranteed to be under the same time system through hardware pulse per second signals or software system time, but after partial sensor data acquisition and time stamping, the partial sensor data acquisition can participate in data fusion after being subjected to time-consuming preprocessing, namely, time delay often exists between the time when the partial sensor data reach the data fusion stage and the acquisition time, the time delay of different sensors is different, and therefore data fusion is carried out after time alignment is needed.
Disclosure of Invention
The present disclosure provides a fusion localization method, apparatus, electronic device, non-transitory computer readable storage medium, and computer program product that improve time alignment that enables multi-sensor fusion localization.
According to an aspect of the present disclosure, there is provided a fusion localization method, including:
receiving data from a plurality of sensors;
in response to the acquisition of the fusion positioning signal, determining a fusion estimation moment according to the current moment and a preset delay time;
determining a plurality of target data from the data of the plurality of sensors according to the fusion estimation time;
and performing data fusion on the plurality of target data to obtain fusion positioning information.
According to another aspect of the present disclosure, there is provided a fusion positioning apparatus including:
the receiving module is used for receiving data of a plurality of sensors;
the delay module is used for responding to the obtained fusion positioning signal and determining fusion estimation time according to the current time and preset delay time;
a determining module, configured to determine a plurality of target data from the data of the plurality of sensors according to the fusion estimation time;
and the fusion module is used for carrying out data fusion on the plurality of target data to obtain fusion positioning information.
According to still another aspect of the present disclosure, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect.
According to yet another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of the first aspect described above.
According to yet another aspect of the present disclosure, there is provided a computer program product comprising: a computer program, stored in a readable storage medium, from which at least one processor of an electronic device can read the computer program, execution of the computer program by the at least one processor causing the electronic device to perform the method of the first aspect.
According to the technical scheme of the disclosure, the time alignment of multi-sensor fusion positioning is realized.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
fig. 1 is a schematic flow chart diagram of a fusion positioning method provided according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a fused position trigger provided in accordance with an embodiment of the present disclosure;
FIG. 3 is a schematic timing diagram of data acquisition by multiple sensors provided in accordance with an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of the arrival times of multi-sensor data provided in accordance with an embodiment of the present disclosure;
FIG. 5 is a schematic structural diagram of a fusion positioning device provided in accordance with an embodiment of the present disclosure;
fig. 6 is a schematic block diagram of an electronic device for implementing the fusion positioning method of the embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Positioning systems in intelligent driving vehicles are used to determine the position, attitude, speed, etc. of the current vehicle and are an integral part of the intelligent driving systems. Because of the limitation of sensor performance, the performance requirements of the intelligent driving vehicle cannot be met by using a single sensor for positioning, and therefore, a multi-sensor fusion positioning method is usually adopted to fuse data of a plurality of sensors to finally obtain positioning information.
For example, the positioning System generally uses a Global Navigation Satellite System (GNSS), an Inertial sensor (IMU), an image sensor, a laser radar, and other sensor data to perform fusion positioning.
In order to ensure real-time accuracy of positioning information and low consumption of system resources, a fusion estimation method of extended Kalman filtering is mostly adopted at the rear end of multi-sensor fusion positioning, the method needs to ensure that multi-sensor data are under the same time system, historical data are not recorded in the method, and the timestamp of the sensor data at the fusion time is required to be consistent. Under the general condition, the data collected by multiple sensors can be ensured to be in the same time system through methods such as hardware second pulse signals or software system time, however, after part of the sensors collect the data and are stamped with time stamps, the data can participate in data fusion after being subjected to time-consuming preprocessing, for example, after the image sensors collect the image data, the image data can participate in data fusion after being subjected to image feature extraction, therefore, the time when part of the sensor data reaches the data fusion stage and the collection time are often delayed, the time delays of different sensors are different, therefore, the data of the multiple sensors need to be subjected to time in the fusion stage, the time stamps are ensured to be consistent, and then the data fusion can be carried out.
At present, the method for performing time alignment in the fusion stage generally includes that data of a sensor with the largest delay in a plurality of sensors is used as a key frame, every time a key frame is received, all sensor data between the latest key frame and the previous key frame are packed, and data fusion is performed sequentially according to time to obtain fusion positioning information. However, the method depends on the key frame, and if the key frame is lost, other sensor data at the corresponding moment cannot be fused, so that the robustness is poor, and the positioning accuracy is easily reduced.
In order to solve the above problem, an embodiment of the present disclosure provides a fusion positioning method, where data fusion is performed without depending on a key frame of a sensor with the largest delay, but data fusion is triggered by using a fusion positioning signal, and when a fusion positioning signal is obtained each time, a fusion estimation time corresponding to the data fusion is determined, where the fusion estimation time is determined according to a current time and a preset delay time, so that it can be ensured that sensor data with delay can participate in fusion, and time alignment of multi-sensor fusion positioning is achieved. In addition, the method does not need to wait for a key frame to trigger data fusion, does not depend on a specific sensor, improves the flexibility and robustness of fusion positioning, and ensures the positioning precision.
Hereinafter, the fusion localization method provided by the present disclosure will be described in detail by specific examples. It is to be understood that the following detailed description may be combined with other embodiments, and that the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 1 is a schematic flow chart of a fusion positioning method according to an embodiment of the present disclosure. As shown in fig. 1, the method includes:
and S101, receiving data of a plurality of sensors.
The types of the plurality of sensors are not limited in the embodiment of the present disclosure, and for example, the sensors may be GNSS, IMU, image sensor, lidar, and the like, and may be specifically determined according to the actual situation of the vehicle. A fusion estimator in a positioning system receives sensor data sent by a plurality of sensors, wherein the data collected by the plurality of sensors are under the same time system, but the frequency of the data collected by each sensor can be different. The fusion estimator may be implemented in software and/or hardware for fusion estimation of multi-sensor data. For example, the frequency at which the IMU acquires data is typically higher than the frequency at which the image sensor acquires data. Because partial sensor data can be transmitted to the fusion estimator for data fusion after being subjected to necessary preprocessing, the time for the data collected by each sensor at the same time to reach the fusion estimator may be different.
And S102, responding to the obtained fusion positioning signal, and determining fusion estimation time according to the current time and preset delay time.
And S103, determining a plurality of target data from the data of the plurality of sensors according to the fusion estimation time.
The fusion positioning signal is used for triggering the fusion estimator to perform data fusion, and the fusion positioning signal can trigger the fusion estimator according to a preset frequency or time. The fusion estimator starts to perform data fusion after acquiring the fusion positioning signal, and in order to determine which data needs to be fused at this time, the fusion estimator needs to determine a fusion estimation time, which is determined by the fusion estimator according to the current time and a preset delay time, wherein the preset delay time may be preset according to delay times of a plurality of sensors, and the preset delay time may ensure that sensor data with delay can participate in data fusion. After the fusion estimation moment is determined, the fusion estimator determines a plurality of target data needing to be fused at this time from the data of the sensors.
And S104, performing data fusion on the plurality of target data to obtain fusion positioning information.
The specific method of data fusion in this step is not limited, and any fusion method disclosed in the related art may be used for data fusion.
In the embodiment of the disclosure, a fusion positioning signal is used to trigger data fusion, and when the fusion positioning signal is obtained each time, a fusion estimation time corresponding to the data fusion is determined, where the fusion estimation time is determined according to the current time and a preset delay time, so that delayed sensor data can participate in fusion, and time alignment of multi-sensor fusion positioning is realized.
On the basis of the above embodiments, the fusion positioning signal, the fusion estimation time, the target data, and the like are further explained.
In addition, in the embodiment of the present disclosure, a system clock may be set, as shown in fig. 2, the system clock triggers the fused positioning signal at a preset frequency, and the stability of triggering the fused positioning signal is ensured by using the system clock.
After obtaining the fusion positioning signal, the fusion estimator subtracts a preset delay time td from the current time tc to obtain a fusion estimation time te, obtains a plurality of data between the fusion estimation time and the last fusion estimation time from the data of the plurality of sensors, and determines the plurality of data as a plurality of target data. The fusion estimation time is obtained by subtracting a preset delay time from the current time, and data fusion is carried out on target data before the fusion estimation time, so that the problem that sensor data with larger delay cannot participate in data fusion due to the fact that the data fusion is carried out at the current time is avoided.
Taking the image sensor and the IMU as an example, as shown in fig. 3, the acquisition frequency of the IMU is 2 times that of the image sensor. The comparison of the image sensor data and IMU data arrival at the fusion estimator is illustrated in fig. 4, where IMU data typically has no or negligible delay, while image sensors have a greater delay, e.g., data acquired at IMUt2 has not yet arrived at the fusion estimator and data acquired at t1 by the image sensor has not yet arrived at.
In order to ensure that the data of the sensor with larger delay can participate in data fusion, the preset delay time td may be set to be greater than the maximum delay time of the sensor with the largest delay among the plurality of sensors in the embodiment of the present application. As shown in fig. 4, the image sensor is a most delayed sensor, and its maximum delay time is tm shown in the figure. Assuming that the system triggers the fusion positioning signal at time t7, that is, the current time tc is t7, and the current time t7 minus td obtains a fusion estimation time te, td is greater than tm, as shown in fig. 4, te is between t5 and t6, and assuming that the fusion estimation time at the last fusion positioning is at time tn shown in fig. 4, the target data determined by the current fusion estimator is the data of the image sensor at time t5, and the data of the IMU at time t4 and time t 5.
In the embodiment of the application, the preset delay time td is set to be longer than the maximum delay time of the sensor with the largest delay among the plurality of sensors, so as to avoid that the fusion estimation time is ahead of the time corresponding to the data of the sensor with the largest delay which has currently arrived at the fusion estimator, that is, to avoid that the data of the sensor with the largest delay and some time before the fusion estimation time does not arrive at the fusion estimator at the fusion estimation time, so that the data of other sensors corresponding to the some time is subjected to data fusion in the current fusion positioning, and the data of the sensor with the largest delay at the some time cannot participate in the data fusion after the data of the some time arrives, thereby affecting the positioning accuracy.
It should be noted that, when the fusion estimator receives the data of the multiple sensors, the data of the multiple sensors may be stored in the corresponding multiple containers, so as to facilitate management and access of the data of the multiple sensors, and correspondingly, after the fusion estimation time is determined, the determined multiple target data are taken out from the multiple containers according to the fusion estimation time, and the multiple target data are sequenced in time order, and data fusion is performed in sequence according to the time order, so as to obtain the fusion positioning information. Because the frequencies of data acquired by the sensors can be different, data of part of the sensors may include data of a plurality of acquisition moments between the current fusion estimation moment and the previous fusion estimation moment, and therefore, when data fusion is performed, data fusion needs to be performed in sequence according to the time sequence, and the accuracy of a fusion result is guaranteed. After the fusion positioning information is obtained, the pose recurrence can be carried out by combining IMU data after the fusion estimation moment, and the real-time performance and the accuracy of the vehicle positioning result are ensured.
Fig. 5 is a schematic structural diagram of a fusion positioning device provided according to an embodiment of the present disclosure. As shown in fig. 5, the fusion positioning device 500 includes:
a receiving module 501, configured to receive data of a plurality of sensors;
a delay module 502, configured to determine a fusion estimation time according to a current time and a preset delay time in response to acquiring the fusion positioning signal;
a determining module 503, configured to determine a plurality of target data from the data of the plurality of sensors according to the fusion estimation time;
and the fusion module 504 is configured to perform data fusion on the multiple target data to obtain fusion positioning information.
In one embodiment, the determining module 503 includes:
a first determination unit configured to acquire a plurality of data between the fusion estimation time and the last fusion estimation time from the data of the plurality of sensors, and determine the plurality of data as a plurality of target data.
In one embodiment, the delay module 502 includes:
and the delay unit is used for subtracting the preset delay time from the current time to obtain the fusion estimation time.
In one embodiment, the fusion locator 500 further comprises:
and the setting module is used for setting a system clock, and the system clock triggers the fusion positioning signal at a preset frequency.
In one embodiment, the delay time is greater than a delay time of a most delayed sensor of the plurality of sensors.
In one embodiment, the fusion module 504 includes:
and the fusion unit is used for sequencing the target data according to time sequence and sequentially fusing the data according to the time sequence.
In one embodiment, the fusion locator 500 further comprises:
the storage unit is used for storing the data of the sensors into a plurality of containers respectively;
the determination module 503 includes:
a second determining unit configured to determine a plurality of target data from the plurality of containers based on the fusion estimation time.
The apparatus of the embodiment of the present disclosure may be configured to execute the fusion positioning method in the above method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
The present disclosure also provides an electronic device and a non-transitory computer-readable storage medium storing computer instructions, according to embodiments of the present disclosure.
According to an embodiment of the present disclosure, the present disclosure also provides a computer program product comprising: a computer program, stored in a readable storage medium, from which at least one processor of the electronic device can read the computer program, the at least one processor executing the computer program causing the electronic device to perform the solution provided by any of the embodiments described above.
Fig. 6 is a schematic block diagram of an electronic device for implementing the fusion positioning method of the embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the electronic device 600 includes a computing unit 601, which can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the device 600 can also be stored. The calculation unit 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
A number of components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, a mouse, or the like; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 601 performs the respective methods and processes described above, such as the fusion localization method. For example, in some embodiments, the fusion localization method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into the RAM 603 and executed by the computing unit 601, one or more steps of the fusion localization method described above may be performed. Alternatively, in other embodiments, the computing unit 601 may be configured to perform the fusion localization method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS"). The server may also be a server of a distributed system, or a server incorporating a blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (17)

1. A fusion localization method, comprising:
receiving data from a plurality of sensors;
in response to the acquisition of the fusion positioning signal, determining a fusion estimation moment according to the current moment and a preset delay time;
determining a plurality of target data from the data of the plurality of sensors according to the fusion estimation time;
and performing data fusion on the plurality of target data to obtain fusion positioning information.
2. The fusion localization method of claim 1, wherein the determining a plurality of target data from the plurality of sensor data according to the fusion estimation time comprises:
and acquiring a plurality of data between the fused estimation time and the last fused estimation time from the data of the plurality of sensors, and determining the plurality of data as the plurality of target data.
3. The fusion positioning method according to claim 1 or 2, wherein the determining the fusion estimation time according to the current time and the preset delay time comprises:
and subtracting the preset delay time from the current time to obtain the fusion estimation time.
4. The fusion localization method of any of claims 1-3, further comprising:
and setting a system clock, wherein the system clock triggers the fusion positioning signal at a preset frequency.
5. The fusion localization method of any of claims 1-4, wherein the preset delay time is greater than a delay time of a most delayed sensor of the plurality of sensors.
6. The fusion localization method of any of claims 1-5, wherein the data fusing the plurality of target data comprises:
and sequencing the target data in time sequence, and performing data fusion in sequence according to the time sequence.
7. The fusion localization method of any one of claims 1-6, further comprising:
storing the data of the sensors into a plurality of containers respectively;
determining a plurality of target data from the data of the plurality of sensors according to the fusion estimation time, comprising:
determining the plurality of target data from the plurality of containers according to the fusion estimation time.
8. A fusion positioning apparatus comprising:
the receiving module is used for receiving data of a plurality of sensors;
the delay module is used for responding to the obtained fusion positioning signal and determining fusion estimation time according to the current time and preset delay time;
a determining module, configured to determine a plurality of target data from the data of the plurality of sensors according to the fusion estimation time;
and the fusion module is used for carrying out data fusion on the plurality of target data to obtain fusion positioning information.
9. The fusion localization apparatus of claim 8, wherein the determination module comprises:
a first determination unit configured to acquire a plurality of data between the fused estimation time and a last fused estimation time from data of the plurality of sensors, and determine the plurality of data as the plurality of target data.
10. The fusion positioning apparatus of claim 8 or 9, wherein the delay module comprises:
and the delay unit is used for subtracting the preset delay time from the current time to obtain the fusion estimation time.
11. The fusion positioning device of any of claims 8-10, further comprising:
and the setting module is used for setting a system clock, and the system clock triggers the fusion positioning signal at a preset frequency.
12. The fusion localization apparatus of any of claims 8-11, wherein the preset delay time is greater than a delay time of a most delayed sensor of the plurality of sensors.
13. The fusion localization apparatus of any of claims 8-12, wherein the fusion module comprises:
and the fusion unit is used for sequencing the target data in time sequence and sequentially fusing the data according to the time sequence.
14. The fusion positioning device of any one of claims 8-13, further comprising:
the storage unit is used for respectively storing the data of the sensors into a plurality of containers;
the determining module comprises:
a second determining unit, configured to determine the plurality of target data from the plurality of containers according to the fusion estimation time.
15. An electronic device, comprising:
at least one processor; and a memory communicatively coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-7.
17. A computer program product comprising a computer program which, when executed by a processor, implements the method of any one of claims 1-7.
CN202110585158.XA 2021-05-27 2021-05-27 Fusion positioning method, device, equipment, storage medium and program product Active CN113327344B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110585158.XA CN113327344B (en) 2021-05-27 2021-05-27 Fusion positioning method, device, equipment, storage medium and program product
PCT/CN2022/095349 WO2022247915A1 (en) 2021-05-27 2022-05-26 Fusion positioning method and apparatus, device, storage medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110585158.XA CN113327344B (en) 2021-05-27 2021-05-27 Fusion positioning method, device, equipment, storage medium and program product

Publications (2)

Publication Number Publication Date
CN113327344A true CN113327344A (en) 2021-08-31
CN113327344B CN113327344B (en) 2023-03-21

Family

ID=77421747

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110585158.XA Active CN113327344B (en) 2021-05-27 2021-05-27 Fusion positioning method, device, equipment, storage medium and program product

Country Status (2)

Country Link
CN (1) CN113327344B (en)
WO (1) WO2022247915A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114964270A (en) * 2022-05-17 2022-08-30 驭势科技(北京)有限公司 Fusion positioning method and device, vehicle and storage medium
WO2022247915A1 (en) * 2021-05-27 2022-12-01 北京百度网讯科技有限公司 Fusion positioning method and apparatus, device, storage medium and program product

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116707746B (en) * 2023-08-07 2024-03-22 长春航盛艾思科电子有限公司 Measuring method of array sensor
CN117499887B (en) * 2024-01-02 2024-03-19 江西机电职业技术学院 Data acquisition method and system based on multi-sensor fusion technology

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298635A (en) * 2011-09-13 2011-12-28 苏州大学 Method and system for fusing event information
CN102390351A (en) * 2011-10-13 2012-03-28 大连理工大学 Multi-sensor fused comprehensive car alarm system
CN102540902A (en) * 2011-12-27 2012-07-04 西安电子科技大学 Single-platform multi-sensor information integration processor and experimental system
CN105049921A (en) * 2015-06-26 2015-11-11 中兴通讯股份有限公司 Data processing method and device
CN105115097A (en) * 2015-07-06 2015-12-02 沈阳工业大学 Variable blast volume air-conditioning end intelligence control system and method based on wireless sensor network
CN106840085A (en) * 2016-12-20 2017-06-13 长安大学 A kind of unmanned plane based on fusion of multi-layer information surveys method high
CN108010144A (en) * 2017-10-24 2018-05-08 武汉米风通信技术有限公司 A kind of system and method for collecting congestion expense based on Internet of Things
CN108445885A (en) * 2018-04-20 2018-08-24 鹤山东风新能源科技有限公司 A kind of automated driving system and its control method based on pure electric vehicle logistic car
CN109035872A (en) * 2018-08-08 2018-12-18 湖北河海科技发展有限公司 Weather information and Track Fusion display system and method
CN109871385A (en) * 2019-02-28 2019-06-11 北京百度网讯科技有限公司 Method and apparatus for handling data
CN111756593A (en) * 2019-03-28 2020-10-09 北京米文动力科技有限公司 Self-testing method and testing method for synchronization precision of time synchronization system
CN112747754A (en) * 2019-10-30 2021-05-04 北京初速度科技有限公司 Fusion method, device and system of multi-sensor data
CN112817301A (en) * 2019-10-30 2021-05-18 北京初速度科技有限公司 Fusion method, device and system of multi-sensor data

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109543703B (en) * 2017-09-22 2023-06-23 华为技术有限公司 Sensor data processing method and device
WO2019198076A1 (en) * 2018-04-11 2019-10-17 Ionterra Transportation And Aviation Technologies Ltd. Real-time raw data- and sensor fusion
US10816358B2 (en) * 2018-07-10 2020-10-27 Rohde & Schwarz Gmbh & Co. Kg Method and test system for sensor fusion positioning testing
CN111721299B (en) * 2020-06-30 2022-07-19 上海汽车集团股份有限公司 Real-time positioning time synchronization method and device
CN112689234B (en) * 2020-12-28 2023-10-17 北京爱笔科技有限公司 Indoor vehicle positioning method, device, computer equipment and storage medium
CN113327344B (en) * 2021-05-27 2023-03-21 北京百度网讯科技有限公司 Fusion positioning method, device, equipment, storage medium and program product

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298635A (en) * 2011-09-13 2011-12-28 苏州大学 Method and system for fusing event information
CN102390351A (en) * 2011-10-13 2012-03-28 大连理工大学 Multi-sensor fused comprehensive car alarm system
CN102540902A (en) * 2011-12-27 2012-07-04 西安电子科技大学 Single-platform multi-sensor information integration processor and experimental system
CN105049921A (en) * 2015-06-26 2015-11-11 中兴通讯股份有限公司 Data processing method and device
CN105115097A (en) * 2015-07-06 2015-12-02 沈阳工业大学 Variable blast volume air-conditioning end intelligence control system and method based on wireless sensor network
CN106840085A (en) * 2016-12-20 2017-06-13 长安大学 A kind of unmanned plane based on fusion of multi-layer information surveys method high
CN108010144A (en) * 2017-10-24 2018-05-08 武汉米风通信技术有限公司 A kind of system and method for collecting congestion expense based on Internet of Things
CN108445885A (en) * 2018-04-20 2018-08-24 鹤山东风新能源科技有限公司 A kind of automated driving system and its control method based on pure electric vehicle logistic car
CN109035872A (en) * 2018-08-08 2018-12-18 湖北河海科技发展有限公司 Weather information and Track Fusion display system and method
CN109871385A (en) * 2019-02-28 2019-06-11 北京百度网讯科技有限公司 Method and apparatus for handling data
CN111756593A (en) * 2019-03-28 2020-10-09 北京米文动力科技有限公司 Self-testing method and testing method for synchronization precision of time synchronization system
CN112747754A (en) * 2019-10-30 2021-05-04 北京初速度科技有限公司 Fusion method, device and system of multi-sensor data
CN112817301A (en) * 2019-10-30 2021-05-18 北京初速度科技有限公司 Fusion method, device and system of multi-sensor data

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022247915A1 (en) * 2021-05-27 2022-12-01 北京百度网讯科技有限公司 Fusion positioning method and apparatus, device, storage medium and program product
CN114964270A (en) * 2022-05-17 2022-08-30 驭势科技(北京)有限公司 Fusion positioning method and device, vehicle and storage medium
CN114964270B (en) * 2022-05-17 2024-04-26 驭势科技(北京)有限公司 Fusion positioning method, device, vehicle and storage medium

Also Published As

Publication number Publication date
WO2022247915A1 (en) 2022-12-01
CN113327344B (en) 2023-03-21

Similar Documents

Publication Publication Date Title
CN113327344B (en) Fusion positioning method, device, equipment, storage medium and program product
CN110617825B (en) Vehicle positioning method and device, electronic equipment and medium
CN111664844B (en) Navigation method, navigation device and electronic equipment
CN113029129B (en) Method and device for determining positioning information of vehicle and storage medium
CN112802325A (en) Vehicle queuing length detection method and device
CN115685249A (en) Obstacle detection method and device, electronic equipment and storage medium
CN113392794A (en) Vehicle over-line identification method and device, electronic equipment and storage medium
CN115139792A (en) Vehicle display control system, method, device, equipment and medium
CN114047760B (en) Path planning method and device, electronic equipment and automatic driving vehicle
CN113219505A (en) Method, device and equipment for acquiring GPS coordinates for vehicle-road cooperative tunnel scene
CN113177980A (en) Target object speed determination method and device for automatic driving and electronic equipment
CN115628754A (en) Odometer initialization method and device, electronic equipment and automatic driving vehicle
CN113984072B (en) Vehicle positioning method, device, equipment, storage medium and automatic driving vehicle
CN115168527A (en) Real-time track data processing method, device and system and electronic equipment
CN114861725A (en) Post-processing method, device, equipment and medium for perception and tracking of target
CN113869439A (en) Data fusion method and device and electronic equipment
CN113203423A (en) Map navigation simulation method and device
CN113970773B (en) Positioning method and device and electronic equipment
CN112824936A (en) Method and device for determining height of ground object, electronic equipment and medium
CN115096304B (en) Delay error correction method, device, electronic equipment and storage medium
CN117031407A (en) Time synchronization method and device for radar
CN116382298A (en) Task processing system, method, electronic device and storage medium
CN114359513A (en) Method and device for determining position of obstacle and electronic equipment
CN116484631A (en) Road simulation scene generation method, device, equipment and storage medium
CN117911469A (en) Registration method, device and equipment of point cloud data and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant