CN114091562A - Multi-sensing data fusion method, device, system, equipment and storage medium - Google Patents

Multi-sensing data fusion method, device, system, equipment and storage medium Download PDF

Info

Publication number
CN114091562A
CN114091562A CN202010777951.5A CN202010777951A CN114091562A CN 114091562 A CN114091562 A CN 114091562A CN 202010777951 A CN202010777951 A CN 202010777951A CN 114091562 A CN114091562 A CN 114091562A
Authority
CN
China
Prior art keywords
sensor
data
fused
sensing
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010777951.5A
Other languages
Chinese (zh)
Inventor
刘建超
王邓江
束然
邓永强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wanji Technology Co Ltd
Original Assignee
Beijing Wanji Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wanji Technology Co Ltd filed Critical Beijing Wanji Technology Co Ltd
Priority to CN202010777951.5A priority Critical patent/CN114091562A/en
Publication of CN114091562A publication Critical patent/CN114091562A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application relates to a multi-sensing data fusion method, a device, a system, equipment and a storage medium. Applied to a multi-sensing system comprising a plurality of sensors, the method comprising: acquiring current environment data of the multi-sensing system, and acquiring a weight correction value of each sensor according to the environment data; acquiring initial weights corresponding to the sensors, and correspondingly correcting the initial weights of the sensors by using the weight correction values of the sensors to obtain target weights of the sensors; acquiring data to be fused of each sensor, wherein the data to be fused is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene; and performing fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result. By adopting the method, the characteristics of different sensors can be fully combined, and the reliability of sensor data sensing is improved.

Description

Multi-sensing data fusion method, device, system, equipment and storage medium
Technical Field
The present application relates to the field of sensor technologies, and in particular, to a method, an apparatus, a system, a device, and a storage medium for fusing multi-sensor data.
Background
The sensor is a detection device which can convert sensed information into electric signals or other information in a required form according to a certain rule and output the electric signals or the information. With the advent of the information age, sensors have been the primary means and approach for obtaining information in the natural and production areas in the process of utilizing information.
For example, in the field of intelligent transportation, more and more roads are provided with sensors for data sensing, and the sensed data is analyzed to monitor road conditions, traffic safety and the like. At present, a laser radar sensor, a millimeter wave radar sensor and a camera are main sensors in the field of intelligent transportation, and are often installed on a cross bar, a vertical bar or a portal frame of a road intersection for data perception on the road. Different sensors have different characteristics, for example, in rainy and snowy weather, the laser radar sensor has more noise interference in the sensed data, and at night or under the condition of insufficient illumination, the imaging effect of the camera is seriously affected, so that the difference between the sensed data of the camera and the reality is larger.
Therefore, how to fully combine the characteristics of different sensors to improve the reliability of the sensor sensing data becomes a problem to be solved urgently at present.
Disclosure of Invention
In view of the above, it is necessary to provide a multi-sensor data fusion method, apparatus, system, device and storage medium that can sufficiently combine the characteristics of different sensors and improve the reliability of sensor sensing data.
In a first aspect, an embodiment of the present application provides a multi-sensing data fusion method, which is applied to a multi-sensing system including multiple sensors, and the method includes:
acquiring current environment data of the multi-sensing system, and acquiring a weight correction value of each sensor according to the environment data;
acquiring initial weights corresponding to the sensors, and correspondingly correcting the initial weights of the sensors by using the weight correction values of the sensors to obtain target weights of the sensors;
acquiring data to be fused of each sensor, wherein the data to be fused is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene;
and performing fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result.
In one embodiment, the performing fusion processing on each to-be-fused data according to the target weight of each sensor to obtain a multi-sensor data fusion result includes:
and performing weighted summation calculation on the data to be fused by using the target weight of each sensor to obtain the multi-sensing data fusion result.
In one embodiment, the obtaining an initial weight corresponding to each sensor includes:
acquiring historical sensing data of each sensor, wherein the historical sensing data of each sensor is sensed by each sensor at the same historical moment and under the same scene;
performing space-time synchronization processing on each historical sensing data to obtain synchronized data corresponding to each historical sensing data;
calculating a variance of each of the synchronized data, and calculating an initial weight of the corresponding sensor according to the variance of each of the synchronized data.
In one embodiment, the calculating the variance of each synchronized data includes:
for each sensor, grouping the synchronized data of the sensors to obtain a plurality of groups of data;
calculating the variance of each group of data in the plurality of groups of data of the sensor, and calculating the variance of the synchronized data of the sensor according to the variance of each group of data.
In one embodiment, the acquiring data to be fused for each sensor includes:
acquiring a plurality of historical data to be fused of each sensor in a preset historical time period, wherein the historical data to be fused is obtained by performing space-time synchronization processing on perception data of the corresponding sensor in the preset historical time period;
calculating an average value of a plurality of historical data to be fused corresponding to each sensor, and determining the average value corresponding to each sensor as the data to be fused of each sensor.
In one embodiment, the acquiring data to be fused for each sensor includes:
acquiring perception data perceived by each sensor at the same time and under the same scene;
and converting the sensing data corresponding to each sensor into a target coordinate system by adopting a preset time-space synchronization method to obtain the data to be fused corresponding to each sensor.
In one embodiment, the sensor comprises a lidar sensor, a camera, or a millimeter-wave radar sensor.
In one embodiment, the acquiring current environmental data of the multi-sensing system includes:
the current environmental data of the multi-sensing system is acquired through an auxiliary sensor, wherein the auxiliary sensor comprises at least one of a high-precision photoelectric sensor and a temperature and humidity sensor.
In a second aspect, an embodiment of the present application provides a multi-sensor data fusion apparatus, where the apparatus includes:
the first acquisition module is used for acquiring current environmental data of the multi-sensing system and acquiring a weight correction value of each sensor according to the environmental data;
the correction module is used for acquiring the initial weight corresponding to each sensor, and correspondingly correcting the initial weight of each sensor by using the weight correction value of each sensor to obtain the target weight of each sensor;
the second acquisition module is used for acquiring data to be fused of each sensor, wherein the data to be fused is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene;
and the processing module is used for carrying out fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensing data fusion result.
In a third aspect, an embodiment of the present application provides a multi-sensing system, where the system includes a processor, an auxiliary sensor, and a sensor, where the sensor and the auxiliary sensor are connected to the processor, the sensor includes at least two of a laser radar sensor, a camera, and a millimeter-wave radar sensor, and the auxiliary sensor includes a high-precision photoelectric sensor and/or a temperature and humidity sensor;
the auxiliary sensor is used for acquiring the current environmental data of the multi-sensor system and sending the environmental data to the processor;
the processor is used for receiving the environment data, obtaining a weight correction value of each sensor according to the environment data, obtaining an initial weight of each sensor, and correspondingly correcting the initial weight of each sensor by using the weight correction value of each sensor to obtain a target weight of each sensor;
the processor is further configured to obtain data to be fused of each sensor, and perform fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result, where the data to be fused is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene.
In a fourth aspect, an embodiment of the present application provides a computer device, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the method according to the first aspect when executing the computer program.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the method according to the first aspect as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
the multi-sensing data fusion method, the device, the system, the equipment and the storage medium acquire the current environment data of the multi-sensing system and acquire the weight correction value of each sensor according to the environment data, because different sensors have different characteristics in the same environment, for example, the data perception effect of a camera is poor at night or under the condition of insufficient illumination, and the data perception effect of a laser radar sensor and a millimeter wave radar sensor is good, the weight correction value of each sensor is acquired according to the current environment data, after the initial weight corresponding to each sensor is acquired, the initial weight of each sensor is correspondingly corrected by using the weight correction value of each sensor, and the target weight of each sensor is acquired, so that the initial weight of each sensor can be dynamically adjusted according to the environment data, and the characteristics of different sensors can be fully combined, if the current environment is at night, the initial weight of the camera can be reduced through the weight correction value of the camera, the initial weight of the laser radar sensor is improved through the weight correction value of the laser radar sensor, and the initial weight of the millimeter wave radar sensor is improved through the weight correction value of the millimeter wave radar sensor, so that the weight of the sensor with performance advantage in the current environment is improved; and then, after the data to be fused of each sensor is obtained, the data to be fused are fused according to the corrected weight of each sensor, namely the target weight of each sensor, so that a multi-sensing data fusion result is obtained, and the reliability of sensing data of the sensors is improved by fully combining the characteristics of different sensors.
Drawings
FIG. 1 is a schematic flow chart diagram illustrating a multi-sensor data fusion method according to one embodiment;
FIG. 2 is a schematic diagram of a partial refinement of step S200 in another embodiment;
FIG. 3 is a diagram illustrating a detailed step of step S300 in another embodiment;
FIG. 4 is a diagram illustrating a detailed step of step S300 in another embodiment;
FIG. 5 is a block diagram showing the structure of a multi-sensor data fusion apparatus according to an embodiment;
FIG. 6 is a block diagram showing the structure of a multi-sensor data fusion apparatus according to an embodiment;
FIG. 7 is a block diagram showing the structure of a multi-sensor data fusion apparatus according to an embodiment;
FIG. 8 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It should be noted that, in the multi-sensor data fusion method provided in the embodiment of the present application, an execution subject may be a multi-sensor data fusion device, and the multi-sensor data fusion device may be implemented as part or all of a computer device by software, hardware, or a combination of software and hardware. In the following method embodiments, the execution subject is a computer device, which may be a server; it can be understood that the multi-sensing data fusion method provided by the following method embodiments may also be applied to a terminal, may also be applied to a system including a terminal and a server, and is implemented through interaction between the terminal and the server.
In one embodiment, as shown in fig. 1, there is provided a multi-sensor data fusion method applied to a multi-sensor system including a plurality of sensors, including the following steps:
and step S100, acquiring current environment data of the multi-sensing system, and acquiring a weight correction value of each sensor according to the environment data.
In the embodiment of the application, the computer device may acquire current environmental data of the multi-sensing system through an auxiliary sensor in the multi-sensing system, where the auxiliary sensor includes at least one of a high-precision photoelectric sensor and a temperature and humidity sensor, the high-precision photoelectric sensor is configured to measure an optical signal in an environment where the multi-sensing system is currently located, and the temperature and humidity sensor is configured to measure a temperature and a humidity in the environment where the multi-sensing system is currently located.
In the embodiment of the present application, the types of the sensors in the multi-sensor system are different, and as an implementation, the multi-sensor system may include at least two of a laser radar sensor, a camera, and a millimeter wave radar sensor. For example, taking the intelligent transportation field as an example, a laser radar sensor, a millimeter wave radar sensor and a camera are main sensors in the intelligent transportation field, and are often installed on a cross bar, a vertical bar or a portal frame of a road intersection for data acquisition of a road test.
After the computer device obtains the current environmental data of the multi-sensing system, the weight correction value corresponding to each sensor in the multi-sensing system and the environmental data can be determined by looking up a table or by adopting a mapping model and the like, that is, each sensor has a corresponding weight correction value under each environmental data.
Because different sensors have different characteristics in the same environment, for example, at night or under the condition of insufficient illumination, the data acquisition effect of the camera is poor, the data acquisition effect of the laser radar sensor and the millimeter wave radar sensor is good, and the initial weight of each sensor in the current environment can be corrected by the weight correction value corresponding to each sensor, so that the weight of the sensor with performance advantage in the current environment can be improved.
And step S200, acquiring the initial weight corresponding to each sensor, and correspondingly correcting the initial weight of each sensor by using the weight correction value of each sensor to obtain the target weight of each sensor.
In the embodiment of the application, each sensor is provided with an initial weight, the sum of the initial weights of the sensors is 1, and the initial weight can be determined according to historical sensing data of each sensor. The computer equipment acquires the initial weight corresponding to each sensor, and for each sensor, the computer equipment corrects the initial weight of the sensor by adopting the weight correction value corresponding to the sensor to obtain the target weight of the sensor, so that the target weight of each sensor is obtained.
In a possible embodiment, the computer device correspondingly corrects the initial weight of a sensor by using the weight correction value of the sensor, and may add the weight correction value of the sensor and the initial weight of the sensor to obtain the target weight of the sensor.
For example, when the current environment data of the multi-sensing system represents that the current environment is night, because the data acquisition effect of the camera is poor at night and the data acquisition effects of the laser radar sensor and the millimeter wave radar sensor are good, the weight correction value of the camera obtained by the computer equipment is a negative value, and the weight correction value of the laser radar sensor and the millimeter wave radar sensor is a positive value; the computer equipment adds the weight correction value of each sensor and the corresponding initial weight, so that the initial weight of the camera can be reduced, the initial weight of the laser radar sensor and the initial weight of the millimeter wave radar sensor are improved, the weight of the sensor with performance advantage in the current environment is improved, the initial weight of each sensor is dynamically adjusted according to environment data, and the advantage characteristics of different sensors can be fully combined.
And step S300, acquiring data to be fused of each sensor.
In the embodiment of the application, the data to be fused is obtained by performing space-time synchronization processing on the sensing data of the corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene.
For example, at the same time and in the same scene, the sensing data acquired by the camera is an image, the sensing data acquired by the lidar sensor is point cloud data, and the sensing data acquired by the millimeter wave radar sensor is electromagnetic waves. The computer equipment can be used for respectively carrying out classification, coordinate extraction, angle calculation and other processing on the image, the point cloud data and the electromagnetic wave to obtain target perception data of each sensor. The target perception data of the camera can comprise the category of the target in the image and the coordinates of a position frame of the target in the image; the target perception data of the laser radar sensor can comprise the category of the target, the center point coordinate of the target and the course angle of the target, wherein the category of the target is extracted from the point cloud data; the target perception data of the millimeter wave radar sensor may include a category of the target extracted from the electromagnetic wave, coordinates of the target, and a speed of the target.
In one possible embodiment, the computer device may set the standard target perception data to include six types of information of category, lateral coordinate, longitudinal coordinate, lateral velocity, longitudinal velocity, and angle. For information that cannot be acquired by each sensor, the computer device marks the class as "0"; for example, the target perception data of the camera includes the category of the target in the image and the coordinates (corresponding to the horizontal coordinate and the vertical coordinate) of the position frame of the target in the image, but the horizontal speed, the vertical speed and the angle cannot be obtained by the camera, and the computer device marks the three categories as 0 in the target perception data of the camera; the target perception data of the laser radar sensor comprises the category of the target, the coordinates of the central point of the target (equivalent to transverse coordinates and longitudinal coordinates) and the course angle (equivalent to angle) of the target, which are extracted from the point cloud data, but the transverse speed and the longitudinal speed cannot be obtained through the laser radar sensor, and the computer equipment marks the three categories as 0 in the target perception data of the laser radar sensor; the target sensing data of the millimeter wave radar sensor includes the category of the target extracted from the electromagnetic wave, the coordinates of the target (equivalent to the lateral coordinates and the longitudinal coordinates), and the speed of the target (equivalent to the lateral speed and the longitudinal speed), but the angle cannot be obtained by the millimeter wave radar sensor, and the computer device marks the category as 0 in the target sensing data of the millimeter wave radar sensor. And after the computer equipment completes the data of each sensor, obtaining the target perception data of each sensor.
Because different sensors sense data based on different original coordinate systems, for example, a camera senses data based on a camera coordinate system, and a millimeter wave radar sensor senses data based on a millimeter wave radar coordinate system, the computer equipment converts target sensing data of the camera, target sensing data of a laser radar sensor and target sensing data of the millimeter wave radar sensor to the same target coordinate system by adopting a space-time synchronization method, so that the data of different sensors can be fused by using the same coordinate system as a reference.
As an implementation manner, the time-space synchronization method may be a calibration method, that is, the computer device calibrates the target sensing data of the camera, the target sensing data of the lidar sensor, and the target sensing data of the millimeter-wave radar sensor to the same target coordinate system by using a calibration algorithm. In the embodiment of the application, the target coordinate system may be a pixel coordinate system, and the computer device obtains the data to be fused corresponding to each sensor after performing space-time synchronization on the target sensing data of each sensor.
In the embodiment of the application, the multiple sensing systems may be set in the same scene, and the computer device may set the same sampling frequency for each sensor, for example, the sampling frequency may be set to be acquired every 0.1 second, so that it is ensured that each sensor senses the sensing data at the same time and in the same scene.
And S400, performing fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result.
And the computer equipment performs fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result. In one possible implementation, the computer may implement step S400 by performing step a as follows:
and step A, performing weighted summation calculation on each data to be fused by using the target weight of each sensor to obtain a multi-sensor data fusion result.
That is, for each sensor, the computer device may multiply the data to be fused of the sensor by the target weight of the sensor, and then add the multiplication results of the sensors to obtain a multi-sensing data fusion result. .
The computer device of the embodiment obtains the current environment data of the multi-sensor system by obtaining the weight correction value of each sensor according to the environment data, because different sensors have different characteristics in the same environment, for example, the data acquisition effect of a camera is poor at night or under the condition of insufficient illumination, and the data acquisition effect of a laser radar sensor and a millimeter wave radar sensor is good, the weight correction value of each sensor is obtained according to the current environment data, after the initial weight corresponding to each sensor is obtained, the initial weight of each sensor is correspondingly corrected by using the weight correction value of each sensor, so as to obtain the target weight of each sensor, thus the initial weight of each sensor can be dynamically adjusted by combining the environment data, thereby fully combining the characteristics of different sensors, for example, when the current environment is at night, the initial weight of the camera can be reduced through the weight correction value of the camera, the initial weight of the laser radar sensor is improved through the weight correction value of the laser radar sensor, the initial weight of the millimeter wave radar sensor is improved through the weight correction value of the millimeter wave radar sensor, and the weight of the sensor with performance advantage in the current environment is improved; and then, after the data to be fused of each sensor is obtained, the data to be fused are fused according to the corrected weight of each sensor, namely the target weight of each sensor, so that a multi-sensing data fusion result is obtained, and the reliability of sensing data of the sensors is improved by fully combining the characteristics of different sensors.
In one embodiment, referring to fig. 2, the embodiment is based on the embodiment shown in fig. 1, and this embodiment relates to a process of how a computer device obtains initial weights corresponding to sensors. As shown in fig. 2, the computer device obtains the initial weight corresponding to each sensor, and can implement the following steps S201, S202, and S203:
step S201, historical sensing data of each sensor is acquired.
The historical perception data of each sensor is perceived by each sensor at the same historical moment and under the same scene. For example, the computer device may obtain historical sensing data of each sensor in a preset historical time period before the current time, and for each sampling time in the preset historical time period, each sensor senses the historical sensing data.
Step S202, performing space-time synchronization processing on each historical sensing data to obtain synchronized data corresponding to each historical sensing data.
And after the computer equipment classifies, extracts coordinates, calculates angles and the like on each historical sensing data, and then performs space-time synchronization processing on the historical sensing data to the same target coordinate system to obtain the synchronized data of each sensor.
Step S203, calculating the variance of each synchronized data, and calculating the initial weight of the corresponding sensor according to the variance of each synchronized data.
In the embodiment of the application, each sensor is assumed to be n sensors, and W is adopted1、W2...WnRepresenting the initial weights of the n sensors, respectively, then:
Figure BDA0002619161460000101
then the total variance after fusion of each sensor is
Figure BDA0002619161460000102
Wherein, deltai 2Represents the variance of the synchronized data for sensor i.
The computer equipment calculates the minimum value of the total variance after the fusion of the sensors to obtain:
Figure BDA0002619161460000103
formula 1 is a calculation formula of the initial weight corresponding to each sensor, and after the computer device calculates the variance of the synchronized data of each sensor, the variance corresponding to each sensor is substituted into formula 1 to obtain the initial weight corresponding to each sensor.
In one possible embodiment, the computer device calculates the variance of the synchronized data of the sensors by performing the following steps: for each sensor, grouping the synchronized data of the sensors to obtain a plurality of groups of data; and calculating the variance of each group of data in the multiple groups of data of the sensor, and calculating the variance of the synchronized data of the sensor according to the variance of each group of data.
In the embodiment of the application, the synchronized data of the sensor can be obtained by processing the historical sensing data sensed by the sensor at a plurality of sampling moments in a historical time period by the computer equipment; if the historical time period is 3 minutes, and the sensor senses data every 0.1 second, the synchronized data of the sensor in the 3 minutes is obtained according to the historical sensing data sensed at 1800 sampling moments, and the synchronized data of the sensor corresponding to the sampling moments can be obtained by processing the historical sensing data at each sampling moment.
In this embodiment of the present application, the synchronized data at each sampling time includes six types of information, including a category, a horizontal coordinate, a vertical coordinate, a horizontal speed, a vertical speed, and an angle, where for information that cannot be acquired by a certain sensor in the six types of information, the computer device marks the type of information as "0" in the synchronized data at each sampling time corresponding to the sensor.
As an embodiment, the computer device may obtain two sets of data corresponding to each sensor by dividing the synchronized data at the plurality of sampling times, the sampling times of which are odd numbers, into one set and dividing the synchronized data at the plurality of sampling times, the sampling times of which are even numbers, into one set according to the parity of the sampling times. For example, the first set of data corresponding to sensor i employs Zi(1),Zi(3),Zi(5)...Zi(p) the corresponding second set of data is Zi(2),Zi(4),Zi(6)...ZiAnd (q), wherein p and q are positive integers, p is an odd number, q is an even number, and the value of p + q is the total sampling times of the sensor a.
The computer device calculates an arithmetic mean of a first set of data corresponding to sensor i
Figure BDA0002619161460000115
And arithmetic mean of the second set of data
Figure BDA0002619161460000111
Then, the computer device calculates the mean square error δ of the first set of data using equation 2 belowi1
Figure BDA0002619161460000112
The computer device calculates the mean square error δ of the second set of data using equation 3 belowi2
Figure BDA0002619161460000113
Thus, the computer device obtains the variance δ of the first set of data corresponding to sensor ii1And the mean square error δ of the second set of datai2Variance of synchronized data corresponding to sensor i
Figure BDA0002619161460000114
After the computer equipment calculates the variance of the synchronized data of each sensor, the variance corresponding to each sensor is substituted into formula 1 to obtain the initial weight corresponding to each sensor.
In the embodiment of the application, the computer equipment acquires historical sensing data of each sensor, and the historical sensing data of each sensor is sensed by each sensor at the same historical moment and under the same scene; performing time-space synchronization processing on each historical sensing data to obtain synchronized data corresponding to each historical sensing data; the variance of each synchronized data is calculated, and the initial weight of the corresponding sensor is calculated according to the variance of each synchronized data, so that the initial weight of each sensor is set, and the target weight of each sensor is used for correcting the initial weight of each sensor, so that the data to be fused of each sensor can be fused, the condition that the error is large when only a single sensor is used for sensing the data is avoided, and the reliability of the fusion result of the multi-sensor data is ensured.
In one embodiment, referring to fig. 3, the embodiment is related to a process of how a computer device acquires data to be fused of each sensor based on the embodiment shown in fig. 1. As shown in fig. 3, step S300 of the present embodiment includes step S301 and step S302:
step S301, acquiring a plurality of historical data to be fused of each sensor in a preset historical time period.
The historical data to be fused is obtained by performing space-time synchronization processing on sensing data of the corresponding sensor in a preset historical time period by the computer equipment. After the computer equipment carries out classification, coordinate extraction, angle calculation and other processing on the sensing data sensed by each sensor at a plurality of sampling moments before the current moment, the sensing data are calibrated to the same target coordinate system, and a plurality of historical data to be fused of each sensor are obtained.
Step S302, calculating an average value of a plurality of historical data to be fused corresponding to each sensor, and determining the average value corresponding to each sensor as the data to be fused of each sensor.
The computer equipment calculates the average value of a plurality of historical data to be fused corresponding to each sensor, and determines the average value corresponding to each sensor as the target traffic data of each sensor.
In the embodiment of the application, the computer device determines the average value corresponding to each sensor as the target traffic data of each sensor, so that even if each sensor does not sense data currently, for example, a sensor fault, the computer device can obtain the data to be fused of each sensor according to a plurality of historical data to be fused, so that the data to be fused corresponding to each sensor can be fused, the multi-sensor data fusion result can be output normally, and the reliability of the output data is improved.
In one embodiment, referring to fig. 4, on the basis of the embodiment shown in fig. 1, this embodiment relates to a process of how a computer device obtains data to be fused corresponding to each sensor through space-time synchronization. As shown in fig. 4, step S300 of the present embodiment includes step S303 and step S304:
step S303, obtaining the sensing data sensed by each sensor at the same time and under the same scene.
In the embodiment of the application, the computer device obtains the sensing data sensed by each sensor at the same time and in the same scene, the same time may be a real-time acquisition time, for example, the sensing data sensed by the camera is an image, the sensing data sensed by the lidar sensor is point cloud data, the sensing data sensed by the millimeter wave radar sensor is an electromagnetic wave, and the computer device may perform classification, coordinate extraction, angle calculation and other processing on the image, the point cloud data and the electromagnetic wave respectively to obtain the target sensing data of each sensor.
And step S304, converting the sensing data corresponding to each sensor to a target coordinate system by adopting a preset time-space synchronization method to obtain the data to be fused corresponding to each sensor.
Because different sensors sense data based on different coordinate systems, for example, a camera senses data based on a camera coordinate system, and a millimeter wave radar sensor senses data based on a millimeter wave radar coordinate system, the computer device calibrates target sensing data of the camera, target sensing data of the laser radar sensor, and target sensing data of the millimeter wave radar sensor to the same target coordinate system by a preset space-time synchronization method, for example, a calibration algorithm, so that data of different sensors can be fused, and the target coordinate system is a pixel coordinate system.
In one possible implementation, step S304 may include step a1 and step a 2:
step a1, acquiring initial rotational freedom parameters and initial translational freedom parameters corresponding to each sensor.
Step a2, for each sensor, converting target perception data corresponding to the sensor to a target coordinate system by using the initial rotational degree of freedom parameter of the sensor, the initial translational degree of freedom parameter of the sensor and a preset calibration algorithm, and obtaining data to be fused corresponding to each sensor.
In the embodiment of the present application, the initial rotational degree of freedom parameter and the initial translational degree of freedom parameter corresponding to each sensor may be manually set, and the computer device obtains the parameters.
Taking the example where the target coordinate system is a pixel coordinate system, the computer device may first convert the coordinates of the target sensing data of each sensor to a reference coordinate system, which may be a world coordinate system, and then convert the coordinates of each sensor in the reference coordinate system to the pixel coordinate system.
As an embodiment, the computer device may convert the coordinates of the target perception data of each sensor to the reference coordinate system using the following equation 4:
Figure BDA0002619161460000131
wherein the content of the first and second substances,
Figure BDA0002619161460000132
is the coordinate of any point in the target sensing data of a sensor in the original coordinate system of the sensor,
Figure BDA0002619161460000133
and R is a rotation matrix corresponding to the initial rotational freedom parameter of the sensor, and t is the initial translational freedom parameter of the sensor.
After the computer equipment converts the coordinates of the target sensing data of each sensor to the reference coordinate system through a formula 4, the following formula 5 is adopted to convert the coordinates of each sensor in the reference coordinate system to the pixel coordinate system:
Figure BDA0002619161460000141
wherein (u, v) is the coordinate in the world coordinate system
Figure BDA0002619161460000142
And converting the pixel coordinates into a pixel coordinate system, wherein a matrix with the size of 3 x 3 is an internal reference matrix of the camera, and r and t are initial rotational freedom parameters and initial translational freedom parameters corresponding to each sensor.
As an implementation manner, the computer device may further correct the calibration result by using a distortion coefficient of the camera, so as to obtain data to be fused corresponding to each sensor.
In one possible implementation, step S304 may include step a3, step a4, and step a 5:
step a3, acquiring a plurality of calibrated data corresponding to the target coordinate system and at least two sensors in the historical calibration process, and acquiring a plurality of historical rotational freedom parameters and a plurality of historical translational freedom parameters of each sensor.
In this embodiment of the application, the historical calibration process may be a process in which the computer device converts sensing data sensed by the camera, the laser radar sensor, and the millimeter wave radar sensor at a historical time into a target coordinate system, and the computer device obtains the converted plurality of calibrated data and obtains a plurality of historical rotational degree of freedom parameters and a plurality of historical translational degree of freedom parameters of each sensor used in the calibration process.
Step a4, selecting a target rotational degree of freedom parameter and a target historical translational degree of freedom parameter corresponding to each sensor from a plurality of historical rotational degree of freedom parameters and a plurality of historical translational degree of freedom parameters of each sensor according to a plurality of calibrated data of each sensor.
In the embodiment of the application, the computer equipment acquires the position information of the fusion area of each sensor, and the fusion area can be the overlapping part of the coverage area of each sensor; for a plurality of calibrated data of the laser radar sensor in the fusion area, the computer device calculates a minimum circumscribed rectangular frame of point coordinates of each target in the plurality of calibrated data, then calculates an overlapping area of the minimum circumscribed rectangular frame of each target and an image recognition result target frame of the target, calculates a ratio Rl of the overlapping area of each target and the image recognition result target frame, and finally calculates a mean value Ml of the ratios Rl of all the targets. For the millimeter wave radar sensor, the computer device takes a target pixel of each target in the calibrated data as a center, p is a radius to draw a rectangular frame, p can be set according to the type of the target during implementation, the computer device calculates the overlapping area of the rectangular frame of each target and the target frame of the image recognition result, then calculates the ratio Rr of the overlapping area to the rectangular frame, and finally calculates the mean value Mr of the ratios Rr of all the targets.
And the computer equipment determines the Ml and the Mr of the maximum value, and determines the historical rotational freedom parameter and the historical translational freedom parameter corresponding to the Ml and the Mr of the maximum value as a target rotational freedom parameter and a target historical translational freedom parameter.
Step a5, for each sensor, converting target sensing data corresponding to the sensor into a target coordinate system by adopting the target rotational freedom parameter of the sensor, the target historical translational freedom parameter of the sensor and a preset calibration algorithm, and obtaining data to be fused corresponding to each sensor.
The computer equipment converts the target sensing data corresponding to each sensor into a target coordinate system by adopting a target rotational degree of freedom parameter, a target historical translational degree of freedom parameter of the sensor and a preset calibration algorithm, such as the Zhang calibration algorithm, so as to obtain the data to be fused corresponding to each sensor.
Therefore, the target rotational freedom parameter and the target historical translational freedom parameter with the best calibration effect are selected from the plurality of historical rotational freedom parameters and the plurality of historical translational freedom parameters of each sensor, and the target sensing data corresponding to each sensor are converted into the target coordinate system by adopting the target rotational freedom parameter and the target historical translational freedom parameter, so that the data accuracy of the data to be fused of the sensors is improved, and the data reliability is improved.
It should be understood that although the various steps in the flow charts of fig. 1-4 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-4 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least some of the other steps.
In one embodiment, as shown in fig. 5, there is provided a multi-sensory data fusion apparatus including:
the first obtaining module 10 is configured to obtain current environmental data of the multi-sensing system, and obtain a weight correction value of each sensor according to the environmental data;
a correction module 20, configured to obtain an initial weight corresponding to each sensor, and correspondingly correct the initial weight of each sensor by using a weight correction value of each sensor, so as to obtain a target weight of each sensor;
a second obtaining module 30, configured to obtain to-be-fused data of each sensor, where the to-be-fused data is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and in the same scene;
and the processing module 40 is configured to perform fusion processing on the data to be fused according to the target weight of each sensor, so as to obtain a multi-sensor data fusion result.
In an embodiment, the processing module 40 is specifically configured to perform weighted summation calculation on each to-be-fused data by using a target weight of each sensor, so as to obtain the multi-sensor data fusion result.
In one embodiment, the modification module 20 is specifically configured to, when obtaining the initial weight corresponding to each sensor, obtain historical sensing data of each sensor, where the historical sensing data of each sensor is sensed by each sensor at the same historical time and under the same scene; performing space-time synchronization processing on each historical sensing data to obtain synchronized data corresponding to each historical sensing data; calculating a variance of each of the synchronized data, and calculating an initial weight of the corresponding sensor according to the variance of each of the synchronized data.
In one embodiment, the modification module 30 is specifically configured to group the synchronized data of the sensors for each sensor to obtain multiple sets of data when calculating the variance of each synchronized data; calculating the variance of each group of data in the plurality of groups of data of the sensor, and calculating the variance of the synchronized data of the sensor according to the variance of each group of data.
In an embodiment, based on the embodiment shown in fig. 5, as shown in fig. 6, the second obtaining module 30 includes:
a first obtaining unit 301, configured to obtain multiple pieces of historical data to be fused of each sensor in a preset historical time period, where the historical data to be fused is obtained by performing time-space synchronization processing on sensing data of a corresponding sensor in the preset historical time period;
a calculating unit 302, configured to calculate an average value of a plurality of historical data to be fused corresponding to each sensor, and determine the average value corresponding to each sensor as the data to be fused of each sensor.
In an embodiment, based on the embodiment shown in fig. 5, as shown in fig. 7, the second obtaining module 30 includes:
a second obtaining unit 303, configured to obtain perception data perceived by each of the sensors at the same time and in the same scene;
and the time-space synchronization unit 304 is configured to convert the sensing data corresponding to each sensor into a target coordinate system by using a preset time-space synchronization method, so as to obtain data to be fused corresponding to each sensor.
In one embodiment, the sensor comprises a lidar sensor, a camera, or a millimeter-wave radar sensor.
In an embodiment, the first obtaining module 10 is specifically configured to obtain the current environmental data of the multi-sensing system through an auxiliary sensor, where the auxiliary sensor includes at least one of a high-precision photoelectric sensor and a temperature and humidity sensor.
For specific limitations of the multi-sensor data fusion device, reference may be made to the above limitations of the multi-sensor data fusion method, which are not described herein again. The modules in the multi-sensor data fusion device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a multi-sensing system is provided, the system comprising a processor, an auxiliary sensor and a sensor, the sensor and the auxiliary sensor being connected to the processor, the sensor comprising at least two of a lidar sensor, a camera and a millimeter-wave radar sensor, the auxiliary sensor comprising a high-precision photosensor and/or a temperature and humidity sensor;
the auxiliary sensor is used for acquiring the current environmental data of the multi-sensor system and sending the environmental data to the processor;
the processor is used for receiving the environment data, obtaining a weight correction value of each sensor according to the environment data, obtaining an initial weight of each sensor, and correspondingly correcting the initial weight of each sensor by using the weight correction value of each sensor to obtain a target weight of each sensor;
the processor is further configured to obtain data to be fused of each sensor, and perform fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result, where the data to be fused is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene.
In an embodiment, the processor is further configured to perform the steps of the method in any of the above embodiments.
For specific limitations of the processor of the multi-sensing system, reference may be made to the limitations of the multi-sensing data fusion method in the foregoing embodiments, and details are not repeated here.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 8. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer equipment is used for storing data of the multi-sensing data fusion method. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a multi-sensory data fusion method.
Those skilled in the art will appreciate that the architecture shown in fig. 8 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
acquiring current environment data of the multi-sensing system, and acquiring a weight correction value of each sensor according to the environment data;
acquiring initial weights corresponding to the sensors, and correspondingly correcting the initial weights of the sensors by using the weight correction values of the sensors to obtain target weights of the sensors;
acquiring data to be fused of each sensor, wherein the data to be fused is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene;
and performing fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
and performing weighted summation calculation on the data to be fused by using the target weight of each sensor to obtain the multi-sensing data fusion result.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring historical sensing data of each sensor, wherein the historical sensing data of each sensor is sensed by each sensor at the same historical moment and under the same scene;
performing space-time synchronization processing on each historical sensing data to obtain synchronized data corresponding to each historical sensing data;
calculating a variance of each of the synchronized data, and calculating an initial weight of the corresponding sensor according to the variance of each of the synchronized data.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
for each sensor, grouping the synchronized data of the sensors to obtain a plurality of groups of data;
calculating the variance of each group of data in the plurality of groups of data of the sensor, and calculating the variance of the synchronized data of the sensor according to the variance of each group of data.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring a plurality of historical data to be fused of each sensor in a preset historical time period, wherein the historical data to be fused is obtained by performing space-time synchronization processing on perception data of the corresponding sensor in the preset historical time period;
calculating an average value of a plurality of historical data to be fused corresponding to each sensor, and determining the average value corresponding to each sensor as the data to be fused of each sensor.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring perception data perceived by each sensor at the same time and under the same scene;
and converting the sensing data corresponding to each sensor into a target coordinate system by adopting a preset time-space synchronization method to obtain the data to be fused corresponding to each sensor.
In one embodiment, the sensor comprises a lidar sensor, a camera, or a millimeter-wave radar sensor.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
the current environmental data of the multi-sensing system is acquired through an auxiliary sensor, wherein the auxiliary sensor comprises at least one of a high-precision photoelectric sensor and a temperature and humidity sensor.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring current environment data of the multi-sensing system, and acquiring a weight correction value of each sensor according to the environment data;
acquiring initial weights corresponding to the sensors, and correspondingly correcting the initial weights of the sensors by using the weight correction values of the sensors to obtain target weights of the sensors;
acquiring data to be fused of each sensor, wherein the data to be fused is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene;
and performing fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and performing weighted summation calculation on the data to be fused by using the target weight of each sensor to obtain the multi-sensing data fusion result.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring historical sensing data of each sensor, wherein the historical sensing data of each sensor is sensed by each sensor at the same historical moment and under the same scene;
performing space-time synchronization processing on each historical sensing data to obtain synchronized data corresponding to each historical sensing data;
calculating a variance of each of the synchronized data, and calculating an initial weight of the corresponding sensor according to the variance of each of the synchronized data.
In one embodiment, the computer program when executed by the processor further performs the steps of:
for each sensor, grouping the synchronized data of the sensors to obtain a plurality of groups of data;
calculating the variance of each group of data in the plurality of groups of data of the sensor, and calculating the variance of the synchronized data of the sensor according to the variance of each group of data.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring a plurality of historical data to be fused of each sensor in a preset historical time period, wherein the historical data to be fused is obtained by performing space-time synchronization processing on perception data of the corresponding sensor in the preset historical time period;
calculating an average value of a plurality of historical data to be fused corresponding to each sensor, and determining the average value corresponding to each sensor as the data to be fused of each sensor.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring perception data perceived by each sensor at the same time and under the same scene;
and converting the sensing data corresponding to each sensor into a target coordinate system by adopting a preset time-space synchronization method to obtain the data to be fused corresponding to each sensor.
In one embodiment, the sensor comprises a lidar sensor, a camera, or a millimeter-wave radar sensor.
In one embodiment, the computer program when executed by the processor further performs the steps of:
the current environmental data of the multi-sensing system is acquired through an auxiliary sensor, wherein the auxiliary sensor comprises at least one of a high-precision photoelectric sensor and a temperature and humidity sensor.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. A multi-sensing data fusion method is applied to a multi-sensing system comprising a plurality of sensors, and is characterized by comprising the following steps:
acquiring current environment data of the multi-sensing system, and acquiring a weight correction value of each sensor according to the environment data;
acquiring initial weights corresponding to the sensors, and correspondingly correcting the initial weights of the sensors by using the weight correction values of the sensors to obtain target weights of the sensors;
acquiring data to be fused of each sensor, wherein the data to be fused is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene;
and performing fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result.
2. The method according to claim 1, wherein the fusing each data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result comprises:
and performing weighted summation calculation on the data to be fused by using the target weight of each sensor to obtain the multi-sensing data fusion result.
3. The method of claim 1, wherein obtaining an initial weight for each of the sensors comprises:
acquiring historical sensing data of each sensor, wherein the historical sensing data of each sensor is sensed by each sensor at the same historical moment and under the same scene;
performing space-time synchronization processing on each historical sensing data to obtain synchronized data corresponding to each historical sensing data;
calculating a variance of each of the synchronized data, and calculating an initial weight of the corresponding sensor according to the variance of each of the synchronized data.
4. The method of claim 3, wherein said calculating a variance of each of said synchronized data comprises:
for each sensor, grouping the synchronized data of the sensors to obtain a plurality of groups of data;
calculating the variance of each group of data in the plurality of groups of data of the sensor, and calculating the variance of the synchronized data of the sensor according to the variance of each group of data.
5. The method of claim 1, wherein the acquiring the data to be fused for each sensor comprises:
acquiring a plurality of historical data to be fused of each sensor in a preset historical time period, wherein the historical data to be fused is obtained by performing space-time synchronization processing on perception data of the corresponding sensor in the preset historical time period;
calculating an average value of a plurality of historical data to be fused corresponding to each sensor, and determining the average value corresponding to each sensor as the data to be fused of each sensor.
6. The method of claim 1, wherein the acquiring the data to be fused for each sensor comprises:
acquiring perception data perceived by each sensor at the same time and under the same scene;
and converting the sensing data corresponding to each sensor into a target coordinate system by adopting a preset time-space synchronization method to obtain the data to be fused corresponding to each sensor.
7. The method of any one of claims 1-6, wherein the sensor comprises a lidar sensor, a camera, or a millimeter-wave radar sensor.
8. The method according to any one of claims 1-6, wherein said acquiring current environmental data of said multi-sensing system comprises:
the current environmental data of the multi-sensing system is acquired through an auxiliary sensor, wherein the auxiliary sensor comprises at least one of a high-precision photoelectric sensor and a temperature and humidity sensor.
9. A multi-sensory data fusion apparatus, the apparatus comprising:
the first acquisition module is used for acquiring current environmental data of the multi-sensing system and acquiring a weight correction value of each sensor according to the environmental data;
the correction module is used for acquiring the initial weight corresponding to each sensor, and correspondingly correcting the initial weight of each sensor by using the weight correction value of each sensor to obtain the target weight of each sensor;
the second acquisition module is used for acquiring data to be fused of each sensor, wherein the data to be fused is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene;
and the processing module is used for carrying out fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensing data fusion result.
10. The multi-sensing system is characterized by comprising a processor, an auxiliary sensor and sensors, wherein the sensors and the auxiliary sensor are connected with the processor, the sensors comprise at least two of a laser radar sensor, a camera and a millimeter wave radar sensor, and the auxiliary sensor comprises a high-precision photoelectric sensor and/or a temperature and humidity sensor;
the auxiliary sensor is used for acquiring the current environmental data of the multi-sensor system and sending the environmental data to the processor;
the processor is used for receiving the environment data, obtaining a weight correction value of each sensor according to the environment data, obtaining an initial weight of each sensor, and correspondingly correcting the initial weight of each sensor by using the weight correction value of each sensor to obtain a target weight of each sensor;
the processor is further configured to obtain data to be fused of each sensor, and perform fusion processing on the data to be fused according to the target weight of each sensor to obtain a multi-sensor data fusion result, where the data to be fused is obtained by performing space-time synchronization processing on sensing data of corresponding sensors, and the sensing data of each sensor is sensed by each sensor at the same time and under the same scene.
11. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 8.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 8.
CN202010777951.5A 2020-08-05 2020-08-05 Multi-sensing data fusion method, device, system, equipment and storage medium Pending CN114091562A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010777951.5A CN114091562A (en) 2020-08-05 2020-08-05 Multi-sensing data fusion method, device, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010777951.5A CN114091562A (en) 2020-08-05 2020-08-05 Multi-sensing data fusion method, device, system, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114091562A true CN114091562A (en) 2022-02-25

Family

ID=80295149

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010777951.5A Pending CN114091562A (en) 2020-08-05 2020-08-05 Multi-sensing data fusion method, device, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114091562A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114593710A (en) * 2022-03-04 2022-06-07 沃飞长空科技(成都)有限公司 Unmanned aerial vehicle measuring method, system, electronic equipment and medium
CN115018014A (en) * 2022-07-27 2022-09-06 东南大学 Machine learning-assisted communication scene classification method based on multi-source information

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114593710A (en) * 2022-03-04 2022-06-07 沃飞长空科技(成都)有限公司 Unmanned aerial vehicle measuring method, system, electronic equipment and medium
CN114593710B (en) * 2022-03-04 2024-02-06 四川傲势科技有限公司 Unmanned aerial vehicle measurement method, unmanned aerial vehicle measurement system, electronic equipment and medium
CN115018014A (en) * 2022-07-27 2022-09-06 东南大学 Machine learning-assisted communication scene classification method based on multi-source information
CN115018014B (en) * 2022-07-27 2024-05-10 东南大学 Machine learning-assisted communication scene classification method based on multi-source information

Similar Documents

Publication Publication Date Title
CN111583663B (en) Monocular perception correction method and device based on sparse point cloud and storage medium
CN109961468B (en) Volume measurement method and device based on binocular vision and storage medium
CN110570449B (en) Positioning and mapping method based on millimeter wave radar and visual SLAM
CN109919893B (en) Point cloud correction method and device and readable storage medium
CN114091562A (en) Multi-sensing data fusion method, device, system, equipment and storage medium
CN110033492B (en) Camera calibration method and terminal
US11263774B2 (en) Three-dimensional position estimation device and program
CN113447923A (en) Target detection method, device, system, electronic equipment and storage medium
CN114111775B (en) Multi-sensor fusion positioning method and device, storage medium and electronic equipment
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
CN112991465A (en) Camera calibration method and device, electronic equipment and computer readable medium
US11580668B2 (en) Automatic correction method for onboard camera and onboard camera device
CN112270320A (en) Power transmission line tower coordinate calibration method based on satellite image correction
CN115451948A (en) Agricultural unmanned vehicle positioning odometer method and system based on multi-sensor fusion
CN114782548B (en) Global image-based radar data calibration method, device, equipment and medium
CN113720428A (en) Vehicle speed perception dynamic weighing compensation method based on artificial intelligence and computer vision
CN110910432A (en) Remote sensing image matching method and device, electronic equipment and readable storage medium
CN110992463A (en) Three-dimensional reconstruction method and system for sag of power transmission conductor based on trinocular vision
CN113140002B (en) Road condition detection method and system based on binocular stereo camera and intelligent terminal
CN111582296B (en) Remote sensing image comprehensive matching method and device, electronic equipment and storage medium
CN111639662A (en) Remote sensing image bidirectional matching method and device, electronic equipment and storage medium
CN116819561A (en) Point cloud data matching method, system, electronic equipment and storage medium
CN115908551A (en) Vehicle distance measuring method and device, electronic equipment and storage medium
CN114429515A (en) Point cloud map construction method, device and equipment
CN113203424A (en) Multi-sensor data fusion method and device and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination