CN113514806B - Obstacle determination method and device in automatic driving process and electronic equipment - Google Patents

Obstacle determination method and device in automatic driving process and electronic equipment Download PDF

Info

Publication number
CN113514806B
CN113514806B CN202110364712.1A CN202110364712A CN113514806B CN 113514806 B CN113514806 B CN 113514806B CN 202110364712 A CN202110364712 A CN 202110364712A CN 113514806 B CN113514806 B CN 113514806B
Authority
CN
China
Prior art keywords
obstacle
target
data
information
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110364712.1A
Other languages
Chinese (zh)
Other versions
CN113514806A (en
Inventor
史院平
韩志华
杨福威
吴宏升
王冠男
张旭
刘雨晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhitu Technology Co Ltd
Original Assignee
Suzhou Zhitu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhitu Technology Co Ltd filed Critical Suzhou Zhitu Technology Co Ltd
Priority to CN202110364712.1A priority Critical patent/CN113514806B/en
Publication of CN113514806A publication Critical patent/CN113514806A/en
Application granted granted Critical
Publication of CN113514806B publication Critical patent/CN113514806B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a method and a device for determining an obstacle in an automatic driving process and electronic equipment. Wherein the method comprises the following steps: when current obstacle data uploaded by a current sensor are monitored, performing position compensation on a current tracking list to obtain a compensation tracking list; performing association matching on a tracking target in the compensation tracking list and a target obstacle corresponding to the current obstacle data to obtain an association matching pair set; correcting obstacle data corresponding to the target obstacle in the association and matching relation pair according to the motion scene and the type of the current sensor to obtain obstacle correction data; the obstacle target data of the target obstacle in each association matching relation pair in the association matching pair set is fused, so that the target obstacle is determined according to the fusion information, the perception precision of the obstacle is improved, the safety of the vehicle in the automatic driving process is improved, and the method has good practical value.

Description

Obstacle determination method and device in automatic driving process and electronic equipment
Technical Field
The invention relates to the technical field of automatic driving, in particular to a method and a device for determining an obstacle in an automatic driving process and electronic equipment.
Background
In recent years, the automatic driving technology has become a hot topic of interest, and unmanned automobiles are one of products to which the automatic driving technology is applied. The unmanned automobile mainly senses the surrounding environment of the automobile through a sensing system and controls the steering and the speed of the automobile according to the road, the automobile position and the obstacle information obtained through sensing, so that the automobile can safely and reliably automatically run on the road. The sensing system is mostly dependent on the characteristics of a single sensor, and the sensing capability of each sensor to obstacles in different scenes is different due to the characteristics of each sensor, so that the data of the sensors of different types are necessarily fused to improve the sensing precision of the obstacles.
The existing sensor data fusion method is a synchronous fusion method with a fixed period, namely, in a fixed period, data acquired by a plurality of sensors are synchronously fused firstly, and then are fused with tracking targets in a tracking list. Although the fusion method can determine the obstacle, due to inconsistent speed of data transmission of different types of sensors, lag exists in sensor data in a fixed period, so that when position synchronous compensation is carried out through the frequency of the sensors, the position compensation is inaccurate, the perception precision of the obstacle is reduced, and the safety of an automatic driving vehicle is reduced.
Disclosure of Invention
In view of the above, the present invention aims to provide a method, a device and an electronic device for determining an obstacle in an automatic driving process, so as to alleviate the above problems, improve the perception precision of the obstacle, further improve the safety of a vehicle in the automatic driving process, and have better practical value.
In a first aspect, an embodiment of the present invention provides a method for determining an obstacle in an autopilot, where the method is applied to a processor of a vehicle sensing system, and the processor is further connected with a sensor, and the method includes: monitoring obstacle data uploaded by each sensor; wherein the sensor comprises at least one of: laser radar, millimeter wave radar and vision sensor; the obstacle data includes obstacle position information and obstacle speed information; when current obstacle data uploaded by a current sensor is monitored, performing position compensation on a current tracking list based on the current obstacle data to obtain a compensated tracking list; the current tracking list is generated according to the previous obstacle data and comprises a plurality of tracking targets; performing association matching on a tracking target in the compensation tracking list and a target obstacle corresponding to the current obstacle data to obtain an association matching pair set; the association matching pair set comprises at least one association matching relation pair; acquiring a motion scene, and correcting obstacle data corresponding to a target obstacle in the association matching relation pairs according to the motion scene and the type of the current sensor to obtain obstacle correction data of each association matching relation pair; filtering the obstacle correction data based on the self-adaptive Kalman filtering to obtain obstacle target data of the target obstacle in each association matching relation pair; and carrying out fusion processing on the barrier target data of the target barrier in each association matching relation in the association matching pair set, and determining the target barrier according to the fusion information.
With reference to the first aspect, an embodiment of the present invention provides a first possible implementation manner of the first aspect, where the step of performing position compensation on the current tracking list based on the current obstacle data includes: calculating to obtain a time difference according to the time stamp of the current obstacle data and the generated time stamp of the current tracking list; calculating to obtain a compensation position of each tracking target based on the time difference and the speed of each tracking target; and carrying out position compensation on the tracking targets based on the compensation positions to obtain target positions of each tracking target.
With reference to the first possible implementation manner of the first aspect, the embodiment of the present invention provides a second possible implementation manner of the first aspect, where the step of performing association matching on a tracking target in the compensation tracking list and a target obstacle corresponding to current obstacle data includes: calculating to obtain a distance matrix according to the target position of the tracking target and the barrier position information of the target barrier; wherein the distance matrix includes a plurality of distance values for characterizing a distance between each tracking target and the target obstacle; and performing association matching on the distance matrix based on a global nearest neighbor algorithm to obtain an association matching pair set.
With reference to the first aspect, the embodiment of the present invention provides a third possible implementation manner of the first aspect, where the processor is further connected to a combined navigation system provided on the vehicle; the step of obtaining the motion scene comprises the following steps: acquiring vehicle information acquired by a combined navigation system; the vehicle information comprises vehicle vertical angular velocity information, vehicle speed information and vehicle position information; calculating to obtain vehicle course angle information based on the vehicle information; determining a motion scene according to the heading angle information of the vehicle; wherein the motion scene comprises a vehicle straight running scene or a vehicle turning scene.
With reference to the third possible implementation manner of the first aspect, an embodiment of the present invention provides a fourth possible implementation manner of the first aspect, where the step of determining a motion scene according to the heading angle information of the vehicle includes: calculating the difference value between the current course angle and a preset reference course angle; the preset reference course angle is an average course angle of the vehicle in a preset time before the current time; judging whether the difference value meets a preset threshold range or not; if yes, the current motion scene is a vehicle straight-running scene; if not, the current motion scene is a vehicle turning scene.
With reference to the first aspect, an embodiment of the present invention provides a fifth possible implementation manner of the first aspect, where the processor is further connected to a camera disposed on a vehicle, and the step of performing fusion processing on obstacle target data of the target obstacle in each of the association matching pairs in the association matching pair set includes: acquiring obstacle information acquired by a camera; the obstacle information comprises characteristic information of a plurality of obstacles in a preset range in the running direction of the vehicle, and the characteristic information comprises obstacle type information and obstacle size information; and carrying out fusion processing on the obstacle target data of the target obstacle and the characteristic information of the target obstacle in each association matching relation in the association matching pair set to obtain fusion information.
With reference to the first aspect, an embodiment of the present invention provides a sixth possible implementation manner of the first aspect, where the sensors are lidar, and the step of monitoring obstacle data uploaded by each sensor includes: acquiring laser scanning point cloud data transmitted by a laser radar; generating environmental light point cloud information according to the laser scanning point cloud data; obstacle data is generated from the ambient light point cloud information.
In a second aspect, an embodiment of the present invention further provides an apparatus for determining an obstacle in an autopilot, where the apparatus is applied to a processor of a vehicle sensing system, and the processor is further connected to a sensor, and the apparatus includes: the monitoring module is used for monitoring obstacle data uploaded by each sensor; wherein the sensor comprises at least one of: laser radar, millimeter wave radar and vision sensor; the obstacle data includes obstacle position information and obstacle speed information; the position compensation module is used for carrying out position compensation on the current tracking list based on the current obstacle data when the current obstacle data uploaded by the current sensor is monitored, so as to obtain a compensation tracking list; the current tracking list is generated according to the previous obstacle data and comprises a plurality of tracking targets; the association matching module is used for carrying out association matching on the tracking target in the compensation tracking list and the target obstacle corresponding to the current obstacle data to obtain an association matching pair set; the association matching pair set comprises at least one association matching relation pair; the correction module is used for acquiring a motion scene, correcting obstacle data corresponding to a target obstacle in the association and matching relation pair according to the types of the motion scene and the current sensor, and obtaining obstacle correction data of each association and matching relation pair; the filtering processing module is used for carrying out filtering processing on the obstacle correction data based on the self-adaptive Kalman filtering to obtain obstacle target data of the target obstacle in each association matching relation; and the fusion processing module is used for carrying out fusion processing on the barrier target data of the target barrier in each association matching relation in the association matching pair set, and determining the target barrier according to the fusion information.
In a third aspect, an embodiment of the present invention further provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method of the first aspect when executing the computer program.
In a fourth aspect, embodiments of the present invention also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the first aspect described above.
The embodiment of the invention has the following beneficial effects:
the embodiment of the invention provides an obstacle determining method, an obstacle determining device and electronic equipment in an automatic driving process, wherein when current obstacle data uploaded by a current sensor is monitored, position compensation is carried out on a current tracking list to obtain a compensation tracking list, so that a tracking target in the compensation tracking list and a target obstacle corresponding to the current obstacle data are subjected to association matching to obtain an association matching pair set; according to the principle of first-come first-served processing, the tracking list is directly compensated according to the acquisition sequence of the obstacle data without waiting; correcting obstacle data corresponding to the target obstacle in the association and matching relation pair according to the motion scene and the type of the current sensor to obtain obstacle correction data; and obtaining obstacle target data after self-adaptive Kalman filtering processing, and carrying out fusion processing to determine a target obstacle according to fusion information, thereby avoiding measuring noise of a filter sensitive to a sensor, improving the perception precision of the obstacle, ensuring the safety and stability of the vehicle in the automatic driving process, and having better practical value.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and drawings.
In order to make the above objects, features and advantages of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic illustration of an autonomous vehicle according to an embodiment of the present invention;
FIG. 2 is a flowchart of a method for determining an obstacle in an automatic driving process according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an adaptive Kalman filter according to an embodiment of the present invention;
FIG. 4 is a flowchart of another method for determining an obstacle during automatic driving according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an obstacle determining device in an automatic driving process according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In the existing automatic driving technology, the automatic driving vehicle is generally provided with sensors such as a laser radar and an ultrasonic radar, and the sensors are respectively responsible for sensing detection ranges of different areas and are relatively independent, so that the sensing detection ranges are small, blind areas around the vehicle body are large, positioning is inaccurate, the safety of the vehicle in the automatic driving process is poor, the cost is high, and the requirement of the unmanned vehicle on comprehensive monitoring of road condition environments cannot be met. Therefore, data fusion is required to be carried out on the information collected by different sensors so as to improve the overall data precision and obtain more comprehensive and higher-accuracy road condition environment information.
The existing fusion method is mainly a synchronous fusion method with a fixed period, and although the obstacle can be determined, the perception precision of the obstacle is reduced, so that the safety of an automatic driving vehicle is reduced. Based on the above, the embodiment of the invention provides the method, the device and the electronic equipment for determining the obstacle in the automatic driving process, so that the problems are alleviated, the perception precision of the obstacle is improved, the safety of the vehicle in the automatic driving process is further improved, and the method and the device have good practical value.
For the sake of understanding the present embodiment, referring to a schematic diagram of an autonomous vehicle shown in fig. 1, the autonomous vehicle is configured with a sensing system including a processor (not shown), and a plurality of acquisition devices connected to the processor, as shown in fig. 1, including a laser radar 11, a millimeter wave radar 12, a camera 13, and a combined navigation system 14, etc., to perform information acquisition on road conditions, obstacles, etc. in the driving direction of the autonomous vehicle, so that the vehicle avoids the obstacles according to the acquired information, and safely travels to a target location, thereby implementing an autonomous driving function of the vehicle. It should be noted that the above-mentioned collecting device may also include other types of collecting devices, and embodiments of the present invention are not limited herein.
Based on the processor of the sensing system, the embodiment of the invention provides a method for determining an obstacle in an automatic driving process, as shown in fig. 2, comprising the following steps:
step S202, monitoring obstacle data uploaded by each sensor;
wherein the sensor comprises at least one of: laser radar, millimeter wave radar and vision sensor; the setting can be performed according to actual conditions. In practical application, in order to improve the perception precision of an obstacle, an autonomous vehicle is generally configured with three sensors for sensing the obstacle, namely, a laser radar, a millimeter wave radar and a vision sensor, and a processor is connected with each sensor and monitors obstacle data uploaded by each sensor in real time.
The obstacle data includes obstacle position information and obstacle speed information; wherein, for lidar, obstacle data may be acquired by: the processor acquires laser scanning point cloud data transmitted by the laser radar; generating environment light point cloud information according to the laser scanning point cloud data; generating obstacle data according to the environment light point cloud information, for example, a laser radar can perform laser scanning on the preset range of the automatic driving vehicle, and the laser scanning point cloud data is sent to a processor, so that the processor can obtain the obstacle data in the preset range of the automatic driving vehicle according to the laser scanning point cloud data; similarly, the millimeter wave radar can send millimeter waves to the obstacle and obtain obstacle data according to reflected waves returned by the obstacle; and the vision sensor acquires the influence of the automatic driving vehicle in a preset range so as to obtain obstacle data. It should be noted that, because of different road conditions, the obstacle data collected by each sensor may be the same obstacle data, or may include multiple obstacle data, which may be specifically set according to actual conditions, which is not limited by the embodiment of the present invention.
In practical application, the laser radar can accurately realize the position sensing of the obstacle; the millimeter wave radar can effectively sense the position and the speed of the obstacle, but the transverse position and the speed of the millimeter wave radar are inaccurate; the vision sensor can accurately identify the obstacle, but has larger error in the ranging aspect than that of a laser radar and a millimeter wave radar. In addition, due to the characteristics of the sensors, the sensing capability of the sensors on the obstacles in different scenes is different, for example, in a close-range turning scene, the transverse position and the speed of the laser radar on the obstacles are more accurate than those of the millimeter wave radar and the vision sensor; in a long-distance scene, the distance measurement and the speed measurement of the millimeter wave radar are more accurate than those of a laser radar and a vision sensor. Therefore, the obstacle is sensed by the sensors of different types, and the sensing precision of the obstacle can be effectively improved.
Step S204, when current obstacle data uploaded by a current sensor is monitored, performing position compensation on a current tracking list based on the current obstacle data to obtain a compensated tracking list;
specifically, because the frequency of each sensor is different, the embodiment of the invention processes the barrier data according to the first come first process principle, namely when the processor monitors the current barrier data uploaded by the current sensor in the present period, the position compensation is directly carried out on the current tracking list based on the current barrier data, compared with the existing synchronous fusion method based on the fusion period, the problem of resource waste caused by waiting for the barrier data by the processor in the fusion period is avoided, the problem of delay of barrier data processing is avoided, the resource utilization rate and the timeliness of barrier data processing are improved, and the precision of the barrier data is further improved.
The current tracking list is generated according to the previous obstacle data and comprises a plurality of tracking targets; the above-mentioned position compensation process is as follows: calculating to obtain a time difference according to the time stamp of the current obstacle data and the generated time stamp of the current tracking list; calculating to obtain a compensation position of each tracking target based on the time difference and the speed of each tracking target; and carrying out position compensation on the tracking targets based on the compensation positions to obtain target positions of each tracking target.
Specifically, according to the first come first process principle, the processor generates a generation time stamp of the current tracking list according to the time stamp of the current obstacle data and the fusion center according to the previous obstacle data, and can calculate a time difference such as dt, and because the speed of each tracking target in the current tracking list is different, a compensation position ds of each tracking target can be obtained according to the time difference dt and the speed v of each tracking target, and then the position of each tracking target is compensated according to the compensation position ds corresponding to each tracking target in the current tracking list, so that the target position of each tracking target can be obtained, and a compensation tracking list is obtained.
Compared with the prior art, the position compensation process carries out position compensation on the tracked target according to the time stamp of the received obstacle data of the processor and the speed of the tracked target, so that the problem of inaccurate position compensation caused by inaccurate speed of the sensor is avoided, the position accuracy of each tracked target in the compensation tracking list is improved, and the association accuracy of the compensation tracking list when the target obstacle corresponding to the current obstacle data is associated and matched is improved.
Step S206, performing association matching on the tracking target in the compensation tracking list and a target obstacle corresponding to the current obstacle data to obtain an association matching pair set;
specifically, after the above-mentioned position compensation, the correlation matching may be performed according to the following procedure: according to the target position of each tracking target in the compensation tracking list and the barrier position information of the target barrier in the current barrier data, a distance matrix can be calculated; wherein the distance matrix includes a plurality of distance values for characterizing a distance between each tracking target and the target obstacle; then, carrying out association matching on the distance matrix based on a global nearest neighbor algorithm to obtain an association matching pair set; the association matching pair set comprises at least one association matching relation pair; each of the association matching relation pairs includes one tracking target and one target obstacle with association matching relation, and since the number of target obstacles corresponding to the current obstacle data may be one or more, the same tracking target may form an association matching relation pair with a plurality of target obstacles, and each target obstacle may also form an association matching relation pair with a plurality of tracking targets, which may be specifically set according to practical situations, and embodiments of the present invention are not limited herein.
Therefore, by performing the association matching on the compensation tracking list and the obstacle, invalid data in the obstacle data acquired by the sensor can be filtered, wherein the invalid data refers to data of the obstacle which does not have an association matching relation with all tracking targets in the compensation tracking list, or only has an association matching relation with few (such as 1) tracking targets in the compensation tracking list, so that the obstacle data of the target obstacle in the association matching relation pair is corrected, and compared with the correction of the data of all the obstacles acquired by the sensor in the existing method, the correction efficiency is improved, and the perception efficiency of the obstacle is further improved.
Step S208, acquiring a motion scene, and correcting obstacle data corresponding to a target obstacle in the association and matching relation pairs according to the types of the motion scene and the current sensor to obtain obstacle correction data of each association and matching relation pair;
in practical applications, the above-mentioned autonomous vehicle is further configured with an integrated navigation system connected to the processor, for collecting vehicle information of the autonomous vehicle, so as to determine a movement scene. The specific process of determining the motion scene is as follows: acquiring vehicle information acquired by a combined navigation system; calculating to obtain vehicle course angle information based on the vehicle information; determining a motion scene according to the heading angle information of the vehicle; wherein the motion scene comprises a vehicle straight running scene or a vehicle turning scene. The integrated navigation system can acquire vehicle angular velocity information and vehicle acceleration information through an IMU (Inertial Measurement Unit, inertial sensor), and common inertial sensors comprise an accelerometer and an angular velocity meter (gyroscope); the vehicle speed information can also be obtained through a vehicle speed sensor; alternatively, the vehicle position information is acquired by GNSS (Global Navigation Satellite System ) or a positioning sensor or the like, and the vehicle position information and the vehicle posture information or the like can be acquired by differential GPS (Global Positioning System ), and therefore, the vehicle information can be acquired by the above-described integrated navigation system; the vehicle information includes, but is not limited to, vehicle vertical angular velocity information, vehicle velocity information and vehicle position information, and the specific integrated navigation system and the collected vehicle information may be set according to practical situations, which is not limited in the embodiment of the present invention.
The vehicle course angle information can be calculated based on the vehicle information, for example, the vehicle course angle information can be calculated through the vehicle angular velocity information; or calculating according to the vehicle information and a preset model to obtain the vehicle course angle information and the like, wherein a specific calculation method of the vehicle course angle information can be set according to actual conditions; then determining a motion scene according to the heading angle information of the vehicle; the motion scene includes a vehicle straight-running scene or a vehicle turning scene, and it should be noted that, herein, the vehicle straight-running scene includes a motion scene in which a vehicle runs in the same direction or in opposite directions relative to an obstacle.
And correcting the obstacle data corresponding to the target obstacle in the association and matching relation pair based on the determined motion scene and the type of the current sensor to obtain obstacle correction data of each association and matching relation pair, namely correcting the sensor measured value (namely the obstacle data) of the target obstacle in the association and matching relation pair to obtain the artifact measured value of the target obstacle, thereby improving the accuracy of the obstacle correction data and further improving the accuracy of the target obstacle determined according to the obstacle correction data.
Step S210, filtering the obstacle correction data based on the adaptive Kalman filtering to obtain obstacle target data of the target obstacle in each association matching relation pair;
Specifically, for the obstacle correction data of each association matching relation pair, the obstacle correction data is subjected to filtering processing through the adaptive Kalman filtering, so that the problem that the difference of frames before and after the measured data entering the Kalman filter is large due to the fact that the characteristics of each sensor are different and the accuracy of the sensor data is different when the sensor data are processed through the traditional Kalman filtering is avoided, the adaptive adjustment of the obstacle data noise of the sensor is realized when the automatic driving scene is suddenly changed, the perception precision and the robustness of the obstacle are improved, and the method has good practical value.
Step S212, carrying out fusion processing on the barrier target data of the target barrier in each association matching relation in the association matching pair set, and determining the target barrier according to the fusion information.
Specifically, in one possible fusion mode, the obstacle target data of the target obstacle is directly fused in the association matching relationship to obtain the fusion information of the same target obstacle, so that the target obstacle is determined according to the fusion information. In another possible fusion manner, as shown in fig. 1, the autopilot vehicle is further provided with a camera connected to the processor, and the fusion process may be performed according to the following procedure: acquiring obstacle information acquired by a camera; wherein the obstacle information includes characteristic information of a plurality of obstacles within a preset range in a vehicle traveling direction, the characteristic information including obstacle type information and obstacle size information; and carrying out fusion processing on the obstacle target data of the target obstacle and the characteristic information of the target obstacle in each association matching relation in the association matching pair set to obtain fusion information. Specifically, an image in a preset range in the running direction of the vehicle can be directly acquired through a camera, and the image is extracted to obtain characteristic information of a plurality of obstacles in the image; the video in a preset range in the running direction of the vehicle can be obtained through the camera, and the video is extracted to obtain the characteristic information of a plurality of obstacles in the video; the setting can be specifically performed according to actual conditions. The characteristic information of the obstacle may also include other characteristic information of the obstacle, which is specifically set according to the specific situation of the obstacle.
Therefore, through the above-mentioned integration mode, can be more accurate confirm the target obstacle, if through carrying out correction and filtering processing with millimeter wave radar and vision sensor barrier data that upload simultaneously, fuse with information that gathers such as integrated navigation system and camera again, can obtain the information of target obstacle according to the integration information, including but not limited to type information, size information, position information and the speed information etc. of this target obstacle, compare with just going through the sensor and perceiving the target obstacle, richened the information of target obstacle, thereby improved the perception precision of target obstacle, and then the vehicle is convenient for carry out route planning and control according to the information of this target obstacle, on the basis of realizing the autopilot function of vehicle, improved the precision and the security of autopilot.
In addition, track management can be performed on the vehicle according to the target obstacle; the track management means that the target obstacle is deleted in time after leaving the detection range of the sensor, or when a new target obstacle appears in the detection range of the sensor, a new tracking target is created in the tracking list, so that the vehicle is better managed, and the safety of the automatic driving process of the vehicle is ensured.
According to the obstacle determining method in the automatic driving process, when current obstacle data uploaded by a current sensor are monitored, position compensation is conducted on a current tracking list to obtain a compensation tracking list, so that a tracking target in the compensation tracking list and a target obstacle corresponding to the current obstacle data are subjected to association matching to obtain an association matching pair set; according to the principle of first-come first-served processing, the tracking list is directly compensated according to the acquisition sequence of the obstacle data without waiting; correcting obstacle data corresponding to the target obstacle in the association and matching relation pair according to the motion scene and the type of the current sensor to obtain obstacle correction data; and obtaining obstacle target data after self-adaptive Kalman filtering processing, and carrying out fusion processing to determine a target obstacle according to fusion information, thereby avoiding measuring noise of a filter sensitive to a sensor, improving the perception precision of the obstacle, ensuring the safety and stability of the vehicle in the automatic driving process, and having better practical value.
Preferably, the adaptive Kalman filter is an adaptive Kalman filter based on Sage-Husa, and a time-varying noise estimator is introduced on the basis of the original Kalman filter, and mainly comprises the following steps:
(1) Establishing a state one-step prediction equation:
wherein,representing the system state at time k predicted from the optimal estimate of the system state at time k-1, F k,k-1 Representing a state transition matrix>An optimal estimate of the system state at time k-1 is shown.
(2) Calculating a covariance matrix at the moment k:
wherein P is k|k-1 Representing the measurement update state covariance at the moment k; p (P) k-1 Representing the measurement update state covariance at time k-1, F k,k-1 Representing a state transition matrix.
(3) Calculating a Kalman gain matrix at the moment k:
wherein K is k Kalman gain matrix representing time k, P k|k-1 Representing the measurement update state covariance at the moment k; h k Representing the measurement matrix at time k,and represents the measurement noise covariance matrix at the k moment.
(4) Calculating an innovation vector at the moment k:
wherein e k An innovation vector Z representing the moment k k Represents the observed value at time k, H k Representing measurements at time kThe matrix is formed by a matrix of,the system state at time k predicted from the optimal estimate of the system state at time k-1 is shown.
(5) Updating the state at time k according to:
wherein,an optimal estimated value indicating the system state at time k, e k An innovation vector representing the moment k +.>Representing the system state at time K predicted from the optimal estimate of the system state at time K-1, K k The kalman gain matrix at time k is shown.
(6) Updating the state covariance at time k according to:
P k =(I-K k H k )P k|k-1 (6)
wherein P is k Representing the state covariance at time k, P k|k-1 Representing the measurement update state covariance at time K, K k Kalman gain matrix representing time k, H k A measurement matrix representing the moment K, I representing an identity matrix, and the identity matrix being equal to K k H k And the same dimension.
The above equations (1) - (6) constitute a filter equation, and the kalman filter has a time update process and an observation update process in one filter period. Based on the time-varying noise estimator is introduced for the self-adaptive Kalman filter based on Sage-Husa, and the time-varying noise estimator is specifically as follows:
the time-varying noise estimator comprises a time-varying measuring noise matrix and a time-varying system noise matrix, and specifically calculates the time-varying measuring noise matrix according to the following formula:
wherein,representing a time-varying measurement noise covariance matrix, i.e. a measurement noise covariance matrix at time k, d k Represents intermediate variables, H k Measurement matrix representing the moment k +.>A measurement noise covariance matrix representing k-1 time e k An innovation vector representing time k, P k|k-1 The measurement update state covariance at time k is shown.
And calculating a time-varying system noise matrix according to:
Wherein,representing a time-varying system noise matrix, d k Representing intermediate variables e k An innovation vector representing the moment K, K k Kalman gain matrix representing time k, P k Representing the state covariance at time k, F k,k-1 Representing a state transition matrix, P k-1 Indicating the measurement update state covariance at time k-1,/->Representing the time-varying system noise matrix at time k-1.
Wherein the intermediate variable is calculated according to the following formula:
wherein d k Represents an intermediate variable, b represents a forgetting factor, and 0<b<1, in practical application, the choice of b must comprehensively consider the tracking performance of the time-varying fading parameters and the insensitivity of noise, and the value range is generally 0.95-0.99.
Simplifying the Sage-Husa self-adaptive filtering algorithm and settingFor a constant, a time-varying measurement noise matrix can be calculated according to the following equation:
wherein,representing a time-varying measurement noise covariance matrix, i.e. a measurement noise covariance matrix at time k, d k Representing intermediate variables +.>A measurement noise covariance matrix representing k-1 time e k And represents the innovation vector at time k.
As can be seen from the above formula (10), the above adaptive filtering algorithm can obtain filtering stability, but reduces filtering accuracy. In order to further improve the stability of the filtering and inhibit the divergence of the filtering, whether the filtering is normal or not can be judged by the following formula:
Wherein e k An innovation vector at time k, gamma denotes a reserve coefficient, tr denotes a trace of a matrix, H k Representing the measurement matrix at time k, P k|k-1 Indicating the measurement update state covariance at time k,representing a time-varying measurement noise covariance matrix, d k Representing intermediate variables.
It should be noted that, the above storage coefficient γ satisfies γ being greater than or equal to 1, and is the most strict convergence criterion condition when γ=1, and the storage coefficient γ is tested to be γ=1.5, which may be specifically set according to the actual situation, which is not limited in the embodiment of the present invention.
Therefore, in the actual filtering process, if the above formula (11) is satisfied, it indicates that the kth filtering is abnormal, the original filtering model is not suitable for the current filtering, and the formula (10) is not available at this time, that is, the above formula (10) is omitted and adoptedTable substitution, if the above formula (11) is not satisfied, it is indicated that the kth filtering is not abnormal, in which case there is no need to estimate +.>
Furthermore, as can be seen from the formulas (7) and (8), the weighting coefficients of the old information and the new information to the noise estimation value are (1-d) k ) And d k Their steady state values are forgetting factors b and 1-b, respectively, where forgetting factor b satisfies the following formula:
thus, as the filtering proceeds, the old information has less and less effect on the noise estimate, while the new information has more and more effect on the noise estimate, which is mainly due to the new information vector e when the filtering reaches steady state k (or K) k e k ) Therefore, the adaptive Kalman filter in the embodiment of the invention is necessarily changed along with the change of the actual environment, thereby realizing the adaptive effect of noise and further improving the perception precision of the obstacle.
The adaptive Kalman filtering mainly comprises a prediction stage and an updating stage in practical application; as shown in fig. 3, the adaptive kalman filtering mainly includes the following processes:
(31) Firstly, initializing filtering parameters such as a state transition matrix F, a measurement update state covariance P, a noise matrix Q, a Kalman gain matrix K, a noise covariance matrix R and the like, setting a forgetting factor b, and then entering a prediction stage;
(32) In the prediction phase, the global state X at time k is predicted first k The method comprises the steps of carrying out a first treatment on the surface of the Then predicting covariance matrix P at k time k|k-1 The method comprises the steps of carrying out a first treatment on the surface of the Setting forgetting factor b, calculating intermediate variable d at k moment k
(33) In the updating stage, it first calculates whether the filtering diverges and updates the time-varying measurement noise covariance matrix at k timeAnd calculate the Kalman gain matrix K at time K k And an innovation vector e at time k k The method comprises the steps of carrying out a first treatment on the surface of the Updating global state X at time k k And updating the state covariance P at time k k And returning to the prediction stage, performing filtering processing at the next moment, and repeating until the obstacle target data of the target obstacle in each association matching relation pair is obtained. It should be noted that, in the above calculation and update process, the calculation may be performed according to the foregoing formula, and the embodiments of the present invention are not described in detail herein.
Therefore, the adaptive Kalman filtering is introduced into the field of multi-sensor asynchronous fusion, so that the filter is insensitive to measurement noise of any sensor, the adaptive adjustment of sensor measurement data noise when a scene is suddenly changed is realized, the accuracy of motion tracking information is improved, and the perception precision of an obstacle is further improved.
On the basis of fig. 2, the embodiment of the invention provides another method for determining an obstacle in an automatic driving process, which mainly describes a process of correcting obstacle data corresponding to a target obstacle in an association and matching relation pair according to a motion scene and a type of a current sensor, as shown in fig. 4, and comprises the following steps:
step S402, monitoring obstacle data uploaded by each sensor;
step S404, when current obstacle data uploaded by a current sensor is monitored, performing position compensation on a current tracking list based on the current obstacle data to obtain a compensated tracking list;
step S406, performing association matching on the tracking target in the compensation tracking list and a target obstacle corresponding to the current obstacle data to obtain an association matching pair set;
the steps S402 to S406 may refer to the foregoing embodiments, and the embodiments of the present invention are not described in detail herein.
Step S408, judging whether the current motion scene is a straight-going scene of the vehicle; if yes, go to step S410-S424; if not, executing steps S426-S440;
specifically, a difference between a current heading angle calculated from vehicle information and a preset reference heading angle; the preset reference course angle is an average course angle of the vehicle in a preset time before the current time; for example, selecting a historical course angle in two times before the current time, and calculating an average course angle as a preset reference course angle, or filtering and smoothing the historical course angle at the preset time, and selecting a certain value after the processing as the preset reference course angle, wherein the setting can be specifically performed according to actual conditions; then judging whether the difference value meets a preset threshold range or not; if yes, the current motion scene is a vehicle straight-running scene; if not, the current motion scene is a vehicle turning scene. For example, the current course angle is yaw, the preset threshold range is [0.4, PI/2], and if the difference value does not exceed the preset threshold range, the front motion scene is a vehicle straight motion scene; i.e. the obstacle moves in the same direction or in opposite directions relative to the vehicle; otherwise, the current motion scene is a vehicle turning scene, that is, a situation in which the vehicle is in a turning state relative to the obstacle, and the vehicle and the obstacle turn simultaneously, and the like. The PI represents the circumference ratio.
In addition, the historical track information of the obstacle may be stored, and the relative motion state with the vehicle, that is, the current motion scene, may be determined according to the historical track information, for example, the speed direction of the obstacle is set to be θ, and the current position information of the vehicle is set to be (x, y), where θ may be calculated according to the following formula:
and calculating the angle difference according to the following formula:
diff_angle=fabs(θ-yaw) (14)
where diff_angle represents the angle difference, θ represents the speed direction of the obstacle, and yaw represents the current heading angle.
If θ calculated in equation (13) satisfies θ > PI/2 and θ is equal to or less than PI, θ=PI-diff_angle; if θ satisfies θ > PI and θ is less than or equal to 3PI/2, θ=diff_angle-PI; if θ satisfies θ > 3PI/2 and θ is less than or equal to 2PI, θ=2PI-diff_angle; and finally, judging whether the theta exceeds a preset threshold range of [0.4, PI/2], if so, the obstacle and the vehicle are in a relative turning state, otherwise, the obstacle and the vehicle are in a same-direction or opposite-direction movement state.
In practical application, the speed measuring precision of the millimeter wave radar is superior to that of a laser radar for the situation that the obstacle and the vehicle are in the same direction or opposite direction movement state, namely, the vehicle moves straight; however, when the obstacle is turned relative to the vehicle, the millimeter wave radar and the vision sensor have lower detection precision on the position and the speed of the obstacle due to the characteristics of the millimeter wave radar and the vision sensor, and the laser radar can accurately detect the position of the obstacle and can obtain more accurate speed through position difference.
Step S410, judging whether the current sensor is a laser radar or a vision sensor; if not, then step S412 is performed; if yes, go to step S414-S422;
step S412, judging whether the current sensor is a millimeter wave radar; if yes, go to step S424;
step S414, judging whether millimeter wave radar history frame measurement values closest to the obstacle speed information exist; if yes, go to step S416-S422; otherwise, step S424 is performed;
step S416, calculating the difference between the obstacle speed information and the nearest millimeter wave radar historical frame measurement value; wherein the difference comprises a speed direction difference and a speed magnitude difference;
step S418, judging whether the speed direction difference value is smaller than a first preset threshold value; if yes, go to step S420; if not, then step S422 is performed;
step S420, judging whether the speed difference value is smaller than a second preset threshold value; if yes, step S424 is performed, and if no, step S422 is performed;
step S422, taking the closest millimeter wave radar history frame measurement value as the obstacle correction speed information in the corresponding obstacle correction data;
step S424, taking the speed value of the current sensor as obstacle correction speed information in the corresponding obstacle correction data; here, the current sensor is a lidar or vision sensor.
It should be noted that, the first preset threshold value and the second preset threshold value may be set according to actual situations, which is not limited by the embodiment of the present invention.
Step S426, judging whether the current sensor is a millimeter wave radar or a vision sensor; if not, then step S428 is performed; if yes, go to step S430-S438;
step S428, judging whether the current sensor is a laser radar; if yes, go to step S440;
step S430, judging whether a laser radar history frame measured value closest to the obstacle speed information exists; if yes, go to step S432-S438; otherwise, step S440 is performed;
step S432, calculating the difference between the obstacle speed information and the nearest laser radar history frame measured value; similarly, the difference comprises a speed direction difference and a speed magnitude difference;
step S434, judging whether the speed direction difference is smaller than a third preset threshold; if yes, go to step S436; if not, go to step S438;
step S436, judging whether the speed difference value is smaller than a fourth preset threshold value; if yes, go to step S440, if no, go to step S438;
it should be noted that the third preset threshold and the fourth preset threshold may be set according to actual situations, which is not limited by the embodiment of the present invention.
Step S438, taking the closest laser radar history frame measured value as obstacle correction speed information in the corresponding obstacle correction data;
step S440, taking the speed value of the current sensor as obstacle correction speed information in the corresponding obstacle correction data; here, the current sensor is a millimeter wave radar or a vision sensor.
Step S442, filtering the obstacle correction data based on the adaptive Kalman filtering to obtain obstacle target data of the target obstacle in each association matching relation pair;
step S444, carrying out fusion processing on the obstacle target data of the target obstacle in each association matching relation in the association matching pair set, and determining the target obstacle according to the fusion information.
Therefore, the obstacle determining method in the automatic driving process fully utilizes the advantages and disadvantages of the sensor, combines the complexity of the automatic driving application scene, provides an accurate and stable obstacle position and speed for the automatic driving vehicle to sense, and ensures the safe and stable operation of the automatic driving vehicle.
In summary, the method for determining the obstacle in the automatic driving process provided by the embodiment of the invention mainly has the following advantages: (1) The obstacle data uploaded by the sensor is fused with the tracking list in sequence according to the first-come first-process principle without waiting, so that the problem of resource waste of the central processor is avoided; (2) When the position compensation is performed before the association matching, the position compensation is performed through the time stamp of the sensor data and the speed of the tracking target, so that the position compensation error caused by inaccurate sensor speed when the speed measured by the sensor is compensated in the existing method is avoided; (3) Correcting the obstacle data through the motion scene and the type of the sensor to obtain a artifact measured value corresponding to the obstacle data, and inputting the obstacle correction data obtained through processing into a self-adaptive Kalman filter for filtering treatment, so that the accuracy of the input data is ensured before the input of the Kalman filter, and the data processing of the Kalman filter is facilitated; (4) Through self-adaptive Kalman filtering, the sensor measurement noise is avoided when the filter is sensitive to the sensor, the self-adaptive adjustment of the sensor measurement data noise is realized when the scene is suddenly changed, meanwhile, the sensor measurement noise is reduced to be larger due to the fact that some scene suddenly changes, such as the ground is raised or climbs, the perceived motion information and the position information are smoother, the accuracy of obstacle tracking information is improved, the perceived accuracy and the real-time performance of the obstacle are improved, the safety of a vehicle in the automatic driving process is guaranteed, and the sensor measurement noise is high in practical value. Corresponding to the above method embodiment, the embodiment of the present invention further provides an obstacle determining device in an automatic driving process, which is applied to a processor of a vehicle sensing system, and the processor is further connected with a sensor, as shown in fig. 5, and the device includes a monitoring module 51, a position compensation module 52, an association matching module 53, a correction module 54, a filtering processing module 55 and a fusion processing module 56, which are sequentially connected; wherein, the functions of each module are as follows:
A monitoring module 51, configured to monitor obstacle data uploaded by each sensor; wherein the sensor comprises at least one of: laser radar, millimeter wave radar and vision sensor; the obstacle data includes obstacle position information and obstacle speed information;
the position compensation module 52 is configured to, when detecting current obstacle data uploaded by the current sensor, perform position compensation on the current tracking list based on the current obstacle data, so as to obtain a compensated tracking list; the current tracking list is generated according to the previous obstacle data and comprises a plurality of tracking targets;
the association matching module 53 is configured to perform association matching on a tracking target in the compensation tracking list and a target obstacle corresponding to the current obstacle data, so as to obtain an association matching pair set; the association matching pair set comprises at least one association matching relation pair;
the correction module 54 is configured to obtain a motion scene, correct obstacle data corresponding to a target obstacle in the association and matching relation pair according to the motion scene and the type of the current sensor, and obtain obstacle correction data of each association and matching relation pair;
the filtering processing module 55 is configured to perform filtering processing on the obstacle correction data based on adaptive kalman filtering, so as to obtain obstacle target data of the target obstacle in each association matching relationship;
And the fusion processing module 56 is configured to perform fusion processing on the obstacle target data of the target obstacle in each association matching relationship in the association matching pair set, and determine the target obstacle according to the fusion information.
When current obstacle data uploaded by a current sensor is monitored, position compensation is carried out on a current tracking list to obtain a compensation tracking list, so that a tracking target in the compensation tracking list and a target obstacle corresponding to the current obstacle data are associated and matched to obtain an association matching pair set; according to the principle of first-come first-served processing, the tracking list is directly compensated according to the acquisition sequence of the obstacle data without waiting; correcting obstacle data corresponding to the target obstacle in the association and matching relation pair according to the motion scene and the type of the current sensor to obtain obstacle correction data; and obtaining obstacle target data after self-adaptive Kalman filtering processing, and carrying out fusion processing to determine a target obstacle according to fusion information, thereby avoiding measuring noise of a filter sensitive to a sensor, improving the perception precision of the obstacle, ensuring the safety and stability of the vehicle in the automatic driving process, and having better practical value.
In one possible embodiment, the performing the position compensation on the current tracking list based on the current obstacle data includes: calculating to obtain a time difference according to the time stamp of the current obstacle data and the generated time stamp of the current tracking list; calculating to obtain a compensation position of each tracking target based on the time difference and the speed of each tracking target; and carrying out position compensation on the tracking targets based on the compensation positions to obtain target positions of each tracking target.
In another possible embodiment, the association matching module 53 is further configured to: calculating to obtain a distance matrix according to the target position of the tracking target and the barrier position information of the target barrier; wherein the distance matrix includes a plurality of distance values for characterizing a distance between each tracking target and the target obstacle; and performing association matching on the distance matrix based on a global nearest neighbor algorithm to obtain an association matching pair set.
In another possible embodiment, the processor is further connected with a combination navigation system arranged on the vehicle; the acquiring the motion scene includes: acquiring vehicle information acquired by a combined navigation system; the vehicle information comprises vehicle vertical angular velocity information, vehicle speed information and vehicle position information; calculating to obtain vehicle course angle information based on the vehicle information; determining a motion scene according to the heading angle information of the vehicle; wherein the motion scene comprises a vehicle straight running scene or a vehicle turning scene.
In another possible embodiment, the determining a motion scene according to the heading angle information of the vehicle includes: calculating the difference value between the current course angle and a preset reference course angle; the preset reference course angle is an average course angle of the vehicle in a preset time before the current time; judging whether the difference value meets a preset threshold range or not; if yes, the current motion scene is a vehicle straight-running scene; if not, the current motion scene is a vehicle turning scene.
In another possible embodiment, the processor is further connected to a camera disposed on the vehicle, and the fusion processing module 56 is further configured to: acquiring obstacle information acquired by a camera; the obstacle information comprises characteristic information of a plurality of obstacles in a preset range in the running direction of the vehicle, and the characteristic information comprises obstacle type information and obstacle size information; and carrying out fusion processing on the obstacle target data of the target obstacle and the characteristic information of the target obstacle in each association matching relation in the association matching pair set to obtain fusion information.
In another possible embodiment, the above-mentioned listening module 51 is further configured to: acquiring laser scanning point cloud data transmitted by a laser radar; generating environmental light point cloud information according to the laser scanning point cloud data; obstacle data is generated from the ambient light point cloud information.
The obstacle determining device in the automatic driving process provided by the embodiment of the invention has the same technical characteristics as the obstacle determining method in the automatic driving process provided by the embodiment, so that the same technical problems can be solved, and the same technical effects can be achieved.
The embodiment of the invention also provides electronic equipment, which comprises a processor and a memory, wherein the memory stores machine executable instructions which can be executed by the processor, and the processor executes the machine executable instructions to realize the obstacle determining method in the automatic driving process.
Referring to fig. 6, the electronic device includes a processor 60 and a memory 61, the memory 61 storing machine executable instructions that can be executed by the processor 60, the processor 60 executing the machine executable instructions to implement the above-described obstacle determining method during automatic driving.
Further, the electronic device shown in fig. 6 further includes a bus 62 and a communication interface 63, and the processor 60, the communication interface 63, and the memory 61 are connected by the bus 62.
The memory 61 may include a high-speed random access memory (RAM, random Access Memory), and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory. The communication connection between the system network element and at least one other network element is achieved via at least one communication interface 63 (which may be wired or wireless), and may use the internet, a wide area network, a local network, a metropolitan area network, etc. Bus 62 may be an ISA (Industrial Standard Architecture, industry standard architecture) bus, PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, or EISA (Enhanced Industry Standard Architecture, extended industry standard architecture) bus, among others. The buses may be classified into address buses, data buses, control buses, and the like. For ease of illustration, only one bi-directional arrow is shown in FIG. 6, but not only one bus or type of bus.
The processor 60 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware or instructions in software in the processor 60. The processor 60 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processor, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field-programmable gate arrays (Field-Programmable Gate Array, FPGA for short) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 61 and the processor 60 reads the information in the memory 61 and in combination with its hardware performs the steps of the method of the previous embodiment.
The present embodiment also provides a machine-readable storage medium storing machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement the above-described method of obstacle determination during autopilot.
The method, the device and the computer program product of the electronic device for determining an obstacle in an automatic driving process provided by the embodiment of the invention comprise a computer readable storage medium storing program codes, wherein the instructions included in the program codes can be used for executing the method described in the method embodiment, and specific implementation can be referred to the method embodiment and will not be repeated here.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again.
In addition, in the description of embodiments of the present invention, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the description of the present invention, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present invention and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above examples are only specific embodiments of the present invention, and are not intended to limit the scope of the present invention, but it should be understood by those skilled in the art that the present invention is not limited thereto, and that the present invention is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A method of obstacle determination during autopilot, characterized by a processor applied to a vehicle perception system, the processor further coupled with a sensor, the method comprising:
monitoring obstacle data uploaded by each sensor; wherein the sensor comprises at least one of: laser radar, millimeter wave radar and vision sensor; the obstacle data includes obstacle position information and obstacle speed information;
When current obstacle data uploaded by a current sensor are monitored, performing position compensation on a current tracking list based on the current obstacle data to obtain a compensation tracking list; the current tracking list is generated according to the previous obstacle data and comprises a plurality of tracking targets;
performing association matching on the tracking target in the compensation tracking list and a target obstacle corresponding to the current obstacle data to obtain an association matching pair set; wherein the association matching pair set comprises at least one association matching relation pair;
acquiring a motion scene, and correcting obstacle data corresponding to a target obstacle in the association and matching relation pair according to the motion scene and the type of the current sensor to obtain obstacle correction data of each association and matching relation pair;
filtering the obstacle correction data based on self-adaptive Kalman filtering to obtain obstacle target data of the target obstacle in each association matching relation pair;
and carrying out fusion processing on barrier target data of the target barrier in each association matching relation in the association matching pair set, and determining the target barrier according to fusion information.
2. The method of claim 1, wherein the step of compensating for the current tracking list based on the current obstacle data comprises:
calculating to obtain a time difference according to the time stamp of the current obstacle data and the generation time stamp of the current tracking list;
calculating a compensation position of each tracking target based on the time difference and the speed of each tracking target;
and carrying out position compensation on the tracking targets based on the compensation positions to obtain target positions of each tracking target.
3. The method according to claim 2, wherein the step of associatively matching the tracking target in the compensation tracking list with the target obstacle corresponding to the current obstacle data includes:
calculating to obtain a distance matrix according to the target position of the tracking target and the barrier position information of the target barrier; wherein the distance matrix comprises a plurality of distance values for characterizing the distance between each of the tracking targets and the target obstacle;
and carrying out association matching on the distance matrix based on a global nearest neighbor algorithm to obtain an association matching pair set.
4. The method of claim 1, wherein the processor is further coupled to a integrated navigation system disposed on the vehicle; the step of obtaining the motion scene comprises the following steps:
acquiring vehicle information acquired by the integrated navigation system; wherein the vehicle information includes vehicle vertical angular velocity information, vehicle speed information, and vehicle position information;
calculating to obtain vehicle course angle information based on the vehicle information;
determining the motion scene according to the vehicle course angle information; wherein the motion scene comprises a vehicle straight running scene or a vehicle turning scene.
5. The method of claim 4, wherein the step of determining the motion scene from the vehicle heading angle information comprises:
calculating the difference value between the current course angle and a preset reference course angle; the preset reference course angle is an average course angle of the vehicle in a preset time before the current time;
judging whether the difference value meets a preset threshold range or not;
if yes, the current motion scene is the vehicle straight-going scene;
if not, the current motion scene is the vehicle turning scene.
6. The method of claim 1, wherein the processor is further connected to a camera disposed on the vehicle, and the step of fusing the obstacle target data of the target obstacle for each of the associative matching pairs in the associative matching pair set comprises:
Acquiring barrier information acquired by the camera; the obstacle information comprises characteristic information of a plurality of obstacles in a preset range in the running direction of the vehicle, and the characteristic information comprises obstacle type information and obstacle size information;
and carrying out fusion processing on the obstacle target data of the target obstacle and the characteristic information of the target obstacle in each association matching relation in the association matching pair set to obtain fusion information.
7. The method of claim 1, wherein the sensors are the lidar, and the step of listening for obstacle data uploaded by each sensor comprises:
acquiring laser scanning point cloud data transmitted by the laser radar;
generating environment light point cloud information according to the laser scanning point cloud data;
and generating the obstacle data according to the environment light point cloud information.
8. An obstacle determination device during automatic driving, characterized by a processor applied to a vehicle perception system, the processor further being connected with a sensor, the device comprising:
the monitoring module is used for monitoring the obstacle data uploaded by each sensor; wherein the sensor comprises at least one of: laser radar, millimeter wave radar and vision sensor; the obstacle data includes obstacle position information and obstacle speed information;
The position compensation module is used for carrying out position compensation on the current tracking list based on the current obstacle data when the current obstacle data uploaded by the current sensor is monitored, so as to obtain a compensated tracking list; the current tracking list is generated according to the previous obstacle data and comprises a plurality of tracking targets;
the association matching module is used for carrying out association matching on the tracking target in the compensation tracking list and the target obstacle corresponding to the current obstacle data to obtain an association matching pair set; wherein the association matching pair set comprises at least one association matching relation pair;
the correction module is used for acquiring a motion scene, correcting obstacle data corresponding to a target obstacle in the association matching relation pair according to the motion scene and the type of the current sensor, and obtaining obstacle correction data of each association matching relation pair;
the filtering processing module is used for carrying out filtering processing on the obstacle correction data based on the self-adaptive Kalman filtering to obtain obstacle target data of the target obstacle in each association matching relation pair;
and the fusion processing module is used for carrying out fusion processing on the barrier target data of the target barrier in each association matching relation in the association matching pair set, and determining the target barrier according to fusion information.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method of any of the preceding claims 1-7 when the computer program is executed by the processor.
10. A computer-readable storage medium, characterized in that it has stored thereon a computer program which, when executed by a processor, performs the steps of the method according to any of the preceding claims 1-7.
CN202110364712.1A 2021-04-02 2021-04-02 Obstacle determination method and device in automatic driving process and electronic equipment Active CN113514806B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110364712.1A CN113514806B (en) 2021-04-02 2021-04-02 Obstacle determination method and device in automatic driving process and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110364712.1A CN113514806B (en) 2021-04-02 2021-04-02 Obstacle determination method and device in automatic driving process and electronic equipment

Publications (2)

Publication Number Publication Date
CN113514806A CN113514806A (en) 2021-10-19
CN113514806B true CN113514806B (en) 2023-12-19

Family

ID=78062257

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110364712.1A Active CN113514806B (en) 2021-04-02 2021-04-02 Obstacle determination method and device in automatic driving process and electronic equipment

Country Status (1)

Country Link
CN (1) CN113514806B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114488065A (en) * 2022-01-27 2022-05-13 中国第一汽车股份有限公司 Track data processing method, device, vehicle and medium
CN114671380B (en) * 2022-03-23 2023-12-29 湖南星邦智能装备股份有限公司 Multi-sensor data fusion-based anti-collision method and system for overhead working truck
CN114419604B (en) * 2022-03-28 2022-06-28 禾多科技(北京)有限公司 Obstacle information generation method and device, electronic equipment and computer readable medium
CN114858200B (en) * 2022-04-19 2023-06-27 合众新能源汽车股份有限公司 Method and device for evaluating quality of object detected by vehicle sensor
CN114529886B (en) * 2022-04-24 2022-08-02 苏州挚途科技有限公司 Method, device and system for determining obstacle
CN116321072B (en) * 2023-03-13 2024-01-23 阿里云计算有限公司 Data compensation method and device based on perception failure
CN116299300B (en) * 2023-05-15 2023-08-08 北京集度科技有限公司 Determination method and device for drivable area, computer equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150086065A (en) * 2014-01-17 2015-07-27 전남대학교산학협력단 System and method for path planning for autonomous navigation of driverless ground vehicle
CN110775052A (en) * 2019-08-29 2020-02-11 浙江零跑科技有限公司 Automatic parking method based on fusion of vision and ultrasonic perception
US10634793B1 (en) * 2018-12-24 2020-04-28 Automotive Research & Testing Center Lidar detection device of detecting close-distance obstacle and method thereof
CN112033429A (en) * 2020-09-14 2020-12-04 吉林大学 Target-level multi-sensor fusion method for intelligent automobile
CN112285714A (en) * 2020-09-08 2021-01-29 苏州挚途科技有限公司 Obstacle speed fusion method and device based on multiple sensors

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109059902B (en) * 2018-09-07 2021-05-28 百度在线网络技术(北京)有限公司 Relative pose determination method, device, equipment and medium
US11352010B2 (en) * 2019-09-30 2022-06-07 Baidu Usa Llc Obstacle perception calibration system for autonomous driving vehicles

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150086065A (en) * 2014-01-17 2015-07-27 전남대학교산학협력단 System and method for path planning for autonomous navigation of driverless ground vehicle
US10634793B1 (en) * 2018-12-24 2020-04-28 Automotive Research & Testing Center Lidar detection device of detecting close-distance obstacle and method thereof
CN110775052A (en) * 2019-08-29 2020-02-11 浙江零跑科技有限公司 Automatic parking method based on fusion of vision and ultrasonic perception
CN112285714A (en) * 2020-09-08 2021-01-29 苏州挚途科技有限公司 Obstacle speed fusion method and device based on multiple sensors
CN112033429A (en) * 2020-09-14 2020-12-04 吉林大学 Target-level multi-sensor fusion method for intelligent automobile

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种Kalman最近邻的智能汽车传感器多目标跟踪算法;黄文奎;罗峰;胡凤鉴;;机电一体化(第12期);全文 *
基于三维激光雷达的动态障碍物检测和追踪方法;邹斌;刘康;王科未;;汽车技术(第08期);全文 *
基于信息融合的智能车障碍物检测方法;陆峰;徐友春;李永乐;王德宇;谢德胜;;计算机应用(第S2期);全文 *

Also Published As

Publication number Publication date
CN113514806A (en) 2021-10-19

Similar Documents

Publication Publication Date Title
CN113514806B (en) Obstacle determination method and device in automatic driving process and electronic equipment
JP6693496B2 (en) Self-location estimation device
Jo et al. GPS-bias correction for precise localization of autonomous vehicles
JP2022113746A (en) Determination device
RU2735567C1 (en) Method for storing movement backgrounds, method for generating motion path model, method for estimating local position and storage device for storing movement backgrounds
US8577539B1 (en) Coded aperture aided navigation and geolocation systems
CN110174105B (en) Intelligent agent autonomous navigation algorithm and system in complex environment
WO2015111344A1 (en) Anomalous travel location detection device and anomalous travel location detection method
JP6302519B2 (en) Vehicle driving support device
CN101846734A (en) Agricultural machinery navigation and position method and system and agricultural machinery industrial personal computer
US11731649B2 (en) High precision position estimation method through road shape classification-based map matching and autonomous vehicle thereof
CN113155124B (en) Multi-source auxiliary navigation method and device
CN113899369B (en) Ultra wideband/PDR indoor positioning method based on self-adaptive noise reduction algorithm
WO2018168956A1 (en) Own-position estimating device
CN110637209B (en) Method, apparatus and computer readable storage medium having instructions for estimating a pose of a motor vehicle
CN111428759A (en) Data fusion method, electronic device and storage medium
WO2021102676A1 (en) Object state acquisition method, mobile platform and storage medium
CN114694111A (en) Vehicle positioning
JP7204612B2 (en) POSITION AND POSTURE ESTIMATION DEVICE, POSITION AND POSTURE ESTIMATION METHOD, AND PROGRAM
JP2019194037A (en) Compartment line recognition device
CN115979288A (en) Course angle determining method, electronic equipment and storage medium
US11740103B2 (en) Map creation device, map creation system, map creation method, and storage medium
CN116022163A (en) Automatic driving vehicle scanning matching and radar attitude estimator based on super local subgraph
CN114861725A (en) Post-processing method, device, equipment and medium for perception and tracking of target
JP6397355B2 (en) Control device for moving body

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant