CN113466822B - Method and device for detecting obstacles - Google Patents

Method and device for detecting obstacles Download PDF

Info

Publication number
CN113466822B
CN113466822B CN202110882862.1A CN202110882862A CN113466822B CN 113466822 B CN113466822 B CN 113466822B CN 202110882862 A CN202110882862 A CN 202110882862A CN 113466822 B CN113466822 B CN 113466822B
Authority
CN
China
Prior art keywords
obstacle
information
identifier
obstacle information
error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110882862.1A
Other languages
Chinese (zh)
Other versions
CN113466822A (en
Inventor
惠育江
王军
王亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110882862.1A priority Critical patent/CN113466822B/en
Publication of CN113466822A publication Critical patent/CN113466822A/en
Application granted granted Critical
Publication of CN113466822B publication Critical patent/CN113466822B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a method and a device for detecting an obstacle. One embodiment of the above method comprises: acquiring point cloud data and millimeter wave radar data; respectively processing the point cloud data and the millimeter wave radar data to obtain first barrier information indicated by the point cloud data and second barrier information indicated by the millimeter wave radar data; determining weights occupied by the first obstacle information and the second obstacle information in fusion according to the first obstacle information and the second obstacle information; according to the weight, fusing the first obstacle information and the second obstacle information; and determining the obstacle according to the information of the obstacle after fusion. According to the method, point cloud data obtained by the laser radar and millimeter wave radar data obtained by the millimeter wave radar are fully utilized, and the accuracy and the sensitivity of obstacle detection are improved.

Description

Method and device for detecting obstacles
The application is a divisional application of Chinese patent application with the application number of CN201710541847.4, the application date of 2017, the year 07 and the month 04 and the name of 'method and device for detecting obstacle'.
Technical Field
The application relates to the field of vehicle driving, in particular to the field of obstacle detection, and particularly relates to a method and a device for detecting an obstacle.
Background
Millimeter wave radar and lidar are widely used in the fields of autopilot, ADAS (ADVANCED DRIVER ASSISTANT SYSTEM, advanced driving assistance system) and the like. The laser radar can realize accurate shape sensing on the obstacle; the millimeter wave radar can provide effective sensing data for the position and the speed of the obstacle and can resist the interference of rain and snow. Because different sensors have respective advantages, effectively fusing the data of each sensor can improve the safety of automatic driving.
The existing fusion method has the problems of low precision and larger error estimation.
Disclosure of Invention
The object of the present application is to propose a method and a device for detecting obstacles, which solve the technical problems mentioned in the background section above.
In a first aspect, an embodiment of the present application provides a method for detecting an obstacle, where the method includes: acquiring point cloud data and millimeter wave radar data; respectively processing the point cloud data and the millimeter wave radar data to obtain first barrier information indicated by the point cloud data and second barrier information indicated by the millimeter wave radar data; determining weights occupied by the first obstacle information and the second obstacle information in fusion according to the first obstacle information and the second obstacle information; according to the weight, fusing the first obstacle information and the second obstacle information; determining an obstacle according to the fused obstacle information; the fusing the first obstacle information and the second obstacle information according to the weight includes: according to a preset first conversion matrix and/or a preset second conversion matrix, converting the first obstacle information and the second obstacle information into the same coordinate system; matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list, wherein the obstacle information fusion list comprises information of historical obstacles; fusing the matched first obstacle information and the matched second obstacle information; the matching of the converted first obstacle information and the converted second obstacle information includes: updating the history obstacle; and/or determining whether the converted first obstacle information and the converted second obstacle information indicate the same obstacle.
In some embodiments, the processing the point cloud data and the millimeter wave radar data respectively includes: clustering and tracking the point cloud data to obtain first barrier information; and filtering the millimeter wave radar data to obtain second obstacle information.
In some embodiments, the first obstacle information includes a first position of the obstacle, a first speed, an error of the first position, and an error of the first speed, and the second obstacle information includes a second position of the obstacle, a second speed, an error of the second position, and an error of the second speed; and determining weights occupied by the first obstacle information and the second obstacle information during fusion, including: determining a first integrated error based on the error of the first position and the error of the first speed; determining a second integrated error based on the error in the second position and the error in the second speed; and determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first integrated error and the second integrated error.
In some embodiments, fusing the first obstacle information and the second obstacle information according to the weights includes: according to a preset first conversion matrix and/or a preset second conversion matrix, converting the first barrier information and the second barrier information into the same coordinate system; matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list; and fusing the matched first obstacle information and the matched second obstacle information.
In some embodiments, the obstacle information fusion list includes a historical obstacle identifier and historical motion information corresponding to the historical obstacle identifier, the first obstacle information includes a first obstacle identifier and motion information corresponding to the first obstacle identifier, and the second obstacle information includes a second obstacle identifier and motion information corresponding to the second obstacle identifier; and matching the converted first obstacle information and the converted second obstacle information according to the preset obstacle information fusion list, including: determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier and the second obstacle identifier; responding to the first obstacle identifier included in the obstacle information fusion list, and updating historical motion information corresponding to the first obstacle identifier in the historical obstacle motion information by adopting motion information corresponding to the first obstacle identifier; and in response to the second obstacle identifier being included in the obstacle information fusion list, updating historical motion information corresponding to the second obstacle identifier in the historical obstacle motion information by using motion information corresponding to the second obstacle identifier.
In some embodiments, the matching the converted first obstacle information and the converted second obstacle information according to the preset obstacle information fusion list includes: determining a matching degree between the motion information corresponding to the first obstacle identifier and the motion information corresponding to the second obstacle identifier in response to the first obstacle identifier or the second obstacle identifier not being included in the obstacle information fusion list; and determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle in response to the matching degree being greater than or equal to a preset threshold, and updating the historical motion information corresponding to the first obstacle identifier or the historical motion information corresponding to the second obstacle identifier in the historical obstacle motion information according to the motion information corresponding to the first obstacle identifier or the motion information corresponding to the second obstacle identifier.
In some embodiments, the matching the converted first obstacle information and the converted second obstacle information according to the preset obstacle information fusion list includes: and adding the first obstacle information or the second obstacle information in the obstacle information fusion list in response to the matching degree being smaller than a preset threshold.
In some embodiments, fusing the first obstacle information and the second obstacle information according to the weights includes: filtering the second obstacle information to obtain filtered second obstacle information; and fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
In some embodiments, the obstacle information fusion list includes a generation time of motion information corresponding to each historical obstacle identifier; the method further comprises the following steps: for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list, determining whether the time difference between the generation time and the current time is greater than a preset duration; and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than the preset duration.
In a second aspect, an embodiment of the present application provides an apparatus for detecting an obstacle, the apparatus including: the data acquisition unit is used for acquiring point cloud data and millimeter wave radar data; the data processing unit is used for respectively processing the point cloud data and the millimeter wave radar data to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data; the weight determining unit is used for determining the weight occupied by the first obstacle information and the second obstacle information in fusion according to the first obstacle information and the second obstacle information; the information fusion unit is used for fusing the first obstacle information and the second obstacle information according to the weight; an obstacle determining unit for determining an obstacle according to the fused obstacle information; the information fusion unit includes: the information conversion module is used for converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix; the information matching module is used for matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list, wherein the obstacle information fusion list comprises information of historical obstacles; the information fusion module is used for fusing the matched first obstacle information and the matched second obstacle information; the information matching module is further configured to: updating the history obstacle; and/or determining whether the converted first obstacle information and the converted second obstacle information indicate the same obstacle.
In some embodiments, the data processing unit is further configured to: clustering and tracking the point cloud data to obtain first barrier information; and filtering the millimeter wave radar data to obtain second obstacle information.
In some embodiments, the first obstacle information includes a first position of the obstacle, a first speed, an error of the first position, and an error of the first speed, and the second obstacle information includes a second position of the obstacle, a second speed, an error of the second position, and an error of the second speed; the weight determining unit includes: the first error determining module is used for determining a first comprehensive error according to the error of the first position and the error of the first speed; the second error determining module is used for determining a second comprehensive error according to the error of the second position and the error of the second speed; the weight determining module is used for determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first integrated error and the second integrated error.
In some embodiments, the information fusion unit includes: the information conversion module is used for converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix; the information matching module is used for matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list; and the information fusion module is used for fusing the matched first obstacle information and the matched second obstacle information.
In some embodiments, the obstacle information fusion list includes a historical obstacle identifier and historical motion information corresponding to the historical obstacle identifier, the first obstacle information includes a first obstacle identifier and motion information corresponding to the first obstacle identifier, and the second obstacle information includes a second obstacle identifier and motion information corresponding to the second obstacle identifier; the above information matching module is further configured to: determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier and the second obstacle identifier; responding to the first obstacle identifier included in the obstacle information fusion list, and updating historical motion information corresponding to the first obstacle identifier in the historical obstacle motion information by adopting motion information corresponding to the first obstacle identifier; and in response to the second obstacle identifier being included in the obstacle information fusion list, updating historical motion information corresponding to the second obstacle identifier in the historical obstacle motion information by using motion information corresponding to the second obstacle identifier.
In some embodiments, the above information matching module is further configured to: determining a matching degree between the motion information corresponding to the first obstacle identifier and the motion information corresponding to the second obstacle identifier in response to the first obstacle identifier or the second obstacle identifier not being included in the obstacle information fusion list; and determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle in response to the matching degree being greater than or equal to a preset threshold, and updating the historical motion information corresponding to the first obstacle identifier or the historical motion information corresponding to the second obstacle identifier in the historical obstacle motion information according to the motion information corresponding to the first obstacle identifier or the motion information corresponding to the second obstacle identifier.
In some embodiments, the above information matching module is further configured to: and adding the first obstacle information or the second obstacle information in the obstacle information fusion list in response to the matching degree being smaller than a preset threshold.
In some embodiments, the above information fusion unit is further configured to: filtering the second obstacle information to obtain filtered second obstacle information; and fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
In some embodiments, the obstacle information fusion list includes a generation time of motion information corresponding to each historical obstacle identifier; and the apparatus further comprises an information deletion unit configured to: for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list, determining whether the time difference between the generation time and the current time is greater than a preset duration; and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than the preset duration.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; and a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the method described in any of the above embodiments.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having stored thereon a computer program which when executed by a processor implements a method as described in any of the above embodiments.
According to the method and the device for detecting the obstacle, which are provided by the embodiment of the application, after the point cloud data and the millimeter wave radar data are acquired, the point cloud data and the millimeter wave radar data are respectively processed to obtain the first obstacle information corresponding to the point cloud data and the second obstacle information corresponding to the millimeter wave radar data, then the weights of the first obstacle information and the second obstacle information when the first obstacle information and the second obstacle information are fused are determined according to the first obstacle information and the second obstacle information, the first obstacle information and the second obstacle information are fused according to the obtained weights, and finally the obstacle is determined according to the fused information. According to the method, point cloud data obtained by the laser radar and millimeter wave radar data obtained by the millimeter wave radar are fully utilized, and the accuracy and the sensitivity of obstacle detection are improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings in which:
FIG. 1 is a flow chart of one embodiment of a method for detecting an obstacle according to the present application;
FIG. 2 is an exemplary system architecture diagram in which the present application may be applied;
Fig. 3 is a schematic view of an application scenario of the method for detecting an obstacle according to the present application;
FIG. 4 is a flow chart of one embodiment of fusing first obstacle information and second obstacle information in a method for detecting an obstacle in accordance with the present application;
FIG. 5 is a schematic diagram of an embodiment of an apparatus for detecting an obstacle according to the present application;
fig. 6 is a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
Fig. 1 shows a flow 100 of one embodiment of a method for detecting an obstacle according to the application. The method for detecting an obstacle of the present embodiment includes the steps of:
And step 101, acquiring point cloud data and millimeter wave radar data.
In this embodiment, the point cloud data may be from a laser radar, and the millimeter wave radar data may be from a millimeter wave radar. The point cloud data and the millimeter wave radar data may include multi-frame data, each of which has different generation time, and each of which includes information of an obstacle included in a driving environment where the vehicle is located. The above-described lidar and millimeter-wave radar may be mounted on a vehicle, for example, on an unmanned vehicle, an autonomous vehicle, or the like. The obstacle may be another vehicle, a pedestrian or any object that impedes the travel of the vehicle.
The method for detecting an obstacle of the present embodiment is generally performed by a terminal or a server, which can be communicatively connected to a vehicle. When acquiring the point cloud data and the millimeter wave radar data, the terminal or the server may acquire the point cloud data and the millimeter wave radar data directly from a storage device connected to each sensor (laser radar or millimeter wave radar) of the vehicle, or may acquire locally stored data. It will be appreciated that the terminal may be installed on a vehicle when the method of the present embodiment is performed by the terminal.
When the method of the present embodiment is executed by a server, the server needs to acquire point cloud data and millimeter wave radar data from a vehicle, and its corresponding system architecture diagram is shown in fig. 2. In fig. 2, a system architecture 200 may include a vehicle 201, a network 202, and a server 203. The network 202 is the medium used to provide a communication link between the vehicle 201 and the server 203. The network 202 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The vehicle 201 may be mounted thereon with a laser radar and a millimeter wave radar, and point cloud data and millimeter wave radar data of obstacles in the running environment of the vehicle may be collected.
The server 203 may be a server that provides various services, such as a background server that processes point cloud data and millimeter wave radar data of the vehicle 201. The background server may acquire point cloud data and millimeter wave radar data of the vehicle 201, and analyze the point cloud data and the millimeter wave radar data to obtain an obstacle in a driving environment where the vehicle 201 is located.
It should be noted that, the method for detecting an obstacle provided by the embodiment of the present application is generally performed by the server 203, and accordingly, the device for detecting an obstacle is generally disposed in the server 203.
It should be understood that the number of vehicles, networks, and servers in fig. 2 are merely illustrative. There may be any number of vehicles, networks, and servers, as desired for implementation.
Returning to fig. 1, in step 102, the point cloud data and the millimeter wave radar data are respectively processed to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data.
In this embodiment, after the point cloud data and the millimeter wave radar data are acquired, the two data may be respectively processed to obtain the first obstacle information indicated by the point cloud data and the second obstacle information indicated by the millimeter wave radar data. The processing of the point cloud data and the millimeter wave radar data may be the same or different. For example, the point cloud data and the millimeter wave radar data may be subjected to a filter process, or the point cloud data may be subjected to a clustering process or the like, and the millimeter wave radar data may be subjected to a filter process. The first obstacle information and the second obstacle information may be the same or different from each other. In this embodiment, the first obstacle information and the second obstacle information may both include information such as a position, a speed, an error of the position, an error of the speed, and the like of the obstacle, and at the same time, the first obstacle information may also include classification information, contour information, and the like of the obstacle.
In some optional implementations of the present embodiment, the processing of the point cloud data and the millimeter wave radar data in step 102 may be implemented by the following steps, which are not shown in fig. 1: clustering and tracking the point cloud data to obtain first barrier information; and filtering the millimeter wave radar data to obtain second obstacle information.
In the implementation manner, the first obstacle information can be obtained by processing point cloud data clustering (for dividing and classifying the point cloud data and determining the obstacle in the driving environment) and tracking (for determining the position and the speed of the detected obstacle); the millimeter wave radar data may be subjected to a filtering process to obtain the second obstacle information, which may include, but is not limited to: kalman filtering, extended Kalman filtering, lossless Kalman filtering (Unscented KALMAN FILTER, UKF).
Step 103, determining weights occupied by the first obstacle information and the second obstacle information in fusion according to the first obstacle information and the second obstacle information.
In this embodiment, the error of the position and the error of the speed may be used as the sensing noise of the hardware device. For example, the error in position and the error in velocity in the first obstacle information may be used as the sensing noise of the laser radar, and the error in position and the error in velocity in the second obstacle information may be used as the sensing noise of the millimeter wave radar. The terminal or the server may determine weights of the first obstacle information and the second obstacle information according to the above-described sensing noise when merging the two.
In some alternative implementations of the present embodiment, the first obstacle information includes a first position of the obstacle, a first speed, an error of the first position, and an error of the first speed, and the second obstacle information includes a second position of the obstacle, a second speed, an error of the second position, and an error of the second speed. The above step 103 may be implemented specifically by the following steps not shown in fig. 1: determining a first integrated error based on the error of the first position and the error of the first speed; determining a second integrated error based on the error in the second position and the error in the second speed; and determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first integrated error and the second integrated error.
In this implementation manner, in order to comprehensively consider the influence of the position error and the velocity error on the fusion result when the first obstacle information and the second obstacle information are fused, the first comprehensive error may be first determined according to the first position error and the first velocity error, and the second comprehensive error may be determined according to the second position error and the second velocity error. In determining the integrated error, the position error and the velocity error may be weighted and superimposed. And finally, determining the first weight of the first obstacle information and the second weight of the second obstacle information according to the first integrated error and the second integrated error. It can be understood that in this implementation manner, the weight of the obstacle information is inversely proportional to the error included in the obstacle information, that is, the larger the error is, the smaller the weight occupied by the obstacle information is, which is beneficial to improving the accuracy of the obstacle information obtained by fusion.
Step 104, according to the weight, fusing the first obstacle information and the second obstacle information.
After determining the weight of the first obstacle information and the weight of the second obstacle information, the terminal or the server may fuse the first obstacle information and the second obstacle information according to the weight.
Step 105, determining the obstacle according to the information of the obstacle after fusion.
After the first obstacle information and the second obstacle information are fused, the obstacle can be determined according to the fused obstacle information. It can be understood that the fused obstacle information may include specific information such as classification, contour, position, speed, driving direction, etc. of the obstacle, and the terminal or the server may specifically determine the obstacle after obtaining the above information, so as to provide a driving policy.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for detecting an obstacle according to the present embodiment. In the application scenario of fig. 3, a laser radar 311 and a millimeter wave radar 312 are installed on a vehicle 31, where the laser radar 311 may obtain point cloud data by scanning a driving environment where the vehicle 31 is located, and the millimeter wave radar 312 may obtain millimeter wave radar data. The vehicle 31 transmits the point cloud data and the millimeter wave radar data to the server 33, and the server 33 processes the data to obtain first obstacle information and second obstacle information, then obtains a first weight and a second weight, and finally merges and determines the obstacle information. The determined obstacle information is transmitted to the vehicle 31, and the vehicle 31 determines that a pedestrian 32 is in front of the vehicle by analyzing the information transmitted from the server 33, thereby making a driving policy to avoid collision with the pedestrian 32.
According to the method for detecting the obstacle, which is provided by the embodiment of the application, after the point cloud data and the millimeter wave radar data are acquired, the point cloud data and the millimeter wave radar data are respectively processed to obtain the first obstacle information corresponding to the point cloud data and the second obstacle information corresponding to the millimeter wave radar data, then the weights of the first obstacle information and the second obstacle information when the first obstacle information and the second obstacle information are fused are determined according to the first obstacle information and the second obstacle information, the first obstacle information and the second obstacle information are fused according to the obtained weights, and finally the obstacle is determined according to the fused information. According to the method, point cloud data obtained by the laser radar and millimeter wave radar data obtained by the millimeter wave radar are fully utilized, and the accuracy and the sensitivity of obstacle detection are improved.
With continued reference to fig. 4, a flow 400 of fusing first obstacle information and second obstacle information in a method for detecting an obstacle in accordance with the present application is illustrated. As shown in fig. 4, in the present embodiment, the fusion of the first obstacle information and the second obstacle information may be achieved by:
Step 401, converting the first obstacle information and the second obstacle information to the same coordinate system according to a preset first conversion matrix and/or second conversion matrix.
In this embodiment, the first obstacle information is obtained by processing point cloud data, and the point cloud data is collected by the laser radar, so that the first obstacle information is located in a laser radar coordinate system. Similarly, the second obstacle information is located in the millimeter wave radar coordinate system. When the first obstacle information and the second obstacle information are fused, the first obstacle information and the second obstacle information need to be first converted into the same coordinate system, for example, the first obstacle information can be converted into a millimeter wave radar coordinate system, the second obstacle information can be converted into a laser radar coordinate system, and the first obstacle information and the second obstacle information can be both converted into a world coordinate system. The first conversion matrix may be a conversion matrix between a laser radar coordinate system and a world coordinate system, or may be a conversion matrix between a laser radar coordinate system and a millimeter wave radar coordinate system. Similarly, the second transformation matrix may be a transformation matrix between the millimeter wave radar coordinate system and the laser radar coordinate system, or may be a transformation matrix between the millimeter wave radar coordinate system and the world coordinate system.
Step 402, matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list.
In this embodiment, after the first obstacle information and the second obstacle information are converted into the same coordinate system, the converted first obstacle information and the converted second obstacle information may be matched by combining with a preset obstacle information fusion list. The above-mentioned obstacle information fusion list may include information of history obstacles detected by the laser radar and the millimeter wave radar. Matching the converted first obstacle information and the converted second obstacle information with the obstacle information fusion list can update the historical obstacle on one hand, and can determine whether the converted first obstacle information and the converted second obstacle information indicate the same obstacle or not on the other hand.
In some optional implementations of this embodiment, the obstacle information fusion list includes a historical obstacle identifier and historical motion information corresponding to the historical obstacle identifier. The first obstacle information includes a first obstacle identifier and motion information corresponding to the first obstacle identifier. The second obstacle information includes a second obstacle identifier and motion information corresponding to the second obstacle identifier. When the obstacle information fusion list comprises a plurality of historical obstacle identifiers, each historical obstacle identifier corresponds to one piece of historical motion information. Similarly, when there are a plurality of first obstacle identifiers, each first obstacle identifier corresponds to one piece of motion information. The above step 402 may be implemented specifically by the following steps not shown in fig. 4:
Determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier and the second obstacle identifier; responding to the first obstacle identifier included in the obstacle information fusion list, and updating historical motion information corresponding to the first obstacle identifier in the historical obstacle motion information by adopting motion information corresponding to the first obstacle identifier; and in response to the second obstacle identifier being included in the obstacle information fusion list, updating historical motion information corresponding to the second obstacle identifier in the historical obstacle motion information by using motion information corresponding to the second obstacle identifier.
The obstacle identifiers are used to distinguish between different obstacles, i.e. the same radar sensor assigns different obstacle identifiers to different obstacles, while different radar sensors assign different obstacle identifiers to the same obstacle. In this implementation, according to the historical obstacle identifier, the first obstacle identifier, and the second obstacle identifier, it may be determined whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list. When the first obstacle identifier is included in the obstacle information fusion list, the obstacle indicated by the first obstacle identifier is detected by the laser radar before, and at the moment, the historical motion information corresponding to the first obstacle identifier in the historical obstacle motion information is updated by using the motion information corresponding to the first obstacle identifier. When the second obstacle identifier is included in the obstacle information fusion list, it is indicated that the obstacle indicated by the second obstacle identifier has been previously detected by the millimeter wave radar, and at this time, the historical motion information corresponding to the second obstacle identifier in the historical obstacle motion information is updated by using the motion information corresponding to the second obstacle identifier. The historical movement information may include information such as a historical position of the obstacle, a historical speed, and the like.
In some alternative implementations of the present embodiment, the step 402 may further include the following steps not shown in fig. 4:
Determining a matching degree between the motion information corresponding to the first obstacle identifier and the motion information corresponding to the second obstacle identifier in response to the first obstacle identifier or the second obstacle identifier not being included in the obstacle information fusion list; and determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle in response to the matching degree being greater than or equal to a preset threshold, and updating the historical motion information corresponding to the first obstacle identifier or the historical motion information corresponding to the second obstacle identifier in the historical obstacle motion information according to the motion information corresponding to the first obstacle identifier or the motion information corresponding to the second obstacle identifier.
When the first obstacle identifier or the second obstacle identifier is not included in the obstacle information fusion list, it is determined that the obstacle indicated by the first obstacle identifier has not been detected by the laser radar before, or that the obstacle indicated by the second obstacle identifier has not been detected by the millimeter wave radar before. At this time, the degree of matching between the motion information corresponding to the first obstacle identifier and the motion information corresponding to the second obstacle identifier may be determined. The motion information may include position information and velocity information of the obstacle, and the matching degree may be represented by a euclidean distance, a manhattan distance, or a mahalanobis distance between the positions, or may be represented by a distance between two velocity vectors. When the degree of matching is expressed by the distance between the two positions and the smaller the distance between the two positions is, the higher the degree of matching between the two positions can be considered. When the matching degree is greater than the preset threshold, the first obstacle identifier and the second obstacle identifier are considered to indicate the same obstacle, and the motion information corresponding to the first obstacle identifier or the motion information corresponding to the second obstacle identifier can be used for updating the historical motion information corresponding to the first obstacle identifier or the second obstacle identifier in the obstacle information fusion list.
In some optional implementations of the present embodiment, the step 402 may further include the following steps not shown in fig. 4:
and adding the first obstacle information or the second obstacle information in the obstacle information fusion list in response to the matching degree being smaller than a preset threshold.
When the matching degree between the motion information corresponding to the first obstacle identifier and the motion information corresponding to the second obstacle identifier is smaller than a preset threshold value, the obstacle indicated by the first obstacle identifier is indicated to be a new-appearing obstacle, or the obstacle indicated by the second obstacle identifier is indicated to be a new-appearing obstacle, and then the obstacle information corresponding to the obstacle identifier which is not included is required to be added in the obstacle information fusion list. That is, if the first obstacle identifier is not included in the obstacle information fusion list, the first obstacle information is added; and if the second obstacle identifier is not included in the obstacle information fusion list, adding the second obstacle information.
Step 403, fusing the matched first obstacle information and the matched second obstacle information.
After matching the converted first obstacle information with the converted second obstacle information, it is possible to determine which obstacles were previously detected and which obstacles are newly appeared, i.e., it is possible to accurately determine the information of the obstacles. Then, the obstacle detected by the laser radar and the obstacle detected by the millimeter wave radar are fused, so that more accurate obstacle detection can be realized.
In some alternative implementations of the present embodiment, the step 403 may be implemented specifically by the following steps not shown in fig. 4:
Filtering the second obstacle information to obtain filtered second obstacle information; and fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
In this implementation manner, the second obstacle information may be further filtered, so that noise in the second obstacle information may be removed. And then fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
In some optional implementations of this embodiment, the obstacle information fusion list includes a generation time of the motion information corresponding to the historical obstacle identifier. The above method may further comprise the following steps, not shown in fig. 4:
For the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list, determining whether the time difference between the generation time and the current time is greater than a preset duration; and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than the preset duration.
In the implementation manner, the movement information corresponding to the history obstacle identifier which is "expired" in the obstacle information fusion list can be deleted timely. The method can be judged according to the time difference between the generation time and the current time of each piece of motion information, and when the time difference is larger than the preset duration, the piece of motion information is considered to be 'outdated', and then the piece of motion information can be deleted.
The method for detecting the obstacle provided by the embodiment of the application can accurately determine whether each obstacle is a historical obstacle, so that the obstacle can be detected better and more accurately by fusing detection results of the laser radar and the millimeter wave radar.
With further reference to fig. 5, as an implementation of the method shown in the above figures, the present application provides an embodiment of an apparatus for detecting an obstacle, which corresponds to the method embodiment shown in fig. 1, and which is particularly applicable to various electronic devices.
As shown in fig. 5, the apparatus 500 for detecting an obstacle of the present embodiment includes: a data acquisition unit 501, a data processing unit 502, a weight determination unit 503, an information fusion unit 504, and an obstacle determination unit 505.
The data acquisition unit 501 is configured to acquire point cloud data and millimeter wave radar data.
The data processing unit 502 is configured to process the point cloud data and the millimeter wave radar data respectively, and obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data.
The weight determining unit 503 is configured to determine weights occupied by the first obstacle information and the second obstacle information during fusion according to the first obstacle information and the second obstacle information.
And an information fusion unit 504, configured to fuse the first obstacle information and the second obstacle information according to the weights.
An obstacle determining unit 505 is configured to determine an obstacle according to the fused obstacle information.
In some optional implementations of this embodiment, the data processing unit 502 may be further configured to: clustering and tracking the point cloud data to obtain first barrier information; and filtering the millimeter wave radar data to obtain second obstacle information.
In some alternative implementations of the present embodiment, the first obstacle information includes a first position of the obstacle, a first speed, an error of the first position, and an error of the first speed, and the second obstacle information includes a second position of the obstacle, a second speed, an error of the second position, and an error of the second speed. The weight determining unit 503 may further include a first error determining module, a second error determining module, and a weight determining module, which are not shown in fig. 5.
The first error determining module is used for determining a first comprehensive error according to the error of the first position and the error of the first speed.
And the second error determining module is used for determining a second comprehensive error according to the error of the second position and the error of the second speed.
The weight determining module is used for determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first integrated error and the second integrated error.
In some optional implementations of this embodiment, the information fusion unit 504 may further include an information conversion module, an information matching module, and an information fusion module, which are not shown in fig. 5.
The information conversion module is used for converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or second conversion matrix.
The information matching module is used for matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list.
And the information fusion module is used for fusing the matched first obstacle information and the matched second obstacle information.
In some optional implementations of this embodiment, the above-mentioned obstacle information fusion list includes historical obstacle identifiers and historical motion information corresponding to each of the historical obstacle identifiers, the first obstacle information includes first obstacle identifiers and motion information corresponding to each of the first obstacle identifiers, and the second obstacle information includes second obstacle identifiers and motion information corresponding to each of the second obstacle identifiers. The above information matching module may further be configured to: determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier and the second obstacle identifier; responding to the fact that a first obstacle identifier is included in the obstacle information fusion list, and updating historical motion information corresponding to the first obstacle identifier in historical obstacle motion information by using motion information corresponding to the first obstacle identifier; and in response to the second obstacle identifier being included in the obstacle information fusion list, updating historical motion information corresponding to the second obstacle identifier in the historical obstacle motion information by using motion information corresponding to the second obstacle identifier.
In some optional implementations of this embodiment, the information matching module may further be configured to: determining the matching degree between the motion information corresponding to each first obstacle identifier and the motion information corresponding to each second obstacle identifier in response to the first obstacle identifier or the second obstacle identifier not being included in the obstacle information fusion list; and determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle in response to the matching degree being greater than or equal to a preset threshold, and updating the historical motion information corresponding to the first obstacle identifier or the historical motion information corresponding to the second obstacle identifier in the historical obstacle motion information according to the motion information corresponding to the first obstacle identifier or the motion information corresponding to the second obstacle identifier.
In some optional implementations of this embodiment, the information matching module may further be configured to: and adding the first obstacle information or the second obstacle information in the obstacle information fusion list in response to the matching degree being smaller than a preset threshold.
In some optional implementations of this embodiment, the information fusion unit 504 may be further configured to: filtering the second obstacle information to obtain filtered second obstacle information; and fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
In some optional implementations of this embodiment, the obstacle information fusion list includes a generation time of the motion information corresponding to each historical obstacle identifier. The apparatus 500 may further include an information deletion unit, not shown in fig. 5, which may be configured to: for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list, determining whether the time difference between the generation time and the current time is greater than a preset duration; and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than the preset duration.
According to the device for detecting the obstacle, which is provided by the embodiment of the application, after the point cloud data and the millimeter wave radar data are acquired, the point cloud data and the millimeter wave radar data are respectively processed to obtain the first obstacle information corresponding to the point cloud data and the second obstacle information corresponding to the millimeter wave radar data, then the weights of the first obstacle information and the second obstacle information when the first obstacle information and the second obstacle information are fused are determined according to the first obstacle information and the second obstacle information, the first obstacle information and the second obstacle information are fused according to the obtained weights, and finally the obstacle is determined according to the fused information. The device of the embodiment fully utilizes the point cloud data obtained by the laser radar and the millimeter wave radar data obtained by the millimeter wave radar, and improves the accuracy and the sensitivity of obstacle detection.
It should be understood that the units 501 to 505 described in the apparatus 500 for detecting an obstacle correspond to the respective steps in the method described with reference to fig. 1, respectively. Thus, the operations and features described above with respect to the method for detecting an obstacle are equally applicable to the apparatus 500 and the units contained therein, and are not described in detail herein. The corresponding elements of the apparatus 500 may cooperate with elements in a terminal or server to implement aspects of embodiments of the present application.
Referring now to FIG. 6, there is illustrated a schematic diagram of a computer system 600 suitable for use in implementing a terminal device or server in accordance with an embodiment of the present application. The terminal device/server shown in fig. 6 is only an example, and should not impose any limitation on the functions and scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU) 601, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other through a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, mouse, etc.; an output portion 607 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The drive 610 is also connected to the I/O interface 605 as needed. Removable media 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed as needed on drive 610 so that a computer program read therefrom is installed as needed into storage section 608.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a machine-readable medium, the computer program comprising program code for performing the method shown in the flow diagrams. In such an embodiment, the computer program may be downloaded and installed from a network through the communication portion 609, and/or installed from the removable medium 611. The above-described functions defined in the method of the present application are performed when the computer program is executed by a Central Processing Unit (CPU) 601.
It should be noted that the computer readable medium described in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented in software or in hardware. The described units may also be provided in a processor, for example, described as: a processor includes a data acquisition unit, a data processing unit, a weight determination unit, an information fusion unit, and an obstacle determination unit. The names of these units do not constitute limitations on the unit itself in some cases, and for example, the data acquisition unit may also be described as "a unit that acquires point cloud data and millimeter wave radar data".
As another aspect, the present application also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring point cloud data and millimeter wave radar data; respectively processing the point cloud data and the millimeter wave radar data to obtain first barrier information indicated by the point cloud data and second barrier information indicated by the millimeter wave radar data; determining weights occupied by the first obstacle information and the second obstacle information in fusion according to the first obstacle information and the second obstacle information; according to the weight, fusing the first obstacle information and the second obstacle information; and determining the obstacle according to the information of the obstacle after fusion.
The above description is only illustrative of the preferred embodiments of the present application and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the application referred to in the present application is not limited to the specific combinations of the technical features described above, but also covers other technical features formed by any combination of the technical features described above or their equivalents without departing from the inventive concept described above. Such as the above-mentioned features and the technical features disclosed in the present application (but not limited to) having similar functions are replaced with each other.

Claims (18)

1. A method for detecting an obstacle, the method comprising:
Acquiring point cloud data and millimeter wave radar data;
respectively processing the point cloud data and the millimeter wave radar data to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data;
Determining weights occupied by the first obstacle information and the second obstacle information in fusion according to the errors of the position and the speed in the first obstacle information and the errors of the position and the speed in the second obstacle information;
According to a preset first conversion matrix and/or a preset second conversion matrix, converting the first obstacle information and the second obstacle information into the same coordinate system;
matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list, wherein the obstacle information fusion list comprises information of historical obstacles;
according to the weight, the matched first obstacle information and the matched second obstacle information are fused;
determining an obstacle according to the fused obstacle information;
Wherein the matching of the converted first obstacle information and the converted second obstacle information includes:
updating the history obstacle; and/or
It is determined whether the converted first obstacle information and the converted second obstacle information indicate the same obstacle.
2. The method of claim 1, wherein the processing the point cloud data and the millimeter wave radar data, respectively, comprises:
clustering and tracking the point cloud data to obtain the first obstacle information;
and filtering the millimeter wave radar data to obtain the second obstacle information.
3. The method of claim 1, wherein the first obstacle information comprises a first position of the obstacle, a first speed, an error of the first position, and an error of the first speed, and the second obstacle information comprises a second position of the obstacle, a second speed, an error of the second position, and an error of the second speed; and
The determining the weight occupied by the first obstacle information and the second obstacle information in fusion comprises the following steps:
determining a first integrated error based on the error of the first position and the error of the first speed;
Determining a second integrated error based on the error in the second position and the error in the second speed;
And determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first integrated error and the second integrated error.
4. The method of claim 1, wherein the obstacle information fusion list includes a historical obstacle identifier and historical motion information corresponding to the historical obstacle identifier, the first obstacle information includes a first obstacle identifier and motion information corresponding to the first obstacle identifier, and the second obstacle information includes a second obstacle identifier and motion information corresponding to the second obstacle identifier; and
According to a preset obstacle information fusion list, matching the converted first obstacle information and the converted second obstacle information comprises the following steps:
Determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier and the second obstacle identifier;
Responding to the first obstacle identifier included in the obstacle information fusion list, and updating historical motion information corresponding to the first obstacle identifier in the obstacle information fusion list by adopting motion information corresponding to the first obstacle identifier;
and in response to the second obstacle identifier being included in the obstacle information fusion list, updating historical motion information corresponding to the second obstacle identifier in the obstacle information fusion list by adopting motion information corresponding to the second obstacle identifier.
5. The method of claim 4, wherein matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list comprises:
Determining a matching degree between motion information corresponding to the first obstacle identifier and motion information corresponding to the second obstacle identifier in response to the first obstacle identifier or the second obstacle identifier not being included in the obstacle information fusion list;
And determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle in response to the matching degree being greater than or equal to a preset threshold, and updating historical motion information corresponding to the first obstacle identifier or historical motion information corresponding to the second obstacle identifier in the obstacle information fusion list according to motion information corresponding to the first obstacle identifier or motion information corresponding to the second obstacle identifier.
6. The method of claim 5, wherein matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list comprises:
And in response to the matching degree being smaller than a preset threshold value, adding the first obstacle information or the second obstacle information in the obstacle information fusion list.
7. A method according to claim 3, wherein said fusing said first obstacle information and said second obstacle information according to said weights comprises:
Filtering the second obstacle information to obtain filtered second obstacle information;
And fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
8. The method according to claim 1, wherein the obstacle information fusion list includes generation timings of motion information corresponding to each history obstacle identifier; and
The method further comprises the steps of:
for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list, determining whether the time difference between the generation time and the current time is greater than a preset duration;
and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than a preset duration.
9. An apparatus for detecting an obstacle, the apparatus comprising:
the data acquisition unit is used for acquiring point cloud data and millimeter wave radar data;
the data processing unit is used for respectively processing the point cloud data and the millimeter wave radar data to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data;
A weight determining unit, configured to determine weights occupied by the first obstacle information and the second obstacle information during fusion according to an error of a position and an error of a speed in the first obstacle information, and an error of a position and an error of a speed in the second obstacle information;
an information fusion unit comprising:
The information conversion module is used for converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix;
the information matching module is used for matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list, wherein the obstacle information fusion list comprises information of historical obstacles; and
The information fusion module is used for fusing the matched first obstacle information and the matched second obstacle information;
An obstacle determining unit for determining an obstacle according to the fused obstacle information;
The information matching module is further configured to:
updating the history obstacle; and/or
It is determined whether the converted first obstacle information and the converted second obstacle information indicate the same obstacle.
10. The apparatus of claim 9, wherein the data processing unit is further configured to:
clustering and tracking the point cloud data to obtain the first obstacle information;
and filtering the millimeter wave radar data to obtain the second obstacle information.
11. The apparatus of claim 9, wherein the first obstacle information comprises a first position of the obstacle, a first speed, an error of the first position, and an error of the first speed, and the second obstacle information comprises a second position of the obstacle, a second speed, an error of the second position, and an error of the second speed; and
The weight determination unit includes:
the first error determining module is used for determining a first comprehensive error according to the error of the first position and the error of the first speed;
A second error determination module configured to determine a second integrated error according to the error of the second position and the error of the second speed;
and the weight determining module is used for determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first integrated error and the second integrated error.
12. The apparatus of claim 9, wherein the obstacle information fusion list comprises historical obstacle identifiers and historical motion information corresponding to the historical obstacle identifiers, the first obstacle information comprises a first obstacle identifier and motion information corresponding to the first obstacle identifier, and the second obstacle information comprises a second obstacle identifier and motion information corresponding to the second obstacle identifier; and
The information matching module is further configured to:
Determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier and the second obstacle identifier;
Responding to the first obstacle identifier included in the obstacle information fusion list, and updating historical motion information corresponding to the first obstacle identifier in the obstacle information fusion list by adopting motion information corresponding to the first obstacle identifier;
and in response to the second obstacle identifier being included in the obstacle information fusion list, updating historical motion information corresponding to the second obstacle identifier in the obstacle information fusion list by adopting motion information corresponding to the second obstacle identifier.
13. The apparatus of claim 12, wherein the information matching module is further configured to:
Determining a matching degree between motion information corresponding to the first obstacle identifier and motion information corresponding to the second obstacle identifier in response to the first obstacle identifier or the second obstacle identifier not being included in the obstacle information fusion list;
And determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle in response to the matching degree being greater than or equal to a preset threshold, and updating historical motion information corresponding to the first obstacle identifier or historical motion information corresponding to the second obstacle identifier in the obstacle information fusion list according to motion information corresponding to the first obstacle identifier or motion information corresponding to the second obstacle identifier.
14. The apparatus of claim 13, wherein the information matching module is further configured to:
And in response to the matching degree being smaller than a preset threshold value, adding the first obstacle information or the second obstacle information in the obstacle information fusion list.
15. The apparatus of claim 11, wherein the information fusion unit is further configured to:
Filtering the second obstacle information to obtain filtered second obstacle information;
And fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
16. The apparatus of claim 9, wherein the obstacle information fusion list includes generation moments of motion information corresponding to each historical obstacle identifier; and
The apparatus further includes an information deletion unit configured to:
for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list, determining whether the time difference between the generation time and the current time is greater than a preset duration;
and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than a preset duration.
17. An electronic device, comprising:
one or more processors;
Storage means for storing one or more programs,
When executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-8.
18. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any of claims 1-8.
CN202110882862.1A 2017-07-04 2017-07-04 Method and device for detecting obstacles Active CN113466822B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110882862.1A CN113466822B (en) 2017-07-04 2017-07-04 Method and device for detecting obstacles

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110882862.1A CN113466822B (en) 2017-07-04 2017-07-04 Method and device for detecting obstacles
CN201710541847.4A CN109212532B (en) 2017-07-04 2017-07-04 Method and apparatus for detecting obstacles

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201710541847.4A Division CN109212532B (en) 2017-07-04 2017-07-04 Method and apparatus for detecting obstacles

Publications (2)

Publication Number Publication Date
CN113466822A CN113466822A (en) 2021-10-01
CN113466822B true CN113466822B (en) 2024-06-25

Family

ID=64993117

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201710541847.4A Active CN109212532B (en) 2017-07-04 2017-07-04 Method and apparatus for detecting obstacles
CN202110882862.1A Active CN113466822B (en) 2017-07-04 2017-07-04 Method and device for detecting obstacles

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201710541847.4A Active CN109212532B (en) 2017-07-04 2017-07-04 Method and apparatus for detecting obstacles

Country Status (1)

Country Link
CN (2) CN109212532B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109738884B (en) * 2018-12-29 2022-03-11 百度在线网络技术(北京)有限公司 Object detection method and device and computer equipment
CN109901183A (en) * 2019-03-13 2019-06-18 电子科技大学中山学院 Method for improving all-weather distance measurement precision and reliability of laser radar
CN111923898B (en) * 2019-05-13 2022-05-06 广州汽车集团股份有限公司 Obstacle detection method and device
CN110517483B (en) * 2019-08-06 2021-05-18 新奇点智能科技集团有限公司 Road condition information processing method and digital rail side unit
CN110658531B (en) * 2019-08-23 2022-03-29 畅加风行(苏州)智能科技有限公司 Dynamic target tracking method for port automatic driving vehicle
CN110531376B (en) * 2019-08-23 2022-04-22 畅加风行(苏州)智能科技有限公司 Obstacle detection and tracking method for port unmanned vehicle
CN110502019A (en) * 2019-09-06 2019-11-26 北京云迹科技有限公司 A kind of barrier-avoiding method and device of Indoor Robot
CN110796705B (en) * 2019-10-23 2022-10-11 北京百度网讯科技有限公司 Model error elimination method, device, equipment and computer readable storage medium
CN110866544B (en) * 2019-10-28 2022-04-15 杭州飞步科技有限公司 Sensor data fusion method and device and storage medium
CN110794406B (en) * 2019-11-12 2022-08-02 北京经纬恒润科技股份有限公司 Multi-source sensor data fusion system and method
WO2021051726A1 (en) * 2020-01-06 2021-03-25 深圳市速腾聚创科技有限公司 Method and apparatus for processing point cloud data, storage medium, and lidar system
CN112001287B (en) * 2020-08-17 2023-09-12 禾多科技(北京)有限公司 Point cloud information generation method and device for obstacle, electronic equipment and medium
CN112378050B (en) * 2020-11-10 2021-09-14 珠海格力电器股份有限公司 Control method and device for air conditioning equipment, electronic equipment and storage medium
CN113296120B (en) * 2021-05-24 2023-05-12 福建盛海智能科技有限公司 Obstacle detection method and terminal

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105109484A (en) * 2015-08-21 2015-12-02 奇瑞汽车股份有限公司 Target-barrier determining method and device
CN106291736A (en) * 2016-08-16 2017-01-04 张家港长安大学汽车工程研究院 Pilotless automobile track dynamic disorder object detecting method

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3669205B2 (en) * 1999-05-17 2005-07-06 日産自動車株式会社 Obstacle recognition device
JP2005175603A (en) * 2003-12-08 2005-06-30 Suzuki Motor Corp Method and system for displaying obstacle using radar
US7138938B1 (en) * 2005-05-06 2006-11-21 Ford Global Technologies, Llc System and method for preemptively sensing an object and selectively operating both a collision countermeasure system and a parking assistance system aboard an automotive vehicle
CN101975951B (en) * 2010-06-09 2013-03-20 北京理工大学 Field environment barrier detection method fusing distance and image information
CN103176185B (en) * 2011-12-26 2015-01-21 上海汽车集团股份有限公司 Method and system for detecting road barrier
JP5991332B2 (en) * 2014-02-05 2016-09-14 トヨタ自動車株式会社 Collision avoidance control device
JP6190758B2 (en) * 2014-05-21 2017-08-30 本田技研工業株式会社 Object recognition device and vehicle
JP2016008922A (en) * 2014-06-25 2016-01-18 株式会社東芝 Sensor information fusion device
JP6303975B2 (en) * 2014-10-22 2018-04-04 株式会社デンソー Obstacle alarm device
DE102015112443A1 (en) * 2015-07-30 2017-02-02 Connaught Electronics Ltd. Method for determining a movement of a motor vehicle by means of fusion of odometry data, driver assistance system and motor vehicle
US9718404B2 (en) * 2015-10-01 2017-08-01 Ford Global Technologies, LLCS Parking obstruction locator and height estimator
US10229363B2 (en) * 2015-10-19 2019-03-12 Ford Global Technologies, Llc Probabilistic inference using weighted-integrals-and-sums-by-hashing for object tracking
CN105629985B (en) * 2016-03-20 2018-09-04 北京工业大学 The three-dimensional obstacle avoidance system of 360 ° of indoor quadrotor drone
CN106908783B (en) * 2017-02-23 2019-10-01 苏州大学 Based on obstacle detection method combined of multi-sensor information

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105109484A (en) * 2015-08-21 2015-12-02 奇瑞汽车股份有限公司 Target-barrier determining method and device
CN106291736A (en) * 2016-08-16 2017-01-04 张家港长安大学汽车工程研究院 Pilotless automobile track dynamic disorder object detecting method

Also Published As

Publication number Publication date
CN113466822A (en) 2021-10-01
CN109212532A (en) 2019-01-15
CN109212532B (en) 2021-08-20

Similar Documents

Publication Publication Date Title
CN113466822B (en) Method and device for detecting obstacles
CN109212530B (en) Method and apparatus for determining velocity of obstacle
CN110687549B (en) Obstacle detection method and device
EP3875985B1 (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN109188438B (en) Yaw angle determination method, device, equipment and medium
CN110654381B (en) Method and device for controlling a vehicle
CN110696826B (en) Method and device for controlling a vehicle
CN112630799B (en) Method and apparatus for outputting information
CN114111774B (en) Vehicle positioning method, system, equipment and computer readable storage medium
CN115339453B (en) Vehicle lane change decision information generation method, device, equipment and computer medium
CN112578781B (en) Data processing method, device, chip system and medium
CN109635868B (en) Method and device for determining obstacle type, electronic device and storage medium
CN113119999A (en) Method, apparatus, device, medium, and program product for determining automatic driving characteristics
CN112558036B (en) Method and device for outputting information
CN115512336B (en) Vehicle positioning method and device based on street lamp light source and electronic equipment
CN114724116B (en) Vehicle traffic information generation method, device, equipment and computer readable medium
CN112526477B (en) Method and device for processing information
CN116168366B (en) Point cloud data generation method, model training method, target detection method and device
CN114620055B (en) Road data processing method and device, electronic equipment and automatic driving vehicle
CN115848358B (en) Vehicle parking method, device, electronic equipment and computer readable medium
WO2023283469A1 (en) Multi-channel object matching
CN116229418A (en) Information fusion method, device, equipment and storage medium
CN117622118A (en) Method, device, equipment, medium and vehicle for determining obstacle orientation
CN114527758A (en) Path planning method and device, equipment, medium and product
CN114049615A (en) Traffic object fusion association method and device in driving environment and edge computing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant