CN113466822A - Method and apparatus for detecting obstacles - Google Patents

Method and apparatus for detecting obstacles Download PDF

Info

Publication number
CN113466822A
CN113466822A CN202110882862.1A CN202110882862A CN113466822A CN 113466822 A CN113466822 A CN 113466822A CN 202110882862 A CN202110882862 A CN 202110882862A CN 113466822 A CN113466822 A CN 113466822A
Authority
CN
China
Prior art keywords
obstacle
information
identifier
obstacle information
error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110882862.1A
Other languages
Chinese (zh)
Other versions
CN113466822B (en
Inventor
惠育江
王军
王亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN202110882862.1A priority Critical patent/CN113466822B/en
Publication of CN113466822A publication Critical patent/CN113466822A/en
Application granted granted Critical
Publication of CN113466822B publication Critical patent/CN113466822B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

Methods and apparatus for detecting obstacles are disclosed. One embodiment of the above method comprises: acquiring point cloud data and millimeter wave radar data; respectively processing the point cloud data and the millimeter wave radar data to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data; determining the weight occupied by the first obstacle information and the second obstacle information during fusion according to the first obstacle information and the second obstacle information; fusing the first obstacle information and the second obstacle information according to the weight; and determining the obstacle according to the fused obstacle information. According to the implementation mode, point cloud data obtained by the laser radar and millimeter wave radar data obtained by the millimeter wave radar are fully utilized, and the accuracy and the sensitivity of obstacle detection are improved.

Description

Method and apparatus for detecting obstacles
The application is a divisional application of Chinese patent application with the application number of CN201710541847.4, the application date of 2017, 07-04.8 and the name of invention 'method and device for detecting obstacles'.
Technical Field
The present application relates to the field of vehicle driving, and more particularly to the field of obstacle detection, and more particularly to a method and apparatus for detecting an obstacle.
Background
Millimeter wave radar and laser radar have been widely used in the fields of automatic driving, ADAS (Advanced Driver assistance System), and the like. The laser radar can realize accurate shape perception on the barrier; the millimeter wave radar can provide effective sensing data for the position and the speed of the obstacle and can resist the interference of rain and snow. Because different sensors have respective advantages, the data of the sensors are effectively fused, and the safety of automatic driving can be improved.
The existing fusion method has the problems of low precision and larger error estimation.
Disclosure of Invention
The present application aims to provide a method and apparatus for detecting obstacles to solve the technical problems mentioned in the background section above.
In a first aspect, an embodiment of the present application provides a method for detecting an obstacle, where the method includes: acquiring point cloud data and millimeter wave radar data; respectively processing the point cloud data and the millimeter wave radar data to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data; determining the weight occupied by the first obstacle information and the second obstacle information during fusion according to the first obstacle information and the second obstacle information; fusing the first obstacle information and the second obstacle information according to the weight; determining an obstacle according to the fused obstacle information; the fusing the first obstacle information and the second obstacle information according to the weight includes: converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix; matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list, wherein the obstacle information fusion list comprises information of historical obstacles; fusing the matched first obstacle information and the matched second obstacle information; the matching the converted first obstacle information and the converted second obstacle information includes: updating the historical barrier; and/or determining whether the converted first obstacle information and the converted second obstacle information indicate the same obstacle.
In some embodiments, the processing the point cloud data and the millimeter wave radar data respectively includes: clustering and tracking the point cloud data to obtain first obstacle information; and filtering the millimeter wave radar data to obtain second obstacle information.
In some embodiments, the first obstacle information includes a first position, a first velocity, an error of the first position, and an error of the first velocity of the obstacle, and the second obstacle information includes a second position, a second velocity, an error of the second position, and an error of the second velocity of the obstacle; and the determining the weight occupied by the first obstacle information and the second obstacle information during fusion includes: determining a first composite error according to the error of the first position and the error of the first speed; determining a second composite error according to the error of the second position and the error of the second speed; and determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first comprehensive error and the second comprehensive error.
In some embodiments, the fusing the first obstacle information and the second obstacle information according to the weight includes: converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix; matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list; and fusing the matched first obstacle information and the matched second obstacle information.
In some embodiments, the obstacle information fusion list includes historical obstacle identifiers and historical movement information corresponding to the historical obstacle identifiers, the first obstacle information includes a first obstacle identifier and movement information corresponding to the first obstacle identifier, and the second obstacle information includes a second obstacle identifier and movement information corresponding to the second obstacle identifier; and the matching of the converted first obstacle information and the converted second obstacle information according to the preset obstacle information fusion list includes: determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier and the second obstacle identifier; in response to the obstacle information fusion list including the first obstacle identifier, updating historical movement information corresponding to the first obstacle identifier in the historical obstacle movement information by adopting movement information corresponding to the first obstacle identifier; and in response to the obstacle information fusion list including the second obstacle identifier, updating the historical movement information corresponding to the second obstacle identifier in the historical obstacle movement information by adopting the movement information corresponding to the second obstacle identifier.
In some embodiments, the matching the converted first obstacle information and the converted second obstacle information according to the preset obstacle information fusion list includes: in response to the obstacle information fusion list not including the first obstacle identifier or the second obstacle identifier, determining a matching degree between the movement information corresponding to the first obstacle identifier and the movement information corresponding to the second obstacle identifier; and in response to the matching degree being greater than or equal to a preset threshold value, determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle, and updating historical movement information corresponding to the first obstacle identifier or historical movement information corresponding to the second obstacle identifier in the historical obstacle movement information according to the movement information corresponding to the first obstacle identifier or the movement information corresponding to the second obstacle identifier.
In some embodiments, the matching the converted first obstacle information and the converted second obstacle information according to the preset obstacle information fusion list includes: and responding to the fact that the matching degree is smaller than a preset threshold value, and adding first obstacle information or second obstacle information in the obstacle information fusion list.
In some embodiments, the fusing the first obstacle information and the second obstacle information according to the weight includes: filtering the second obstacle information to obtain filtered second obstacle information; and fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
In some embodiments, the obstacle information fusion list includes a generation time of motion information corresponding to each historical obstacle identifier; and the above method further comprises: determining whether the time difference between the generation time and the current time is greater than a preset time for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list; and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than the preset time.
In a second aspect, an embodiment of the present application provides an apparatus for detecting an obstacle, where the apparatus includes: the data acquisition unit is used for acquiring point cloud data and millimeter wave radar data; the data processing unit is used for respectively processing the point cloud data and the millimeter wave radar data to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data; a weight determination unit configured to determine a weight occupied by the first obstacle information and the second obstacle information when the first obstacle information and the second obstacle information are fused, based on the first obstacle information and the second obstacle information; an information fusion unit configured to fuse the first obstacle information and the second obstacle information according to the weight; the obstacle determining unit is used for determining an obstacle according to the fused obstacle information; the information fusion unit includes: the information conversion module is used for converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix; the information matching module is used for matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list, and the obstacle information fusion list comprises information of historical obstacles; the information fusion module is used for fusing the matched first obstacle information and the matched second obstacle information; the information matching module is further configured to: updating the historical barrier; and/or determining whether the converted first obstacle information and the converted second obstacle information indicate the same obstacle.
In some embodiments, the data processing unit is further configured to: clustering and tracking the point cloud data to obtain first obstacle information; and filtering the millimeter wave radar data to obtain second obstacle information.
In some embodiments, the first obstacle information includes a first position, a first velocity, an error of the first position, and an error of the first velocity of the obstacle, and the second obstacle information includes a second position, a second velocity, an error of the second position, and an error of the second velocity of the obstacle; and the weight determination unit includes: the first error determination module is used for determining a first comprehensive error according to the error of the first position and the error of the first speed; a second error determination module for determining a second composite error according to the error of the second position and the error of the second speed; and the weight determining module is used for determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first comprehensive error and the second comprehensive error.
In some embodiments, the information fusion unit includes: the information conversion module is used for converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix; the information matching module is used for matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list; and the information fusion module is used for fusing the matched first obstacle information and the matched second obstacle information.
In some embodiments, the obstacle information fusion list includes historical obstacle identifiers and historical movement information corresponding to the historical obstacle identifiers, the first obstacle information includes a first obstacle identifier and movement information corresponding to the first obstacle identifier, and the second obstacle information includes a second obstacle identifier and movement information corresponding to the second obstacle identifier; and the information matching module is further configured to: determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier and the second obstacle identifier; in response to the obstacle information fusion list including the first obstacle identifier, updating historical movement information corresponding to the first obstacle identifier in the historical obstacle movement information by adopting movement information corresponding to the first obstacle identifier; and in response to the obstacle information fusion list including the second obstacle identifier, updating the historical movement information corresponding to the second obstacle identifier in the historical obstacle movement information by adopting the movement information corresponding to the second obstacle identifier.
In some embodiments, the information matching module is further configured to: in response to the obstacle information fusion list not including the first obstacle identifier or the second obstacle identifier, determining a matching degree between the movement information corresponding to the first obstacle identifier and the movement information corresponding to the second obstacle identifier; and in response to the matching degree being greater than or equal to a preset threshold value, determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle, and updating historical movement information corresponding to the first obstacle identifier or historical movement information corresponding to the second obstacle identifier in the historical obstacle movement information according to the movement information corresponding to the first obstacle identifier or the movement information corresponding to the second obstacle identifier.
In some embodiments, the information matching module is further configured to: and responding to the fact that the matching degree is smaller than a preset threshold value, and adding first obstacle information or second obstacle information in the obstacle information fusion list.
In some embodiments, the information fusion unit is further configured to: filtering the second obstacle information to obtain filtered second obstacle information; and fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
In some embodiments, the obstacle information fusion list includes a generation time of motion information corresponding to each historical obstacle identifier; and the apparatus further comprises an information deleting unit configured to: determining whether the time difference between the generation time and the current time is greater than a preset time for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list; and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than the preset time.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; a storage device, configured to store one or more programs, which when executed by the one or more processors, cause the one or more processors to implement the method described in any of the above embodiments.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the method described in any of the above embodiments.
According to the method and the device for detecting the obstacle, after the point cloud data and the millimeter wave radar data are obtained, the point cloud data and the millimeter wave radar data are respectively processed, first obstacle information corresponding to the point cloud data and second obstacle information corresponding to the millimeter wave radar data are obtained, then the weight of the point cloud data and the second obstacle information when the point cloud data and the millimeter wave radar data are fused is determined according to the first obstacle information and the second obstacle information, the first obstacle information and the second obstacle information are fused according to the obtained weight, and finally the obstacle is determined according to the fused information. According to the method, the point cloud data obtained by the laser radar and the millimeter wave radar data obtained by the millimeter wave radar are fully utilized, and the accuracy and the sensitivity of obstacle detection are improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a flow diagram of one embodiment of a method for detecting obstacles according to the present application;
FIG. 2 is an exemplary system architecture diagram to which the present application may be applied;
FIG. 3 is a schematic diagram of one application scenario of a method for detecting obstacles according to the present application;
FIG. 4 is a flow diagram of one embodiment of fusing first obstacle information and second obstacle information in a method for detecting obstacles according to the present application;
FIG. 5 is a schematic block diagram of one embodiment of an apparatus for detecting obstacles according to the present application;
fig. 6 is a schematic structural diagram of a computer system suitable for implementing the terminal device or the server according to the embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows a flow 100 of one embodiment of a method for detecting an obstacle according to the present application. The method for detecting an obstacle of the present embodiment includes the steps of:
step 101, point cloud data and millimeter wave radar data are obtained.
In this embodiment, the point cloud data may be from a laser radar, and the millimeter wave radar data may be from a millimeter wave radar. The point cloud data and the millimeter wave radar data may respectively include multi-frame data, generation time of each frame of data is different, and each frame of data includes information of obstacles included in a driving environment where the vehicle is located. The above-described laser radar and millimeter wave radar may be mounted on a vehicle, such as an unmanned vehicle, an autonomous vehicle, or the like. The obstacle may be another vehicle, a pedestrian, or any object obstructing travel by the vehicle.
The method for detecting an obstacle according to the present embodiment is generally performed by a terminal or a server, which may be in communication connection with a vehicle. When acquiring the point cloud data and the millimeter wave radar data, the terminal or the server may directly acquire the point cloud data and the millimeter wave radar data from a storage device connected to each sensor (laser radar or millimeter wave radar) of the vehicle, or may acquire locally stored data. It is understood that when the method of the present embodiment is performed by a terminal, the terminal may be mounted on a vehicle.
When the method of the present embodiment is executed by a server, the server needs to acquire point cloud data and millimeter wave radar data from a vehicle, and the corresponding system architecture diagram is shown in fig. 2. In fig. 2, the system architecture 200 may include a vehicle 201, a network 202, and a server 203. Network 202 serves as a medium for providing a communication link between vehicle 201 and server 203. Network 202 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The vehicle 201 may be mounted with a laser radar and a millimeter wave radar, and may collect point cloud data and millimeter wave radar data of obstacles in the running environment of the vehicle.
The server 203 may be a server that provides various services, such as a background server that processes point cloud data and millimeter wave radar data of the vehicle 201. The background server may obtain the point cloud data and the millimeter wave radar data of the vehicle 201, and analyze the point cloud data and the millimeter wave radar data to obtain the obstacle in the driving environment where the vehicle 201 is located.
It should be noted that the method for detecting an obstacle provided in the embodiment of the present application is generally performed by the server 203, and accordingly, the apparatus for detecting an obstacle is generally disposed in the server 203.
It should be understood that the number of vehicles, networks, and servers in FIG. 2 is merely illustrative. There may be any number of vehicles, networks, and servers, as desired for implementation.
Returning to fig. 1, in step 102, the point cloud data and the millimeter wave radar data are processed respectively to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data.
In this embodiment, after the point cloud data and the millimeter wave radar data are obtained, the two data may be processed respectively to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data. The processing of the point cloud data and the millimeter wave radar data may be the same or different. For example, filtering processing may be performed on the point cloud data and the millimeter wave radar data, or filtering processing may be performed on the millimeter wave radar data by performing processing such as clustering on the point cloud data. The first obstacle information and the second obstacle information may have the same or different contents. In this embodiment, the first obstacle information and the second obstacle information may both include information such as a position, a speed, an error in position, an error in speed, and the like of the obstacle, and the first obstacle information may further include classification information, contour information, and the like of the obstacle.
In some optional implementation manners of this embodiment, the processing of the point cloud data and the millimeter wave radar data in step 102 may be implemented by the following steps that are not shown in fig. 1: clustering and tracking the point cloud data to obtain first obstacle information; and filtering the millimeter wave radar data to obtain second obstacle information.
In the implementation mode, the point cloud data can be clustered (the obstacles in the driving environment are determined by segmenting and classifying the point cloud data) and tracked (the position and the speed of the detected obstacles are determined), so that first obstacle information is obtained; the millimeter wave radar data may be subjected to filtering processing to obtain the second obstacle information, where the filtering processing may include, but is not limited to: kalman filtering, extended Kalman filtering, lossless Kalman filtering (Unscented Kalman Filter, UKF).
Step 103, determining the weight occupied by the first obstacle information and the second obstacle information during fusion according to the first obstacle information and the second obstacle information.
In this embodiment, the error of the position and the error of the velocity may be used as the sensing noise of the hardware device. For example, the error of the position and the error of the velocity in the first obstacle information may be regarded as the sensing noise of the laser radar, and the error of the position and the error of the velocity in the second obstacle information may be regarded as the sensing noise of the millimeter wave radar. When the terminal or the server fuses the first obstacle information and the second obstacle information, the terminal or the server may determine the weight of the first obstacle information and the second obstacle information according to the sensed noise.
In some optional implementations of this embodiment, the first obstacle information includes a first position, a first velocity, an error of the first position, and an error of the first velocity of the obstacle, and the second obstacle information includes a second position, a second velocity, an error of the second position, and an error of the second velocity of the obstacle. The above step 103 can be specifically realized by the following steps not shown in fig. 1: determining a first composite error according to the error of the first position and the error of the first speed; determining a second composite error according to the error of the second position and the error of the second speed; and determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first comprehensive error and the second comprehensive error.
In this embodiment, in order to comprehensively consider the influence of the error between the position and the speed on the fusion result when fusing the first obstacle information and the second obstacle information, a first comprehensive error may be determined from the error between the first position and the error between the first speed, and a second comprehensive error may be determined from the error between the second position and the error between the second speed. In determining the composite error, the error in position and the error in velocity may be weighted and superimposed. And finally, determining a first weight of the first obstacle information and a second weight of the second obstacle information according to the first comprehensive error and the second comprehensive error. It can be understood that, in this implementation, the weight of the obstacle information is inversely proportional to the error contained in itself, that is, the larger the error is, the smaller the weight occupied by the obstacle information is, which is beneficial to improving the accuracy of the obstacle information obtained by fusion.
And step 104, fusing the first obstacle information and the second obstacle information according to the weight.
The terminal or the server may fuse the first obstacle information and the second obstacle information according to the weight after determining the weight of the first obstacle information and the weight of the second obstacle information.
And 105, determining the obstacle according to the fused obstacle information.
After the first obstacle information and the second obstacle information are fused, the obstacle may be determined based on the fused obstacle information. It is understood that the merged obstacle information may include specific information such as classification, contour, position, speed, driving direction of the obstacle, and the terminal or the server may specifically determine the obstacle after obtaining the information, so as to provide a driving strategy.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for detecting an obstacle according to the present embodiment. In the application scenario of fig. 3, a laser radar 311 and a millimeter wave radar 312 are installed on the vehicle 31, wherein the laser radar 311 may obtain point cloud data by scanning a driving environment where the vehicle 31 is located, and the millimeter wave radar 312 may obtain millimeter wave radar data. The vehicle 31 sends the point cloud data and the millimeter wave radar data to the server 33, the server 33 processes the data to obtain first obstacle information and second obstacle information, then obtains a first weight and a second weight, and finally fuses and determines the obstacle information. The determined obstacle information is sent to the vehicle 31, and the vehicle 31 determines that a pedestrian 32 is in front of the vehicle through analyzing the information sent by the server 33, so that a driving strategy is established to avoid colliding with the pedestrian 32.
According to the method for detecting the obstacle provided by the embodiment of the application, after point cloud data and millimeter wave radar data are obtained, the point cloud data and the millimeter wave radar data are respectively processed to obtain first obstacle information corresponding to the point cloud data and second obstacle information corresponding to the millimeter wave radar data, then the weight of the point cloud data and the millimeter wave radar data when the point cloud data and the millimeter wave radar data are fused is determined according to the first obstacle information and the second obstacle information, the first obstacle information and the second obstacle information are fused according to the obtained weight, and finally the obstacle is determined according to the fused information. According to the method, the point cloud data obtained by the laser radar and the millimeter wave radar data obtained by the millimeter wave radar are fully utilized, and the accuracy and the sensitivity of obstacle detection are improved.
With continued reference to fig. 4, a flow 400 of fusing first obstacle information and second obstacle information in a method for detecting obstacles according to the present application is shown. As shown in fig. 4, in this embodiment, fusing the first obstacle information and the second obstacle information may be implemented by the following steps:
step 401, converting the first obstacle information and the second obstacle information to the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix.
In this embodiment, the first obstacle information is obtained by processing point cloud data, the point cloud data is collected by a laser radar, and the first obstacle information is located in a laser radar coordinate system. Similarly, the second obstacle information is located in the millimeter wave radar coordinate system. When the first obstacle information and the second obstacle information are fused, the first obstacle information and the second obstacle information need to be firstly converted into the same coordinate system, for example, the first obstacle information may be converted into a millimeter wave radar coordinate system, the second obstacle information may be converted into a laser radar coordinate system, or both the first obstacle information and the second obstacle information may be converted into a world coordinate system. The first transformation matrix may be a transformation matrix between a laser radar coordinate system and a world coordinate system, or may be a transformation matrix between a laser radar coordinate system and a millimeter wave radar coordinate system. Similarly, the second transformation matrix may be a transformation matrix between the millimeter-wave radar coordinate system and the laser radar coordinate system, or may be a transformation matrix between the millimeter-wave radar coordinate system and the world coordinate system.
And step 402, matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list.
In this embodiment, after the first obstacle information and the second obstacle information are converted into the same coordinate system, the converted first obstacle information and the converted second obstacle information may be matched in combination with a preset obstacle information fusion list. The obstacle information fusion list may include information of historical obstacles detected by the laser radar and the millimeter wave radar. Matching the converted first obstacle information and the converted second obstacle information with the obstacle information fusion list may update the historical obstacles on the one hand, and may determine whether the converted first obstacle information and the converted second obstacle information indicate the same obstacle on the other hand.
In some optional implementations of the present embodiment, the obstacle information fusion list includes historical obstacle identifiers and historical movement information corresponding to the historical obstacle identifiers. The first obstacle information includes a first obstacle identifier and motion information corresponding to the first obstacle identifier. The second obstacle information includes a second obstacle identifier and motion information corresponding to the second obstacle identifier. When a plurality of historical obstacle identifiers are included in the obstacle information fusion list, each historical obstacle identifier corresponds to one piece of historical movement information. Similarly, when there are a plurality of first obstacle identifiers, each first obstacle identifier corresponds to one piece of motion information. The step 402 can be implemented by the following steps not shown in fig. 4:
determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier and the second obstacle identifier; in response to the obstacle information fusion list including the first obstacle identifier, updating historical movement information corresponding to the first obstacle identifier in the historical obstacle movement information by adopting movement information corresponding to the first obstacle identifier; and in response to the obstacle information fusion list including the second obstacle identifier, updating the historical movement information corresponding to the second obstacle identifier in the historical obstacle movement information by adopting the movement information corresponding to the second obstacle identifier.
The obstacle identifiers are used to distinguish different obstacles, that is, different obstacle identifiers are given to different obstacles by the same radar sensor, and different obstacle identifiers are given to the same obstacle by different radar sensors. In this implementation, whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list can be determined based on the history obstacle identifier, the first obstacle identifier, and the second obstacle identifier. When the obstacle information fusion list includes the first obstacle identifier, the fact that the obstacle indicated by the first obstacle identifier is detected by the laser radar before is explained, and at the moment, historical movement information corresponding to the first obstacle identifier in the historical obstacle movement information is updated by the movement information corresponding to the first obstacle identifier. When the obstacle information fusion list comprises the second obstacle identifier, the fact that the obstacle indicated by the second obstacle identifier is detected by the millimeter wave radar before is explained, and at the moment, the historical movement information corresponding to the second obstacle identifier in the historical obstacle movement information is updated by the movement information corresponding to the second obstacle identifier. The historical movement information may include historical position, historical speed, etc. of the obstacle.
In some optional implementations of this embodiment, the step 402 may further include the following steps not shown in fig. 4:
in response to the obstacle information fusion list not including the first obstacle identifier or the second obstacle identifier, determining a matching degree between the movement information corresponding to the first obstacle identifier and the movement information corresponding to the second obstacle identifier; and in response to the matching degree being greater than or equal to a preset threshold value, determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle, and updating historical movement information corresponding to the first obstacle identifier or historical movement information corresponding to the second obstacle identifier in the historical obstacle movement information according to the movement information corresponding to the first obstacle identifier or the movement information corresponding to the second obstacle identifier.
When the first obstacle identifier or the second obstacle identifier is not included in the obstacle information fusion list, it is determined that the obstacle indicated by the first obstacle identifier has not been detected by the laser radar before, or that the obstacle indicated by the second obstacle identifier has not been detected by the millimeter wave radar before. At this time, a matching degree between the motion information corresponding to the first obstacle identifier and the motion information corresponding to the second obstacle identifier may be determined. The motion information may include position information and velocity information of the obstacle, and the matching degree may be represented by a euclidean distance, a manhattan distance, or a mahalanobis distance between positions, or may be represented by a distance between two velocity vectors. When the matching degree is expressed by the distance between two positions and the distance between two positions is smaller, it can be considered that the matching degree of two positions is higher. When the matching degree is greater than the preset threshold value, the first obstacle identifier and the second obstacle identifier are considered to indicate the same obstacle, and the historical movement information corresponding to the first obstacle identifier or the second obstacle identifier in the obstacle information fusion list can be updated by using the movement information corresponding to the first obstacle identifier or the movement information corresponding to the second obstacle identifier.
In some optional implementations of this embodiment, the step 402 may further include the following steps not shown in fig. 4:
and responding to the fact that the matching degree is smaller than a preset threshold value, and adding first obstacle information or second obstacle information in the obstacle information fusion list.
When the matching degree between the motion information corresponding to the first obstacle identifier and the motion information corresponding to the second obstacle identifier is smaller than a preset threshold value, it is described that the obstacle indicated by the first obstacle identifier is a newly-appeared obstacle or the obstacle indicated by the second obstacle identifier is a newly-appeared obstacle, and it is necessary to add obstacle information corresponding to an obstacle identifier which is not included in the obstacle information fusion list. That is, if the first obstacle identifier is not included in the obstacle information fusion list, the first obstacle information is added; and if the second obstacle identifier is not included in the obstacle information fusion list, adding the second obstacle information.
And step 403, fusing the matched first obstacle information and the matched second obstacle information.
After the converted first obstacle information is matched with the converted second obstacle information, which obstacles have been detected before and which obstacles are newly appeared can be determined, that is, the information of the obstacles can be accurately determined. Then, the obstacles detected by the laser radar and the obstacles detected by the millimeter wave radar are fused, so that the obstacles can be detected more accurately.
In some optional implementations of this embodiment, the step 403 may be specifically implemented by the following steps not shown in fig. 4:
filtering the second obstacle information to obtain filtered second obstacle information; and fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
In this implementation manner, the second obstacle information may be further subjected to filtering processing, so that noise in the second obstacle information may be removed. And then fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
In some optional implementations of the embodiment, the obstacle information fusion list includes a generation time of motion information corresponding to the historical obstacle identifier. The above method may further comprise the following steps not shown in fig. 4:
determining whether the time difference between the generation time and the current time is greater than a preset time for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list; and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than the preset time.
In this implementation, the motion information corresponding to the "expired" historical obstacle identifier in the obstacle information fusion list can be deleted in due time. The motion information can be judged according to the time difference between the generation time of each piece of motion information and the current time, and when the time difference is longer than the preset time, the piece of motion information is considered to be overdue, and then the piece of motion information can be deleted.
The method for detecting the obstacles provided by the embodiment of the application can accurately determine whether each obstacle is a historical obstacle, so that the obstacles can be better and more accurately detected by fusing the detection results of the laser radar and the millimeter wave radar.
With further reference to fig. 5, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of an apparatus for detecting an obstacle, which corresponds to the method embodiment shown in fig. 1, and which is particularly applicable to various electronic devices.
As shown in fig. 5, the apparatus 500 for detecting an obstacle of the present embodiment includes: a data acquisition unit 501, a data processing unit 502, a weight determination unit 503, an information fusion unit 504, and an obstacle determination unit 505.
The data obtaining unit 501 is configured to obtain point cloud data and millimeter wave radar data.
The data processing unit 502 is configured to process the point cloud data and the millimeter wave radar data respectively to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data.
A weight determination unit 503, configured to determine, based on the first obstacle information and the second obstacle information, a weight that the first obstacle information and the second obstacle information occupy when fused.
An information fusion unit 504 for fusing the first obstacle information and the second obstacle information according to the weight.
And an obstacle determining unit 505, configured to determine an obstacle according to the fused obstacle information.
In some optional implementations of this embodiment, the data processing unit 502 may further be configured to: clustering and tracking the point cloud data to obtain first obstacle information; and filtering the millimeter wave radar data to obtain second obstacle information.
In some optional implementations of this embodiment, the first obstacle information includes a first position, a first velocity, an error of the first position, and an error of the first velocity of the obstacle, and the second obstacle information includes a second position, a second velocity, an error of the second position, and an error of the second velocity of the obstacle. The weight determination unit 503 may further include a first error determination module, a second error determination module, and a weight determination module, which are not shown in fig. 5.
The first error determination module is used for determining a first comprehensive error according to the error of the first position and the error of the first speed.
And the second error determination module is used for determining a second comprehensive error according to the error of the second position and the error of the second speed.
And the weight determining module is used for determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first comprehensive error and the second comprehensive error.
In some optional implementations of this embodiment, the information fusion unit 504 may further include an information conversion module, an information matching module, and an information fusion module, which are not shown in fig. 5.
The information conversion module is used for converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix.
And the information matching module is used for matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list.
And the information fusion module is used for fusing the matched first obstacle information and the matched second obstacle information.
In some optional implementations of the embodiment, the obstacle information fusion list includes historical obstacle identifiers and historical movement information corresponding to the historical obstacle identifiers, the first obstacle information includes first obstacle identifiers and movement information corresponding to the first obstacle identifiers, and the second obstacle information includes second obstacle identifiers and movement information corresponding to the second obstacle identifiers. The information matching module may be further configured to: determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier and the second obstacle identifier; in response to the obstacle information fusion list including a first obstacle identifier, updating historical movement information corresponding to the first obstacle identifier in historical obstacle movement information by adopting movement information corresponding to the first obstacle identifier; and in response to the obstacle information fusion list including the second obstacle identifier, updating the historical movement information corresponding to the second obstacle identifier in the historical obstacle movement information by adopting the movement information corresponding to the second obstacle identifier.
In some optional implementation manners of this embodiment, the information matching module may be further configured to: in response to the obstacle information fusion list not including the first obstacle identifier or the second obstacle identifier, determining a matching degree between the motion information corresponding to each first obstacle identifier and the motion information corresponding to each second obstacle identifier; and in response to the matching degree being greater than or equal to a preset threshold value, determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle, and updating historical movement information corresponding to the first obstacle identifier or historical movement information corresponding to the second obstacle identifier in the historical obstacle movement information according to the movement information corresponding to the first obstacle identifier or the movement information corresponding to the second obstacle identifier.
In some optional implementation manners of this embodiment, the information matching module may be further configured to: and responding to the fact that the matching degree is smaller than a preset threshold value, and adding first obstacle information or second obstacle information in the obstacle information fusion list.
In some optional implementations of the present embodiment, the information fusion unit 504 may be further configured to: filtering the second obstacle information to obtain filtered second obstacle information; and fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
In some optional implementations of the embodiment, the obstacle information fusion list includes a generation time of motion information corresponding to each historical obstacle identifier. The apparatus 500 may further include an information deleting unit not shown in fig. 5, and the information deleting unit may be configured to: determining whether the time difference between the generation time and the current time is greater than a preset time for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list; and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than the preset time.
The device for detecting the obstacle, provided by the embodiment of the application, respectively processes the point cloud data and the millimeter wave radar data after the point cloud data and the millimeter wave radar data are acquired, so as to obtain first obstacle information corresponding to the point cloud data and second obstacle information corresponding to the millimeter wave radar data, then determines the weight of the point cloud data and the millimeter wave radar data when the point cloud data and the millimeter wave radar data are fused according to the first obstacle information and the second obstacle information, fuses the first obstacle information and the second obstacle information according to the obtained weight, and finally determines the obstacle according to the fused information. The above-mentioned device of this embodiment, make full use of the millimeter wave radar data that the cloud data that laser radar obtained and millimeter wave radar obtained, improved the accuracy and the sensitivity that the obstacle detected.
It should be understood that units 501 to 505, respectively, recited in the apparatus 500 for detecting obstacles correspond to the respective steps in the method described with reference to fig. 1. Thus, the operations and features described above for the method for detecting obstacles apply equally to the apparatus 500 and the units contained therein, and are not described in detail here. The corresponding elements of the apparatus 500 may cooperate with elements in a terminal or server to implement aspects of embodiments of the present application.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing a terminal device or server of an embodiment of the present application. The terminal device/server shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a machine-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601.
It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a data acquisition unit, a data processing unit, a weight determination unit, an information fusion unit, and an obstacle determination unit. The names of these units do not in some cases constitute a limitation on the units themselves, and for example, the data acquisition unit may also be described as a "unit that acquires point cloud data and millimeter wave radar data".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring point cloud data and millimeter wave radar data; respectively processing the point cloud data and the millimeter wave radar data to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data; determining the weight occupied by the first obstacle information and the second obstacle information during fusion according to the first obstacle information and the second obstacle information; fusing the first obstacle information and the second obstacle information according to the weight; and determining the obstacle according to the fused obstacle information.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (18)

1. A method for detecting an obstacle, the method comprising:
acquiring point cloud data and millimeter wave radar data;
processing the point cloud data and the millimeter wave radar data respectively to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data;
determining the weight occupied by the first obstacle information and the second obstacle information during fusion according to the first obstacle information and the second obstacle information;
fusing the first obstacle information and the second obstacle information according to the weight;
determining an obstacle according to the fused obstacle information;
the fusing the first obstacle information and the second obstacle information according to the weight includes:
converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix;
matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list, wherein the obstacle information fusion list comprises information of historical obstacles;
fusing the matched first obstacle information and the matched second obstacle information;
the matching the converted first obstacle information and the converted second obstacle information includes:
updating the historical barrier; and/or
It is determined whether the converted first obstacle information and the converted second obstacle information indicate the same obstacle.
2. The method of claim 1, wherein the separately processing the point cloud data and the millimeter wave radar data comprises:
clustering and tracking the point cloud data to obtain the first obstacle information;
and filtering the millimeter wave radar data to obtain the second obstacle information.
3. The method of claim 1, wherein the first obstacle information includes a first position, a first velocity, an error in the first position, and an error in the first velocity of the obstacle, and wherein the second obstacle information includes a second position, a second velocity, an error in the second position, and an error in the second velocity of the obstacle; and
the determining the weight occupied by the first obstacle information and the second obstacle information during fusion includes:
determining a first composite error according to the error of the first position and the error of the first speed;
determining a second composite error according to the error of the second position and the error of the second speed;
and determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first comprehensive error and the second comprehensive error.
4. The method according to claim 1, wherein the obstacle information fusion list includes historical obstacle identifiers and historical movement information corresponding to the historical obstacle identifiers, the first obstacle information includes a first obstacle identifier and movement information corresponding to the first obstacle identifier, and the second obstacle information includes a second obstacle identifier and movement information corresponding to the second obstacle identifier; and
the matching of the converted first obstacle information and the converted second obstacle information according to the preset obstacle information fusion list includes:
determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier, and the second obstacle identifier;
in response to the obstacle information fusion list including the first obstacle identifier, updating historical movement information corresponding to the first obstacle identifier in the obstacle information fusion list with movement information corresponding to the first obstacle identifier;
in response to the obstacle information fusion list including the second obstacle identifier, updating historical movement information corresponding to the second obstacle identifier in the obstacle information fusion list with movement information corresponding to the second obstacle identifier.
5. The method according to claim 4, wherein the matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list comprises:
in response to the obstacle information fusion list not including the first obstacle identifier or the second obstacle identifier, determining a degree of matching between motion information corresponding to the first obstacle identifier and motion information corresponding to the second obstacle identifier;
and in response to the matching degree being greater than or equal to a preset threshold value, determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle, and updating historical movement information corresponding to the first obstacle identifier or historical movement information corresponding to the second obstacle identifier in the obstacle information fusion list according to movement information corresponding to the first obstacle identifier or movement information corresponding to the second obstacle identifier.
6. The method according to claim 5, wherein the matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list comprises:
and in response to the matching degree being smaller than a preset threshold value, adding the first obstacle information or the second obstacle information in the obstacle information fusion list.
7. The method of claim 3, wherein said fusing the first obstacle information and the second obstacle information according to the weight comprises:
filtering the second obstacle information to obtain filtered second obstacle information;
and fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
8. The method according to claim 1, wherein the obstacle information fusion list includes a generation time of motion information corresponding to each historical obstacle identifier; and
the method further comprises the following steps:
determining whether the time difference between the generation time and the current time is greater than a preset time for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list;
and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than the preset time.
9. An apparatus for detecting an obstacle, the apparatus comprising:
the data acquisition unit is used for acquiring point cloud data and millimeter wave radar data;
the data processing unit is used for respectively processing the point cloud data and the millimeter wave radar data to obtain first obstacle information indicated by the point cloud data and second obstacle information indicated by the millimeter wave radar data;
a weight determination unit configured to determine, based on the first obstacle information and the second obstacle information, a weight occupied by the first obstacle information and the second obstacle information when fused;
an information fusion unit configured to fuse the first obstacle information and the second obstacle information according to the weight;
the obstacle determining unit is used for determining an obstacle according to the fused obstacle information;
the information fusion unit includes:
the information conversion module is used for converting the first obstacle information and the second obstacle information into the same coordinate system according to a preset first conversion matrix and/or a preset second conversion matrix;
the information matching module is used for matching the converted first obstacle information and the converted second obstacle information according to a preset obstacle information fusion list, and the obstacle information fusion list comprises information of historical obstacles;
the information fusion module is used for fusing the matched first obstacle information and the matched second obstacle information;
the information matching module is further configured to:
updating the historical barrier; and/or
It is determined whether the converted first obstacle information and the converted second obstacle information indicate the same obstacle.
10. The apparatus of claim 9, wherein the data processing unit is further configured to:
clustering and tracking the point cloud data to obtain the first obstacle information;
and filtering the millimeter wave radar data to obtain the second obstacle information.
11. The apparatus of claim 9, wherein the first obstacle information includes a first position, a first velocity, an error of the first position, and an error of the first velocity of the obstacle, and the second obstacle information includes a second position, a second velocity, an error of the second position, and an error of the second velocity of the obstacle; and
the weight determination unit includes:
a first error determination module, configured to determine a first composite error according to the error of the first position and the error of the first speed;
a second error determination module, configured to determine a second combined error according to the error of the second position and the error of the second speed;
and the weight determining module is used for determining a first weight of the first obstacle information and a second weight of the second obstacle information when the first obstacle information and the second obstacle information are fused according to the first comprehensive error and the second comprehensive error.
12. The apparatus according to claim 9, wherein the obstacle information fusion list includes a historical obstacle identifier and historical movement information corresponding to the historical obstacle identifier, the first obstacle information includes a first obstacle identifier and movement information corresponding to the first obstacle identifier, and the second obstacle information includes a second obstacle identifier and movement information corresponding to the second obstacle identifier; and
the information matching module is further configured to:
determining whether the first obstacle identifier and the second obstacle identifier are included in the obstacle information fusion list according to the historical obstacle identifier, the first obstacle identifier, and the second obstacle identifier;
in response to the obstacle information fusion list including the first obstacle identifier, updating historical movement information corresponding to the first obstacle identifier in the obstacle information fusion list with movement information corresponding to the first obstacle identifier;
in response to the obstacle information fusion list including the second obstacle identifier, updating historical movement information corresponding to the second obstacle identifier in the obstacle information fusion list with movement information corresponding to the second obstacle identifier.
13. The apparatus of claim 12, wherein the information matching module is further configured to:
in response to the obstacle information fusion list not including the first obstacle identifier or the second obstacle identifier, determining a degree of matching between motion information corresponding to the first obstacle identifier and motion information corresponding to the second obstacle identifier;
and in response to the matching degree being greater than or equal to a preset threshold value, determining that the first obstacle identifier and the second obstacle identifier indicate the same obstacle, and updating historical movement information corresponding to the first obstacle identifier or historical movement information corresponding to the second obstacle identifier in the obstacle information fusion list according to movement information corresponding to the first obstacle identifier or movement information corresponding to the second obstacle identifier.
14. The apparatus of claim 13, wherein the information matching module is further configured to:
and in response to the matching degree being smaller than a preset threshold value, adding the first obstacle information or the second obstacle information in the obstacle information fusion list.
15. The apparatus of claim 11, wherein the information fusion unit is further configured to:
filtering the second obstacle information to obtain filtered second obstacle information;
and fusing the first obstacle information and the filtered second obstacle information according to the first weight and the second weight.
16. The apparatus according to claim 9, wherein the obstacle information fusion list includes a generation time of motion information corresponding to each historical obstacle identifier; and
the apparatus further comprises an information deletion unit configured to:
determining whether the time difference between the generation time and the current time is greater than a preset time for the motion information corresponding to each historical obstacle identifier in the obstacle information fusion list;
and deleting the motion information corresponding to the historical obstacle identifier in response to the time difference being greater than the preset time.
17. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-8.
18. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 8.
CN202110882862.1A 2017-07-04 2017-07-04 Method and device for detecting obstacles Active CN113466822B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110882862.1A CN113466822B (en) 2017-07-04 2017-07-04 Method and device for detecting obstacles

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110882862.1A CN113466822B (en) 2017-07-04 2017-07-04 Method and device for detecting obstacles
CN201710541847.4A CN109212532B (en) 2017-07-04 2017-07-04 Method and apparatus for detecting obstacles

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201710541847.4A Division CN109212532B (en) 2017-07-04 2017-07-04 Method and apparatus for detecting obstacles

Publications (2)

Publication Number Publication Date
CN113466822A true CN113466822A (en) 2021-10-01
CN113466822B CN113466822B (en) 2024-06-25

Family

ID=64993117

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110882862.1A Active CN113466822B (en) 2017-07-04 2017-07-04 Method and device for detecting obstacles
CN201710541847.4A Active CN109212532B (en) 2017-07-04 2017-07-04 Method and apparatus for detecting obstacles

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201710541847.4A Active CN109212532B (en) 2017-07-04 2017-07-04 Method and apparatus for detecting obstacles

Country Status (1)

Country Link
CN (2) CN113466822B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109738884B (en) * 2018-12-29 2022-03-11 百度在线网络技术(北京)有限公司 Object detection method and device and computer equipment
CN109901183A (en) * 2019-03-13 2019-06-18 电子科技大学中山学院 Method for improving all-weather distance measurement precision and reliability of laser radar
CN111923898B (en) * 2019-05-13 2022-05-06 广州汽车集团股份有限公司 Obstacle detection method and device
CN112180910B (en) * 2019-06-18 2024-07-19 北京京东乾石科技有限公司 Mobile robot obstacle sensing method and device
CN110517483B (en) * 2019-08-06 2021-05-18 新奇点智能科技集团有限公司 Road condition information processing method and digital rail side unit
CN110531376B (en) * 2019-08-23 2022-04-22 畅加风行(苏州)智能科技有限公司 Obstacle detection and tracking method for port unmanned vehicle
CN110658531B (en) * 2019-08-23 2022-03-29 畅加风行(苏州)智能科技有限公司 Dynamic target tracking method for port automatic driving vehicle
CN110502019A (en) * 2019-09-06 2019-11-26 北京云迹科技有限公司 A kind of barrier-avoiding method and device of Indoor Robot
CN110796705B (en) * 2019-10-23 2022-10-11 北京百度网讯科技有限公司 Model error elimination method, device, equipment and computer readable storage medium
CN110866544B (en) * 2019-10-28 2022-04-15 杭州飞步科技有限公司 Sensor data fusion method and device and storage medium
CN110794406B (en) * 2019-11-12 2022-08-02 北京经纬恒润科技股份有限公司 Multi-source sensor data fusion system and method
CN113366341B (en) * 2020-01-06 2024-04-26 深圳市速腾聚创科技有限公司 Point cloud data processing method and device, storage medium and laser radar system
CN112001287B (en) * 2020-08-17 2023-09-12 禾多科技(北京)有限公司 Point cloud information generation method and device for obstacle, electronic equipment and medium
CN112378050B (en) * 2020-11-10 2021-09-14 珠海格力电器股份有限公司 Control method and device for air conditioning equipment, electronic equipment and storage medium
CN113296120B (en) * 2021-05-24 2023-05-12 福建盛海智能科技有限公司 Obstacle detection method and terminal

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000329852A (en) * 1999-05-17 2000-11-30 Nissan Motor Co Ltd Obstacle recognition device
JP2005175603A (en) * 2003-12-08 2005-06-30 Suzuki Motor Corp Method and system for displaying obstacle using radar
CN101975951A (en) * 2010-06-09 2011-02-16 北京理工大学 Field environment barrier detection method fusing distance and image information
US20150217765A1 (en) * 2014-02-05 2015-08-06 Toyota Jidosha Kabushiki Kaisha Collision prevention control apparatus
US20150338516A1 (en) * 2014-05-21 2015-11-26 Honda Motor Co., Ltd. Object recognition apparatus and vehicle
CN105109484A (en) * 2015-08-21 2015-12-02 奇瑞汽车股份有限公司 Target-barrier determining method and device
JP2016008922A (en) * 2014-06-25 2016-01-18 株式会社東芝 Sensor information fusion device
CN106291736A (en) * 2016-08-16 2017-01-04 张家港长安大学汽车工程研究院 Pilotless automobile track dynamic disorder object detecting method
DE102015112443A1 (en) * 2015-07-30 2017-02-02 Connaught Electronics Ltd. Method for determining a movement of a motor vehicle by means of fusion of odometry data, driver assistance system and motor vehicle
US20170096102A1 (en) * 2015-10-01 2017-04-06 Ford Global Technologies, Llc Parking Obstruction Locator and Height Estimator
CN106796758A (en) * 2014-10-22 2017-05-31 株式会社电装 Obstacle alert device
CN106908783A (en) * 2017-02-23 2017-06-30 苏州大学 Obstacle detection method based on multi-sensor information fusion

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7138938B1 (en) * 2005-05-06 2006-11-21 Ford Global Technologies, Llc System and method for preemptively sensing an object and selectively operating both a collision countermeasure system and a parking assistance system aboard an automotive vehicle
CN103176185B (en) * 2011-12-26 2015-01-21 上海汽车集团股份有限公司 Method and system for detecting road barrier
US10229363B2 (en) * 2015-10-19 2019-03-12 Ford Global Technologies, Llc Probabilistic inference using weighted-integrals-and-sums-by-hashing for object tracking
CN105629985B (en) * 2016-03-20 2018-09-04 北京工业大学 The three-dimensional obstacle avoidance system of 360 ° of indoor quadrotor drone

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000329852A (en) * 1999-05-17 2000-11-30 Nissan Motor Co Ltd Obstacle recognition device
JP2005175603A (en) * 2003-12-08 2005-06-30 Suzuki Motor Corp Method and system for displaying obstacle using radar
CN101975951A (en) * 2010-06-09 2011-02-16 北京理工大学 Field environment barrier detection method fusing distance and image information
US20150217765A1 (en) * 2014-02-05 2015-08-06 Toyota Jidosha Kabushiki Kaisha Collision prevention control apparatus
US20150338516A1 (en) * 2014-05-21 2015-11-26 Honda Motor Co., Ltd. Object recognition apparatus and vehicle
JP2016008922A (en) * 2014-06-25 2016-01-18 株式会社東芝 Sensor information fusion device
CN106796758A (en) * 2014-10-22 2017-05-31 株式会社电装 Obstacle alert device
DE102015112443A1 (en) * 2015-07-30 2017-02-02 Connaught Electronics Ltd. Method for determining a movement of a motor vehicle by means of fusion of odometry data, driver assistance system and motor vehicle
CN105109484A (en) * 2015-08-21 2015-12-02 奇瑞汽车股份有限公司 Target-barrier determining method and device
US20170096102A1 (en) * 2015-10-01 2017-04-06 Ford Global Technologies, Llc Parking Obstruction Locator and Height Estimator
CN106291736A (en) * 2016-08-16 2017-01-04 张家港长安大学汽车工程研究院 Pilotless automobile track dynamic disorder object detecting method
CN106908783A (en) * 2017-02-23 2017-06-30 苏州大学 Obstacle detection method based on multi-sensor information fusion

Also Published As

Publication number Publication date
CN109212532A (en) 2019-01-15
CN113466822B (en) 2024-06-25
CN109212532B (en) 2021-08-20

Similar Documents

Publication Publication Date Title
CN109212532B (en) Method and apparatus for detecting obstacles
CN109212530B (en) Method and apparatus for determining velocity of obstacle
US11579307B2 (en) Method and apparatus for detecting obstacle
EP3875985B1 (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN113240909B (en) Vehicle monitoring method, equipment, cloud control platform and vehicle road cooperative system
CN110654381B (en) Method and device for controlling a vehicle
CN110696826B (en) Method and device for controlling a vehicle
CN115339453B (en) Vehicle lane change decision information generation method, device, equipment and computer medium
KR20230036960A (en) Obstacle detection method and device, automatic driving vehicle, equipment and storage medium
CN112578781A (en) Data processing method, device, chip system and medium
CN112558036B (en) Method and device for outputting information
CN114724116B (en) Vehicle traffic information generation method, device, equipment and computer readable medium
CN115512336B (en) Vehicle positioning method and device based on street lamp light source and electronic equipment
CN115534935B (en) Vehicle travel control method, apparatus, electronic device, and computer-readable medium
CN114495049A (en) Method and device for identifying lane line
CN111427037A (en) Obstacle detection method and device, electronic equipment and vehicle-end equipment
CN112526477A (en) Method and apparatus for processing information
CN116168366B (en) Point cloud data generation method, model training method, target detection method and device
CN114620055B (en) Road data processing method and device, electronic equipment and automatic driving vehicle
CN115906001A (en) Multi-sensor fusion target detection method, device and equipment and automatic driving vehicle
CN116229418A (en) Information fusion method, device, equipment and storage medium
CN114120255A (en) Target identification method and device based on laser radar speed measurement
CN114049615A (en) Traffic object fusion association method and device in driving environment and edge computing equipment
CN113721235A (en) Object state determination method and device, electronic equipment and storage medium
CN117622118A (en) Method, device, equipment, medium and vehicle for determining obstacle orientation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant