CN112712040B - Method, device, equipment and storage medium for calibrating lane line information based on radar - Google Patents

Method, device, equipment and storage medium for calibrating lane line information based on radar Download PDF

Info

Publication number
CN112712040B
CN112712040B CN202011641279.3A CN202011641279A CN112712040B CN 112712040 B CN112712040 B CN 112712040B CN 202011641279 A CN202011641279 A CN 202011641279A CN 112712040 B CN112712040 B CN 112712040B
Authority
CN
China
Prior art keywords
lane line
information
road
radar
line information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011641279.3A
Other languages
Chinese (zh)
Other versions
CN112712040A (en
Inventor
代尚猛
于永基
宫永玉
谢小忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weichai Power Co Ltd
Original Assignee
Weichai Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Weichai Power Co Ltd filed Critical Weichai Power Co Ltd
Priority to CN202011641279.3A priority Critical patent/CN112712040B/en
Publication of CN112712040A publication Critical patent/CN112712040A/en
Application granted granted Critical
Publication of CN112712040B publication Critical patent/CN112712040B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Probability & Statistics with Applications (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a method, a device, equipment and a storage medium for calibrating lane line information based on a radar, wherein the method comprises the following steps: acquiring static target information on two sides of a road through radar equipment installed on a vehicle; clustering the static target information, and performing curve fitting on the classified clusters; and performing feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve. According to the method for calibrating the lane line information, the millimeter wave radar is adopted to extract the road information, and the road information can be mutually checked with the lane line information identified by the camera under the condition that a system sensor is not added, so that the accuracy rate of lane line identification is greatly improved.

Description

Method, device, equipment and storage medium for calibrating lane line information based on radar
Technical Field
The application relates to the technical field of intelligent automobiles, in particular to a method, a device, equipment and a storage medium for calibrating lane line information based on a radar.
Background
The lane line information of the road is accurately identified, and the method is an important technical problem in the field of intelligent driving automobiles. Currently, LKA (Lane Keeping Assistance, lane assist and keeping system) is mainly based on a single-camera scheme, a lane line is identified through a single camera arranged on a vehicle, the deviation between the vehicle and the center of the lane is finally calculated, and the purpose of vehicle centering driving is achieved by controlling steering to reduce the deviation. However, the function only uses a single forward camera to sense the lane information, the lane line of an actual road is blurred, damaged and the like, so that the camera is greatly disturbed, sometimes misidentification occurs, and great danger is caused under a high-speed scene. The scheme based on double-camera recognition is adopted, the cost is increased, the two cameras are used for recognizing lane line information, and if the lane line is damaged, the situation that the two cameras are inaccurate in detection can occur, so that the two cameras cannot be well designed in a redundancy mode.
Disclosure of Invention
The embodiment of the disclosure provides a method, a device, equipment and a storage medium for calibrating lane line information based on a radar. The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key/critical elements nor delineate the scope of such embodiments. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
In a first aspect, an embodiment of the present disclosure provides a method for calibrating lane line information based on a radar, including:
acquiring static target information on two sides of a road through radar equipment installed on a vehicle;
clustering the static target information, and performing curve fitting on the classified clusters;
and performing feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve.
In one embodiment, before acquiring the stationary target information on both sides of the road by the radar apparatus mounted on the vehicle, further comprises:
and acquiring lane line information identified by the vehicle camera and confidence information of the lane line.
In one embodiment, after obtaining the lane line information identified by the vehicle camera and the confidence information of the lane line, the method further comprises:
judging the road type according to the lane line information;
if the road type is a straight road, the lane line information is not calibrated, and the vehicle control instruction is directly executed;
and if the road type is a curved road, acquiring static target information on two sides of the road through radar equipment installed on the vehicle.
In one embodiment, clustering stationary target information and curve fitting the classified clusters includes:
clustering the static target information by a clustering algorithm based on density growth;
and performing curve fitting on each classified cluster to obtain a fitted cubic function curve.
In one embodiment, after curve fitting the classified clusters, further comprising:
determining the confidence coefficient of the fitted curve according to the distance between the vehicle and the fitted curve and the number of stationary targets in the cluster where the fitted curve is located;
when the confidence coefficient is larger than or equal to a preset first threshold value, performing feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve;
and when the confidence coefficient is smaller than a preset first threshold value, extracting barrier information output by the radar and the camera, determining a track to be driven by the vehicle according to a control instruction to be executed, executing the control instruction if no barrier exists in the track within a preset time period, otherwise, not executing the control instruction, and sending alarm information.
In one embodiment, performing feature matching on the fitted curve and lane line information identified by the camera includes:
determining a first curvature and a first curvature change rate of a road according to lane line information identified by a camera;
determining a second curvature and a second curvature change rate of the road according to the fitted curve;
and if the difference value between the first curvature and the second curvature is smaller than a preset second threshold value and the difference value between the first curvature change rate and the second curvature change rate is smaller than a preset third threshold value, determining that the matching is successful.
In one embodiment, calibrating lane line information according to the fitted curve includes:
obtaining the confidence coefficient of the fitting curve and the confidence coefficient of the lane line recognized by the camera;
and correcting the characteristic information of the lane line in a weighted average mode according to the confidence degrees of the two.
In a second aspect, an embodiment of the present disclosure provides an apparatus for calibrating lane line information based on a radar, including:
the acquisition module is used for acquiring static target information on two sides of a road through radar equipment installed on a vehicle;
the curve fitting module is used for clustering the static target information and performing curve fitting on the classified clusters;
and the calibration module is used for carrying out feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve.
In a third aspect, an embodiment of the present disclosure provides a device for calibrating lane line information based on a radar, including a processor and a memory storing program instructions, the processor being configured to execute the method for calibrating lane line information based on the radar provided in the above embodiment when executing the program instructions.
In a fourth aspect, embodiments of the present disclosure provide a computer-readable medium having computer-readable instructions stored thereon that are executable by a processor to implement a method of calibrating lane line information based on a radar provided by the above embodiments.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
according to the method for calibrating the lane line information, the millimeter wave radar is adopted to extract the static target information on two sides of the road, the road guardrail, the tree and other information are utilized to extract the road characteristics, the road characteristics and the lane line information identified by the camera are mutually verified, and the verification accuracy and the system safety are greatly improved by adopting different road information for verification, so that the problem that in the prior art, the lane line identification of the camera is interfered due to the conditions of fuzzy lane lines, damage and the like of an actual road, and the inaccurate lane line identification is caused is solved. In addition, the method in the embodiment of the disclosure does not increase a system sensor, and reduces the cost.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a flow chart of a method for calibrating lane line information based on a radar, according to an example embodiment;
FIG. 2 is a flow chart of a method for calibrating lane line information based on a radar, according to an example embodiment;
FIG. 3 is a schematic diagram illustrating a clustering of stationary objects according to an example embodiment;
FIG. 4 is a schematic diagram illustrating a curve fit to clustered clusters in accordance with an exemplary embodiment;
FIG. 5 is a schematic diagram illustrating a configuration of an apparatus for calibrating lane line information based on a radar according to an exemplary embodiment;
FIG. 6 is a schematic diagram illustrating a configuration of an apparatus for calibrating lane line information based on a radar according to an exemplary embodiment;
fig. 7 is a schematic diagram of a computer storage medium shown according to an example embodiment.
Detailed Description
The following description and the drawings sufficiently illustrate specific embodiments of the application to enable those skilled in the art to practice them.
It should be understood that the described embodiments are merely some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of systems and methods that are consistent with aspects of the application as detailed in the accompanying claims.
In the description of the present application, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present application will be understood in specific cases by those of ordinary skill in the art. Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The embodiment of the disclosure provides a method for assisting in checking lane line detection by a camera, which comprises the steps of extracting static target information on two sides of a road through a forward millimeter wave radar of a vehicle, fitting the static target into a triple equation according to the extracted information, checking lane line information recognized by the camera through the equation, and allowing a system to perform large-angle steering control on the vehicle if the two information can be matched, otherwise, only allowing small-range control on the vehicle. The millimeter wave radar is adopted to extract road information, and the road information can be mutually checked with the camera under the condition of not increasing the sensor, so that the safety of the system is greatly improved, and the functional safety bottleneck of the system in the aspect of the sensor is solved.
The method for calibrating lane line information based on the radar according to the embodiment of the present application will be described in detail with reference to fig. 1 to fig. 4, and the method specifically includes the following steps:
s101 acquires stationary target information on both sides of a road by radar devices mounted on the vehicle.
In one possible implementation, before performing step S101, the method further includes obtaining lane line information identified by a forward camera mounted on the vehicle, where the identified lane line information and a confidence level of the lane line may be output by the camera mounted on the vehicle.
Further, the road type is determined according to the identified lane line, for example, the curvature of the road is calculated according to the identified lane line information, the larger the curvature is, the more curved the road is described, when the curvature of the road is larger than a preset curvature threshold value, the road is determined to be a curved road, and the curvature threshold value in the embodiment of the disclosure can be set by a person skilled in the art.
If the road type is judged to be a straight road, the road condition is simple, road line information is not required to be further checked, and a control instruction of the vehicle is directly executed. If the road type is judged to be a curved road, the road condition is complex, and in order to improve the accuracy of road line information identification, radar equipment on the vehicle is introduced to further check the lane line information.
Specifically, the information on both sides of the road can be obtained through millimeter wave radar equipment installed on the vehicle, and then the static target information on both sides of the road is extracted from the data shot by the millimeter wave radar, including tree information planted on both sides of the road, guardrail information on both sides of the road, street lamp information on both sides of the road and the like.
The method has the advantages that the millimeter wave radar on the vehicle is adopted to acquire the static target information at two sides of the road, so that the system cost is greatly reduced, the road information is extracted from different directions, and the information multidimensional property is represented.
S102, clustering the static target information, and performing curve fitting on the classified clusters.
In order to determine road characteristics according to the extracted static target information, the static target information is clustered first. In one possible implementation, the static target information is clustered by a clustering algorithm based on density growth, and a common clustering algorithm generally needs to specify the number of clusters to be clustered, but in this application scenario, the clustering algorithm for determining the number of clusters is not applicable because the environment is complex and the number of static targets is unknown. In one possible implementation, according to the concepts of the DBSCAN clustering algorithm, the OPTICS clustering algorithm, the dencolue clustering algorithm, and the region growing, starting with a seed point, processing the points within a certain spatial radius of the seed point, incorporating the points meeting the requirements into the same cluster, and discarding the points not meeting the requirements. And obtaining the clustered static target cluster.
Fig. 3 is a schematic diagram illustrating a static object clustering process according to an exemplary embodiment, where, as shown in fig. 3, the left half part is used for clustering static objects on two sides of a straight road, and two clusters can be obtained after clustering, and are located on two sides of the road respectively. The right half is to cluster the stationary targets at both sides of the curved road, and two clusters at both sides of the road can be obtained through clustering.
Further, in order to conveniently represent the road information, curve fitting is performed on each cluster after classification, and a fitted cubic function curve is obtained. FIG. 4 is a schematic diagram showing a curve fit to clustered clusters according to an exemplary embodiment, wherein the two thick vertical lines in the left half are the fitted curves, the curved thick line in the right half are the fitted curves, the fitted curves are cubic function curves, and equation ax is used 3 +bx 2 +cx+d=0, and the information of the fitted curve can be obtained by solving the equation.
And S103, performing feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve.
In one embodiment, after curve fitting is performed on the classified clusters, the method further comprises determining the confidence coefficient of the fitted curve according to the distance between the vehicle and the fitted curve and the number of stationary targets in the cluster where the fitted curve is located, specifically, the closer the distance between the vehicle and the fitted curve is, the more the number of stationary targets in the cluster where the fitted curve is located is, the higher the confidence coefficient of the fitted curve is, and the higher the confidence coefficient is, which means that the more accurately represents the actual characteristics of the road.
Further, the relation between the confidence coefficient of the fitting curve and the preset first threshold is determined, wherein the preset first threshold can be set by a person skilled in the art, and the embodiment of the disclosure is not limited in particular. When the confidence coefficient is smaller than a preset first threshold value, the confidence coefficient of the fitting curve is lower, and at the moment, the lane line information is not calibrated through the fitting curve. But extracting barrier information output by the radar and the camera, pre-drawing a track to be driven by the vehicle according to a turning control instruction to be executed, executing the turning control instruction if no barrier exists in the track within a preset time period, otherwise, not executing the turning control instruction, and sending alarm information to inform a driver.
And when the confidence coefficient is greater than or equal to a preset first threshold value, performing feature matching on the fitted curve and the lane line information identified by the camera, and if the matching is successful, executing a normal vehicle control instruction. And if the matching is unsuccessful, calibrating the lane line information according to the curve after the matching.
Specifically, if the confidence level of the fitted curve reaches the threshold value, it is indicated that the confidence level of the fitted curve is higher, and at this time, the lane line information may be further calibrated according to the fitted curve information. Selecting a fitting curve with highest confidence coefficient at two sides of a road, calculating information such as a second curvature, a second curvature change rate and the like of the curve, then calculating information such as a first curvature, a first curvature change rate and the like of the road according to lane line information identified by a camera, performing feature matching on road features determined by two methods, calculating a difference value between the first curvature and the second curvature, calculating a difference value between the first curvature change rate and the second curvature change rate, and if the difference value between the first curvature and the second curvature is smaller than a preset second threshold value and the difference value between the first curvature change rate and the second curvature change rate is smaller than a preset third threshold value, indicating that the difference value is in a preset range, and the lane information identified by the two methods is similar and the road features are matched. Wherein the second threshold and the third threshold are a range of differences, which can be set by the person skilled in the art.
If the road characteristics are matched, a normal vehicle control instruction can be executed, and the vehicle can be subjected to large-angle steering control. If the road characteristics are not matched, lane line information identified by the camera can be corrected according to the fitting curve information. And taking the result of the fusion of the two to control the vehicle.
In one possible implementation manner, the confidence coefficient of the fitting curve and the confidence coefficient of the lane line recognized by the camera are obtained, and the characteristic information of the lane line is corrected in a weighted average mode according to the confidence coefficient of the fitting curve and the confidence coefficient of the lane line recognized by the camera. For example, the road curvature calculated according to the fitted curve is 0.8, the confidence coefficient of the fitted curve is 0.7, the road curvature calculated according to the lane line identified by the camera is 0.4, the confidence coefficient of the lane line identified by the camera is 0.3, the road curvature after fusion is 0.8×0.7+0.4×0.3=0.68, and the final road curvature is 0.68.
And controlling the vehicle by taking the result of the fusion of the two, for example, judging the distance between the vehicle and the lane lines at two sides by the information such as the distance between the vehicle and the lane lines, the road curvature change rate and the like, finally calculating the deviation between the vehicle and the lane center, and reducing the deviation by controlling steering so as to achieve the purpose of central running of the vehicle.
According to the method, mutual verification can be performed, different sensors are used for detecting different characteristics of the road, lane line information is extracted, and the method is an independent system in terms of functional safety, and the mutual verification can maximally improve the accuracy of the verification.
In order to facilitate understanding of the lane line calibration method according to the embodiment of the present application, the following description is made with reference to fig. 2. As shown in fig. 2, the method mainly comprises the following steps:
firstly, lane line information of a vehicle, which is identified by a camera, is obtained, road types are classified according to the lane line information, if the road types are straight roads, control instructions of the vehicle are directly executed, and if the road types are curved roads, data at two sides of the road are obtained through millimeter wave radars on the vehicle. And calibrating the lane lines according to the data extracted by the millimeter wave radar.
Specifically, millimeter wave radars are adopted to obtain static target information at two sides of a road, clustering is carried out on the static target information through a clustering algorithm based on density growth, curve fitting is carried out on each cluster after classification, and a fitted cubic function curve is obtained. Determining the confidence coefficient of the fitted curve according to the distance between the vehicle and the fitted curve and the number of stationary targets in the cluster where the fitted curve is located, analyzing the barrier and the pre-description track when the confidence coefficient is smaller than a preset first threshold value, extracting barrier information output by a radar and a camera, determining the track to be driven by the vehicle according to a control instruction to be executed, executing the control instruction if no barrier exists in the track within a preset time period, otherwise, not executing the control instruction, and sending alarm information.
And when the confidence coefficient is greater than or equal to a preset first threshold value, performing feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, if the matching is unsuccessful, acquiring the confidence coefficient of the fitted curve and the confidence coefficient of the lane line identified by the camera, correcting the feature information of the lane line in a weighted average mode according to the confidence coefficient of the fitted curve and the confidence coefficient of the lane line, and performing vehicle control according to the fused data.
According to the method for calibrating the lane line information, the millimeter wave radar is adopted to extract the static target information on two sides of the road, the road guardrail, the tree and other information are utilized to extract the road characteristics, the road characteristics and the lane line information identified by the camera are mutually verified, and the verification accuracy and the system safety are greatly improved by adopting different road information for verification, so that the problem that in the prior art, the lane line identification of the camera is interfered due to the conditions of fuzzy lane lines, damage and the like of an actual road, and the inaccurate lane line identification is caused is solved. In addition, the method in the embodiment of the disclosure does not increase a system sensor, and reduces the cost.
The embodiment of the disclosure further provides an apparatus based on the radar calibration lane line information, which is configured to perform the method based on the radar calibration lane line information of the foregoing embodiment, as shown in fig. 5, and the apparatus includes:
an acquisition module 501 for acquiring stationary target information on both sides of a road through radar equipment installed on a vehicle;
the curve fitting module 502 is configured to cluster the static target information and perform curve fitting on the classified clusters;
and the calibration module 503 is configured to perform feature matching on the fitted curve and the lane line information identified by the camera, execute a normal vehicle control instruction if the matching is successful, and calibrate the lane line information according to the fitted curve if the matching is unsuccessful.
It should be noted that, when the apparatus based on the radar calibration lane line information provided in the above embodiment performs the method based on the radar calibration lane line information, only the division of the above functional modules is used for illustration, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the device based on the radar calibration lane line information provided in the above embodiment and the method embodiment based on the radar calibration lane line information belong to the same concept, which embody the detailed implementation process and are not described herein.
The embodiment of the disclosure also provides an electronic device corresponding to the method for calibrating lane line information based on the radar provided by the previous embodiment, so as to execute the method for calibrating lane line information based on the radar.
Referring to fig. 6, a schematic diagram of an electronic device according to some embodiments of the application is shown. As shown in fig. 6, the electronic device includes: a processor 600, a memory 601, a bus 602 and a communication interface 603, the processor 600, the communication interface 603 and the memory 601 being connected by the bus 602; the memory 601 stores a computer program executable on the processor 600, and the processor 600 executes the method for calibrating lane line information based on the radar according to any one of the above embodiments of the present application when the computer program is executed.
The memory 601 may include a high-speed random access memory (RAM: random Access Memory), and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory. The communication connection between the system network element and at least one other network element is implemented via at least one communication interface 603 (which may be wired or wireless), the internet, a wide area network, a local network, a metropolitan area network, etc. may be used.
Bus 602 may be an ISA bus, a PCI bus, an EISA bus, or the like. The buses may be divided into address buses, data buses, control buses, etc. The memory 601 is configured to store a program, and the processor 600 executes the program after receiving an execution instruction, and the method for calibrating lane line information based on the radar disclosed in any of the foregoing embodiments of the present application may be applied to the processor 600 or implemented by the processor 600.
The processor 600 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the methods described above may be performed by integrated logic circuitry in hardware or instructions in software in processor 600. The processor 600 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but may also be a Digital Signal Processor (DSP), application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 601 and the processor 600 reads the information in the memory 601 and performs the steps of the method described above in combination with its hardware.
The electronic equipment provided by the embodiment of the application and the method for calibrating the lane line information based on the radar provided by the embodiment of the application have the same beneficial effects as the method adopted, operated or realized by the electronic equipment based on the same inventive concept.
An embodiment of the present application further provides a computer readable storage medium corresponding to the method for calibrating lane line information based on the radar according to the foregoing embodiment, referring to fig. 7, the computer readable storage medium is shown as an optical disc 700, on which a computer program (i.e. a program product) is stored, where the computer program, when executed by a processor, performs the method for calibrating lane line information based on the radar according to any of the foregoing embodiments.
It should be noted that examples of the computer readable storage medium may also include, but are not limited to, a phase change memory (PRAM), a Static Random Access Memory (SRAM), a Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a flash memory, or other optical or magnetic storage medium, which will not be described in detail herein.
The computer readable storage medium provided by the above embodiment of the present application has the same advantages as the method adopted, operated or implemented by the application program stored in the computer readable storage medium, because the same inventive concept is adopted as the method for calibrating lane line information based on the radar provided by the embodiment of the present application.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (8)

1. A method of calibrating lane line information based on radar, comprising:
acquiring static target information on two sides of a road through radar equipment installed on a vehicle;
clustering the static target information, and performing curve fitting on the classified clusters; determining the confidence coefficient of the fitted curve according to the distance between the vehicle and the fitted curve and the number of stationary targets in the cluster where the fitted curve is located; when the confidence coefficient is larger than or equal to a preset first threshold value, performing feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve, wherein the method comprises the steps of obtaining the confidence coefficient of the fitted curve and the confidence coefficient of the lane line identified by the camera; correcting the characteristic information of the lane line in a weighted average mode according to the confidence coefficient of the two;
and when the confidence coefficient is smaller than a preset first threshold value, extracting barrier information output by the radar and the camera, determining a track to be driven by the vehicle according to a control instruction to be executed, executing the control instruction if no barrier exists in the track within a preset time period, otherwise, not executing, and sending out alarm information.
2. The method according to claim 1, further comprising, before acquiring the stationary target information on both sides of the road by the radar apparatus mounted on the vehicle:
and acquiring lane line information identified by the vehicle camera and confidence information of the lane line.
3. The method of claim 2, further comprising, after obtaining lane line information identified by a vehicle camera and confidence information of the lane line:
judging the road type according to the lane line information;
if the road type is a straight road, lane line information is not calibrated, and a vehicle control instruction is directly executed;
and if the road type is a curved road, acquiring static target information on two sides of the road through radar equipment installed on the vehicle.
4. The method of claim 1, wherein clustering the stationary target information and curve fitting the classified clusters comprises:
clustering the static target information by a clustering algorithm based on density growth;
and performing curve fitting on each classified cluster to obtain a fitted cubic function curve.
5. The method of claim 1, wherein feature matching the fitted curve with lane line information identified by the camera comprises:
determining a first curvature and a first curvature change rate of a road according to lane line information identified by a camera;
determining a second curvature and a second curvature change rate of the road according to the fitted curve;
and if the difference value between the first curvature and the second curvature is smaller than a preset second threshold value and the difference value between the first curvature change rate and the second curvature change rate is smaller than a preset third threshold value, determining that the matching is successful.
6. An apparatus for calibrating lane line information based on a radar, comprising:
the acquisition module is used for acquiring static target information on two sides of a road through radar equipment installed on a vehicle;
the curve fitting module is used for clustering the static target information and performing curve fitting on the classified clusters;
the calibration module is used for determining the confidence coefficient of the fitting curve according to the distance between the vehicle and the fitting curve and the number of stationary targets in the cluster where the fitting curve is located; when the confidence coefficient is larger than or equal to a preset first threshold value, performing feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve, wherein the method comprises the steps of obtaining the confidence coefficient of the fitted curve and the confidence coefficient of the lane line identified by the camera; correcting the characteristic information of the lane line in a weighted average mode according to the confidence coefficient of the two;
and when the confidence coefficient is smaller than a preset first threshold value, extracting barrier information output by the radar and the camera, determining a track to be driven by the vehicle according to a control instruction to be executed, executing the control instruction if no barrier exists in the track within a preset time period, otherwise, not executing, and sending out alarm information.
7. An apparatus for radar-based lane-line information, comprising a processor and a memory storing program instructions, the processor being configured, when executing the program instructions, to perform the method for radar-based lane-line information of any one of claims 1 to 5.
8. A computer readable medium having stored thereon computer readable instructions executable by a processor to implement a method of calibrating lane line information based on a radar according to any of claims 1 to 5.
CN202011641279.3A 2020-12-31 2020-12-31 Method, device, equipment and storage medium for calibrating lane line information based on radar Active CN112712040B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011641279.3A CN112712040B (en) 2020-12-31 2020-12-31 Method, device, equipment and storage medium for calibrating lane line information based on radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011641279.3A CN112712040B (en) 2020-12-31 2020-12-31 Method, device, equipment and storage medium for calibrating lane line information based on radar

Publications (2)

Publication Number Publication Date
CN112712040A CN112712040A (en) 2021-04-27
CN112712040B true CN112712040B (en) 2023-08-22

Family

ID=75547999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011641279.3A Active CN112712040B (en) 2020-12-31 2020-12-31 Method, device, equipment and storage medium for calibrating lane line information based on radar

Country Status (1)

Country Link
CN (1) CN112712040B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706731A (en) * 2021-07-15 2021-11-26 北京无限感测科技有限公司 ETC integrated antenna power self-adaptive adjusting method and device
CN113569800A (en) * 2021-08-09 2021-10-29 北京地平线机器人技术研发有限公司 Lane recognition and verification method and device, readable storage medium and electronic equipment
CN113673438A (en) * 2021-08-23 2021-11-19 上海商汤临港智能科技有限公司 Collision early warning method and device, electronic equipment and storage medium
CN113791414B (en) * 2021-08-25 2023-12-29 南京市德赛西威汽车电子有限公司 Scene recognition method based on millimeter wave vehicle-mounted radar view
CN113990052B (en) * 2021-10-29 2023-08-15 南京慧尔视防务科技有限公司 Incoming vehicle early warning method and device, storage medium and road protection vehicle
CN114280582A (en) * 2021-12-31 2022-04-05 中国第一汽车股份有限公司 Calibration and calibration method and device for laser radar, storage medium and electronic equipment
CN115236627B (en) * 2022-09-21 2022-12-16 深圳安智杰科技有限公司 Millimeter wave radar data clustering method based on multi-frame Doppler velocity dimension expansion

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108519605A (en) * 2018-04-09 2018-09-11 重庆邮电大学 Curb detection method based on laser radar and video camera
CN108960183A (en) * 2018-07-19 2018-12-07 北京航空航天大学 A kind of bend target identification system and method based on Multi-sensor Fusion
CN109532826A (en) * 2017-09-21 2019-03-29 天津所托瑞安汽车科技有限公司 A kind of radar anticollision method for early warning based on the optimization of lane line Visual identification technology
CN110174113A (en) * 2019-04-28 2019-08-27 福瑞泰克智能***有限公司 A kind of localization method, device and the terminal in vehicle driving lane
CN110962858A (en) * 2019-12-23 2020-04-07 潍柴动力股份有限公司 Target identification method and device
CN111247525A (en) * 2019-01-14 2020-06-05 深圳市大疆创新科技有限公司 Lane detection method and device, lane detection equipment and mobile platform
CN112132109A (en) * 2020-10-10 2020-12-25 北京百度网讯科技有限公司 Lane line processing and lane positioning method, device, equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109532826A (en) * 2017-09-21 2019-03-29 天津所托瑞安汽车科技有限公司 A kind of radar anticollision method for early warning based on the optimization of lane line Visual identification technology
CN108519605A (en) * 2018-04-09 2018-09-11 重庆邮电大学 Curb detection method based on laser radar and video camera
CN108960183A (en) * 2018-07-19 2018-12-07 北京航空航天大学 A kind of bend target identification system and method based on Multi-sensor Fusion
CN111247525A (en) * 2019-01-14 2020-06-05 深圳市大疆创新科技有限公司 Lane detection method and device, lane detection equipment and mobile platform
CN110174113A (en) * 2019-04-28 2019-08-27 福瑞泰克智能***有限公司 A kind of localization method, device and the terminal in vehicle driving lane
CN110962858A (en) * 2019-12-23 2020-04-07 潍柴动力股份有限公司 Target identification method and device
CN112132109A (en) * 2020-10-10 2020-12-25 北京百度网讯科技有限公司 Lane line processing and lane positioning method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112712040A (en) 2021-04-27

Similar Documents

Publication Publication Date Title
CN112712040B (en) Method, device, equipment and storage medium for calibrating lane line information based on radar
US10983523B2 (en) Autonomous driving support apparatus and method
US20220373353A1 (en) Map Updating Method and Apparatus, and Device
CN111507327B (en) Target detection method and device
US10698084B2 (en) Method and system for carrying out a calibration of a sensor
US20210216796A1 (en) Apparatus for managing driving pattern based on object recognition, vehicle driving controlling apparatus using the same and method thereof
US10592784B2 (en) Detection based on fusion of multiple sensors
CN112461257A (en) Method and device for determining lane line information
CN116872957A (en) Early warning method and device for intelligent driving vehicle, electronic equipment and storage medium
CN112885074B (en) Road information detection method and device
CN109583418B (en) Lane line deviation self-correction method and device based on parallel relation
CN116148821A (en) Laser radar external parameter correction method and device, electronic equipment and storage medium
CN112739599B (en) Vehicle lane change behavior identification method and device
US11386675B2 (en) Device and method for generating vehicle data, and system
CN112733778A (en) Vehicle front guide determination method and device and computer equipment
Lee et al. The real-time implementation for the parking line departure warning system
CN112101177A (en) Map construction method and device and carrier
US20240142590A1 (en) Online sensor alignment using feature registration
CN116625384B (en) Data association method and device and electronic equipment
JP2011216012A (en) Image recognition apparatus
US20220136859A1 (en) Apparatus and method for updating map
US20230258805A1 (en) Vehicle driving control system and method using sensor fusion technology
US20220390593A1 (en) Apparatus and method for monitoring surrounding environment of vehicle
KR102406520B1 (en) System and Method for correcting parameter for determining obstacle
CN115691127A (en) Obstacle detection method and device, electronic equipment and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant