CN112798293B - Test method and device of lane departure early warning system - Google Patents

Test method and device of lane departure early warning system Download PDF

Info

Publication number
CN112798293B
CN112798293B CN201911113612.0A CN201911113612A CN112798293B CN 112798293 B CN112798293 B CN 112798293B CN 201911113612 A CN201911113612 A CN 201911113612A CN 112798293 B CN112798293 B CN 112798293B
Authority
CN
China
Prior art keywords
vehicle
instrument panel
image frames
image
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911113612.0A
Other languages
Chinese (zh)
Other versions
CN112798293A (en
Inventor
段雄
吕传龙
汪少林
郎咸朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing CHJ Automobile Technology Co Ltd
Original Assignee
Beijing CHJ Automobile Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing CHJ Automobile Technology Co Ltd filed Critical Beijing CHJ Automobile Technology Co Ltd
Priority to CN201911113612.0A priority Critical patent/CN112798293B/en
Publication of CN112798293A publication Critical patent/CN112798293A/en
Application granted granted Critical
Publication of CN112798293B publication Critical patent/CN112798293B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the disclosure discloses a testing method and a testing device of a lane departure early warning system, relates to the technical field of automobiles, and mainly aims to reduce the testing cost of the lane departure early warning system. The main technical scheme of the embodiment of the disclosure comprises the following steps: acquiring a plurality of vehicle front road image frames and a plurality of vehicle instrument panel image frames in the running process of a vehicle, wherein each image frame has a corresponding acquisition time point, early warning information is marked in at least one vehicle instrument panel image frame in the plurality of vehicle instrument panel image frames, and the early warning information is marked in a vehicle instrument panel when a lane departure early warning system identifies that the vehicle has lane departure; selecting key image frames from the road image frames in front of the vehicles according to the acquisition time points of the vehicle instrument panel image frames marked with the early warning information and the acquisition time points of the road image frames in front of the vehicles; and determining lane departure parameters according to the key image frames, and testing the lane departure early warning system based on the lane departure parameters.

Description

Test method and device of lane departure early warning system
Technical Field
The embodiment of the disclosure relates to the technical field of automobiles, in particular to a method and a device for testing a lane departure early warning system.
Background
The lane departure early warning system is a system for assisting a vehicle driver in reducing traffic accidents caused by departure in an alarming mode, and the processing precision of the lane departure early warning system directly determines the driving safety of the vehicle, so the lane departure early warning system needs to be tested.
Currently, the testing of lane departure warning systems is usually done by means of expensive high-precision positioning equipment, such as Genesys positioning equipment, in cooperation with high-precision maps. However, the high-precision positioning equipment is expensive, and the process of constructing the high-precision map is also high in cost. Therefore, the existing lane departure early warning system has higher test cost.
Disclosure of Invention
In view of this, embodiments of the present disclosure provide a method and an apparatus for testing a lane departure warning system, and mainly aim to reduce the testing cost of the lane departure warning system.
The embodiment of the disclosure mainly provides the following technical scheme:
in a first aspect, an embodiment of the present disclosure provides a method for testing a lane departure warning system, where the method includes:
the method comprises the steps that a plurality of vehicle front road image frames and a plurality of vehicle instrument panel image frames in the running process of a vehicle are obtained, wherein each vehicle front road image frame and each vehicle instrument panel image frame have corresponding acquisition time points, early warning information is marked in at least one vehicle instrument panel image frame in the plurality of vehicle instrument panel image frames, and the early warning information is marked in a vehicle instrument panel when a lane departure early warning system recognizes that the vehicle deviates from a lane;
selecting key image frames from the road image frames in front of the vehicles according to the acquisition time points of the image frames of the instrument panel of the vehicle marked with the early warning information and the acquisition time points of the image frames of the road in front of the vehicles;
determining a lane departure parameter according to the key image frame, and testing the lane departure early warning system based on the lane departure parameter.
In a second aspect, an embodiment of the present disclosure provides a testing apparatus for a lane departure warning system, the apparatus including:
the system comprises an image acquisition unit, a lane departure warning system and a vehicle instrument panel, wherein the image acquisition unit is used for acquiring a plurality of road image frames in front of the vehicle and a plurality of vehicle instrument panel image frames in the running process of the vehicle, each road image frame in front of the vehicle and each vehicle instrument panel image frame have respective corresponding acquisition time points, at least one vehicle instrument panel image frame in the plurality of vehicle instrument panel image frames is marked with warning information, and the warning information is marked in the vehicle instrument panel when the lane departure warning system identifies that the vehicle has lane departure;
the system comprises a selecting unit, a pre-warning unit and a processing unit, wherein the selecting unit is used for selecting key image frames from image frames of road images in front of each vehicle according to the acquisition time points of the image frames of a vehicle instrument panel marked with the pre-warning information and the acquisition time points of the image frames of the road in front of each vehicle;
and the testing unit is used for determining a lane departure parameter according to the key image frame and testing the lane departure early warning system based on the lane departure parameter.
In a third aspect, an embodiment of the present disclosure provides a storage medium, where the storage medium includes a stored program, and when the program runs, a device on which the storage medium is located is controlled to execute the method for testing the lane departure warning system according to the first aspect.
In a fourth aspect, embodiments of the present disclosure provide a human-computer interaction device, which includes a storage medium; and one or more processors, the storage medium coupled with the processors, the processors configured to execute program instructions stored in the storage medium; the program instructions, when executed, implement the method of the lane departure warning system of the first aspect.
By means of the technical scheme, the method and the device for testing the lane departure early warning system provided by the embodiment of the disclosure firstly acquire a plurality of vehicle front road image frames and a plurality of vehicle instrument panel image frames in the vehicle driving process, and each vehicle front road image frame and each vehicle instrument panel image frame have respective corresponding acquisition time points. Early warning information is marked in at least one vehicle instrument panel image frame of the vehicle instrument panel image frames, wherein the early warning information is marked in the vehicle instrument panel when the lane departure early warning system identifies that the vehicle has lane departure. And selecting key image frames from the road image frames in front of the vehicles according to the acquisition time points of the image frames marked with the early warning information on the vehicle instrument panel and the acquisition time points of the image frames in the road images in front of the vehicles. And finally, determining lane departure parameters according to the key image frames, and testing the lane departure early warning system based on the lane departure parameters. Therefore, the embodiment of the disclosure can test the lane departure early warning system only by using the visual image in the driving process of the vehicle without using expensive positioning equipment and a high-precision map system. Therefore, the embodiment of the disclosure can reduce the cost of the lane departure early warning system test.
The foregoing description is only an overview of the embodiments of the present disclosure, and in order to make the technical means of the embodiments of the present disclosure more clearly understood, the embodiments of the present disclosure may be implemented in accordance with the content of the description, and in order to make the foregoing and other objects, features, and advantages of the embodiments of the present disclosure more clearly understood, the following detailed description of the embodiments of the present disclosure is given.
Drawings
Various additional advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the embodiments of the present disclosure. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 shows a flowchart of a testing method of a lane departure warning system provided by an embodiment of the present disclosure;
fig. 2 illustrates an example of a road image frame in front of a vehicle provided by an embodiment of the disclosure;
FIG. 3 illustrates an example diagram of a semantically segmented image provided by an embodiment of the present disclosure;
FIG. 4 illustrates an example of a road image frame in front of a vehicle provided by an embodiment of the disclosure;
FIG. 5 illustrates another example image frame of a road ahead of a vehicle provided by an embodiment of the present disclosure;
figure 6 illustrates an illustration of a bird's eye view provided by an embodiment of the present disclosure;
figure 7 illustrates another aerial view illustration provided by an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram illustrating a testing device of a lane departure warning system according to an embodiment of the present disclosure;
fig. 9 shows a schematic structural diagram of a testing device of another lane departure warning system provided by an embodiment of the disclosure.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
In a first aspect, an embodiment of the present disclosure provides a method for testing a lane departure warning system, as shown in fig. 1, the method mainly includes:
101. the method comprises the steps of obtaining a plurality of vehicle front road image frames and a plurality of vehicle instrument panel image frames in the running process of a vehicle, wherein each vehicle front road image frame and each vehicle instrument panel image frame have corresponding acquisition time points, early warning information is marked in at least one vehicle instrument panel image frame in the plurality of vehicle instrument panel image frames, and the early warning information is marked in a vehicle instrument panel when a lane departure early warning system identifies that the vehicle deviates from a lane.
In practical application, the lane departure warning system is a system for assisting a vehicle driver in reducing traffic accidents caused by vehicle departure in an alarm mode, and the processing precision of the lane departure warning system directly determines the completeness of vehicle driving. Therefore, in order to improve the accuracy of the lane departure warning system, the lane departure warning system needs to be tested.
In this embodiment, when testing the lane departure warning system, a plurality of image frames of a road ahead of the vehicle during the driving process of the vehicle need to be acquired, and the image frames of the road ahead of the vehicle can represent the road condition during the driving process of the vehicle. In addition, a plurality of vehicle instrument panel image frames in the vehicle driving process need to be acquired, and the plurality of vehicle instrument panel image frames include various driving parameters in the vehicle driving process, so that various driving states of the vehicle in the vehicle driving process can be represented. Early warning information is marked in at least one vehicle instrument panel image frame of the vehicle instrument panel image frames, the early warning information is marked in the vehicle instrument panel when a lane departure early warning system identifies that the vehicle has lane departure, and when the early warning information is marked in the vehicle instrument panel, a vehicle driver can know that the vehicle has lane departure.
In this embodiment, in order to ensure the accuracy of the lane departure warning system, the method for acquiring the image frames of the road in front of the vehicle and the image frames of the instrument panel of the vehicle at least comprises the following steps: two image acquisition devices with the same frame rate are adopted to respectively acquire road image frames in front of the vehicle and vehicle instrument panel image frames in the running process of the vehicle at the same frequency. The method can ensure that the time when the lane departure occurs to the vehicle and the warning system of the lane departure alarms through the instrument panel of the vehicle are matched to the maximum extent, and can improve the testing precision. When the two image acquisition devices respectively acquire the image frames, the vehicle runs between two lane lines of the road. In addition, the specific types of the two image capturing devices may be determined based on business requirements, and are not specifically limited in this embodiment. Optionally, the two image capturing devices are high frame rate shooting devices. Illustratively, high-definition pan-tilt cameras with the same specification and frame rate of more than 60 frames are adopted to respectively acquire road image frames in front of a vehicle and vehicle instrument panel image frames.
In practical application, the image frames of the road in front of the vehicle and the image frames of the instrument panel of the vehicle have at least the following two types according to different acquisition time points:
firstly, in order to ensure that the plurality of image frames of the road in front of the vehicle and the plurality of image frames of the instrument panel of the vehicle can completely reflect all driving conditions in the whole driving process of the vehicle, the plurality of image frames of the road in front of the vehicle and the plurality of image frames of the instrument panel of the vehicle are all image frames in the whole driving process from the start of the vehicle to the stop of the vehicle.
Secondly, in order to reduce the number of the image frames of the road in front of the vehicle and the image frames of the dashboard of the vehicle, the image frames of the road in front of the vehicle and the image frames of the dashboard of the vehicle are both image frames in the process that the vehicle enters a stable driving state, and the stable driving state can be a state that the vehicle continuously runs for a preset time at a vehicle speed within a preset vehicle speed range.
102. Selecting key image frames from the image frames of the road images in front of the vehicles according to the acquisition time points of the image frames of the instrument panel of the vehicle marked with the early warning information and the acquisition time points of the image frames in the road images in front of the vehicles.
In practical applications, in order to clearly distinguish the image frames of the road ahead of each vehicle, each image frame of the road ahead of the vehicle has its own frame number. The frame number represents the collection sequence of the image frames of the road in front of the vehicle, and the image frames of the road in front of the vehicle can be known as the collected image frames in the number from the frame number of the image frames of the road in front of the vehicle. It should be noted that the frame number has uniqueness. In addition, the image frames of the road in front of the vehicle also carry their respective acquisition time points, so that it is known when the image frames are acquired based on the acquisition time points.
Similarly, in order to clearly distinguish the vehicle dashboard image frames, the vehicle dashboard image frames each have their own frame number. The frame number represents the collection sequence of the image frames of the vehicle instrument panel, and the image frames of the vehicle instrument panel can be known to be the collected image frames from the frame number of the image frames of the vehicle instrument panel. It should be noted that the frame number has uniqueness. In addition, the vehicle dashboard image frames also carry their respective acquisition time points so as to know when the image frames are acquired based on the acquisition time points. It should be emphasized that, in order to distinguish the image frames of the road in front of the vehicle from the image frames of the dashboard of the vehicle, the image frames of the road in front of the vehicle and the image frames of the dashboard of the vehicle have different frame numbers.
In this embodiment, in order to know the real driving state of the vehicle when the lane departure warning system identifies that the vehicle has a lane departure, time synchronization processing needs to be performed on the image frames of the road in front of the vehicle and the image frames of the instrument panel of the vehicle, so as to determine the image frames with the same acquisition time point in the image frames of the road in front of the vehicle and the image frames of the instrument panel of the vehicle.
After time synchronization processing, selecting key image frames from the road image frames in front of the vehicle according to the acquisition time points of the image frames of the vehicle instrument panel marked with the early warning information and the acquisition time points of the image frames of the road in front of each vehicle, wherein the selected key image frames can represent the real driving state of the vehicle when the lane departure early warning system identifies that the vehicle has lane departure. The specific execution method for selecting the key image frame at least comprises the following two steps:
firstly, determining a time interval according to the acquisition time point of the vehicle instrument panel image frame marked with the early warning information, wherein the acquisition time point of the vehicle instrument panel image frame marked with the early warning information is contained in the time interval; and selecting the image frames of the road in front of each vehicle with the acquisition time points positioned in the time interval as key image frames.
It should be noted that, if a plurality of vehicle instrument panel image frames marked with the warning information exist in the vehicle instrument panel image frame, a corresponding time interval may be determined for each vehicle instrument panel image frame marked with the warning information, and a key image frame is selected, or one vehicle instrument image frame marked with the warning information may be selected to determine a time interval, and a key image frame is selected from the vehicle instrument image frame marked with the warning information.
Specifically, the road image frame in front of the vehicle corresponding to the collection time point of the vehicle instrument panel image frame marked with the early warning information can reflect the actual situation of vehicle deviation, and the collection time point of the vehicle instrument panel image frame marked with the early warning information needs to be included in the determined time interval. The method for determining the time interval at least comprises the following three steps:
1. and the acquisition time point of the vehicle instrument panel image frame marked with the early warning information is used as the maximum value of the time interval.
2. And the acquisition time point of the image frame of the vehicle instrument panel marked with the early warning information is used as the minimum value of the time interval.
3. And the acquisition time point of the image frame of the vehicle instrument panel marked with the early warning information is used as a value in a time interval. Illustratively, the collection time point of the vehicle instrument panel image frame identified with the warning information is taken as a middle value in the time interval.
Illustratively, the collection time point of the vehicle dashboard image frame identified with the warning information is "8: 00', the set time interval is as follows. And selecting each image frame of which the acquisition time point is positioned in a time interval "[7, 59, 8.
Secondly, determining a frame number difference d between vehicle instrument panel image frames and vehicle front road image frames acquired at the same acquisition time point, wherein each vehicle front road image frame and each vehicle instrument panel image frame have respective corresponding frame numbers, and the frame numbers represent the acquisition sequence of the image frames; determining a frame number n of an image frame marked with early warning information; selecting image frames with frame numbers in the interval [ n + d-t, n + d + t ] in the image frames of the road in front of the vehicle as key image frames, wherein t represents the preset number of frames.
In practical application, since the working inspiration time points of the image acquisition devices for acquiring the image frames of the road in front of the vehicle and the image frames of the instrument panel of the vehicle can be different, in order to select the key image frames, the frame number difference d between the image frames of the instrument panel of the vehicle and the image frames of the road in front of the vehicle acquired at the same acquisition time point needs to be determined, so as to complete the time synchronization of the image frames of the road in front of the vehicle and the image frames of the instrument panel of the vehicle.
It should be noted that, because the frame number of the image frames has different identification rules, the frame number difference d between the vehicle instrument panel image frame and the vehicle front road image frame collected at the same collecting time point is determined to be at least two of the following:
first, if the original frame number of the image frame of the road in front of the vehicle is a continuous number label, the original frame number is determined to be the frame number of the image frame of the road in front of each vehicle. And if the frame number of the image frame of the vehicle instrument panel is a continuous digital mark, determining that the original frame number is the frame number of the image frame of each vehicle instrument panel.
For example, the original frame numbers of the image frames of the road ahead of the vehicle are the numbers "1, 2, 3, 4 \8230;" etc., and the original frame numbers are determined to be the frame numbers of the image frames of the road ahead of each vehicle.
Secondly, if the original frame number of the image frames of the road in front of the vehicle is not a continuous number label, determining the acquisition sequence of the image frames of the road in front of the vehicle according to the original frame number, digitally numbering the image frames of the road in front of the vehicle by using continuous numbers based on the acquisition sequence, and determining the digital number of the image frames of the road in front of the vehicle as the frame number of the image frames of the road in front of the vehicle. And if the frame number of the image frames of the vehicle instrument panel is not a continuous digital mark, determining the acquisition sequence of the image frames of the vehicle instrument panel according to the original frame number, digitally numbering the image frames of the vehicle instrument panel by using continuous numbers based on the acquisition sequence, and determining the digital number of the image frames of the vehicle instrument panel as the frame number of the image frames of the vehicle instrument panel.
For example, the original frame number of the image frames of the road ahead of the vehicle is not a continuous number, but a type of "a, B, C \8230;" or the like, and the like, the acquisition sequence of the image frames of the road ahead of the vehicle is determined according to the original frame number, the image frames of the road ahead of the vehicle are digitally numbered by using continuous numbers based on the acquisition sequence, and the digital number of the image frames of the road ahead of the vehicle is determined as the frame number of the image frames of the road ahead of the vehicle, for example, the frame number of the image frames of the road ahead of the vehicle marked as "a" is determined as 1.
Therefore, in order to ensure that the frame number difference d can be determined sequentially, before determining the frame number difference d between the vehicle instrument panel image frame and the vehicle front road image frame collected at the same collecting time point, the method further comprises the following steps:
judging whether the original frame number of the image frames of the road in front of the vehicle is a continuous digital label or not, and if not, determining that the original frame number is the frame number of the image frames of the road in front of each vehicle; otherwise, determining the acquisition sequence of the road image frames in front of the vehicles according to the original frame number, digitally numbering the road image frames in front of the vehicles by using continuous numbers based on the acquisition sequence, and determining the digital numbers of the road image frames in front of the vehicles as the frame number of the road image frames in front of the vehicles. Judging whether the frame number of the image frame of the vehicle instrument panel is a continuous digital label or not, and if not, determining that the original frame number is the frame number of the image frame of the road in front of each vehicle; otherwise, determining the acquisition sequence of the road image frames in front of each vehicle according to the original frame number, digitally numbering the road image frames in front of each vehicle by using continuous numbers based on the acquisition sequence, and determining the digital number of the road image frames in front of each vehicle as the frame number of the road image frames in front of each vehicle.
103. Determining a lane departure parameter according to the key image frame, and testing the lane departure early warning system based on the lane departure parameter.
In this embodiment, the specific process of determining the lane departure parameter according to the key image frame at least includes: processing the key image frame by using a preset semantic segmentation algorithm to obtain a semantic segmentation image; converting the semantic segmentation image into a bird's-eye view by adopting a preset inverse perspective transformation matrix; and determining lane departure parameters according to the aerial view and a preset reference image.
Specifically, the preset semantic segmentation algorithm may be a deep learning semantic segmentation algorithm, and the semantic segmentation operation is performed on the key image frame by using the deep learning semantic segmentation algorithm, so as to obtain a semantic segmentation image. Illustratively, as shown in fig. 2, fig. 2 shows key image frames (only one key image frame is taken as an example here), and after performing semantic segmentation operation on the key image frames by using a deep learning semantic segmentation algorithm, a semantic segmented image as shown in fig. 3 is obtained.
Specifically, the inverse perspective transformation matrix may be obtained by the following method: taking an image of a road ahead of the vehicle as a reference image when the vehicle is parked between two lane lines, wherein a center line of the vehicle is parallel to the lane lines when the vehicle is parked; respectively selecting two lane line points on two lane lines of the reference image, wherein the four lane line points form a rectangle; and performing inverse perspective transformation matrix operation based on the four lane line points to obtain the inverse perspective transformation matrix.
Illustratively, as shown in fig. 4, fig. 4 is a reference image, and two lane line points are selected on two lane lines of the reference image to form a rectangle 20, as shown in fig. 5. And based on the coordinates of the four lane line points in the reference image, performing inverse perspective transformation matrix operation by adopting the following formula to obtain an inverse perspective transformation matrix. It should be noted that the inverse perspective matrix can convert fig. 4 into a bird's eye view as shown in fig. 6.
Figure BDA0002273450160000111
Wherein, [ x, y, z]Characterizing an inverse perspective transformation matrix; [ u, v, 1]]Characterizing coordinates in the reference image;
Figure BDA0002273450160000112
a transformation matrix is characterized, wherein values in the transformation matrix may be set based on traffic requirements.
Further, in order to make the transformation matrix more accurate, before performing inverse perspective transformation matrix operation, it is necessary to determine whether the reference image is an image when the vehicle is parked at the middle between two lane lines; if yes, the inverse perspective transformation matrix operation is directly carried out. Otherwise, correcting the positions of the four lane line points, and performing inverse perspective transformation matrix operation based on the corrected four lane line points. It should be noted that, the method for correcting the positions of the four lane line points may be: the amount of deviation of the vehicle center from the middle of the two lane lines is determined by increasing or decreasing the amount of deviation on the coordinates of the lane line points.
Illustratively, the semantic segmentation image is converted into a bird's-eye view image by using a preset inverse perspective transformation matrix, and the semantic segmentation image is converted into the bird's-eye view image shown in FIG. 7 by using the preset inverse perspective transformation matrix.
Specifically, the method for determining the lane departure parameter according to the bird's-eye view and the preset reference image at least comprises the following steps:
first, determining a reference position of a lane line in the reference image, and determining a position of the lane line in the bird's eye view; and determining the deviation position of the vehicle according to the coordinates of the reference position and the coordinates of the lane line position.
Specifically, the reference image includes two lane lines, and a distance between the two lane lines is the same as a distance between the two lane lines in the image in front of the vehicle. The process of determining the reference position of the lane line in the reference image is as follows: and determining coordinates of the two lane line positions in the reference image.
Specifically, the process of determining the lane line position in the bird's-eye view image includes: and determining coordinates of the positions of the two lane lines in the bird's-eye view. The coordinate system used by the bird's eye view and the reference image is the same.
Specifically, the method for determining the vehicle deviation position based on the coordinates of the reference position and the coordinates of the lane line position includes: and determining corresponding coordinates in the aerial view and the reference image, and determining a difference value between the corresponding coordinates in the aerial view and the reference image as the offset position of the vehicle.
Secondly, determining a vehicle offset curve according to the vehicle offset position on the basis of the first determination of the vehicle offset position; selecting a deviation time point; and deriving the vehicle deviation position corresponding to the deviation time point in the vehicle deviation amount curve to obtain the deviation speed corresponding to the deviation time point.
Specifically, if the lane line position in the bird's-eye view is a series of coordinates and the lane line position in the reference image is a series of coordinates, a vehicle offset curve can be obtained from the acquisition time point and the offset position of the image frame.
Specifically, the deviation time point may be a collection time point of an image frame of a vehicle instrument panel marked with the warning information, and a deviation position of the vehicle corresponding to the deviation time point in the vehicle offset curve is derived to obtain a deviation speed corresponding to the deviation time point.
In this embodiment, the specific process of testing the lane departure warning system based on the lane departure parameter is as follows: judging whether the lane departure parameters meet preset lane departure early warning conditions or not, and if so, judging that the early warning of the lane departure early warning system is normal; otherwise, judging that the lane departure early warning system is abnormal in early warning, and sending an early warning abnormal report.
Specifically, the determining whether the lane departure parameter satisfies the preset lane departure warning condition may include: judging whether the deviation position and the deviation speed of the acquisition time point of the image frame marked with the early warning information can cause the vehicle to deviate, if so, judging that the early warning of the lane deviation early warning system is normal; otherwise, judging that the lane departure early warning system is abnormal in early warning, and sending an early warning abnormal report.
The method for testing the lane departure early warning system provided by the embodiment of the disclosure comprises the steps of firstly obtaining a plurality of vehicle front road image frames and a plurality of vehicle instrument panel image frames in the vehicle driving process, wherein each vehicle front road image frame and each vehicle instrument panel image frame are provided with corresponding acquisition time points. Early warning information is marked in at least one vehicle instrument panel image frame of the vehicle instrument panel image frames, wherein the early warning information is marked in the vehicle instrument panel when the lane departure early warning system identifies that the vehicle has lane departure. And selecting key image frames from the road image frames in front of the vehicles according to the acquisition time points of the image frames marked with the early warning information on the vehicle instrument panel and the acquisition time points of the image frames in the road images in front of the vehicles. And finally, determining lane departure parameters according to the key image frames, and testing the lane departure early warning system based on the lane departure parameters. Therefore, the embodiment of the disclosure can test the lane departure early warning system only by using the visual image in the driving process of the vehicle without using expensive positioning equipment and a high-precision map system. Therefore, the embodiment of the disclosure can reduce the cost of the lane departure early warning system test.
In a second aspect, according to the method shown in fig. 1, another embodiment of the present disclosure further provides a testing apparatus of a lane departure warning system, as shown in fig. 8, the apparatus mainly includes:
the image acquisition unit 31 is configured to acquire a plurality of road image frames in front of a vehicle and a plurality of vehicle instrument panel image frames in a vehicle driving process, where each of the road image frames in front of the vehicle and each of the vehicle instrument panel image frames have a corresponding acquisition time point, and early warning information is identified in at least one of the vehicle instrument panel image frames in the plurality of vehicle instrument panel image frames, where the early warning information is identified in a vehicle instrument panel when a lane departure early warning system identifies that a vehicle has a lane departure;
the selecting unit 32 is configured to select a key image frame from image frames of road images in front of each vehicle according to a collection time point of a vehicle instrument panel image frame identified with early warning information and a collection time point of each road image frame in front of the vehicle;
and the testing unit 33 is configured to determine a lane departure parameter according to the key image frame, and test the lane departure warning system based on the lane departure parameter.
The embodiment of the disclosure provides a testing device of a lane departure early warning system, which firstly obtains a plurality of vehicle front road image frames and a plurality of vehicle instrument panel image frames in the vehicle driving process, wherein each vehicle front road image frame and each vehicle instrument panel image frame are provided with corresponding acquisition time points. Early warning information is marked in at least one vehicle instrument panel image frame of the vehicle instrument panel image frames, wherein the early warning information is marked in the vehicle instrument panel when the lane departure early warning system identifies that the vehicle has lane departure. And selecting key image frames from the road image frames in front of the vehicles according to the acquisition time points of the image frames marked with the early warning information on the vehicle instrument panel and the acquisition time points of the image frames in the road images in front of the vehicles. And finally, determining lane departure parameters according to the key image frames, and testing the lane departure early warning system based on the lane departure parameters. Therefore, the embodiment of the disclosure can test the lane departure early warning system by only using the visual image in the driving process of the vehicle without using expensive positioning equipment and a high-precision map system. Therefore, the embodiment of the disclosure can reduce the cost of the lane departure early warning system test.
In some embodiments, as shown in fig. 9, the selecting unit 32 includes:
the first determining module 321 is configured to determine a time interval according to an acquisition time point of a vehicle instrument panel image frame identified with the early warning information, where the acquisition time point of the vehicle instrument panel image frame identified with the early warning information is included in the time interval;
the first selecting module 322 is configured to select the image frames of the road ahead of each vehicle whose collection time point is located in the time interval as the key image frame.
In some embodiments, as shown in fig. 9, the selecting unit 32 includes:
the second determining module 323 is configured to determine a frame number difference d between vehicle instrument panel image frames and vehicle front road image frames acquired at the same acquisition time point, where each of the vehicle front road image frames and each of the vehicle instrument panel image frames have their respective corresponding frame numbers, and the frame numbers represent acquisition orders of the image frames;
a third determining module 324, configured to determine a frame number n of the vehicle instrument panel image frame marked with the warning information;
the second selecting module 325 is configured to select a vehicle front road image frame with a frame number within an interval [ n + d-t, n + d + t ] as the key image frame, where t represents a preset frame number.
In some embodiments, as shown in fig. 9, the apparatus further comprises:
a first judging unit 34, configured to judge whether an original frame number of each image frame of the road ahead of the vehicle in each image of the road ahead of the vehicle is a consecutive number label, and if not, determine that the original frame number is the frame number of each image frame of the road ahead of the vehicle; otherwise, determining the acquisition sequence of the road image frames in front of the vehicles according to the original frame number, digitally numbering the road image frames in front of the vehicles by using continuous numbers based on the acquisition sequence, and determining the digital number of the road image frames in front of the vehicles as the frame number of each image frame;
the second judging unit 35 is configured to judge whether the frame number of the vehicle dashboard image frame is a continuous digital mark, and if not, determine that the original frame number is the frame number of each vehicle dashboard image frame; otherwise, determining the acquisition sequence of each vehicle instrument panel image frame according to the original frame number, digitally numbering each vehicle instrument panel image frame by using continuous numbers based on the acquisition sequence, and determining the digital number of each vehicle instrument panel image frame as the frame number of each vehicle instrument panel image frame.
In some embodiments, as shown in fig. 9, the test unit 33 includes:
the processing module 331 is configured to process the key image frame by using a preset semantic segmentation algorithm to obtain a semantic segmentation image;
a conversion module 332, configured to convert the semantic segmentation image into an aerial view by using a preset inverse perspective transformation matrix;
a fourth determining module 333, configured to determine a lane departure parameter according to the bird's eye view and a preset reference image.
In some embodiments, as shown in fig. 9, a fourth determining module 333 is configured to determine a reference position of a lane line in the reference image and determine a position of a lane line in the bird's eye view; and selecting a vehicle deviation position according to the coordinates of the reference position and the coordinates of the lane line position.
In some embodiments, as shown in fig. 9, the fourth determining module 333 is further configured to determine a vehicle offset curve according to the vehicle offset position; selecting a deviation time point; and deriving the vehicle deviation position corresponding to the deviation time point in the vehicle deviation amount curve to obtain the deviation speed corresponding to the deviation time point.
In some embodiments, as shown in fig. 9, the test unit 33 further includes:
an obtaining module 334, configured to take an obtained road image in front of the vehicle when the vehicle is parked between two lane lines as a reference image, where a center line of the vehicle is parallel to the lane lines when the vehicle is parked;
a third selecting module 335, configured to select two lane line points on two lane lines of the reference image, where four lane line points form a rectangle;
and the operation module 336 is configured to perform inverse perspective transformation matrix operation based on the four lane line points to obtain the inverse perspective transformation matrix.
In some embodiments, as shown in fig. 9, the test unit 33 further includes:
a third judging module 337, configured to judge whether the reference image is an image when the vehicle is parked in the middle of two lane lines; if not, correcting the positions of the four lane line points, and performing inverse perspective transformation matrix operation based on the corrected four lane line points.
In some embodiments, as shown in fig. 9, the acquiring unit 31 is configured to respectively acquire the image frames of the road ahead of the vehicle and the image frames of the dashboard of the vehicle during the driving of the vehicle at the same frequency by using two image acquiring devices with the same frame rate.
In some embodiments, as shown in fig. 9, the test unit 33 includes:
a fourth determining module 338, configured to determine whether the lane departure parameter meets a preset lane departure warning condition; if so, judging that the lane departure early warning system has normal early warning; otherwise, judging that the lane departure early warning system is abnormal in early warning, and sending an early warning abnormal report.
The test device of the lane departure warning system provided by the embodiment of the second aspect may be used to execute the test method of the lane departure warning system provided by the embodiment of the first aspect, and the related meanings and specific embodiments may be referred to the related descriptions in the embodiment of the first aspect, and will not be described in detail here.
In a third aspect, an embodiment of the present disclosure provides a storage medium, where the storage medium includes a stored program, and when the program runs, the apparatus on which the storage medium is located is controlled to execute the test method of the lane departure warning system according to the first aspect.
The storage medium may include volatile memory in a computer readable medium, random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
In a fourth aspect, embodiments of the present disclosure provide a human-computer interaction device, which includes a storage medium; and one or more processors, the storage medium coupled with the processors, the processors configured to execute program instructions stored in the storage medium; the program instructions when executed perform the method of testing a lane departure warning system of the first aspect.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related descriptions of other embodiments.
As will be appreciated by one of skill in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, embodiments of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising one of 8230; \8230;" 8230; "does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one of skill in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, embodiments of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (13)

1. A method for testing a lane departure warning system, the method comprising:
the method comprises the steps that a plurality of vehicle front road image frames and a plurality of vehicle instrument panel image frames in the running process of a vehicle are obtained, wherein each vehicle front road image frame and each vehicle instrument panel image frame have corresponding acquisition time points, early warning information is marked in at least one vehicle instrument panel image frame in the plurality of vehicle instrument panel image frames, and the early warning information is marked in a vehicle instrument panel when a lane departure early warning system recognizes that the vehicle deviates from a lane;
according to the acquisition time point of a vehicle instrument panel image frame marked with early warning information and the acquisition time point of each road image frame in front of the vehicle, selecting a key image frame from each road image frame in front of the vehicle, wherein the key image frame comprises the following steps: determining a frame number difference d between vehicle instrument panel image frames and vehicle front road image frames acquired at the same acquisition time point, wherein each vehicle front road image frame and each vehicle instrument panel image frame have respective corresponding frame numbers, and the frame numbers represent the acquisition sequence of the image frames; determining a frame number n of an image frame of a vehicle instrument panel marked with early warning information; selecting a road image frame in front of the vehicle with a frame number in an interval [ n + d-t, n + d + t ] as the key image frame, wherein t represents a preset frame number;
determining a lane departure parameter according to the key image frame, and testing the lane departure early warning system based on the lane departure parameter;
the method for acquiring a plurality of vehicle front road image frames and a plurality of vehicle instrument panel image frames in the vehicle driving process comprises the following steps:
two image acquisition devices with the same frame rate are adopted to respectively acquire road image frames in front of the vehicle and vehicle instrument panel image frames in the running process of the vehicle at the same frequency.
2. The method of claim 1, wherein selecting key image frames from each vehicle front road image frame according to the acquisition time point of the vehicle instrument panel image frame identified with the early warning information and the acquisition time point of each vehicle front road image frame comprises:
determining a time interval according to the acquisition time point of the vehicle instrument panel image frame marked with the early warning information, wherein the acquisition time point of the vehicle instrument panel image frame marked with the early warning information is contained in the time interval;
and selecting the image frames of the road in front of the vehicle with the acquisition time points positioned in the time interval as the key image frames.
3. The method of claim 1, further comprising:
judging whether the original frame number of each road image frame in front of the vehicle is a continuous digital label or not, and if not, determining that the original frame number is the frame number of each road image frame in front of the vehicle; otherwise, determining the acquisition sequence of the image frames of the road in front of the vehicle according to the original frame number, digitally numbering the image frames of the road in front of the vehicle by using continuous numbers based on the acquisition sequence, and determining the digital number of the image frames of the road in front of the vehicle as the frame number of the image frames of the road in front of the vehicle;
judging whether the frame number of the image frame of the vehicle instrument panel is a continuous digital label or not, and if not, determining that the original frame number is the frame number of each image frame of the vehicle instrument panel; otherwise, determining the acquisition sequence of each vehicle instrument panel image frame according to the original frame number, digitally numbering each vehicle instrument panel image frame by using continuous numbers based on the acquisition sequence, and determining the digital number of each vehicle instrument panel image frame as the frame number of each vehicle instrument panel image frame.
4. The method of claim 1, wherein determining lane departure parameters from the keyframe frames comprises:
processing the key image frame by using a preset semantic segmentation algorithm to obtain a semantic segmentation image;
converting the semantic segmentation image into a bird's-eye view by adopting a preset inverse perspective transformation matrix;
and determining the lane departure parameters according to the aerial view and a preset reference image.
5. The method of claim 4, wherein determining lane departure parameters from the bird's eye view and a preset reference image comprises:
determining a reference position of a lane line in the reference image, and determining a lane line position in the aerial view;
and determining the deviation position of the vehicle according to the coordinates of the reference position and the coordinates of the lane line position.
6. The method of claim 5, further comprising:
determining a vehicle offset curve according to the vehicle offset position;
selecting a deviation time point;
and deriving the vehicle deviation position corresponding to the deviation time point in the vehicle deviation amount curve to obtain the deviation speed corresponding to the deviation time point.
7. The method of claim 4, further comprising:
taking an image of a road ahead of the vehicle as a reference image when the vehicle is parked between two lane lines, wherein a center line of the vehicle is parallel to the lane lines when the vehicle is parked;
respectively selecting two lane line points on two lane lines of the reference image, wherein the four lane line points form a rectangle;
and performing inverse perspective transformation matrix operation based on the four lane line points to obtain the inverse perspective transformation matrix.
8. The method of claim 7, further comprising:
judging whether the reference image is an image when the vehicle is parked at the middle of the two lane lines;
if not, correcting the positions of the four lane line points, and performing inverse perspective transformation matrix operation based on the corrected four lane line points.
9. The method according to any one of claims 1-8, wherein evaluating the lane departure warning system based on the lane departure parameter comprises:
judging whether the lane departure parameters meet preset lane departure early warning conditions or not;
if so, judging that the lane departure early warning system has normal early warning;
otherwise, judging that the early warning of the lane departure early warning system is abnormal, and sending an early warning abnormal report.
10. A testing device for a lane departure warning system, the device comprising:
the system comprises an image acquisition unit, a data acquisition unit and a data processing unit, wherein the image acquisition unit is used for acquiring a plurality of road image frames in front of a vehicle and a plurality of vehicle instrument panel image frames in the running process of the vehicle, each road image frame in front of the vehicle and each vehicle instrument panel image frame have respective corresponding acquisition time points, early warning information is marked in at least one vehicle instrument panel image frame in the plurality of vehicle instrument panel image frames, and the early warning information is marked in a vehicle instrument panel when a lane departure early warning system identifies that the vehicle has lane departure;
the selecting unit is used for selecting key image frames from the vehicle front road image frames according to the acquisition time points of the vehicle instrument panel image frames marked with the early warning information and the acquisition time points of the vehicle front road image frames, and comprises the following steps: determining a frame number difference d between vehicle instrument panel image frames and vehicle front road image frames acquired at the same acquisition time point, wherein each vehicle front road image frame and each vehicle instrument panel image frame have respective corresponding frame numbers, and the frame numbers represent the acquisition sequence of the image frames; determining a frame number n of an image frame of a vehicle instrument panel marked with early warning information; selecting a vehicle front road image frame with a frame number in an interval [ n + d-t, n + d + t ] as the key image frame, wherein t represents a preset frame number;
the testing unit is used for determining lane departure parameters according to the key image frames and testing the lane departure early warning system based on the lane departure parameters;
the image acquisition unit is specifically used for acquiring road image frames in front of the vehicle and vehicle instrument panel image frames in the running process of the vehicle respectively at the same frequency by adopting two image acquisition devices with the same frame rate.
11. The apparatus of claim 10, wherein the selecting unit comprises:
the first determining module is used for determining a time interval according to the acquisition time point of the vehicle instrument panel image frame marked with the early warning information, wherein the acquisition time point of the vehicle instrument panel image frame marked with the early warning information is contained in the time interval;
and the first selection module is used for selecting the image frames of the road in front of the vehicle with the collection time points positioned in the time interval as the key image frames.
12. A storage medium comprising a stored program, wherein the program, when executed, controls a device in which the storage medium is located to execute a method of testing the lane departure warning system according to any one of claims 1 to 9.
13. A human-computer interaction device, characterized in that the device comprises a storage medium; and one or more processors, the storage medium coupled with the processors, the processors configured to execute program instructions stored in the storage medium; the program instructions when executed perform a method of testing a lane departure warning system as claimed in any one of claims 1 to 9.
CN201911113612.0A 2019-11-14 2019-11-14 Test method and device of lane departure early warning system Active CN112798293B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911113612.0A CN112798293B (en) 2019-11-14 2019-11-14 Test method and device of lane departure early warning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911113612.0A CN112798293B (en) 2019-11-14 2019-11-14 Test method and device of lane departure early warning system

Publications (2)

Publication Number Publication Date
CN112798293A CN112798293A (en) 2021-05-14
CN112798293B true CN112798293B (en) 2023-03-17

Family

ID=75803824

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911113612.0A Active CN112798293B (en) 2019-11-14 2019-11-14 Test method and device of lane departure early warning system

Country Status (1)

Country Link
CN (1) CN112798293B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070031559A (en) * 2005-09-15 2007-03-20 현대모비스 주식회사 A hadware in the loop simulation apparatus and a test method thereof for lane departure warning system
CN101915672A (en) * 2010-08-24 2010-12-15 清华大学 Testing device and testing method of lane departure warning system
CN201812368U (en) * 2010-08-24 2011-04-27 清华大学 Testing device for lane departure warning system
CN104590122A (en) * 2014-12-11 2015-05-06 重庆长安汽车股份有限公司 Testing device for driveway deviation alarm system and method
CN206797383U (en) * 2017-05-25 2017-12-26 安徽江淮汽车集团股份有限公司 Track line skew warning test system
CN107817018A (en) * 2016-09-12 2018-03-20 沃尔沃汽车公司 The test system and method for testing of lane line departure warning system
CN108263387A (en) * 2016-12-30 2018-07-10 意法半导体股份有限公司 For generating the method for lane departure warning, related system in the car
CN110203210A (en) * 2019-06-19 2019-09-06 厦门金龙联合汽车工业有限公司 A kind of lane departure warning method, terminal device and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8311283B2 (en) * 2008-07-06 2012-11-13 Automotive Research&Testing Center Method for detecting lane departure and apparatus thereof
TWI338643B (en) * 2009-01-19 2011-03-11 Univ Nat Taiwan Science Tech Lane departure warning method and system thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070031559A (en) * 2005-09-15 2007-03-20 현대모비스 주식회사 A hadware in the loop simulation apparatus and a test method thereof for lane departure warning system
CN101915672A (en) * 2010-08-24 2010-12-15 清华大学 Testing device and testing method of lane departure warning system
CN201812368U (en) * 2010-08-24 2011-04-27 清华大学 Testing device for lane departure warning system
CN104590122A (en) * 2014-12-11 2015-05-06 重庆长安汽车股份有限公司 Testing device for driveway deviation alarm system and method
CN107817018A (en) * 2016-09-12 2018-03-20 沃尔沃汽车公司 The test system and method for testing of lane line departure warning system
CN108263387A (en) * 2016-12-30 2018-07-10 意法半导体股份有限公司 For generating the method for lane departure warning, related system in the car
CN206797383U (en) * 2017-05-25 2017-12-26 安徽江淮汽车集团股份有限公司 Track line skew warning test system
CN110203210A (en) * 2019-06-19 2019-09-06 厦门金龙联合汽车工业有限公司 A kind of lane departure warning method, terminal device and storage medium

Also Published As

Publication number Publication date
CN112798293A (en) 2021-05-14

Similar Documents

Publication Publication Date Title
CN112069643B (en) Automatic driving simulation scene generation method and device
CN110796007B (en) Scene recognition method and computing device
CN109871745A (en) Identify method, system and the vehicle of empty parking space
CN107886104A (en) A kind of mask method of image
KR20160112580A (en) Apparatus and method for reconstructing scene of traffic accident using OBD, GPS and image information of vehicle blackbox
CN112683284B (en) Method and device for updating high-precision map
CN110956214B (en) Training method and device for automatic driving vision positioning model
CN114463984B (en) Vehicle track display method and related equipment
CN111483464A (en) Dynamic automatic driving lane changing method, equipment and storage medium based on road side unit
CN114782924A (en) Traffic light detection method and device for automatic driving and electronic equipment
CN112798293B (en) Test method and device of lane departure early warning system
CN113049264B (en) Test system and method for advanced driving assistance system of vehicle
CN117079238A (en) Road edge detection method, device, equipment and storage medium
CN116543271A (en) Method, device, electronic equipment and medium for determining target detection evaluation index
CN112284402B (en) Vehicle positioning method and device
CN112556703B (en) Method, device and system for updating high-precision map
CN112598314B (en) Method, device, equipment and medium for determining perception confidence of intelligent driving automobile
CN115762153A (en) Method and device for detecting backing up
Bubeníková et al. The ways of streamlining digital image processing algorithms used for detection of lines in transport scenes video recording
CN114943940A (en) Method, equipment and storage medium for visually monitoring vehicles in tunnel
CN112380313B (en) Method and device for updating confidence coefficient of high-precision map
CN115235484A (en) Method and device for generating high-precision map stop line
CN113085861A (en) Control method and device for automatic driving vehicle and automatic driving vehicle
CN115457138A (en) Position calibration method, system and storage medium based on radar and camera
CN112132115B (en) Image screening method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant