CN111811828A - Unmanned vehicle driving test method, device, system and storage medium - Google Patents

Unmanned vehicle driving test method, device, system and storage medium Download PDF

Info

Publication number
CN111811828A
CN111811828A CN202010537890.5A CN202010537890A CN111811828A CN 111811828 A CN111811828 A CN 111811828A CN 202010537890 A CN202010537890 A CN 202010537890A CN 111811828 A CN111811828 A CN 111811828A
Authority
CN
China
Prior art keywords
frame
driving
image data
route
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010537890.5A
Other languages
Chinese (zh)
Other versions
CN111811828B (en
Inventor
齐云
周卓
方晓波
张辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Newpoint Intelligent Technology Group Co Ltd
Original Assignee
Newpoint Intelligent Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Newpoint Intelligent Technology Group Co Ltd filed Critical Newpoint Intelligent Technology Group Co Ltd
Publication of CN111811828A publication Critical patent/CN111811828A/en
Application granted granted Critical
Publication of CN111811828B publication Critical patent/CN111811828B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention provides a method, a device, a system and a storage medium for testing unmanned vehicle driving, which are applied to the technical field of vehicle testing, wherein the method comprises the following steps: acquiring first driving data of manual driving on a test route; acquiring second driving data of the unmanned vehicle driving on the virtual route corresponding to the test route; and grading the second driving data according to the first driving data. According to the embodiment of the invention, by the method, the effect of real vehicle running on the test route can be provided for the unmanned vehicle, the real reaction of the unmanned vehicle when the unmanned vehicle drives on the test route can be tested, and the accuracy and the safety of unmanned vehicle test are improved; and based on the collection convenience of first driving data, can provide abundant, comprehensive test route for the driving test of unmanned car, the test is more high-efficient and low-cost.

Description

Unmanned vehicle driving test method, device, system and storage medium
Technical Field
The invention relates to the technical field of vehicle testing, in particular to an unmanned vehicle driving testing method, an unmanned vehicle driving testing device, an unmanned vehicle driving testing system and a storage medium.
Background
The unmanned automobile is one of intelligent automobiles, is also called a wheeled mobile robot, and mainly achieves the purpose of unmanned driving by means of an intelligent driver which is mainly a computer system in the automobile.
In order to ensure the driving safety of the unmanned vehicle, the unmanned vehicle needs to be subjected to a safety driving test before formal driving application. However, the existing unmanned vehicle driving test method has the defects of test accuracy, safety and comprehensiveness.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are provided to provide an unmanned vehicle driving test method, an unmanned vehicle driving test apparatus, an unmanned vehicle driving test system, and a storage medium that overcome or at least partially solve the above problems.
According to one aspect of the invention, the embodiment of the invention discloses an unmanned vehicle driving test method, which comprises the following steps:
acquiring first driving data of manual driving on a test route;
acquiring second driving data of the unmanned vehicle driving on the virtual route corresponding to the test route;
and grading the second driving data according to the first driving data.
In an optional embodiment of the present invention, the first driving data includes multiple frames of road condition image data within a preset time period, and the method further includes:
calculating the frame integrity rate of the multi-frame road condition image data; the frame integrity rate is the ratio of the total frame number of the multi-frame road condition image data to the theoretical total frame number to be obtained;
judging whether the frame integrity rate of the multi-frame road condition image data is greater than or equal to a preset frame integrity rate threshold value or not;
and when the frame integrity rate of the multi-frame road condition image data is greater than or equal to the preset frame integrity rate threshold value, saving the multi-frame road condition image data as the first driving data.
In an optional embodiment of the present invention, the method further comprises:
acquiring the starting test time, the ending test time and the theoretical acquisition frequency of the multi-frame road condition image data, wherein the preset time period is a time period from the starting test time to the ending test time;
determining the theoretical total frame number to be obtained according to the time stamp of the starting test time, the time stamp of the ending test time and the theoretical acquisition frequency;
counting the actual frame loss number of the multi-frame road condition image data according to the time stamp of the starting test time, the time stamp of the ending test time, the theoretical acquisition frequency, the acquisition time stamp of the first frame image and the acquisition time stamp of the last frame image in the multi-frame road condition image data, and determining the frame loss number which cannot be interpolated in the actual frame loss number;
and determining the difference value between the theoretical total frame number to be obtained and the lost frame number which can not be interpolated as the total frame number of the multi-frame road condition image data.
In an optional embodiment of the present invention, the actual frame loss number includes at least one of a middle frame loss number, a front frame loss number, and a rear frame loss number; counting the actual frame loss number of the multi-frame road condition image data according to the time stamp of the starting test time, the time stamp of the ending test time, the theoretical acquisition frequency, the acquisition time stamp of the first frame image and the acquisition time stamp of the last frame image in the multi-frame road condition image data, and the method comprises the following steps:
determining the number of frames of the multi-frame road condition image data to be lost according to the time stamp of the starting test time, the acquisition time stamp of the first frame image in the multi-frame road condition image data and the theoretical acquisition frequency;
determining the number of rear lost frames of the multi-frame road condition image data according to the time stamp of the test ending time, the acquisition time stamp of the last frame of image in the multi-frame road condition image data and the theoretical acquisition frequency;
determining the middle frame loss number of the multi-frame road condition image data according to the theoretical acquisition frequency, the acquisition time stamp of the first frame image and the acquisition time stamp of the last frame image;
and counting the actual frame loss number of the multi-frame road condition image data based on the front frame loss number, the rear frame loss number and the middle frame loss number of the multi-frame road condition image data.
In an optional embodiment of the present invention, determining the number of lost frames that cannot be interpolated in the actual number of lost frames includes:
when the actual frame loss number comprises the previous frame loss number, the frame loss number which cannot be interpolated in the actual frame loss number comprises the previous frame loss number;
when the actual frame loss number comprises the post frame loss number, the frame loss number which cannot be interpolated in the actual frame loss number comprises the post frame loss number;
and when the actual frame loss number comprises the intermediate frame loss number, comparing the intermediate frame loss number with a preset pluggable frame number, and when the intermediate frame loss number is greater than the preset pluggable frame number, determining the difference value between the intermediate frame loss number and the preset pluggable frame number as the frame loss number which cannot be interpolated in the actual frame loss number.
In an optional embodiment of the present invention, the method further comprises:
determining the continuous frame loss number in the middle frame loss number of the multi-frame road condition image data;
and when the continuous frame loss number is more than or equal to a preset middle continuous maximum frame loss number, removing the multi-frame road condition image data from the first driving data.
In an optional embodiment of the present invention, the multiple frames of road condition image data include multiple frames of optical image data and multiple frames of radar image data, and the preset frame integrity threshold includes an optical frame integrity threshold and a radar frame integrity threshold;
judging whether the frame integrity rate of the multi-frame road condition image data is greater than or equal to a preset frame integrity rate threshold value or not, including:
judging whether the frame integrity rate of the multi-frame optical image data is greater than or equal to the optical frame integrity rate threshold value or not, and judging whether the frame integrity rate of the multi-frame radar image data is greater than or equal to the radar frame integrity rate threshold value or not;
when the frame integrity rate of the multiple frames of road condition image data is greater than or equal to the preset frame integrity rate threshold, saving the multiple frames of road condition image data as the first driving data, including:
and when the frame integrity rate of the multiple frames of optical image data is greater than or equal to the optical frame integrity rate threshold value and the frame integrity rate of the multiple frames of radar image data is greater than or equal to the radar frame integrity rate threshold value, storing the multiple frames of optical image data and the multiple frames of radar image data as the first driving data.
In an optional embodiment of the invention, the first driving data further comprises a first driving control parameter, and the second driving data comprises a second driving control parameter;
when the test route is a variable route, the variable route includes a plurality of events, each event including at least 2 options; scoring the second driving data according to the first driving data, including:
taking the option of the event corresponding to the first driving control parameter as a standard option of the variable route, and making a score of the option of the same event of the variable route;
and determining the score of the corresponding option of the second driving control parameter in the event according to the scores of the options of the same event of the variable route.
In an optional embodiment of the invention, the first driving data further comprises a first driving control parameter, and the second driving data comprises a second driving control parameter;
when the test route is an unchangeable route, scoring the second driving data according to the first driving data, including:
determining the first driving control parameter as a unique execution criterion for the invariable route;
scoring the second driving control parameter in accordance with the unique performance criteria.
According to another aspect of the present invention, an embodiment of the present invention further discloses an unmanned vehicle driving test system, which includes:
drive route simulation platform, unmanned car and install the collection equipment on artifical driving car, wherein:
the acquisition equipment is used for acquiring first driving data of manual driving on the test route;
the driving route simulation platform is used for simulating a virtual route corresponding to the test route for the unmanned vehicle running on the driving route simulation platform according to the first driving data, collecting second driving data of the unmanned vehicle driving on the virtual route, and grading the second driving data;
and the unmanned vehicle is used for driving control on the virtual route.
In an optional embodiment of the present invention, the first driving data includes a plurality of frames of optical image data, a plurality of frames of radar image data, and a first driving control parameter, and the collecting device includes an optical image collecting device, a radar image collecting device, and a driving control parameter collecting device, wherein:
the optical image acquisition equipment is used for acquiring the multiframe optical image data manually driven on a test route;
the radar image acquisition equipment is used for acquiring the multiframe radar image data manually driven on a test route;
the driving control parameter acquisition equipment is used for acquiring first driving control parameters of manual driving on the test route.
According to another aspect of the invention, the embodiment of the invention also discloses an unmanned vehicle driving testing device, which is characterized by comprising:
the first driving data acquisition module is used for acquiring first driving data of manual driving on the test route;
the second driving data acquisition module is used for acquiring second driving data of the unmanned vehicle driven on the virtual route corresponding to the test route;
and the second driving data scoring module is used for scoring the second driving data according to the first driving data.
According to another aspect of the present invention, an embodiment of the present invention further discloses an unmanned vehicle driving testing apparatus, including:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the unmanned vehicle driving testing method as described above.
According to another aspect of the present invention, the embodiment of the present invention also discloses a computer readable storage medium storing a computer program for causing a processor to execute the unmanned vehicle driving test method as described above.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, a test route is determined, first driving data of manual driving on the test route and the first driving data are obtained, the test route is simulated to obtain a virtual route, and the virtual route is provided for an unmanned vehicle, so that the effect of real vehicle driving on the test route can be achieved when the unmanned vehicle drives on the virtual route; then, second driving data of the unmanned vehicle driving on the virtual route corresponding to the test route are collected, the second driving data are scored according to the first driving data, the real reaction of the unmanned vehicle driving on the test route can be tested, and the accuracy and the safety of unmanned vehicle testing are improved; and based on the collection convenience of first driving data, can provide abundant, comprehensive test route for the driving test of unmanned car, the test is more high-efficient and low-cost.
Drawings
FIG. 1 is a schematic structural diagram of an unmanned vehicle driving test system according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating steps of a method for testing driverless vehicles in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of steps of a first driving data processing method in accordance with an embodiment of the present invention;
FIG. 4 is a schematic diagram of a checking process of multi-frame optical image data and multi-frame radar image data according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a complete test evaluation flow according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an unmanned vehicle driving test device according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
To solve the technical problem of the embodiment of the present invention, referring to fig. 1, a schematic structural diagram of an unmanned vehicle driving test system according to the embodiment of the present invention is shown, where the system specifically includes:
driving route simulation platform 101, unmanned car 102 and install collection equipment 103 on artifical driving car, wherein:
the acquisition equipment 103 is used for acquiring first driving data of manual driving on the test route;
the driving route simulation platform 101 is configured to simulate, according to the first driving data, a virtual route corresponding to the test route for the unmanned vehicle running on the driving route simulation platform, collect second driving data of the unmanned vehicle driving on the virtual route, and score the second driving data;
the unmanned vehicle 102 is configured to perform driving control on the virtual route.
In the embodiment of the invention, the manual driving automobile can be an automobile which is manually controlled and can freely drive on a real road, and the manual driving automobile which is applied in a large scale, such as a gasoline automobile and an electric automobile, is preferred.
The acquisition equipment can acquire first driving data of manual driving on each test route, wherein the first driving data comprises multiple frames of optical image data, multiple frames of radar image data and first driving control parameters. In an optional mode of the present application, the collecting device includes an optical image collecting device, a radar image collecting device, and a driving control parameter collecting device, wherein:
the optical image acquisition equipment is used for acquiring the multiframe optical image data manually driven on a test route;
the radar image acquisition equipment is used for acquiring the multiframe radar image data manually driven on a test route;
the driving control parameter acquisition equipment is used for acquiring first driving control parameters of manual driving on the test route.
The optical image acquisition device can be various types of cameras, such as an industrial camera, a high-speed camera, a wide-angle camera and the like, and can shoot road environment, weather environment, pedestrians or other running vehicles on a test route and the like to form multi-frame optical image data.
The radar image acquisition equipment detects the target by using the electromagnetic wave, so that information such as the distance from the target to an electromagnetic wave emission point, the distance change rate (radial speed), the azimuth, the altitude and the like is obtained, and the radar image acquisition equipment can be specifically a rotary laser radar, a single-line laser radar, a millimeter wave radar and the like. For example, the distance between the vehicle and the ground, the distance between the vehicle and surrounding running vehicles, pedestrians and obstacles, the height of a tunnel portal and the like can be obtained by collecting the surrounding environment through radar image collection equipment, and then multi-frame radar image data are generated.
The driving control parameter acquisition equipment can be various sensors, positioning chips, processors and the like, and can acquire first driving control parameters of manual driving on the test route, such as speed, acceleration and deceleration, braking and the like.
Driving route simulation platform, including computer measurement and control system, for the quick car experiment test bench of multi-functional integrated intelligence, have multiple interface, can establish communication connection through relevant interface and unmanned car, can drive the first driving data that obtains on the test route according to the manual work, for going the virtual route that unmanned car simulation test route on the driving route simulation platform corresponds. For example, the slope, the roll angle, the road adhesion coefficient, the road wet skid degree, the translation inertia during vehicle running, the road running resistance and the like of a test route are simulated, a high-precision driving map is combined to plan a real-time path, and the executing mechanism part of the vehicle is controlled in a most advanced electric control hydraulic combination mode, so that the unmanned vehicle running on the driving route simulation platform can achieve the effect of real vehicle running on the test route. The driving route simulation platform also has a function of detecting the element function and the decision performance of the unmanned vehicle, can perform speed test, safety test, automatic driving auxiliary technology test, intelligent light test and the like on the vehicle, can collect second driving data of the unmanned vehicle driven on the virtual route, scores the second driving data according to the first driving data, judges the real reaction of the unmanned vehicle driven on the real road and determines the difference between the real reaction and manual driving.
In practice, the first driving data obtained by the driving route simulation platform may be sent to the driving route simulation platform by a collection device installed on a manually driven automobile, or may be input into the driving route simulation platform manually or by other means.
The unmanned vehicle, also called as an unmanned vehicle, is provided with a corresponding sensing system and an intelligent control system, can sense road conditions and surrounding environment of a virtual route based on the sensing system of various single sensor processing technologies such as monocular vision, a three-dimensional environment, laser point cloud data, millimeter wave radar and the like on the basis of high-precision driving map positioning, and can perform driving control operations such as transverse, longitudinal, acceleration and deceleration, braking and the like on the virtual route.
The embodiment of the invention adopts the existing commonly used manual driving automobile, can drive on various test routes simultaneously, can conveniently acquire the first driving data manually driven on various test routes by arranging the corresponding acquisition equipment on the manual driving automobile, provides rich and comprehensive test route materials for the driving test of an unmanned automobile, and has the advantages of high efficiency, convenience and low cost in the acquisition process of the test route materials. Secondly, through the construction of the driving route simulation platform and the unmanned vehicle, the road condition in the real test route can be simulated indoors according to the first driving data, so that the unmanned vehicle running on the driving route simulation platform achieves the effect of real vehicle running on the test route, the real reaction of the unmanned vehicle running on the real road can be tested, the accuracy of unmanned vehicle test is improved, and compared with the mode of putting the unmanned vehicle into the real road for real vehicle road test, the safety of unmanned vehicle test is greatly improved.
Based on the above unmanned vehicle driving test system, a detailed description will be given below of an unmanned vehicle driving test method of the present invention.
Referring to fig. 2, a flowchart illustrating steps of a method for testing driving of an unmanned aerial vehicle according to an embodiment of the present invention is shown, where the method may specifically include the following steps:
step S201: acquiring first driving data of manual driving on the test route;
the test route refers to a route formed by real roads in the embodiment of the invention, and has a starting point and an end point, and the starting point and the end point can be planned and established by a tester. The embodiment of the invention can determine the test route based on the starting point and the end point planned by the tester. For example, the starting point is a capital east station, and the ending point is a Chongqing West station, wherein three roads from the capital east station to the Chongqing West station can be selected, namely road 1, road 2 and road 3, so that the determined test route also has 3 roads, which are respectively: (1) starting from the east station of Chengdu, and arriving at the west station of Chongqing through a road 1; (2) starting from the east station of Chengdu, and arriving at the west station of Chongqing through a road 2; (3) from the east station of the Chengdu, the station arrives at the west station of Chongqing through a road 3. The test route is a variable route, and since it is unknown which test route the unmanned vehicle will select from the starting point to the destination point, all the test routes between the starting point and the destination point need to be manually driven once before actual test, and first driving data of each test route is obtained respectively. Of course, the test route may also be an unchangeable route, i.e. there is only one road between the starting point and the end point, and only one form of mode is allowed on the road, such as straight going, not turning or turning around.
In practice, the road conditions of the test routes are different due to different roads, and some roads are expressways, some roads are mud pit roads, some roads are bridges, and some roads are tunnels; some roads have large traffic flow, some have small traffic flow, some have accidents, some have rainwater, wind or rain, some are uphill or downhill, and some have many curves. Accordingly, the first driving data collected by the collecting device installed in the manually driven vehicle may be different.
Step S202: acquiring second driving data of the unmanned vehicle driving on the virtual route corresponding to the test route;
in the embodiment of the invention, first driving data is input into a driving route simulation platform, a driving route simulation platform simulates a test route indoors through a virtual reality generation technology according to the first driving data, a road surface state of the test route is simulated for an unmanned vehicle running on the route simulation platform, a virtual route corresponding to the test route is provided for the unmanned vehicle, and therefore a real test route is moved indoors, and the effect of testing the real vehicle running of the unmanned vehicle on the test route indoors is achieved.
Specifically, the first driving data includes multiple frames of road condition image data, and in an optional embodiment of the present invention, the step S202 may include:
generating a virtual route corresponding to the test route according to the multi-frame road condition image data;
and inputting the virtual route into the unmanned vehicle, and collecting second driving data of the unmanned vehicle driving on the virtual route.
Based on the difference of the test routes, the multi-frame road condition image data collected in different test routes may also be different, and corresponding to the explanation of the road condition of the test route in step S201, some multi-frame road condition image data may include the road condition image of the tunnel, some multi-frame road condition image data may include the road condition image of the bridge, and further some may be the image of the bridge a, and some may be the image of the bridge B.
Aiming at a certain test route, the driving route simulation platform simulates the test route according to multi-frame road condition image data to obtain a virtual route, and then inputs the virtual route into the unmanned vehicle. Because the virtual route is simulated according to the test route, the virtual route driven by the unmanned vehicle is correspondingly different. For example, some virtual routes may pass through tunnels, some virtual routes may pass through bridges, and so on.
In an optional embodiment of the present invention, the multiple frames of road condition image data include multiple frames of optical image data and multiple frames of radar image data, and in a specific implementation, a virtual route corresponding to the test route may be generated according to the multiple frames of optical image data and the multiple frames of radar image data. In the embodiment of the present invention, how to simulate the test route by the driving route simulation platform can be realized by using related prior art, such as a virtual reality technology, various electric hydraulic control technologies, a high-precision driving map, and the like, which are not the key points of the embodiment of the present invention and are not described herein.
And when the unmanned vehicle runs in the virtual route, carrying out corresponding driving control according to the sensed road condition and environment in the virtual route. The driving route simulation platform is connected with the unmanned vehicle through a related detection device, and acquires second driving data for driving control of the unmanned vehicle on the virtual route.
Step S203: and grading the second driving data according to the first driving data.
Since the test route is divided into the variable route and the non-variable route, in the embodiment of the invention, the second driving data of the unmanned vehicle driving is scored according to the first driving data of the manual driving, and the evaluation is respectively carried out based on the difference of the two routes.
The first driving data further comprises a first driving control parameter, and the second driving data comprises a second driving control parameter; when the test route is a variable route, the variable route includes a plurality of events, each event including at least 2 options; in an alternative embodiment of the present invention, step S203 may include the following sub-steps:
substep S203-1: taking the option of the event corresponding to the first driving control parameter as a standard option of the variable route, and making a score of the option of the same event of the variable route;
substep S203-2: and determining the score of the corresponding option of the second driving control parameter in the event according to the scores of the options of the same event of the variable route.
Step S201 proposes an explanation of the variable route that a plurality of roads are selectable between the same start point and end point, and then one road is selected among the plurality of roads, which is an event, and the plurality of selectable roads provided are options of the event. The variable route also comprises a plurality of events, if a certain event is overtaking, the options are left overtaking and right overtaking; if the certain event is highway running, the running speed is less than 80km/h, the running speed is between 80km/h and 120km/h, and the running speed is more than 120 km/h.
In the embodiment of the present invention, assuming that the manual driving of the car on the variable route is very safe and standard for the selection of different events, an alternative example of the sub-step S203-1 may be: the event is that one road is selected from among a road a, a road B, and a road C, and the road a is selected from among the first driving control parameters, and then the score of the road a is made 100, and the scores of the road B and the road C are respectively 80 and 70. Another two alternative examples of substep S203-1 may be: the event is a passing, and a left passing is selected from the first driving control parameters, so that the score of the left passing is set to be 100 points, and the score of the right passing is set to be 60 points.
According to the established scores of the options of the same event, the score of the selection made by the second driving control parameter when facing the event can be determined, so that the difference between the selection made by the unmanned vehicle for the same event and the selection made by the artificial driving can be judged.
When the test route is an unchangeable route, in an alternative embodiment of the present invention, step S203 may include the following sub-steps:
substep S203-3: determining the first driving control parameter as a unique execution criterion for the invariable route;
substep S203-4: scoring the second driving control parameter in accordance with the unique performance criteria.
In the invariable route, the execution standards are only one, and if the driving operations of the manually driven automobile on the invariable route are all very standard and safe, the second driving control parameters obtained by the unmanned automobile driving on the invariable route are unqualified as long as the second driving control parameters are inconsistent with the first driving control parameters. Thus, scoring the second driving control parameter in accordance with the unique performance criterion may be further interpreted as: and judging whether the second driving control parameter is qualified or not according to the unique execution standard. For example, the vehicle can only go straight on the unchangeable route, and the unmanned vehicle turns around or turns around, so that the vehicle is unqualified.
It should be noted that the test route of the present invention may include both the variable route and the non-variable route, or only one of the variable route and the non-variable route.
In summary, according to the first driving data, the embodiment of the invention can simulate the road condition in the real test route indoors, so that the unmanned vehicle running on the driving route simulation platform can achieve the effect of real vehicle running on the test route; and according to the first driving data obtained by driving the unmanned vehicle on the test route, the second driving data obtained by driving the unmanned vehicle is scored, so that the real reaction of the unmanned vehicle when the unmanned vehicle drives on the test route can be tested, and the accuracy and the safety of the unmanned vehicle test are improved.
On the basis of the unmanned vehicle driving test method, the embodiment of the invention optimizes the first driving data after the first driving data is obtained, so as to improve the reliability of the simulated virtual route. Referring to fig. 3, a flowchart illustrating steps of a first driving data processing method according to an embodiment of the present invention is shown, where the method may specifically include the following steps:
step S301: calculating the frame integrity rate of the multi-frame road condition image data; the frame integrity rate is the ratio of the total frame number of the multi-frame road condition image data to the theoretical total frame number to be obtained;
step S302: judging whether the frame integrity rate of the multi-frame road condition image data is greater than or equal to a preset frame integrity rate threshold value or not;
step S303: and when the frame integrity rate of the multi-frame road condition image data is greater than or equal to the preset frame integrity rate threshold value, saving the multi-frame road condition image data as the first driving data.
The first driving data includes multi-frame road condition image data in a preset time period, and under an ideal condition, the total frame number of the actually obtained multi-frame road condition image data should be consistent with the theoretical total frame number to be obtained, and equivalently, the frame integrity rate should reach 100%. However, in the process of acquiring the first driving data, due to a network or an acquisition device installed on a manually driven vehicle, the integrity of some acquired data may be affected, and especially some image data is lost too much, which results in an undesirable simulation effect of the virtual route. Therefore, the embodiment of the invention calculates and judges the frame integrity of the multi-frame road condition image data in the preset time period, and only stores the multi-frame road condition image data of which the frame integrity reaches the preset frame integrity threshold as the first driving data, so that the first driving data can be used as the data source of the simulated test route, thereby improving the reliability of the simulated virtual route. And if the frame integrity rate does not reach the preset frame integrity rate threshold value, discarding the multi-frame road condition image data, and manually driving again at the later stage to fill up corresponding data.
In an optional embodiment of the present invention, a method for calculating a total frame number of multiple frames of road condition image data and a theoretical total frame number to be obtained is provided, including:
step S11: acquiring the starting test time, the ending test time and the theoretical acquisition frequency of the multi-frame road condition image data, wherein the preset time period is a time period from the starting test time to the ending test time;
in the embodiment of the invention, the starting test time of the multi-frame road condition image data can be the time when the manually driven automobile starts from the starting point of the test route, and the ending test time can be the time when the manually driven automobile reaches the end point of the test route. Of course, the starting test time and the ending test time can also be any two time nodes between the time when the manually driven automobile starts from the starting point of the test route and the time when the manually driven automobile reaches the end point of the test route, so that the test times and the test distance can be reduced, and the test cost can be reduced. If the departure time of the manually-driven automobile from the starting point of the test route is 9 am 00 min 00 s at 3/4/2019, the arrival time at the end point of the test route is 9 am 30 min 25 s at 3/4/2019, the starting test time can be 9 am 10 min 30 s at 3/4/2019, the ending test time can be 9 am 20 min 30 s at 4/2019, and the preset time period is a time period from 9 am 10 min 30 s at 3/4/2019 am 9 am to 20 am 30 s at 3/4/2019 am 9 am 30 s at 2019.
The theoretical sampling frequency is determined according to the equipment performance of the acquisition equipment corresponding to the multi-frame road condition image data, the multi-frame road condition image data can comprise multi-frame optical image data and multi-frame radar image data, the theoretical sampling frequency of the multi-frame optical image data is determined according to the equipment sampling frequency of the optical image acquisition equipment, the theoretical sampling frequency of the multi-frame radar image data is determined according to the equipment sampling frequency of the radar image acquisition equipment, and if the equipment sampling frequency of the optical image acquisition equipment is 30 frames/second, the theoretical sampling frequency of the multi-frame optical image data is 30 frames/second; the device sampling frequency of the radar image acquisition device is 60 frames/second, and the theoretical sampling frequency of multi-frame radar image data is 60 frames/second.
Step S12: determining the theoretical total frame number to be obtained according to the time stamp of the starting test time, the time stamp of the ending test time and the theoretical acquisition frequency;
during calculation, the theoretical total frame number which should be obtained by the multi-frame road condition image data is calculated respectively according to different multi-frame road condition image data, namely the theoretical total frame number which should be obtained by the multi-frame optical image data and the theoretical total frame number which should be obtained by the multi-frame radar image data are calculated respectively.
And obtaining the duration of a preset time period through the time stamp of the starting test time and the time stamp of the ending test time, and then multiplying the duration of the preset time period by the theoretical acquisition frequency according to a formula of time and frequency to determine the theoretical total frame number to be obtained. For example, the theoretical sampling frequency of the multi-frame radar image data is 60 frames/second, the duration of the preset time period is 10 minutes, that is, 600 seconds, and then the theoretical total number of frames to be obtained by the multi-frame radar image data is 36000 frames.
When the total frame number of the multi-frame road condition image data is calculated, all the frame numbers of the collected multi-frame road condition image data are not directly used as the total frame number of the multi-frame road condition image data obtained in the preset time period to calculate the frame integrity rate, and the frame integrity rate is completely discarded when the frame integrity rate is unqualified, which is too wasteful. Therefore, the embodiment of the present invention proposes the calculation manner of step S13 and step S14.
Step S13: counting the actual frame loss number of the multi-frame road condition image data according to the time stamp of the starting test time, the time stamp of the ending test time, the theoretical acquisition frequency, the acquisition time stamp of the first frame image and the acquisition time stamp of the last frame image in the multi-frame road condition image data, and determining the frame loss number which cannot be interpolated in the actual frame loss number;
in the embodiment of the invention, the actual frame loss number comprises at least one of a middle frame loss number, a front frame loss number and a back frame loss number; the step S13 may count the actual number of frames lost of the multiple frames of road condition image data through the following sub-steps:
substep S13-1: determining the number of frames of the multi-frame road condition image data to be lost according to the time stamp of the starting test time, the acquisition time stamp of the first frame image in the multi-frame road condition image data and the theoretical acquisition frequency;
the number of previous lost frames refers to the number of frames lost between the time of starting the test and the time of obtaining the first frame image in the multi-frame road condition image data.
If the time stamp of the starting test time is 9 points 00 minutes 00 seconds, and the acquisition time stamp of the first frame image in the multi-frame road condition image data is 9 points 00 minutes 01 seconds, and the theoretical sampling frequency is 60 frames/second, then the number of the previous lost frames of the multi-frame road condition image data is 60 frames.
Substep S13-2: determining the number of rear lost frames of the multi-frame road condition image data according to the time stamp of the test ending time, the acquisition time stamp of the last frame of image in the multi-frame road condition image data and the theoretical acquisition frequency;
and the frame loss number refers to the number of frames lost between the time of obtaining the last frame image in the multi-frame road condition image data and the test ending time.
If the time stamp of the test ending time is 9 points, 20 minutes and 30 seconds, and the acquisition time stamp of the last frame of image in the multi-frame road condition image data is 9 points, 20 minutes and 29 seconds, and the theoretical sampling frequency is 60 frames/second, the number of the last lost frames of the multi-frame road condition image data is 60 frames.
Substep S13-3: determining the middle frame loss number of the multi-frame road condition image data according to the theoretical acquisition frequency, the acquisition time stamp of the first frame image and the acquisition time stamp of the last frame image;
the middle frame loss number refers to the number of frames lost between the time of obtaining the first frame image of the multi-frame road condition image data and the time of obtaining the last frame image.
There are two methods for determining the number of middle lost frames:
firstly, determining the time length for acquiring each frame of image according to the theoretical sampling frequency, then adding the time length for acquiring each frame of image to one frame of image on the basis of the acquisition time stamp of the first frame of image to obtain the time stamp of each frame of image, and then comparing the time stamp of each frame of image with the time stamp of the image of the actually obtained multi-frame road condition image data to determine which frames the lost image is.
Secondly, according to the product of the time length from the acquisition time stamp of the first frame image to the acquisition time stamp of the last frame image and the theoretical sampling frequency, the number of frames which should be obtained in the time period from the time of obtaining the first frame image to the time of obtaining the last frame image can be determined, and then the number of frames which are actually obtained in the time period from the time of obtaining the first frame image to the time of obtaining the last frame image can be subtracted from the number of true frames which should be obtained, so that the middle lost frame number can be determined. If the acquisition time stamp of the first frame image in the multi-frame road condition image data is 9: 00/01 seconds, the acquisition time stamp of the last frame image is 9: 05/01 seconds, and the theoretical sampling frequency is 60 frames/second, then normally, 60 × 5 × 60 frames should be obtained between the time from 9: 00/01 seconds to 9: 05/01 seconds. However, in practice, only 17500 frames are obtained from 9 o 'clock 00 min 01 s to 9 o' clock 05 min 01 s, which indicates that 500 frames are lost in the middle, and the number of lost frames in the middle is 500 frames.
Substep S13-4: and counting the actual frame loss number of the multi-frame road condition image data based on the front frame loss number, the rear frame loss number and the middle frame loss number of the multi-frame road condition image data.
The actual frame loss number is the sum of the front frame loss number, the rear frame loss number and the middle frame loss number of the multi-frame road condition image data. If the number of front lost frames of the multi-frame road condition image data is 60 frames, the number of rear lost frames is 60 frames, and the number of middle lost frames is 500 frames, then the actual lost frame of the multi-frame road condition image data is 620 frames.
In the embodiment of the invention, in order to improve the usability of the multi-frame road condition image data and further reduce the test cost, a method for performing image interpolation on the multi-frame road condition image data is provided. However, since some images cannot be interpolated, in the embodiment of the present invention, the number of lost frames that cannot be interpolated in the actual number of lost frames is determined first, and the determining method may include:
when the actual frame loss number comprises the previous frame loss number, the frame loss number which cannot be interpolated in the actual frame loss number comprises the previous frame loss number;
when the actual frame loss number comprises the post frame loss number, the frame loss number which cannot be interpolated in the actual frame loss number comprises the post frame loss number;
and when the actual frame loss number comprises the intermediate frame loss number, comparing the intermediate frame loss number with a preset pluggable frame number, and when the intermediate frame loss number is greater than the preset pluggable frame number, determining the difference value between the intermediate frame loss number and the preset pluggable frame number as the frame loss number which cannot be interpolated in the actual frame loss number.
The interpolation process is a method for interpolating a missing part by calculating coordinates of a middle point through which a motion track passes by knowing a starting point, an end point, a curve type and a trend of the motion track. Therefore, when the previous frame loss and the next frame loss occur in the multi-frame road condition image data, the number of the lost previous frame loss and the number of the lost next frame loss cannot be interpolated. When the condition of frame loss occurs in the middle of the multi-frame road condition image data, the adjacent images are also arranged between the multi-frame road condition image data, so that interpolation processing can be carried out. Of course, when the number of lost frames is too large, the difference between two existing adjacent images is too large, and interpolation processing cannot be performed. Therefore, through the research of the inventor on the multi-frame road condition image data in the preset time period, a preset pluggable frame number is set, and when the middle lost frame number is less than or equal to the preset pluggable frame number, the lost images from the first frame image to the last frame image of the multi-frame road condition image data can be interpolated. And when the intermediate lost frame number is greater than the preset pluggable frame number, the part of the intermediate lost frame number which exceeds the preset pluggable frame number is equivalent to the part which cannot be interpolated.
Therefore, the embodiment of the invention can determine the number of lost frames which can not be interpolated by comparing the middle lost frame number with the preset pluggable frame number and adding the lost previous lost frame number and the lost next lost frame number.
Step S14: and determining the difference value between the theoretical total frame number to be obtained and the lost frame number which can not be interpolated as the total frame number of the multi-frame road condition image data.
Finally, the embodiment of the invention subtracts the number of lost frames which cannot be interpolated from the theoretical total frame number which is to be obtained, and determines the difference value as the total frame number of the multi-frame road condition image data. Then, the total frame number of the multi-frame road condition image data is divided by the theoretical total frame number to be obtained, and the frame integrity rate of the multi-frame road condition image data can be calculated.
Because the multi-frame road condition image data of the embodiment of the invention comprises multi-frame optical image data and multi-frame radar image data, the embodiment of the invention respectively sets the frame integrity threshold values of the multi-frame optical image data and the multi-frame radar image data, and the preset frame integrity threshold values comprise an optical frame integrity threshold value and a radar frame integrity threshold value;
step S302 may include the steps of: judging whether the frame integrity rate of the multi-frame optical image data is greater than or equal to the optical frame integrity rate threshold value or not, and judging whether the frame integrity rate of the multi-frame radar image data is greater than or equal to the radar frame integrity rate threshold value or not;
step S303 may include the steps of: and when the frame integrity rate of the multiple frames of optical image data is greater than or equal to the optical frame integrity rate threshold value and the frame integrity rate of the multiple frames of radar image data is greater than or equal to the radar frame integrity rate threshold value, storing the multiple frames of optical image data and the multiple frames of radar image data as the first driving data.
In the embodiment of the invention, the optical frame integrity rate threshold and the radar frame integrity rate threshold may be the same or different, when the frame integrity rate of the multi-frame optical image data is greater than or equal to the optical frame integrity rate threshold and the frame integrity rate of the multi-frame radar image data is greater than or equal to the radar frame integrity rate threshold, it is indicated that the multi-frame optical image data and the multi-frame radar image data acquired this time can also obtain a better interpolation effect, and the group of multi-frame road condition image data can be used as the first driving data to simulate the virtual route, otherwise, the data is discarded. Both the optical frame integrity threshold and the radar frame integrity threshold are optionally set to 95% -100%.
In an optional embodiment of the present invention, the method may further include the following steps:
determining the continuous frame loss number in the middle frame loss number of the multi-frame road condition image data;
and when the continuous frame loss number is more than or equal to a preset middle continuous maximum frame loss number, removing the multi-frame road condition image data from the first driving data.
Based on the first determination method of the intermediate frame loss number, the embodiment of the present invention can determine the continuous frame loss number in the intermediate frame loss number. For example, the number of middle lost frames is 50 frames, the middle continuous maximum number of lost frames is 10 frames, the number of continuous lost frames in 500 frames respectively has 3 groups of continuous lost frames, the first group of continuous lost frames is 8 frames, the second group of continuous lost frames is 10 frames, the third group of continuous lost frames is 12 frames, because the third group of continuous lost frames is 12 frames, which is already greater than the preset middle continuous maximum number of lost frames 10 frames, even if the data cannot be found back through interpolation processing, the availability of the group of multi-frame road condition image data is not high, the group of data is recommended to be discarded and reacquired. In the embodiment of the invention, before judging whether the frame integrity of the multi-frame road condition image data is greater than or equal to the preset frame integrity threshold value, the middle continuous maximum frame loss number is used as a middle screening condition, the obviously unavailable multi-frame road condition image data can be screened and removed, and the speed of screening the multi-frame road condition image data is improved.
To more clearly show the processing procedure of the first driving data processing method according to the embodiment of the present invention, referring to fig. 4, taking multi-frame road condition image data as multi-frame optical image data and multi-frame radar image data as an example, a schematic diagram of a verification process of the multi-frame optical image data and the multi-frame radar image data according to the embodiment of the present invention is shown.
Referring to fig. 5, a complete test evaluation flow diagram according to an embodiment of the present invention is shown in combination with a first driving data processing method according to an embodiment of the present invention. In fig. 5, the driving control parameters are represented by a plotted road map and a speed map, which can more clearly show the comparison process.
It should be noted that, in fig. 4 and 5, the multiple frames of optical image data are collected by the camera, the multiple frames of radar image data are collected by the radar, and the terminals refer to the camera and the radar respectively, so that the multiple frames of optical image data and the multiple frames of radar image data are abbreviated as optical and radar image data for saving space. In fig. 5, the test vehicle refers to an unmanned vehicle to be tested, and the embodiment of the invention verifies the inputted multi-frame optical image data and the multi-frame radar image data according to the time stamp, so that the synchronism and integrity of the multi-frame optical image data and the multi-frame radar image data acquired by manual driving can be verified, and the reliability of the simulated virtual route is improved.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 6, a schematic structural diagram of an unmanned vehicle driving test device according to an embodiment of the present invention is shown based on the same inventive concept, and the device may include the following modules:
the first driving data acquisition module 601 is used for acquiring first driving data of manual driving on a test route;
the second driving data acquisition module 602 is configured to acquire second driving data of the unmanned vehicle driving on the virtual route corresponding to the test route;
a second driving data scoring module 603, configured to score the second driving data according to the first driving data.
In an optional embodiment of the present invention, the first driving data includes multiple frames of road condition image data within a preset time period, and the apparatus may further include the following modules:
the frame integrity rate calculation module is used for calculating the frame integrity rate of the multi-frame road condition image data; the frame integrity rate is the ratio of the total frame number of the multi-frame road condition image data to the theoretical total frame number to be obtained;
the frame integrity rate judging module is used for judging whether the frame integrity rate of the multi-frame road condition image data is greater than or equal to a preset frame integrity rate threshold value or not;
and the first screening module of the multi-frame road condition image data is used for saving the multi-frame road condition image data as the first driving data when the frame integrity rate of the multi-frame road condition image data is greater than or equal to the preset frame integrity rate threshold value.
In an optional embodiment of the present invention, the apparatus may further include the following modules:
the first acquisition module is used for acquiring the starting test time, the ending test time and the theoretical acquisition frequency of the multi-frame road condition image data, and the preset time period is a time period from the starting test time to the ending test time;
a theoretical total frame number determining module, configured to determine a theoretical total frame number to be obtained according to the timestamp of the start test time, the timestamp of the end test time, and the theoretical acquisition frequency;
a frame loss number determining module, configured to count an actual frame loss number of the multiple frames of road condition image data according to the timestamp of the start test time, the timestamp of the end test time, the theoretical acquisition frequency, and the acquisition timestamps of the first frame image and the last frame image in the multiple frames of road condition image data, and determine a frame loss number that cannot be interpolated in the actual frame loss number;
and the total frame number calculating module is used for determining the difference value between the theoretical total frame number to be obtained and the lost frame number which cannot be interpolated as the total frame number of the multi-frame road condition image data.
In an optional embodiment of the present invention, the actual frame loss number includes at least one of a middle frame loss number, a front frame loss number, and a rear frame loss number; the frame loss number determination module may include the following sub-modules:
the front lost frame number determining submodule is used for determining the front lost frame number of the multi-frame road condition image data according to the time stamp of the starting test time, the acquisition time stamp of the first frame image in the multi-frame road condition image data and the theoretical acquisition frequency;
a back frame loss number determining submodule, configured to determine a back frame loss number of the multi-frame road condition image data according to the timestamp of the end test time, the acquisition timestamp of the last frame image in the multi-frame road condition image data, and the theoretical acquisition frequency;
the middle lost frame number determining submodule is used for determining the middle lost frame number of the multi-frame road condition image data according to the theoretical acquisition frequency, the acquisition time stamp of the first frame image and the acquisition time stamp of the last frame image;
and the actual frame loss number determining submodule is used for counting the actual frame loss number of the multi-frame road condition image data based on the front frame loss number, the rear frame loss number and the middle frame loss number of the multi-frame road condition image data.
In an optional embodiment of the present invention, the lost frame number determining module may further include the following sub-modules:
a first determining submodule for determining a number of frames that cannot be interpolated, wherein when the actual number of lost frames includes the number of frames lost before, the number of frames lost that cannot be interpolated in the actual number of lost frames includes the number of frames lost before;
a second determining submodule for determining the number of frames which cannot be interpolated, wherein the number of frames which cannot be interpolated in the actual number of frames includes the number of frames which are lost when the actual number of frames which are lost includes the number of frames which are lost;
and a third determining submodule of the number of frames which cannot be interpolated, which is used for comparing the intermediate lost frame number with a preset pluggable frame number when the actual lost frame number comprises the intermediate lost frame number, and determining the difference value between the intermediate lost frame number and the preset pluggable frame number as the number of lost frames which cannot be interpolated in the actual lost frame number when the intermediate lost frame number is greater than the preset pluggable frame number.
In an optional embodiment of the present invention, the apparatus may further include the following modules:
the continuous frame loss number determining module is used for determining the continuous frame loss number in the middle frame loss numbers of the multi-frame road condition image data;
and the second screening module of the multi-frame road condition image data is used for removing the multi-frame road condition image data from the first driving data when the continuous frame loss number is more than or equal to a preset middle continuous maximum frame loss number.
In an optional embodiment of the present invention, the multiple frames of road condition image data include multiple frames of optical image data and multiple frames of radar image data, and the preset frame integrity threshold includes an optical frame integrity threshold and a radar frame integrity threshold;
the frame integrity rate judging module comprises the following sub-modules:
the optical frame integrity rate judging submodule is used for judging whether the frame integrity rate of the plurality of frames of optical image data is greater than or equal to the optical frame integrity rate threshold value or not;
the radar frame integrity rate judgment submodule is used for judging whether the frame integrity rate of the multi-frame radar image data is greater than or equal to the radar frame integrity rate threshold value or not;
the first screening module for the multi-frame road condition image data may include the following sub-modules:
and the multi-frame road condition image data storage submodule is used for storing the multi-frame optical image data and the multi-frame radar image data as the first driving data when the frame integrity rate of the multi-frame optical image data is greater than or equal to the optical frame integrity rate threshold and the frame integrity rate of the multi-frame radar image data is greater than or equal to the radar frame integrity rate threshold.
In an optional embodiment of the invention, the first driving data further comprises a first driving control parameter, and the second driving data comprises a second driving control parameter; when the test route is a variable route, the variable route includes a plurality of events, each event including at least 2 options; the second driving data scoring module 603 may include the following sub-modules:
the variable route selection score formulating submodule is used for formulating the score of the option of the same event of the variable route by taking the option of the event corresponding to the first driving control parameter as the standard option of the variable route;
and the first scoring submodule is used for determining the score of the corresponding option of the second driving control parameter in the event according to the score of the option of the same event of the variable route.
In an optional embodiment of the invention, the first driving data further comprises a first driving control parameter, and the second driving data comprises a second driving control parameter; when the test route is an unchangeable route, the second driving data scoring module 603 may include the following sub-modules:
a unique execution criterion determining sub-module for determining the first driving control parameter as a unique execution criterion of the invariable route;
and the second scoring submodule is used for scoring the second driving control parameter according to the unique execution standard.
According to an aspect of the present invention, an unmanned vehicle driving test apparatus is provided, including:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the unmanned vehicle driving testing method as described above.
According to an aspect of the present invention, there is also provided a computer-readable storage medium storing a computer program for causing a processor to execute the unmanned vehicle driving test method as described above.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The present invention provides an unmanned vehicle driving test method, an unmanned vehicle driving test device, an unmanned vehicle driving test system and a storage medium, which are introduced in detail, and the present invention applies specific examples to explain the principle and the implementation of the present invention, and the description of the above embodiments is only used to help understand the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (14)

1. An unmanned vehicle driving test method, the method comprising:
acquiring first driving data of manual driving on a test route;
acquiring second driving data of the unmanned vehicle driving on the virtual route corresponding to the test route;
and grading the second driving data according to the first driving data.
2. The method of claim 1, wherein the first driving data comprises a plurality of frames of road condition image data within a preset time period, and the method further comprises:
calculating the frame integrity rate of the multi-frame road condition image data; the frame integrity rate is the ratio of the total frame number of the multi-frame road condition image data to the theoretical total frame number to be obtained;
judging whether the frame integrity rate of the multi-frame road condition image data is greater than or equal to a preset frame integrity rate threshold value or not;
and when the frame integrity rate of the multi-frame road condition image data is greater than or equal to the preset frame integrity rate threshold value, saving the multi-frame road condition image data as the first driving data.
3. The method of claim 2, further comprising:
acquiring the starting test time, the ending test time and the theoretical acquisition frequency of the multi-frame road condition image data, wherein the preset time period is a time period from the starting test time to the ending test time;
determining the theoretical total frame number to be obtained according to the time stamp of the starting test time, the time stamp of the ending test time and the theoretical acquisition frequency;
counting the actual frame loss number of the multi-frame road condition image data according to the time stamp of the starting test time, the time stamp of the ending test time, the theoretical acquisition frequency, the acquisition time stamp of the first frame image and the acquisition time stamp of the last frame image in the multi-frame road condition image data, and determining the frame loss number which cannot be interpolated in the actual frame loss number;
and determining the difference value between the theoretical total frame number to be obtained and the lost frame number which can not be interpolated as the total frame number of the multi-frame road condition image data.
4. The method of claim 3 wherein the actual number of dropped frames comprises at least one of a number of middle frames dropped, a number of front frames dropped, and a number of rear frames dropped; counting the actual frame loss number of the multi-frame road condition image data according to the time stamp of the starting test time, the time stamp of the ending test time, the theoretical acquisition frequency, the acquisition time stamp of the first frame image and the acquisition time stamp of the last frame image in the multi-frame road condition image data, and the method comprises the following steps:
determining the number of frames of the multi-frame road condition image data to be lost according to the time stamp of the starting test time, the acquisition time stamp of the first frame image in the multi-frame road condition image data and the theoretical acquisition frequency;
determining the number of rear lost frames of the multi-frame road condition image data according to the time stamp of the test ending time, the acquisition time stamp of the last frame of image in the multi-frame road condition image data and the theoretical acquisition frequency;
determining the middle frame loss number of the multi-frame road condition image data according to the theoretical acquisition frequency, the acquisition time stamp of the first frame image and the acquisition time stamp of the last frame image;
and counting the actual frame loss number of the multi-frame road condition image data based on the front frame loss number, the rear frame loss number and the middle frame loss number of the multi-frame road condition image data.
5. The method of claim 4, wherein determining the number of lost frames that cannot be interpolated from the number of actual lost frames comprises:
when the actual frame loss number comprises the previous frame loss number, the frame loss number which cannot be interpolated in the actual frame loss number comprises the previous frame loss number;
when the actual frame loss number comprises the post frame loss number, the frame loss number which cannot be interpolated in the actual frame loss number comprises the post frame loss number;
and when the actual frame loss number comprises the intermediate frame loss number, comparing the intermediate frame loss number with a preset pluggable frame number, and when the intermediate frame loss number is greater than the preset pluggable frame number, determining the difference value between the intermediate frame loss number and the preset pluggable frame number as the frame loss number which cannot be interpolated in the actual frame loss number.
6. The method according to claim 4 or 5, characterized in that the method further comprises:
determining the continuous frame loss number in the middle frame loss number of the multi-frame road condition image data;
and when the continuous frame loss number is more than or equal to a preset middle continuous maximum frame loss number, removing the multi-frame road condition image data from the first driving data.
7. The method according to claim 2, wherein the plurality of frames of road condition image data comprise a plurality of frames of optical image data and a plurality of frames of radar image data, and the preset frame integrity threshold comprises an optical frame integrity threshold and a radar frame integrity threshold;
judging whether the frame integrity rate of the multi-frame road condition image data is greater than or equal to a preset frame integrity rate threshold value or not, including:
judging whether the frame integrity rate of the multi-frame optical image data is greater than or equal to the optical frame integrity rate threshold value or not, and judging whether the frame integrity rate of the multi-frame radar image data is greater than or equal to the radar frame integrity rate threshold value or not;
when the frame integrity rate of the multiple frames of road condition image data is greater than or equal to the preset frame integrity rate threshold, saving the multiple frames of road condition image data as the first driving data, including:
and when the frame integrity rate of the multiple frames of optical image data is greater than or equal to the optical frame integrity rate threshold value and the frame integrity rate of the multiple frames of radar image data is greater than or equal to the radar frame integrity rate threshold value, storing the multiple frames of optical image data and the multiple frames of radar image data as the first driving data.
8. The method of claim 1, wherein the first driving data further comprises a first driving control parameter and the second driving data comprises a second driving control parameter; when the test route is a variable route, the variable route includes a plurality of events, each event including at least 2 options; scoring the second driving data according to the first driving data, including:
taking the option of the event corresponding to the first driving control parameter as a standard option of the variable route, and making a score of the option of the same event of the variable route;
and determining the score of the corresponding option of the second driving control parameter in the event according to the scores of the options of the same event of the variable route.
9. The method of claim 1, wherein the first driving data further comprises a first driving control parameter and the second driving data comprises a second driving control parameter; when the test route is an unchangeable route, scoring the second driving data according to the first driving data, including:
determining the first driving control parameter as a unique execution criterion for the invariable route;
scoring the second driving control parameter in accordance with the unique performance criteria.
10. An unmanned vehicle driving testing system, the system comprising:
drive route simulation platform, unmanned car and install the collection equipment on artifical driving car, wherein:
the acquisition equipment is used for acquiring first driving data of manual driving on the test route;
the driving route simulation platform is used for simulating a virtual route corresponding to the test route for the unmanned vehicle running on the driving route simulation platform according to the first driving data, collecting second driving data of the unmanned vehicle driving on the virtual route, and grading the second driving data;
and the unmanned vehicle is used for driving control on the virtual route.
11. The system of claim 10, wherein the first driving data comprises a plurality of frames of optical image data, a plurality of frames of radar image data, and a first driving control parameter, and the acquisition device comprises an optical image acquisition device, a radar image acquisition device, and a driving control parameter acquisition device, wherein:
the optical image acquisition equipment is used for acquiring the multiframe optical image data manually driven on a test route;
the radar image acquisition equipment is used for acquiring the multiframe radar image data manually driven on a test route;
the driving control parameter acquisition equipment is used for acquiring first driving control parameters of manual driving on the test route.
12. An unmanned vehicle driving testing device, the device comprising:
the first driving data acquisition module is used for acquiring first driving data of manual driving on the test route;
the second driving data acquisition module is used for acquiring second driving data of the unmanned vehicle driven on the virtual route corresponding to the test route;
and the second driving data scoring module is used for scoring the second driving data according to the first driving data.
13. An unmanned vehicle driving test device, comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the unmanned vehicle driving testing method of any of claims 1-9.
14. A computer-readable storage medium storing a computer program for causing a processor to execute the driverless vehicle test method according to any one of claims 1 to 9.
CN202010537890.5A 2020-04-17 2020-06-12 Unmanned vehicle driving test method, device, system and storage medium Active CN111811828B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010307445 2020-04-17
CN202010307445X 2020-04-17

Publications (2)

Publication Number Publication Date
CN111811828A true CN111811828A (en) 2020-10-23
CN111811828B CN111811828B (en) 2022-05-24

Family

ID=72845063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010537890.5A Active CN111811828B (en) 2020-04-17 2020-06-12 Unmanned vehicle driving test method, device, system and storage medium

Country Status (1)

Country Link
CN (1) CN111811828B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112525542A (en) * 2020-10-26 2021-03-19 泉州装备制造研究所 Intelligent vehicle performance test system and method thereof
CN112837555A (en) * 2020-12-31 2021-05-25 清华大学苏州汽车研究院(吴江) Test route selection method and device, computer equipment and storage medium
CN113627372A (en) * 2021-08-17 2021-11-09 北京伟景智能科技有限公司 Running test method, system and computer readable storage medium
CN116465647A (en) * 2023-04-18 2023-07-21 日照朝力信息科技有限公司 Automobile performance testing method and system based on virtual reality technology

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103335853A (en) * 2013-07-18 2013-10-02 中国科学院自动化研究所 Unmanned driving vehicle cognitive competence testing system and method
KR20130134572A (en) * 2012-05-31 2013-12-10 주식회사 코아로직 Method and apparatus for managing and verifying traveling information of car, and system using thereof
CN104112363A (en) * 2014-07-04 2014-10-22 西安交通大学 Multi-sensing-data space-time synchronization method and road multi-sensing-data vehicle-mounted acquisition system
CN107063713A (en) * 2017-04-27 2017-08-18 百度在线网络技术(北京)有限公司 Method of testing and device applied to pilotless automobile
CN109215433A (en) * 2017-07-03 2019-01-15 百度(美国)有限责任公司 The Driving Scene generator of view-based access control model for automatic Pilot emulation
CN109725630A (en) * 2018-12-29 2019-05-07 驭势科技(北京)有限公司 Intelligent driving vehicle control device test method, device, server and computer-readable medium
CN109815555A (en) * 2018-12-29 2019-05-28 百度在线网络技术(北京)有限公司 The environmental modeling capability assessment method and system of automatic driving vehicle
US20200064837A1 (en) * 2018-08-24 2020-02-27 Baidu Usa Llc Image data acquisition logic of an autonomous driving vehicle for capturing image data using cameras

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130134572A (en) * 2012-05-31 2013-12-10 주식회사 코아로직 Method and apparatus for managing and verifying traveling information of car, and system using thereof
CN103335853A (en) * 2013-07-18 2013-10-02 中国科学院自动化研究所 Unmanned driving vehicle cognitive competence testing system and method
CN104112363A (en) * 2014-07-04 2014-10-22 西安交通大学 Multi-sensing-data space-time synchronization method and road multi-sensing-data vehicle-mounted acquisition system
CN107063713A (en) * 2017-04-27 2017-08-18 百度在线网络技术(北京)有限公司 Method of testing and device applied to pilotless automobile
CN109215433A (en) * 2017-07-03 2019-01-15 百度(美国)有限责任公司 The Driving Scene generator of view-based access control model for automatic Pilot emulation
US20200064837A1 (en) * 2018-08-24 2020-02-27 Baidu Usa Llc Image data acquisition logic of an autonomous driving vehicle for capturing image data using cameras
CN110895147A (en) * 2018-08-24 2020-03-20 百度(美国)有限责任公司 Image data acquisition logic for an autonomous vehicle that captures image data with a camera
CN109725630A (en) * 2018-12-29 2019-05-07 驭势科技(北京)有限公司 Intelligent driving vehicle control device test method, device, server and computer-readable medium
CN109815555A (en) * 2018-12-29 2019-05-28 百度在线网络技术(北京)有限公司 The environmental modeling capability assessment method and system of automatic driving vehicle

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112525542A (en) * 2020-10-26 2021-03-19 泉州装备制造研究所 Intelligent vehicle performance test system and method thereof
CN112525542B (en) * 2020-10-26 2022-09-27 泉州装备制造研究所 Intelligent vehicle performance test system and method thereof
CN112837555A (en) * 2020-12-31 2021-05-25 清华大学苏州汽车研究院(吴江) Test route selection method and device, computer equipment and storage medium
CN113627372A (en) * 2021-08-17 2021-11-09 北京伟景智能科技有限公司 Running test method, system and computer readable storage medium
CN113627372B (en) * 2021-08-17 2024-01-05 北京伟景智能科技有限公司 Running test method, running test system and computer readable storage medium
CN116465647A (en) * 2023-04-18 2023-07-21 日照朝力信息科技有限公司 Automobile performance testing method and system based on virtual reality technology
CN116465647B (en) * 2023-04-18 2024-03-26 日照朝力信息科技有限公司 Automobile performance testing method and system based on virtual reality technology

Also Published As

Publication number Publication date
CN111811828B (en) 2022-05-24

Similar Documents

Publication Publication Date Title
CN111811828B (en) Unmanned vehicle driving test method, device, system and storage medium
CN112789619B (en) Simulation scene construction method, simulation method and device
CN109884916B (en) Automatic driving simulation evaluation method and device
EP3877740B1 (en) Method and system for modifying a control unit of an autonomous car
CN109816811B (en) Natural driving data acquisition device
US20230325550A1 (en) Method, device, equipment for determining test evaluation information and computer storage medium
CN111795832B (en) Intelligent driving vehicle testing method, device and equipment
CN107807542A (en) Automatic Pilot analogue system
CN112819968B (en) Test method and device for automatic driving vehicle based on mixed reality
CN105702152A (en) Map generation method and device
CN113378305B (en) Driverless trolley-based vehicle-road cooperative testing method and device
CN112307594A (en) Road data acquisition and simulation scene establishment integrated system and method
CN109839922B (en) Method and apparatus for controlling unmanned vehicle
CN110716529A (en) Automatic generation method and device for automatic driving test case
CN112015164A (en) Intelligent networking automobile complex test scene implementation system based on digital twin
CN112816226B (en) Automatic driving test system and method based on controllable traffic flow
CN117436821B (en) Method, device and storage medium for generating traffic accident diagnosis report
CN114360240A (en) High-precision positioning method based on vehicle networking track characteristics
CN113222407A (en) Highway project security evaluation system based on BIM
CN116859881A (en) Test method and device
CN111816022A (en) Simulation method and device for simulation scene, storage medium and electronic equipment
CN113465608B (en) Road side sensor calibration method and system
US20230042001A1 (en) Weighted planning trajectory profiling method for autonomous vehicle
CN114925457A (en) Early warning function test method and device applied to Internet of vehicles
CN113341764A (en) Electric automobile optimization control simulation method and system with real automobile working condition on line

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant