CN115904957A - Automated testing method, apparatus, storage medium, and program product - Google Patents

Automated testing method, apparatus, storage medium, and program product Download PDF

Info

Publication number
CN115904957A
CN115904957A CN202211405190.6A CN202211405190A CN115904957A CN 115904957 A CN115904957 A CN 115904957A CN 202211405190 A CN202211405190 A CN 202211405190A CN 115904957 A CN115904957 A CN 115904957A
Authority
CN
China
Prior art keywords
data
model
replayed
perception
played back
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211405190.6A
Other languages
Chinese (zh)
Inventor
顾瑞红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecarx Hubei Tech Co Ltd
Original Assignee
Ecarx Hubei Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecarx Hubei Tech Co Ltd filed Critical Ecarx Hubei Tech Co Ltd
Priority to CN202211405190.6A priority Critical patent/CN115904957A/en
Publication of CN115904957A publication Critical patent/CN115904957A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Debugging And Monitoring (AREA)

Abstract

An embodiment of the application provides an automated testing method, an automated testing device, a storage medium and a program product, wherein the method comprises the steps of obtaining first data to be replayed, and the first data to be replayed comprises the following steps: the method comprises the steps of obtaining first original sensing data and first process data, wherein the first process data are process data output by a first model based on first original sensing data operation, optimizing the first model into a second model by playing first data to be replayed, generating second data to be replayed according to the first original sensing data based on the second model, playing and comparing the first data to be replayed and the second data to be replayed to obtain a verification result, and determining whether the first model is updated to the second model or not according to the verification result. The method provided by the embodiment of the application can ensure the effect and the quality of the second model in the research, development and test processes.

Description

Automated testing method, apparatus, storage medium, and program product
Technical Field
The embodiment of the application relates to the technical field of automatic driving, in particular to an automatic testing method, automatic testing equipment, an automatic testing storage medium and an automatic testing program product.
Background
The autopilot system contains a large number of machine learning algorithms and requires a large amount of data to be tested in the development and testing stages. However, in the above test method, the quality of the test data in the initial development stage is poor and the amount of the test data is small, so that the algorithm based on the test result optimization is poor.
In the related art, the test data packet can be obtained through the test vehicle, the test data packet is played through the visualization tool, the problem is located, and the corresponding algorithm is optimized. .
However, in the process of implementing the present application, the inventors found that at least the following problems exist in the prior art: in the test mode, the algorithm effect based on test result optimization is poor due to poor quality of test data in the initial research and development stage and less test data amount.
Disclosure of Invention
The embodiment of the application provides an automatic testing method, automatic testing equipment, an automatic testing storage medium and an automatic testing program product, so that the effect and the quality of algorithm optimization are improved.
In a first aspect, an embodiment of the present application provides an automated testing method, including:
acquiring first data to be played back; the first data to be played back includes: first raw sensing data and first process data; the first process data is process data which is output by the first model based on the first original sensing data;
optimizing the first model into a second model by playing the first data to be replayed, and generating second data to be replayed according to the first original sensing data on the basis of the second model;
and playing and comparing the first data to be replayed with the second data to be replayed to obtain a verification result, and determining whether to update the first model to the second model according to the verification result.
In one possible design, the obtaining the first data to be played back includes:
acquiring a test data packet acquired by a target vehicle, analyzing the test data packet, and acquiring a label of the test data packet;
and extracting first data to be replayed from the test data packet based on the label and the test requirement.
In one possible design, the optimizing the first model to the second model by playing the first to-be-played-back data includes:
determining a problem code in a first model by playing the first data to be played back;
repairing the problem code to obtain a repair code;
and performing cross compilation on the repair code through a heterogeneous environment to obtain a second model.
In one possible design, the generating second data to be replayed from the first raw sensing data based on the second model includes:
inputting the first original sensing data into the second model based on the heterogeneous environment, and operating to obtain second process data;
and forming second data to be played back by the second process data and the first original sensing data.
In a possible design, before optimizing the first model to the second model by playing the first to-be-replayed data, the method further includes:
acquiring first simulation data; the first simulation data comprises second original data and third process data; the third process data is process data which is output by the first model based on the second original sensing data;
supplementing the first data to be played back based on the first simulation data to obtain third data to be played back;
the optimizing the first model to a second model by playing the first to-be-replayed data includes: optimizing the first model into a second model by playing the third data to be played back;
the playing and comparing the first data to be played back with the second data to be played back includes: and playing and comparing the third data to be played back with the second data to be played back.
In one possible design, if the first data to be replayed includes perceptual data, optimizing a first perceptual model to a second perceptual model based on the first data to be replayed;
determining first perception truth value data according to the first to-be-replayed data;
evaluating the first perception model and the second perception model based on the first perception truth value data to obtain an evaluation result;
and determining whether to update the first perception model to the second perception model according to the evaluation result.
In one possible design, the determining first perceptual truth data from the first to-be-replayed data includes:
and labeling first original sensing data of the first data to be replayed to obtain first perception truth value data.
In a possible design, before evaluating the first perception model and the second perception model based on the first perception truth data and obtaining an evaluation result, the evaluating method further includes:
acquiring second simulation data; the second simulation data comprises second perceptual truth data;
evaluating the first perception model and the second perception model based on the first perception truth data to obtain an evaluation result, wherein the evaluating comprises:
evaluating the first perception model and the second perception model based on the first perception truth data and the second perception truth data to obtain an evaluation result.
In a possible design, after determining whether to update the first model to the second model according to the verification result, the method further includes:
adding the first data to be replayed into a regression scene set; the regression scene set comprises historical data to be replayed; the historical data to be replayed is first data to be replayed corresponding to a previous verification result;
if the first model is determined to be updated to the second model, generating a new software version according to the second model;
and performing regression testing on the new software version based on the regression scene set.
In a second aspect, an embodiment of the present application provides an automated testing device, including:
the acquisition module is used for acquiring first data to be played back; the first data to be played back includes: first raw sensing data and first process data; the first process data is process data which is output by the first model based on the first original sensing data;
the optimization module is used for optimizing the first model into a second model by playing the first data to be played back, and generating second data to be played back according to the first original sensing data based on the second model;
and the verification module is used for playing and comparing the first data to be replayed and the second data to be replayed to obtain a verification result, and determining whether to update the first model to the second model according to the verification result.
In a third aspect, an embodiment of the present application provides an automated testing device, including: at least one processor and a memory;
the memory stores computer-executable instructions;
the at least one processor executes computer-executable instructions stored by the memory to cause the at least one processor to perform the method as set forth in the first aspect above and in various possible designs of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which computer-executable instructions are stored, and when the computer-executable instructions are executed by a processor, the method according to the first aspect and various possible designs of the first aspect are implemented.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program that, when executed by a processor, implements the method as set forth in the first aspect and various possible designs of the first aspect.
The method includes acquiring first data to be replayed, where the first data to be replayed includes: the method comprises the steps of obtaining first original sensing data and first process data, wherein the first process data are process data which are output by a first model based on the operation of the first original sensing data, optimizing the first model into a second model by playing the first data to be replayed, generating second data to be replayed according to the first original sensing data based on the second model, playing and comparing the first data to be replayed and the second data to be replayed to obtain a verification result, and determining whether to update the first model into the second model or not according to the verification result. According to the method provided by the embodiment of the application, after the problem of the first model is found through the replay data packet, the first model related to the problem is optimized to obtain the second model, and then the new data packet is generated based on the data packet to carry out replay to verify the second model, so that the effect and the quality of the second model can be ensured in the research and development and test processes.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and those skilled in the art can obtain other drawings without inventive labor.
FIG. 1 is a schematic diagram of an autonomous driving system to be tested provided by an embodiment of the present application;
fig. 2 is a first flowchart illustrating an automated testing method according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart illustrating an automated testing method according to an embodiment of the present application;
fig. 4 is a third schematic flowchart of an automated testing method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an automated testing system provided in an embodiment of the present application;
FIG. 6 is a schematic structural diagram of an automated test equipment provided in an embodiment of the present application;
fig. 7 is a block diagram of an automated testing apparatus according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The autopilot system contains a large number of machine learning algorithms and requires a large amount of data to be tested in the development and testing phases. In the related art, an automatic driving system may pass through at least billion miles of driving data to perform system and algorithm test verification to achieve mass production. The construction and maintenance costs of the automatic driving fleet are undoubtedly high, the real vehicle quantity is limited, the early-stage data quality is poor, in the related art, a test data packet needs to be obtained through a test vehicle, the test data packet is played through a visualization tool, the problem is located, and a corresponding algorithm is optimized. However, in the above test method, the quality of the test data in the initial development stage is poor and the amount of the test data is small, so that the algorithm based on the test result optimization is poor.
In order to solve the above problems, the inventors of the present application have found that the problems can be verified by more effectively using data, and specifically, after the problems of the first model are found by replaying a data packet, the first model related to the problems is optimized to obtain a second model, and then a new data packet is generated based on the data packet to replay the second model, so as to verify the second model, thereby ensuring the effect and quality of the second model in the research, development and test processes. Based on this, the embodiment of the present application provides an automated testing method.
Fig. 1 is a schematic diagram of an automatic driving system to be tested according to an embodiment of the present application. As shown in FIG. 1, the system 100 includes a perception module 101, a planning module 102, and a control module 103. The sensing module 101 is configured to obtain raw sensor data output by a sensor, process the raw sensor data based on a sensing model, obtain sensing data (e.g., a position and a name of a target object), and further send the sensing data to the planning module 102, the planning module 102 is configured to perform route planning according to the sensing data and send the route planning to the control module 103, and the control module 103 generates a control instruction (e.g., acceleration, deceleration, turning, braking, etc.) according to the route planning.
In a specific implementation process, a vehicle end (e.g., a test vehicle) provided with an automatic driving system can record sensing data output by the sensing module 101 and process data (route planning, control instructions and the like) output by the planning module 102 and the control module 103 to obtain a test data packet, the test data packet is stored in a cloud, and the automatic test system obtains first data to be replayed from the test data packet stored in the cloud; the first data to be played back includes: first raw sensing data and first process data; the first process data are process data which are operated and output by the first model based on the first original sensing data; optimizing the first model into a second model by playing the first data to be replayed, and generating second data to be replayed according to the first original sensing data on the basis of the second model; and playing and comparing the first data to be replayed with the second data to be replayed to obtain a verification result, and determining whether to update the first model to the second model according to the verification result. According to the automatic testing method provided by the embodiment of the application, after the problem of the first model is found through the replay data packet, the first model related to the problem is optimized to obtain the second model, and then the new data packet is generated based on the data packet to be replayed so as to verify the second model, so that the effect and the quality of the second model can be guaranteed in the research, development and testing processes.
The technical means of the present application will be described in detail with specific examples. These several specific embodiments may be combined with each other below, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 2 is a first schematic flow chart of an automated testing method according to an embodiment of the present application. As shown in fig. 2, the method includes:
201. acquiring first data to be played back; the first data to be played back includes: first raw sensing data and first process data; the first process data is process data output by the first model based on the first original sensing data.
The execution subject of this embodiment may be a terminal or a server, such as a cloud server.
In this embodiment, the first model may be an automatic driving system, and may also be a specific algorithm in the automatic driving system, such as an algorithm of a planning class, a control class, a perception class, and the like. The process data refers to positioning, perception, fusion, prediction, route planning, control, and the like.
In this embodiment, the first data to be replayed has multiple acquiring manners, and can acquire a test data packet acquired by a target vehicle, and analyze the test data packet to acquire a tag of the test data packet; and extracting first data to be replayed from the test data packet based on the label and the test requirement.
The labels can be problem labels (tracking loss problems, signal weakness problems, manual taking-over problems, lane changing failure problems and the like) or scene labels (overhead driving scenes, ponding driving scenes, tunnel driving scenes and the like).
For example, an Automated Driving (AD) based test vehicle may collect and obtain package Data Bag Data (test Data packet), where the package Data includes raw sensor Data and process Data output during operation of an AD system for sensing, planning, controlling, and the like. And then the collected packet data is uploaded to a server, such as a cloud data platform, so that packet data analysis, meta information and scene or problem label extraction are performed, and data storage and management are realized on the data platform.
In the research and development or test process, first data to be replayed can be selected from data package stored in the cloud end according to test requirements and labels at the cloud end, data package replaying of the cloud end is carried out, the original driving scene of the vehicle end at the time and the behaviors of the perception target object, the tracking target object and the prediction target object output by the AD system can be replayed through a visualization tool, the planned vehicle route, the control instruction and the execution result can be displayed, meanwhile, logs and some statistical indexes can be displayed, and then problems can be analyzed and positioned based on the displayed data.
The mode of selecting the first data to be played back from the packet data stored in the cloud can be various, and in one implementation mode, when the vehicle end data is acquired, a marking tool at the vehicle end can mark the current moment when meeting special scenes (an overhead driving scene, a ponding driving scene, a tunnel driving scene and the like) and system problems (tracking loss problems, signal weakness problems, manual takeover problems, lane change failure problems and the like). The cloud finds the first data to be played back in a tag retrieval mode. If the car end does not possess the mark condition in another kind of can realizing mode, then can record back whole package data, save in high in the clouds, through replaying in high in the clouds, and adopt the marking tool in high in the clouds to beat mark to scene and problem fragment to follow-up based on label and test requirement from choosing first data that need to replay, make to be favorable to forming the closed loop that problem feedback, problem were restoreed, iteration were verified.
It should be noted that the test data may be obtained by recording a special test vehicle, or may be obtained by converting the format of the data acquired by the acquisition vehicle into the format of the test data recorded by the test vehicle. The simulation data of the simulation platform can also be obtained by format conversion (conversion into the format of the test data).
202. And optimizing the first model into a second model by playing the first data to be played back, and generating second data to be played back according to the first original sensing data based on the second model.
In this embodiment, the second model is a model obtained by optimizing the first model, and may be, for example, an optimized automatic driving system or a specific algorithm in the optimized automatic driving system, such as an algorithm of a planning class, a control class, a sensing class, and the like.
In some embodiments, when the test data is supplemented with simulation data to solve the problem of a small amount of test data, the optimizing the first model to the second model by playing the first to-be-replayed data may include: acquiring first simulation data; the first simulation data comprises second original data and third process data; the third process data are process data which are operated and output by the first model based on the second original sensing data; supplementing the first data to be replayed based on the first simulation data to obtain third data to be replayed; optimizing the first model into a second model by playing the third data to be played back; correspondingly, the playing and comparing the first data to be played back and the second data to be played back may include: and playing and comparing the third data to be played back with the second data to be played back.
203. And playing and comparing the first data to be replayed with the second data to be replayed to obtain a verification result, and determining whether to update the first model to the second model according to the verification result.
Specifically, after a problem is located, an algorithm developer locally modifies a code, submits the code to a development branch (for development, branch not formally submitted for measurement), triggers automatic compilation of a new software version, and deploys the new software version as a service. Research personnel can compare the comparison of the output results of the new software version and the old software version in a visual mode by playing the current problem data again, and then verify that the update of the research personnel can be submitted to the main line branch. And continuous code integration continuous testing of a full link from a development stage to an integration testing stage is realized.
In this embodiment, the first data to be replayed may be played through the visualization tool, a user (e.g., a research and development staff) may perform problem repair of the algorithm based on related content displayed by the visualization tool, submit the second model to the development branch, trigger compilation and mirroring through Continuous Integration/Continuous Deployment of codes (CICD), form a new algorithm mirror package and deploy the new algorithm mirror package to the cloud, and then play a problem scene through the package playing tool again, output a new result through the second model, form a new data package again, and perform visualization result verification.
Specifically, in the process of generating the second data to be played back based on the first data to be played back, the original sensor data recorded by the vehicle end and the process data (the first process data of the first data to be played back) output by the vehicle end AD system at that time may be saved, and the second data to be played back (the results of positioning, sensing, fusing, predicting, and route planning) output after the original sensor data of the first data to be played back is input to the algorithm service updated by the cloud end may be regenerated to correspond to the original scene, and the first data to be played back and the second data to be played back correspond to different algorithm versions. When verification and comparison are carried out, the image frames output by the original sensor and the results output by the new and old algorithm versions (versions of the first model and the second model) can be displayed at the same time in different colors, the comparison between the effect output by the new and old models and the real environment is observed, the effect of problem repair is verified, and whether the main line branch can be submitted or not is judged. Therefore, the verification of the second model is completed in a multi-version simultaneous visualization mode.
It should be noted that, in order to improve the testing efficiency, a heterogeneous environment is adopted in this embodiment, the optimizing the first model to the second model by playing the first data to be replayed may include: determining a problem code in a first model by playing the first data to be played back; repairing the problem code to obtain a repair code; and performing cross compilation on the repair code through a heterogeneous environment to obtain a second model. Correspondingly, the generating second data to be played back according to the first raw sensing data based on the second model may include: inputting the first original sensing data into the second model based on the heterogeneous environment, and operating to obtain second process data; and forming second data to be played back by the second process data and the first original sensing data.
Specifically, in this embodiment, the compiling supports cross compiling in various hardware environments, and a heterogeneous AD algorithm verification computing environment constructed by a cloud X86 machine, a local industrial personal computer, an AGX and other various environments is provided, so that an algorithm effect in a vehicle-side environment can be verified at the cloud.
In the automated testing method provided by this embodiment, after the problem of the first model is found by the replay data packet, the first model related to the problem is optimized to obtain the second model, and then a new data packet is generated based on the data packet to replay to verify the second model, so that the effect and quality of the second model can be ensured in the research and development and testing processes. Based on this, the embodiment of the application provides an automatic test method.
Compared with the regulation and control problems, the perception problems can optimize the effect by visually watching the model, and both the accuracy and the efficiency are low. In order to more efficiently solve the sensing problem, in some embodiments, on the basis of the foregoing embodiments, after step 201, the method may further include: if the first data to be replayed comprises sensing data, optimizing a first sensing model into a second sensing model based on the first data to be replayed; determining first perception truth value data according to the first data to be replayed; evaluating the first perception model and the second perception model based on the first perception truth value data to obtain an evaluation result; and determining whether to update the first perception model into the second perception model according to the evaluation result. Optionally, the determining first perceptual truth data according to the first to-be-replayed data may include: and labeling first original sensing data of the first data to be replayed to obtain first perception truth value data.
Specifically, when the first to-be-replayed data is replayed, the generation reason of the problem is located to be a perception problem, and the result cannot be accurately judged in a visual effect watching mode, the original sensor output image/point cloud frame can be analyzed through the originally recorded problem scene segment (the first to-be-replayed data), truth value data is generated in a manual or cloud large-model automatic labeling mode, then evaluation Metrics analysis (TP, FP, FN, precision, recall and the like) is carried out on the truth value, the vehicle-side base model output result and the updated model output result, and finally the model version is determined so as to be submitted.
In this embodiment, the relationship between the first perception model and the first model may be an equal relationship, and the first model may specifically be the first perception model, and the first perception model and the first model are the same; the first perception model can be used as a perception class algorithm contained in the first model (AD system); also possible is a parallel relationship: the first model can also be a regulation and control algorithm in the AD system, namely, the first model and the perception model are in parallel relation and belong to the AD system. This embodiment is not limited to this.
In some embodiments, in order to compensate for the small amount of test data, the simulation data may be effectively utilized. As shown in fig. 3, on the basis of the above embodiment, after step 201, the method may further include:
301. and optimizing the first perception model into a second perception model based on the first data to be played back.
302. And labeling first original sensing data of the first data to be replayed to obtain first perception truth value data.
303. Acquiring second simulation data; the second simulation data includes second perceptual truth data.
304. And evaluating the first perception model and the second perception model based on the first perception truth value data and the second perception truth value data to obtain an evaluation result.
305. And determining whether to update the first perception model to the second perception model according to the evaluation result.
Specifically, in the early stage of development of an automatic driving system, due to the fact that the number of vehicles participating in recording of a test data packet is limited and the vehicle performance is limited, data recorded by a real vehicle is insufficient in data volume and scene coverage on one hand, and on the other hand, the vehicle records original data of multiple sensors simultaneously in the AD running process and is limited in performance, so that the sensor data is recorded by adopting a frequency reduction or down sampling mode generally, and such data is insufficient in integrity and in later stage as evaluation data.
In view of the situation of insufficient evaluation data, the inventor researches and discovers that a large number of scenes can be constructed in the simulation process, and besides the end-to-end simulation test from the sensor input to the vehicle control in the simulation environment, the scenes can also record the data to form the same data format as the vehicle end, and the data format is stored in a data platform for management. Therefore, the problem scene of the simulation test at that time can be obtained, and debugging debug analysis and debugging verification can be carried out through the playback visualization tool. Meanwhile, the simulation constructed scene can simultaneously output true value data and virtual scene data of the sensor, so that the link of data labeling can be omitted, and the simulation constructed scene can be directly used for evaluation.
In some embodiments, to improve the accuracy of the regression test and also to make efficient use of the data, the development phase packet data for algorithm verification may be subjected to the regression test. Specifically, on the basis of the above embodiment, for example, after step 203, the method may include:
adding the first data to be replayed into a regression scene set; the regression scene set comprises historical data to be replayed; the historical data to be replayed is first data to be replayed corresponding to a previous verification result; if the first model is determined to be updated to the second model, generating a new software version according to the second model; and performing regression testing on the new software version based on the regression scene set.
Further, the scene data added into the regression scene set is scene data of AD real vehicle/simulation running failure, and includes original environment data (image, point cloud, global Positioning System (GPS), vehicle body Control Area Network (CAN) signal, etc.), and may also include data (identification result, route planning result, etc.) generated in the running process of the AD System, and the purpose is different; when the sensing module is verified, the original environment data is required to be input, and a sensing result is seen; when only the planning module needs to be repaired and verified, the sensing result data output by the AD system can be used as input, and only the planning result is needed to be seen.
Specifically, after the first data to be replayed is replayed through a visualization tool, algorithm optimization is carried out, and/or optimization of the perception model is achieved through an evaluation means, relevant codes of the algorithm and/or the model can be updated and submitted, and after the codes are submitted, a new software version of the automatic driving system can be deployed to a cloud simulation test platform through a daily update CICD. The scenes repaired at this time can be put into a replay verification scene set of the regression testing, and the regression testing is carried out through batch replay of the data to be replayed corresponding to each problem scene stored in the verification scene set, so that the version quality is guaranteed.
The automated testing method provided by the embodiment builds a set of highly automated complete links for continuously integrating and continuously verifying codes from a development stage to a testing stage after collecting from a testing problem scene, building a problem database and evaluation data, replaying analysis problems to a cloud, evaluating models and updating algorithms. And the AD test data and the simulation test data are recorded, stored and managed uniformly, and an efficient tool chain and data support are provided for problem analysis and repair verification.
In order to clearly illustrate the automated testing method provided by the embodiment, the following processes of testing data source, visualization tool playback, model evaluation, regression testing, and the like are fully exemplified with reference to fig. 4 and 5. As shown in fig. 4, the method includes:
401. the method comprises the steps of obtaining original high-precision sensor data collected by a collection vehicle, and carrying out format conversion on the high-precision sensor data to obtain first test data.
402. And acquiring original sensor data and corresponding process data acquired by the test vehicle to acquire second test data.
403. And acquiring simulation sensor data and corresponding process data acquired by the simulation vehicle to acquire third test data.
404. And performing format processing on the first test data, the second test data and the third test data to obtain original sensing data, perception truth value data and packet data. The sensing truth value data can comprise truth value data marked by the collection vehicle and the test vehicle and truth value data output by the simulation vehicle, and the packet data can be bag packet data under a Cyber framework.
405. And evaluating the perception model based on the perception truth value data, and updating the perception model in the automatic driving system based on the evaluation result to obtain a new perception model.
406. And optimizing and updating the algorithm through data playback based on the original sensing data and the packet data, and verifying the optimized algorithm to obtain a new algorithm.
407. And deploying the new algorithm and the new perception model to a test vehicle and a simulation platform, so that the test vehicle performs road test on the new algorithm and the new perception model, and the simulation platform performs simulation test on the new algorithm and the new perception model.
In a specific implementation process, as shown in fig. 5, the collection vehicle collects original high-precision sensor data and sends the data to the data processing service, and format conversion is performed on the high-precision sensor data through the data processing service to obtain first test data. The test vehicle collects the original sensor data and the corresponding process data and sends the data to the data processing service to obtain second test data. The simulation vehicle collects the simulation sensor data and the corresponding process data and sends the data to the data processing service to obtain third test data. The data processing service carries out format processing on the first test data, the second test data and the third test data to obtain original sensing data, perception true value data and packet data. The sensing truth value data can comprise truth value data marked by the collection vehicle and the test vehicle and truth value data output by the simulation vehicle, and the packet data can be bag packet data under a Cyber framework. The perception evaluation system evaluates the perception model based on the perception truth value data to obtain an evaluation result, the CICD updates the perception model in the automatic driving system based on the evaluation result, and deploys the perception model to a vehicle end model (AD algorithm service) to obtain a new perception model. And the data playback visualization tool optimizes the updating algorithm through data playback based on the original sensing data and the packet data, and serves the AD algorithm by submitting the second model and deploying through a CICD. The AD algorithm service verifies the second model based on the original sensing data and the packet data to obtain a new algorithm. And deploying the new algorithm and the new perception model to a test vehicle and a simulation platform through the CICD, so that the test vehicle performs road test on the new algorithm and the new perception model, and the simulation platform performs simulation test on the new algorithm and the new perception model.
The automated testing method provided by the embodiment can firstly establish a set of complete iterative closed loop processes of automatic driving function testing, problem scene acquisition, problem analysis, problem positioning, playback verification based on an original scene after problem solution, algorithm evaluation (including function testing and scale regression testing of research and development self-testing and testing), version release and deployment to a vehicle end. Secondly, problem data in the simulation test and drive test processes can be managed uniformly, a problem library with a uniform format is formed, a test data set is enriched, and problem tracking and management are facilitated. And finally, a perfect tool chain can be provided for cloud scene playback, problem analysis, continuous integration of codes, continuous deployment, continuous verification and continuous testing, and a data closed loop of problem feedback updating iteration is formed. And moreover, development and test efficiency is improved, development and cooperation of all modules at the cloud end, problem circulation, program development, self-test and scale test are facilitated, and an automation flow of continuous development, continuous integration and continuous deployment of codes is constructed. Finally, the method not only can be used for verifying different system versions, optimizing functions and improving indexes based on one problem, but also can be used for verifying different scenes and realizing the robustness and generalization capability of the functions based on one version.
Fig. 6 is a schematic structural diagram of an automated testing apparatus according to an embodiment of the present application. As shown in fig. 6, the automated test equipment 60 includes: an acquisition module 801, a detection module 802, and a determination module 803.
An obtaining module 601, configured to obtain first data to be played back; the first data to be replayed comprises: first raw sensing data and first process data; the first process data is process data output by the first model based on the first original sensing data.
The optimizing module 602 is configured to optimize the first model into a second model by playing the first data to be played back, and generate second data to be played back according to the first original sensing data based on the second model.
The verification module 603 is configured to play and compare the first data to be replayed with the second data to be replayed to obtain a verification result, and determine whether to update the first model to the second model according to the verification result.
The automatic test equipment provided by the embodiment of the application optimizes the first model related to the problem to obtain the second model after finding the problem of the first model through the replay data packet, and generates a new data packet based on the data packet to replay to verify the second model, so that the effect and the quality of the second model can be ensured in the research, development and test processes.
The automated testing device provided in the embodiment of the present application can be used to implement the above method embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
Fig. 7 is a block diagram of an automated testing device according to an embodiment of the present disclosure, where the automated testing device may be a terminal or a server. The terminal can be a computer, a message receiving and sending device, a tablet device, a medical device and the like, and the server can be a cloud data platform, a cluster server and the like.
Device 70 may include one or more of the following components: processing components 701, memory 702, power components 703, multimedia components 704, audio components 705, input/output (I/O) interfaces 706, sensor components 707, and communication components 708.
The processing component 701 generally controls the overall operation of the device 70, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 701 may include one or more processors 709 to execute instructions to perform all or part of the steps of the methods described above. Further, processing component 701 may include one or more modules that facilitate interaction between processing component 701 and other components. For example, the processing component 701 may include a multimedia module to facilitate interaction between the multimedia component 704 and the processing component 701.
Memory 702 is configured to store various types of data to support operation at device 70. Examples of such data include instructions for any application or method operating on device 70, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 702 may be implemented by any type or combination of volatile or non-volatile storage devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A power supply component 703 provides power to the various components of the device 70. The power components 703 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 70.
The multimedia component 704 includes a screen that provides an output interface between the device 70 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 704 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 70 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 705 is configured to output and/or input audio signals. For example, the audio component 705 includes a Microphone (MIC) configured to receive external audio signals when the device 70 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 702 or transmitted via the communication component 708. In some embodiments, audio component 705 also includes a speaker for outputting audio signals.
The I/O interface 706 provides an interface between the processing component 701 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 707 includes one or more sensors for providing various aspects of status assessment for the device 70. For example, the sensor component 707 may detect an open/closed state of the device 70, the relative positioning of components, such as a display and keypad of the device 70, the sensor component 707 may also detect a change in the position of the device 70 or a component of the device 70, the presence or absence of user contact with the device 70, orientation or acceleration/deceleration of the device 70, and a change in the temperature of the device 70. The sensor assembly 707 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 707 can also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 707 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 708 is configured to facilitate communication between the device 70 and other devices in a wired or wireless manner. The device 70 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 708 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 708 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the device 70 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as the memory 702 including instructions executable by the processor 709 of the device 70 to perform the above-described method. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
The computer-readable storage medium may be implemented by any type of volatile or non-volatile storage device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. A readable storage medium may be any available medium that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Of course, the processor and the readable storage medium may also reside as discrete components in the apparatus.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Embodiments of the present application further provide a computer program product, which includes a computer program, and when the computer program is executed by a processor, the automatic test method performed by the above automatic test equipment is implemented.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. An automated testing method, comprising:
acquiring first data to be played back; the first data to be played back includes: first raw sensing data and first process data; the first process data is process data which is output by the first model based on the first original sensing data;
optimizing the first model into a second model by playing the first data to be played back, and generating second data to be played back according to the first original sensing data based on the second model;
and playing and comparing the first data to be replayed with the second data to be replayed to obtain a verification result, and determining whether to update the first model to the second model according to the verification result.
2. The method according to claim 1, wherein said obtaining the first data to be replayed comprises:
acquiring a test data packet acquired by a target vehicle, analyzing the test data packet, and acquiring a label of the test data packet;
and extracting first data to be replayed from the test data packet based on the label and the test requirement.
3. The method according to claim 1, wherein said optimizing the first model to the second model by playing the first to-be-played-back data comprises:
determining a problem code in a first model by playing the first data to be played back;
repairing the problem code to obtain a repair code;
and performing cross compilation on the repair code through a heterogeneous environment to obtain a second model.
4. The method according to claim 3, wherein the generating second data to be replayed from the first raw sensing data based on the second model comprises:
inputting the first original sensing data into the second model based on the heterogeneous environment, and operating to obtain second process data;
and forming second data to be played back by the second process data and the first original sensing data.
5. The method according to any one of claims 1 to 4, wherein the optimizing the first model to the second model by playing the first data to be replayed comprises:
acquiring first simulation data; the first simulation data comprises second original data and third process data; the third process data is process data which is output by the first model based on the second original sensing data;
supplementing the first data to be played back based on the first simulation data to obtain third data to be played back;
optimizing the first model into a second model by playing the third data to be played back;
the playing and comparing the first data to be played back with the second data to be played back includes: and playing and comparing the third data to be played back with the second data to be played back.
6. The method according to any of claims 1-4, wherein if the first data to be played back comprises perceptual data,
optimizing the first perception model into a second perception model based on the first data to be replayed;
determining first perception truth value data according to the first data to be replayed;
evaluating the first perception model and the second perception model based on the first perception truth value data to obtain an evaluation result;
and determining whether to update the first perception model to the second perception model according to the evaluation result.
7. The method of claim 6, wherein determining first perceptual truth data from the first to-be-replayed data comprises:
and labeling first original sensing data of the first data to be replayed to obtain first perception truth value data.
8. The method according to claim 6, wherein the evaluating the first perception model and the second perception model based on the first perception truth data, and obtaining an evaluation result comprises:
acquiring second simulation data; the second simulation data comprises second perceptual truth data;
and evaluating the first perception model and the second perception model based on the first perception truth value data and the second perception truth value data to obtain an evaluation result.
9. The method according to any one of claims 1-4, further comprising:
adding the first data to be replayed into a regression scene set; the regression scene set comprises historical data to be replayed; the historical data to be replayed is first data to be replayed corresponding to a previous verification result;
if the first model is determined to be updated to the second model, generating a new software version according to the second model;
and performing regression testing on the new software version based on the regression scene set.
10. An automated test equipment, comprising:
the acquisition module is used for acquiring first data to be played back; the first data to be played back includes: first raw sensing data and first process data; the first process data are process data which are operated and output by the first model based on the first original sensing data;
the optimization module is used for optimizing the first model into a second model by playing the first data to be played back, and generating second data to be played back according to the first original sensing data based on the second model;
and the verification module is used for playing and comparing the first data to be replayed and the second data to be replayed to obtain a verification result, and determining whether to update the first model to the second model according to the verification result.
11. An automated test equipment, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing computer-executable instructions stored by the memory, causing the at least one processor to perform the automated testing method of any of claims 1 to 9.
12. A computer-readable storage medium having stored thereon computer-executable instructions which, when executed by a processor, implement the automated testing method of any one of claims 1 to 9.
13. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the automated testing method according to any one of claims 1 to 9.
CN202211405190.6A 2022-11-10 2022-11-10 Automated testing method, apparatus, storage medium, and program product Pending CN115904957A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211405190.6A CN115904957A (en) 2022-11-10 2022-11-10 Automated testing method, apparatus, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211405190.6A CN115904957A (en) 2022-11-10 2022-11-10 Automated testing method, apparatus, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN115904957A true CN115904957A (en) 2023-04-04

Family

ID=86491820

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211405190.6A Pending CN115904957A (en) 2022-11-10 2022-11-10 Automated testing method, apparatus, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN115904957A (en)

Similar Documents

Publication Publication Date Title
EP4224404A1 (en) Simulated traffic scene file generation method and apparatus
CN109668742B (en) Laser radar-based unmanned vehicle testing method and device
CN108874268B (en) User behavior data acquisition method and device
CN110347085B (en) Automated test system, method, vehicle, and computer-readable medium
CN112783793B (en) Automatic interface test system and method
CN107423106A (en) The method and apparatus for supporting more frame grammars
CN111611711A (en) Automatic driving data processing method and device and electronic equipment
CN112241361A (en) Test case generation method and device and problem scene automatic reproduction method and device
CN103916576A (en) Method and electronic apparatus for processing images
CN112016585A (en) System and method for integrating machine learning and mass outsourcing data tagging
CN104317699A (en) Application program verifying method and device
CN106790895A (en) A kind of fault handling method and device
CN112612472A (en) Embedded model SDK development method and development platform
CN113934885A (en) ADAS video data recharge performance verification system and method based on local area network
CN112052037A (en) Application software development method, device, equipment and medium
CN114779665A (en) Automatic parking simulation test method and device and readable storage medium
CN115904957A (en) Automated testing method, apparatus, storage medium, and program product
CN111897737A (en) Omission detection method and device for program test of micro-service system
CN111209195A (en) Method and device for generating test case
CN112633194A (en) Method and device for detecting fingerprint in screen
CN112416751A (en) Processing method and device for interface automation test and storage medium
CN114896168B (en) Quick debugging system, method and memory for automatic driving algorithm development
CN115422052A (en) Vehicle navigation app test system and method
CN115616936A (en) Simulation test method, device and equipment for vehicle
CN114118440A (en) Model iteration method, model iteration device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination