CN114817015A - Test case coverage rate statistical method and device, electronic equipment and storage medium - Google Patents
Test case coverage rate statistical method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN114817015A CN114817015A CN202210394920.0A CN202210394920A CN114817015A CN 114817015 A CN114817015 A CN 114817015A CN 202210394920 A CN202210394920 A CN 202210394920A CN 114817015 A CN114817015 A CN 114817015A
- Authority
- CN
- China
- Prior art keywords
- coverage rate
- coverage
- verification
- log information
- file
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 174
- 238000007619 statistical method Methods 0.000 title abstract description 13
- 238000012795 verification Methods 0.000 claims abstract description 112
- 238000004088 simulation Methods 0.000 claims abstract description 67
- 238000000034 method Methods 0.000 claims abstract description 65
- 230000015654 memory Effects 0.000 claims abstract description 18
- 238000012216 screening Methods 0.000 claims abstract description 9
- 238000004590 computer program Methods 0.000 claims description 8
- 230000008569 process Effects 0.000 abstract description 31
- 238000013461 design Methods 0.000 abstract description 15
- 238000004891 communication Methods 0.000 description 5
- 238000013515 script Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012217 deletion Methods 0.000 description 2
- 230000037430 deletion Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000717 retained effect Effects 0.000 description 2
- 241000220317 Rosa Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3676—Test management for coverage analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3692—Test management for test results analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Quality & Reliability (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- Debugging And Monitoring (AREA)
Abstract
The invention relates to the technical field of nonvolatile memories, and particularly discloses a method and a device for counting the coverage rate of a test case, electronic equipment and a storage medium, wherein the counting method comprises the following steps: simulating and operating all the test cases and generating corresponding coverage rate files and log information; screening and deleting a coverage rate file failed in verification according to the log information; generating a coverage report according to the rest coverage files; the statistical method screens and deletes the coverage rate file which fails verification according to the log information generated in the simulation process, and integrates the coverage rate file corresponding to the test case which can normally run to obtain the coverage rate report, so that the coverage rate report can reflect the coverage condition of the test case on the test point which passes normal verification, the verification progress of the test case can be more accurately reflected, and whether the project verification progress meets the design expectation can be more clearly reflected.
Description
Technical Field
The present disclosure relates to the field of non-volatile memory technologies, and in particular, to a method and an apparatus for counting coverage of test cases, an electronic device, and a storage medium.
Background
The Test Case (Test Case) refers to the description of a Test task performed on a specific software product, and embodies Test schemes, methods, techniques and strategies. The contents of the test object, the test environment, the input data, the test steps, the expected results, the test scripts and the like are included, and finally, a document is formed. In short, a test case is a set of test inputs, execution conditions, and expected results tailored for a particular purpose to verify that a particular software requirement is met.
For the verification technology of flash memory chips, the coverage of test cases is an important way to measure the verification progress and the verification completeness, and generally, before designing a test case, test points to be verified are determined according to verification requirements, then a plurality of test cases are designed according to the test points, finally, the test cases are simulated to obtain corresponding coverage files, and then the coverage files are overlapped to obtain the coverage of the test cases relative to the verification purpose.
However, in the prior art, in the process of obtaining the coverage of the test case, the test case is processed only by obtaining the test point range related to the test case for superposition, and whether the test case can successfully complete the verification process of the corresponding test point is not considered, that is, when the verification bug is not completely cleaned, the directly collected coverage cannot well represent the verification condition, the test point which finally causes the verification failure is also included in the coverage of the test case, and the verification progress effect cannot be accurately reflected.
In view of the above problems, no effective technical solution exists at present.
Disclosure of Invention
The application aims to provide a method and a device for counting the coverage rate of a test case, an electronic device and a storage medium, so as to accurately reflect the verification progress of the test case.
In a first aspect, the present application provides a method for counting coverage of test cases, where the method is used for counting coverage of test cases, and the method includes the following steps:
simulating and operating all the test cases and generating corresponding coverage rate files and log information;
screening and deleting a coverage rate file failed in verification according to the log information;
a coverage report is generated from the remaining coverage files.
According to the coverage rate statistical method for the test cases, the coverage rate files corresponding to the test cases capable of running normally are integrated to obtain the coverage rate report, so that the coverage rate report can reflect the coverage condition of the test cases on the test points passing normal verification, and the verification progress of the test cases can be reflected more accurately.
The method for counting the coverage rate of the test cases comprises the following steps of:
simulating and operating all the test cases and generating corresponding simulation results;
generating the coverage rate file according to the simulation result;
and generating log information according to the simulation result and a preset reference result.
According to the coverage rate statistical method of the test case, the reference result and the simulation result are introduced for comparison, so that log information which can represent whether the coverage rate file fails to be verified is generated, and the coverage rate file which fails to be verified can be rapidly and accurately screened.
The method for counting the coverage rate of the test cases comprises the following steps of:
and simulating and operating the test case according to a preset assertion check code and generating the simulation result.
According to the coverage rate statistical method for the test cases, the assertion check codes are added in the simulation process for checking, and the verification failure condition can be checked from the simulation process, namely the simulation result contains the verification failure condition meeting the assertion condition in the simulation process, so that the log information generated in the step S13 is more accurate, namely the simulation process and the result are covered, and the coverage rate file with the verification failure can be more accurately deleted in the step S2.
The method for counting the coverage rate of the test case comprises the following steps of screening and deleting a coverage rate file which fails to be verified according to the log information:
extracting verification failure number information corresponding to the coverage rate file with verification failure according to the log information;
and deleting the coverage rate file of the verification failure according to the verification failure number information.
The method for counting the coverage rate of the test case comprises the following steps of extracting verification failure number information corresponding to a coverage rate file with verification failure according to the log information:
searching the text content of the log information, and regarding a coverage rate file corresponding to the log information containing the failure marking entry as the coverage rate file of the failure verification;
and extracting the verification failure number information of the coverage rate file with the verification failure.
The method for counting the coverage rate of the test case comprises the following steps of:
generating a vector file according to the verification failure number information;
and deleting the coverage rate files which fail to be verified one by one according to the vector files.
The method for counting the coverage rate of the test cases comprises the step of obtaining log information, wherein the log information comprises name information, number information and operation information of corresponding test cases.
In a second aspect, the present application further provides a device for counting coverage of test cases, where the device is used for counting coverage of test cases, and the device includes:
the simulation module is used for simulating and running all the test cases and generating corresponding coverage rate files and log information;
the deleting module is used for screening and deleting the coverage rate file which fails in verification according to the log information;
and the generating module is used for generating a coverage rate report according to the rest coverage rate files.
According to the coverage rate statistical device for the test cases, the generation module is utilized to integrate the coverage rate file corresponding to the test cases capable of running normally to obtain the coverage rate report, so that the coverage rate report can reflect the coverage condition of the test cases on the test points passing normal verification, and the verification progress of the test cases can be reflected more accurately.
In a third aspect, the present application further provides an electronic device, comprising a processor and a memory, where the memory stores computer readable instructions, and the computer readable instructions, when executed by the processor, perform the steps of the method as provided in the first aspect.
In a fourth aspect, the present application also provides a storage medium having a computer program stored thereon, which when executed by a processor performs the steps of the method as provided in the first aspect above.
Therefore, the application provides a method and a device for counting the coverage rate of a test case, an electronic device and a storage medium, wherein the counting method screens and deletes the coverage rate file with failed verification according to log information generated in a simulation process, and integrates the coverage rate file corresponding to the test case capable of normally running to obtain a coverage rate report, so that the coverage rate report can reflect the coverage condition of the test case on the test point passing normal verification, can more accurately reflect the verification progress of the test case, and can more clearly reflect whether the project verification progress meets the design expectation.
Drawings
Fig. 1 is a flowchart of a coverage statistical method for test cases according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a coverage rate statistics apparatus for test cases according to an embodiment of the present disclosure.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Reference numerals: 201. a simulation module; 202. deleting the module; 203. a generation module; 301. a processor; 302. a memory; 303. a communication bus.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
The coverage rate of the test case is determined according to the percentage of the test points to be verified defined during the design of the test points covered in the simulation test process, and the test points to be verified defined during the design can be generally determined according to the design document.
After the coverage of the test case is obtained, whether the test case meets the design expectation is generally determined by determining whether the coverage is greater than a preset threshold (e.g., 90%).
In the existing verification technology of flash memory chips, in the process of obtaining the coverage rate of a test case, only whether the test case performs a relevant verification operation on a corresponding test point is collected, whether the relevant verification operation of the test case is successfully completed is not considered, the finally obtained coverage rate can only reflect the completeness of the test case (whether the corresponding test point is covered) but can not accurately reflect the verification progress (whether the test point is successfully verified), and only can roughly and one-sidedly reflect the association relationship between the test case and the design requirement, so that the finally obtained coverage rate can not clearly reflect whether the project verification progress meets the design expectation.
In a first aspect, please refer to fig. 1, where fig. 1 is a method for counting coverage of test cases in some embodiments of the present application, where the method is used for counting coverage of test cases, and includes the following steps:
s1, simulating and running all test cases and generating corresponding coverage rate files and log information;
specifically, the test case is subjected to simulation operation on a pre-configured verification platform, that is, a chip model preset in the verification platform is subjected to simulation test by using the test case, and the coverage rate and log information are obtained through the operation process and result of a simulation chip model.
More specifically, the coverage rate file represents the coverage rate of the test case currently run by simulation, that is, represents the proportion of the test point tested by the test case to all the test points in the design document.
More specifically, the log information is interaction information between the chip model and the system, which is arranged in order according to time in the process of executing the test case in the simulation mode, and the log information includes interaction information of an operation process and an operation result in the process of executing the test case by the chip model.
More specifically, the steps include: and simulating and operating all the test cases one by one or simultaneously, and respectively acquiring the coverage rate file and the log information of the corresponding test case.
S2, screening and deleting the coverage rate file which fails in verification according to the log information;
specifically, when instruction execution fails in the process of simulation operation of a test case on a chip model by a verification platform, an operation failure result is generated, and text information of verification failure is retained in log information while the operation failure result is generated so as to record that the test case has verification defects; therefore, in this step, the coverage files generated by the test cases correspondingly need to be acquired and deleted, that is, the coverage files generated by the test cases correspondingly and artificially and normally used are reserved.
More specifically, the log information includes the operation process and the interaction information of the operation result in the process of executing the test case by the chip model, so that the test case which fails to be verified and the corresponding coverage rate file can be screened out according to the log information.
And S3, generating a coverage report according to the rest coverage files.
Specifically, the coverage report includes a total coverage, and the total coverage reflects a coverage of the test point of the design requirement by the simulation operation result of all the test cases.
Specifically, after deleting the coverage rate file with the verification failure, the remaining coverage rate files correspond to all the test cases capable of normally operating, and the coverage rate report generated according to the remaining coverage rate files can clearly reflect the test range corresponding to the test cases capable of normally operating, that is, the coverage rate integrated by all the available test reports is reflected.
More specifically, in this step, the remaining coverage rate file is integrated by a preset integration script or report generation tool to generate a coverage rate report, and the integration script and the report generation tool belong to the prior art and are not described herein again.
According to the coverage rate statistical method for the test cases, the coverage rate files which fail verification are screened and deleted according to the log information generated in the simulation process, the coverage rate files corresponding to the test cases which can normally run are integrated to obtain the coverage rate report, so that the coverage rate report can reflect the coverage condition of the test cases on the test points which normally pass the verification, the verification progress of the test cases can be reflected more accurately, and whether the project verification progress meets the design expectation or not can be reflected more clearly.
More specifically, the engineer can adjust the test case corresponding to the verification failure or redesign the test case according to the coverage rate report, so that the verification effect of the test case is better and meets the design expectation.
In some preferred embodiments, the step of simulating to run all test cases and generate corresponding coverage files and log information includes:
s11, simulating and running all test cases and generating corresponding simulation results;
specifically, the step is carried out in a verification platform, wherein one or more chip models are arranged in the verification platform; when one chip model is used, putting the chip models into test cases one by one to carry out simulation test so as to obtain a corresponding simulation result; when a plurality of chip models are available, the step puts different chip models into different test cases to carry out simultaneous or discontinuous or continuous tests so as to obtain corresponding simulation results.
S12, generating a coverage rate file according to the simulation result;
specifically, the simulation result reflects the coverage of the test point of the chip model test in the test case simulation process, so step S12 can generate a coverage file related to the coverage of the test point according to the simulation result, the generation process is automatically generated based on the generation script of the verification platform, and the generation script belongs to the prior art and is not described herein again.
And S13, generating log information according to the simulation result and a preset reference result.
Specifically, the preset reference result is a reference result set according to the design requirement of the test case, that is, an execution result obtained when the test case is assumed to be able to be used normally.
More specifically, the simulation result is a result of the test case simulation test, and the result only reflects an expression result after the test case is executed by the chip model in a simulation manner, and generally cannot reflect whether the test case is executed smoothly, so that step S13 refers to a preset reference result to compare with the simulation result to determine whether the test case is executed smoothly, thereby generating corresponding log information, so that the log information can reflect whether the verification of the coverage rate file fails.
According to the coverage rate statistical method for the test cases, the reference result and the simulation result are introduced to be compared, so that log information which can represent whether the coverage rate files fail to be verified is generated, and the coverage rate files which fail to be verified can be screened out quickly and accurately.
In some preferred embodiments, the step of simulating to run all test cases and generate corresponding simulation results includes:
and simulating and operating all test cases according to the preset assertion check code and generating a corresponding simulation result.
Specifically, in the process of executing the test case on the chip model in a simulation manner, a preset assertion check code is included to perform assertion judgment at a preset assertion position, so that the simulation result is subjected to further judgment constraint, the simulation result contains more simulation information, and the simulation condition of the test case can be clearly reflected by the simulation result.
More specifically, the preset assertion check code is set in the verification platform, and in the simulation process, when the verification platform can detect that the chip model is executed to the assertion position according to the check condition of the preset assertion check code, the corresponding assertion is true, so as to generate a simulation result including a failure corresponding to the assertion or a success corresponding to the assertion, so that in the process of generating log information according to the simulation result and the preset reference result in step S13, the successful assertion needs to be recorded in the log information; generally, assertion success corresponds to a case of verification failure; if taking the signal a and the signal b as an example, when a chip model actually runs a certain test case, the two signals should meet the condition that the signal a and the signal b are different in high level, and verilog is used in a verification platform to realize assertion check, that is, the following assertion conditions are added to corresponding assertion positions:
if(a & b)
$display(“Error: asserted check failed.\n”);
the assertion condition indicates that when a signal a and a signal b of high level appear simultaneously in the chip model to be tested, the simulation outputs "Error: "used in the specification of" used.
Also for example, if the rising edge of the clk signal is used to check whether the req signal is pulled, the assertion check can be implemented by using a system Verilog asser, which adds the following code:
sequence s1;
@(posedge clk)
$rose(req)
endsequence
in the assertion condition, the assertion is successful if the req signal is pulled up, otherwise the assertion is failed.
More specifically, the statistical method according to the embodiment of the present application adds an assertion check code to check in the simulation process, and can check out a verification failure condition from the simulation process, that is, a simulation result includes a verification failure condition that meets an assertion condition in the simulation process, so that the log information generated in step S13 is more accurate, that is, the simulation process and the result are covered, and the coverage rate file that fails in verification can be deleted more accurately in step S2.
It is worth mentioning that in the process of simulating and running all test cases and generating corresponding simulation results, the assertion check code is a code which can be selected to be added or not.
In some preferred embodiments, the step of filtering and deleting the coverage file failing the verification according to the log information comprises:
s21, extracting verification failure number information corresponding to the coverage rate file with verification failure according to the log information;
specifically, the test cases, the log information, and the coverage files are in a one-to-one correspondence relationship, the coverage files have corresponding numbers corresponding to the test cases or the test sequence or time sequence, the log information can be used to determine whether the corresponding coverage files fail to be verified, and step S21 is to extract the number information corresponding to the coverage files from the log information corresponding to the coverage files that fail to be verified, where the number information is regarded as verification failure number information.
More specifically, the log information is used for recording interaction information between the log information and the system in a time-ordered arrangement mode, the log information does not have a judgment function, the log information of each test case is mainly analyzed by a verification platform to judge whether the log information contains verification failure content, if yes, the coverage rate file corresponding to the log information is considered to be verified to be failed, and then corresponding verification failure number information is extracted from the log information to position the corresponding coverage rate file.
And S22, deleting the coverage rate file which fails in the verification according to the verification failure number information.
Specifically, after the coverage files are located by using the verification failure number information in step S21, only the coverage files that are successfully verified are retained after the corresponding coverage files are deleted.
In some preferred embodiments, the step of extracting, from the log information, verification failure number information corresponding to a coverage file for which verification fails includes:
s211, searching text content of log information, and regarding a coverage rate file corresponding to the log information containing the failure tag entry as a coverage rate file of failure verification;
specifically, the failure flag entry is a preset text word, such as fail, error, and the like, and only when the coverage rate file fails to be verified, the corresponding text word is generated and recorded in the log information, so that when the coverage rate file with the failure flag entry is found in the log information, the coverage rate file corresponding to the log information is regarded as failed to be verified.
S212, extracting verification failure number information of the coverage rate file with verification failure.
Specifically, after the coverage file corresponding to the log information is determined to be the coverage file with the verification failure, the verification failure number information needs to be extracted from the log information to accurately position the coverage file.
In some preferred embodiments, the step of deleting the coverage file that fails to be verified according to the verification failure number information includes:
s221, generating a vector file according to the verification failure number information;
specifically, after the coverage rate file with failed verification is located through the log information, if the coverage rate file is directly deleted, a problem of logic confusion or deletion omission may occur, so the statistical method in the embodiment of the present application adopts a processing means of batch deletion, and therefore, step S221 firstly counts all verification identification number information and integrates to generate a vector file, where the vector file includes verification failure number information of all coverage rate files with failed verification.
S222, deleting the coverage rate files which fail to be verified one by one according to the vector files.
Specifically, the vector files have a sequential characteristic, so that the coverage rate files except for the verification failure corresponding to the verification failure number information can be deleted one by one in an oriented manner.
In some preferred embodiments, the log information includes name information, number information, and running information of the corresponding test case.
Specifically, the name information and the number information are beneficial for engineering personnel to screen the log information of the test case for special analysis, the running information comprises the whole simulation process and a result log text, and the failure marking entry is recorded in the running information; step S211 is mainly to find whether the running information includes a failure flag entry.
In a second aspect, please refer to fig. 2, fig. 2 is a device for counting coverage of test cases according to some embodiments of the present application, where the device includes:
the simulation module 201 is used for simulating and running all test cases and generating corresponding coverage rate files and log information;
the deleting module 202 is used for screening and deleting the coverage rate file which fails to be verified according to the log information;
a generating module 203 for generating a coverage report from the remaining coverage files.
According to the coverage rate statistical device for the test cases, the coverage rate files which fail to be verified are screened and deleted according to the log information generated in the simulation process, the generation module 203 is used for integrating the coverage rate files corresponding to the test cases which can normally run to obtain the coverage rate report, so that the coverage rate report can reflect the coverage condition of the test cases on the test points which normally pass through verification, the verification progress of the test cases can be reflected more accurately, and whether the project verification progress meets the design expectation or not can be reflected more clearly.
In some preferred embodiments, the coverage statistical apparatus for test cases in the embodiments of the present application is configured to execute the coverage statistical method for test cases provided in the first aspect.
In a third aspect, referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application, where the present application provides an electronic device, including: the processor 301 and the memory 302, the processor 301 and the memory 302 being interconnected and communicating with each other via a communication bus 303 and/or other form of connection mechanism (not shown), the memory 302 storing a computer program executable by the processor 301, the processor 301 executing the computer program when the computing device is running to perform the method of any of the alternative implementations of the embodiments described above.
In a fourth aspect, the present application provides a storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program performs the method in any optional implementation manner of the foregoing embodiments. The storage medium may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
In summary, embodiments of the present application provide a method and an apparatus for counting coverage of test cases, an electronic device, and a storage medium, where the statistical method screens and deletes a coverage file that fails to be verified according to log information generated in a simulation process, and integrates the coverage file corresponding to a test case that can normally operate to obtain a coverage report, so that the coverage report can reflect a coverage condition of the test case on a test point that passes normal verification, can more accurately reflect verification progress of the test case, and can more clearly reflect whether project verification progress meets design expectations.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (10)
1. A method for counting the coverage rate of test cases is used for counting the coverage rate of test cases, and is characterized by comprising the following steps:
simulating and operating all the test cases and generating corresponding coverage rate files and log information;
screening and deleting a coverage rate file failed in verification according to the log information;
a coverage report is generated from the remaining coverage files.
2. The method according to claim 1, wherein the step of performing simulation operation on all the test cases and generating corresponding coverage files and log information comprises:
simulating and operating all the test cases and generating corresponding simulation results;
generating the coverage rate file according to the simulation result;
and generating log information according to the simulation result and a preset reference result.
3. The method according to claim 1, wherein the step of performing simulation operation on all the test cases and generating corresponding simulation results comprises:
and simulating and operating the test case according to a preset assertion check code and generating the simulation result.
4. The method according to claim 1, wherein the step of screening and deleting the coverage files that fail to be verified according to the log information comprises:
extracting verification failure number information corresponding to the coverage rate file with verification failure according to the log information;
and deleting the coverage rate file of the verification failure according to the verification failure number information.
5. The method according to claim 4, wherein the step of extracting verification failure number information corresponding to the coverage file of the verification failure according to the log information comprises:
searching the text content of the log information, and regarding a coverage rate file corresponding to the log information containing the failure marking entry as the coverage rate file of the failure verification;
and extracting the verification failure number information of the coverage rate file with the verification failure.
6. The method according to claim 4, wherein the step of deleting the coverage file that fails to verify according to the verification failure number information comprises:
generating a vector file according to the verification failure number information;
and deleting the coverage rate files which fail to be verified one by one according to the vector files.
7. The method according to claim 1, wherein the log information includes name information, number information, and running information of the corresponding test case.
8. A device for counting the coverage rate of test cases is used for counting the coverage rate of test cases, and the device comprises:
the simulation module is used for simulating and operating all the test cases and generating corresponding coverage rate files and log information;
the deleting module is used for screening and deleting the coverage rate file which fails in verification according to the log information;
and the generating module is used for generating a coverage rate report according to the rest coverage rate files.
9. An electronic device comprising a processor and a memory, said memory storing computer readable instructions which, when executed by said processor, perform the steps of the method according to any one of claims 1 to 7.
10. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, performs the steps of the method according to any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210394920.0A CN114817015A (en) | 2022-04-14 | 2022-04-14 | Test case coverage rate statistical method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210394920.0A CN114817015A (en) | 2022-04-14 | 2022-04-14 | Test case coverage rate statistical method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114817015A true CN114817015A (en) | 2022-07-29 |
Family
ID=82536387
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210394920.0A Pending CN114817015A (en) | 2022-04-14 | 2022-04-14 | Test case coverage rate statistical method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114817015A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115248783A (en) * | 2022-09-26 | 2022-10-28 | 江西萤火虫微电子科技有限公司 | Software testing method, system, readable storage medium and computer equipment |
CN115470125A (en) * | 2022-09-02 | 2022-12-13 | 芯华章科技(北京)有限公司 | Debugging method and device based on log file and storage medium |
CN116245057A (en) * | 2022-09-26 | 2023-06-09 | 上海合见工业软件集团有限公司 | Effective execution area determining system based on time sequence type coverage database |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107797929A (en) * | 2017-10-26 | 2018-03-13 | 北京广利核***工程有限公司 | The statistical method and device of FPGA emulation testing function coverage |
CN111832236A (en) * | 2020-06-29 | 2020-10-27 | 山东云海国创云计算装备产业创新中心有限公司 | Chip regression testing method and system, electronic equipment and storage medium |
CN112131807A (en) * | 2020-09-18 | 2020-12-25 | 山东云海国创云计算装备产业创新中心有限公司 | Cross-clock domain verification method, device, equipment and medium |
CN114297961A (en) * | 2021-11-30 | 2022-04-08 | 山东云海国创云计算装备产业创新中心有限公司 | Chip test case processing method and related device |
-
2022
- 2022-04-14 CN CN202210394920.0A patent/CN114817015A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107797929A (en) * | 2017-10-26 | 2018-03-13 | 北京广利核***工程有限公司 | The statistical method and device of FPGA emulation testing function coverage |
CN111832236A (en) * | 2020-06-29 | 2020-10-27 | 山东云海国创云计算装备产业创新中心有限公司 | Chip regression testing method and system, electronic equipment and storage medium |
CN112131807A (en) * | 2020-09-18 | 2020-12-25 | 山东云海国创云计算装备产业创新中心有限公司 | Cross-clock domain verification method, device, equipment and medium |
CN114297961A (en) * | 2021-11-30 | 2022-04-08 | 山东云海国创云计算装备产业创新中心有限公司 | Chip test case processing method and related device |
Non-Patent Citations (2)
Title |
---|
石柱主编;遇今等编写: "《软件质量管理》", vol. 1, 30 September 2003, 航空工业出版社, pages: 138 - 139 * |
郭利文;邓月明: "《普通高校"十三五"规划教材 CPLD/FPGA设计与应用基础教程 从VerilogHDL到SystemVerilog》", vol. 1, 31 August 2019, 北京航空航天大学出版社, pages: 308 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115470125A (en) * | 2022-09-02 | 2022-12-13 | 芯华章科技(北京)有限公司 | Debugging method and device based on log file and storage medium |
CN115248783A (en) * | 2022-09-26 | 2022-10-28 | 江西萤火虫微电子科技有限公司 | Software testing method, system, readable storage medium and computer equipment |
CN115248783B (en) * | 2022-09-26 | 2022-12-23 | 江西萤火虫微电子科技有限公司 | Software testing method, system, readable storage medium and computer equipment |
CN116245057A (en) * | 2022-09-26 | 2023-06-09 | 上海合见工业软件集团有限公司 | Effective execution area determining system based on time sequence type coverage database |
CN116245057B (en) * | 2022-09-26 | 2023-12-19 | 上海合见工业软件集团有限公司 | Effective execution area determining system based on time sequence type coverage database |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114817015A (en) | Test case coverage rate statistical method and device, electronic equipment and storage medium | |
US8627296B1 (en) | Unified unit and integration test with automatic mock creation | |
US7188061B2 (en) | Simulation monitors based on temporal formulas | |
US7293213B1 (en) | Method for detecting software errors and vulnerabilities | |
US20110055777A1 (en) | Verification of Soft Error Resilience | |
US20120254662A1 (en) | Automated test system and automated test method | |
US10657028B2 (en) | Method for replicating production behaviours in a development environment | |
CN111752833B (en) | Software quality system approval method, device, server and storage medium | |
US7389482B2 (en) | Method and apparatus for analyzing post-layout timing violations | |
US8560991B1 (en) | Automatic debugging using automatic input data mutation | |
CN111475411A (en) | Server problem detection method, system, terminal and storage medium | |
CN114048129A (en) | Automatic testing method, device, equipment and system for software function change | |
US10929108B2 (en) | Methods and systems for verifying a software program | |
CN110287700B (en) | iOS application security analysis method and device | |
CN112612716A (en) | Method, system, equipment and storage medium for enhancing marking of coverage rate of difference line codes | |
CN112270110A (en) | Compatibility testing method and system for industrial internet platform assembly | |
CN112597718A (en) | Verification method, verification device and storage medium for integrated circuit design | |
US10546080B1 (en) | Method and system for identifying potential causes of failure in simulation runs using machine learning | |
Elmqvist et al. | Safety-oriented design of component assemblies using safety interfaces | |
CN113220594B (en) | Automatic test method, device, equipment and storage medium | |
CN113378502B (en) | Test method, device, medium and equipment for verifying signal trend code matching | |
CN115757099A (en) | Automatic test method and device for platform firmware protection recovery function | |
CN111767222A (en) | Data model verification method and device, electronic equipment and storage medium | |
CN113238953A (en) | UI automation test method and device, electronic equipment and storage medium | |
CN113342632A (en) | Simulation data automatic processing method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |