CN112540916A - Automatic rerun method and device for failed case, computer equipment and storage medium - Google Patents

Automatic rerun method and device for failed case, computer equipment and storage medium Download PDF

Info

Publication number
CN112540916A
CN112540916A CN202011378638.0A CN202011378638A CN112540916A CN 112540916 A CN112540916 A CN 112540916A CN 202011378638 A CN202011378638 A CN 202011378638A CN 112540916 A CN112540916 A CN 112540916A
Authority
CN
China
Prior art keywords
execution
result
test
case
tested
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011378638.0A
Other languages
Chinese (zh)
Inventor
王涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dilu Technology Co Ltd
Original Assignee
Dilu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dilu Technology Co Ltd filed Critical Dilu Technology Co Ltd
Priority to CN202011378638.0A priority Critical patent/CN112540916A/en
Publication of CN112540916A publication Critical patent/CN112540916A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application relates to a method and a device for automatically rereading a failed case, computer equipment and a storage medium. The method comprises the following steps: acquiring a to-be-tested case and test data in a test scene; executing the case to be tested according to the test data to obtain an execution result; comparing the execution result with a preset expected result according to the execution result to determine a comparison result; when the comparison result is not matched, updating the residual re-execution times to obtain the updated residual re-execution times; and when the updated residual re-execution times are larger than 0, returning to the step of executing the to-be-tested case according to the test data and obtaining the execution result, and obtaining the final execution result until the updated residual re-execution times are equal to 0 or the comparison result is matched. The test case which is not matched with the preset expected result can be automatically re-executed, so that the method is relatively flexible, the limitation is improved, and the software testing efficiency is improved.

Description

Automatic rerun method and device for failed case, computer equipment and storage medium
Technical Field
The present application relates to the field of automated testing technologies, and in particular, to an automatic rerun method and apparatus for failed cases, a computer device, and a storage medium.
Background
Automated testing is a process that translates human-driven test behavior into machine execution. Typically, after a test case is designed and passes review, the test is performed step by a tester according to the procedures described in the test case, resulting in a comparison of the actual results with the expected results. In the process, in order to save manpower, time or hardware resources and improve the testing efficiency, the concept of automatic testing is introduced.
The existing automatic test mainly adopts a unittest frame to execute the whole test cases, has large limitation and is relatively single, so that the software test efficiency is low.
Disclosure of Invention
In view of the above, it is necessary to provide a method, an apparatus, a computer device, and a storage medium for automatically rerunning failed cases, which can improve software testing efficiency.
A method for automatic rerun of failed use cases, the method comprising:
acquiring a to-be-tested case and test data in a test scene;
executing the to-be-tested case according to the test data to obtain an execution result;
comparing the execution result with a preset expected result according to the execution result to determine a comparison result;
when the comparison result is not matched, updating the residual re-execution times to obtain the updated residual re-execution times;
and when the updated residual re-execution times are larger than 0, returning to the step of executing the to-be-tested case according to the test data to obtain an execution result, and obtaining a final execution result until the updated residual re-execution times are equal to 0 or the comparison result is matched.
In one embodiment, the step of updating the remaining number of times of re-execution to obtain the updated remaining number of times of re-execution when the comparison result is not matched includes:
when the comparison result is not matched, marking the times that the comparison result of the execution result of the to-be-tested case is not matched;
and subtracting 1 from the current residual re-execution times to obtain the updated residual re-execution times.
In one embodiment, the step of obtaining a to-be-tested case and test data in a test scenario includes:
when the preset test time is reached, triggering a test instruction;
and acquiring a to-be-tested case and test data in a test scene according to the test instruction.
In one embodiment, the method further comprises:
acquiring a test report template;
and generating a test report according to the test report template and the final execution result.
In one embodiment, the method further comprises:
acquiring contact information of a person in charge related to the case to be tested;
and sending the test report to the relevant responsible person according to the contact information of the relevant responsible person.
In one embodiment, the method further comprises:
when the final execution result has a result of execution failure, the test report also includes a specific reason for prompting the user of an error and the related module.
In one embodiment, the method further comprises:
and generating an execution condition log of a specific related element component module according to the execution condition of the to-be-tested case, and associating the execution condition log with the test report.
An automatic rerun apparatus for a failed use case, the apparatus comprising:
the data acquisition module is used for acquiring a to-be-tested case and test data in a test scene;
the execution module is used for executing the to-be-tested case according to the test data to obtain an execution result;
the comparison module is used for comparing the execution result with a preset expected result to determine a comparison result;
the number updating module is used for updating the residual re-execution number when the comparison result is not matched, and obtaining the updated residual re-execution number;
and a final execution result obtaining module, configured to, when the updated remaining number of times of re-execution is greater than 0, return to the step of executing the to-be-tested case according to the test data to obtain an execution result, and obtain a final execution result until the updated remaining number of times of re-execution is equal to 0 or the comparison result is a match.
A computer device comprising a memory storing a computer program and a processor implementing the steps of the method when executing the computer program.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method.
The automatic rerun method, the automatic rerun device, the computer equipment and the storage medium for the failed case acquire the case to be tested and test data in a test scene; executing the to-be-tested case according to the test data to obtain an execution result; comparing the execution result with a preset expected result according to the execution result to determine a comparison result; when the comparison result is not matched, updating the residual re-execution times to obtain the updated residual re-execution times; and when the updated residual re-execution times are larger than 0, returning to the step of executing the to-be-tested case according to the test data to obtain an execution result, and obtaining a final execution result until the updated residual re-execution times are equal to 0 or the comparison result is matched. The test case which is not matched with the preset expected result can be automatically re-executed, so that the method is relatively flexible, the limitation is improved, and the software testing efficiency is improved.
Drawings
FIG. 1 is a flow diagram illustrating a method for automatic rerun of failed use cases in one embodiment;
FIG. 2 is a diagram of an embodiment of an APP system interface automation test report;
FIG. 3 is a diagram that illustrates a user prompting a particular cause of an error and modules involved, in one embodiment;
fig. 4 is a block diagram of an automatic rerun apparatus for a failed case in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, the provided automatic rerun method for the failed use case is to utilize Python3.5 language (which is a cross-platform computer programming language) as a programming language, use htmltestrunner library (TestRunner for Python unit test framework, which generates an HTML report to display the result at a glance) to output the test result and generate the test report, and use Jenkins (which is an open source software project and is a persistent integration tool developed based on Java) to perform persistent integration management. The method for constructing the automatic rerun method of the failed case comprises the following steps:
step 1.1: install Python version 3.5 and pip3 (a software package management system written in Python computer programming language) tools.
Step 1.2: htmltestrunner was installed using pip 3.
Step 1.3: obtaining jenkins version on github (a hosting platform facing open source and private software project), and then carrying out dpkg (which is the abbreviation of Debian Package, is a suite management system specially developed by Debian and is convenient for software installation, update and removal) to install deb (which is a file extension name in the format of Debian software package) package.
Step 1.4: and packaging the original API, and repackaging the unit library interface.
In one embodiment, a method for automatically rereeling a failed use case is provided, which can be applied to a terminal or a server, as shown in fig. 1, and is described as being applied to the terminal, including the following steps:
step S220, a to-be-tested case and test data in the test scene are obtained.
The test cases to be tested are test tasks which need to be tested by software, and there may be one or more than two test cases to be tested. The test data is the data correspondingly needed by the case to be tested during testing. Such as: taking the test case for testing whether the login program is normal as an example, the test data is used to log in the required user name, password, etc. The cases to be tested and the test data can be pre-stored in an automated case library.
In one embodiment, the step of obtaining a case to be tested and test data in a test scenario includes: when the preset test time is reached, triggering a test instruction; and obtaining a to-be-tested case and test data in the test scene according to the test instruction.
The responsible person of the test can set the triggering time of the test instruction according to the test requirement, for example, triggering the test instruction at 24 points every day, and the like.
Step S240, executing the to-be-tested case according to the test data to obtain an execution result.
The execution result is a result obtained after the case to be tested is executed, such as: taking the test case for testing whether the login program is normal as an example, after the test case is executed, a result of successful login or failed login should be obtained, and the result is an execution result. And after the execution result is obtained, storing the execution result into an automatic case library.
Step S260, comparing the execution result with a preset expected result to determine a comparison result.
And when the comparison is needed, obtaining the expected result corresponding to the to-be-tested case from the automatic case library for comparison. And comparing the preset expected result with the execution result, wherein if the preset expected result is consistent with the execution result, the execution of the to-be-tested case is successful, and if the preset expected result is inconsistent with the execution result, the execution of the to-be-tested case is failed.
In step S280, when the comparison result is not matched, the remaining number of times of re-execution is updated to obtain the updated remaining number of times of re-execution.
Wherein, the residual re-execution times can be configured according to the requirement, and the limitation is improved. A sheet page (worksheet) in an excel configuration file for configuring execution parameters of the failed use case automatic rerun method corresponds to a related use case set to be tested, and each use case set to be tested contains the following related field contents, case _ name, case _ param and execution _ time (that is, the number of times of residual re-execution, an initial value default to n is 2, it should be noted that the default value may be other values, which is not specifically limited herein).
In one embodiment, when the comparison result is not matched, the step of updating the remaining number of times of re-execution to obtain the updated remaining number of times of re-execution includes: when the comparison result is not matched, marking the times that the comparison result of the execution result of the case to be tested is not matched; and subtracting 1 from the current residual re-execution times to obtain the updated residual re-execution times.
The automatic case library stores case _ name, expect (expected result), fail _ count (failure frequency, default value f is 0, it is to be noted that the default value may also be other values, which is not specifically limited herein), and the like of all cases to be tested, and marks the number of times that the comparison result of the execution result of the cases to be tested is mismatch, that is: add 1 to fail _ count (number of failures) as: and if the current failure frequency f of the to-be-tested case is 0, marking the number of times that the execution result of the to-be-tested case is not matched once, and if the failure frequency f is 1. And subtracting 1 from the current remaining number of times of re-execution to obtain the updated remaining number of times of re-execution, namely: the execution _ time field is decremented by n-1, as: and the current residual number of times of re-execution of the current failure times of the use case to be tested is n-2, 1 is subtracted from the current residual number of times of re-execution, and the updated residual number of times of re-execution is n-1.
And step S300, when the updated residual re-execution times are larger than 0, returning to the step of executing the to-be-tested case according to the test data to obtain the execution result, and obtaining the final execution result until the updated residual re-execution times are equal to 0 or the comparison result is matched.
And if the updated residual re-execution times are larger than 0, executing the case to be tested again, and modifying the contents of the two fields of the fail _ count and the extraction _ time until the updated residual re-execution times are equal to 0 or the comparison result is matched, obtaining a final execution result, and storing the final execution result into the automatic case library.
In one embodiment, the method for automatically rerunning the failed use case further includes:
acquiring a test report template; and generating a test report according to the test report template and the final execution result.
The test report is generated in an HTML report mode, and the test report template is used for displaying a final execution result. The APP system interface automation test report as shown in the figure is a generated test report, and includes information of tester, start time, duration, final execution result, and the like.
In one embodiment, the method for automatically rerunning the failed use case further includes: acquiring contact information of a person in charge related to a case to be tested; and sending a test report to the related responsible person according to the contact information of the related responsible person.
The contact information of the relevant responsible person can be a mobile phone number, a micro signal, a QQ number and the like, the test report is sent to the relevant responsible person, and the relevant responsible person can conveniently acquire information such as a final execution result and the like without being limited to checking information such as the final execution result and the like through a terminal executing the automatic re-running method of the failed case. When the test report is sent, the website associated with the test report may be sent to the relevant responsible person, and the relevant responsible person views the test report through the website associated with the test report.
In one embodiment, the method for automatically rerunning the failed use case further includes: when the final execution result has the result of execution failure, the test report also includes specific reasons for prompting the user of errors and related modules.
The execution failure result refers to a final execution result output when the execution result obtained by executing the to-be-tested case according to the test data is compared with a preset expected result, the comparison result is not matched, and the updated residual re-execution times are equal to 0. The specific reasons for prompting the user for errors and the contents of the modules involved are shown in the figure.
In one implementation, the method for automatically rerunning the failed use case further includes: and generating an execution condition log of the specific related element component module according to the execution condition of the to-be-tested case, and associating the execution condition log with the test report.
The execution condition log is associated with the test report, and the related responsible person can access the access execution condition log and check the access execution condition log based on the interface of the access execution condition log associated with the test report page. The work efficiency of the related responsible persons is improved.
The automatic rerun method for the failed case comprises the steps of obtaining a case to be tested and test data in a test scene; executing the case to be tested according to the test data to obtain an execution result; comparing the execution result with a preset expected result according to the execution result to determine a comparison result; when the comparison result is not matched, updating the residual re-execution times to obtain the updated residual re-execution times; and when the updated residual re-execution times are larger than 0, returning to the step of executing the to-be-tested case according to the test data and obtaining the execution result, and obtaining the final execution result until the updated residual re-execution times are equal to 0 or the comparison result is matched. The test case which is not matched with the preset expected result can be automatically re-executed, so that the method is relatively flexible, the limitation is improved, and the software testing efficiency is improved.
It should be understood that, although the steps in the flowchart of fig. 1 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 1 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 4, there is provided a failed use case automatic rerun apparatus, including: a data obtaining module 310, an executing module 320, a comparing module 330, a number updating module 340, and a final execution result obtaining module 350, wherein:
a data obtaining module 310, configured to obtain a to-be-tested case and test data in a test scene;
the execution module 320 is used for executing the to-be-tested case according to the test data to obtain an execution result;
a comparison module 330, configured to compare the execution result with a preset expected result to determine a comparison result;
the number updating module 340 is configured to update the remaining number of re-executions when the comparison result is that the comparison result is not matched, and obtain the updated remaining number of re-executions;
and a final execution result obtaining module 350, configured to, when the updated remaining number of times of re-execution is greater than 0, return to the step of executing the to-be-tested case according to the test data to obtain the execution result, and obtain the final execution result until the updated remaining number of times of re-execution is equal to 0 or the comparison result is a match.
In one embodiment, the number update module 340 is further configured to: when the comparison result is not matched, marking the times that the comparison result of the execution result of the case to be tested is not matched; and subtracting 1 from the current residual re-execution times to obtain the updated residual re-execution times.
In one embodiment, the data acquisition module 310 is further configured to: when the preset test time is reached, triggering a test instruction; and obtaining a to-be-tested case and test data in the test scene according to the test instruction.
In one embodiment, the apparatus further comprises a report generation module: acquiring a test report template; and generating a test report according to the test report template and the final execution result.
In one embodiment, the apparatus further comprises a sending module: acquiring contact information of a person in charge related to a case to be tested; and sending a test report to the related responsible person according to the contact information of the related responsible person.
In one embodiment, the report generation module is further configured to: when the final execution result has the result of execution failure, the test report also includes specific reasons for prompting the user of errors and related modules.
In one embodiment, the apparatus further comprises a log generation module: and generating an execution condition log of the specific related element component module according to the execution condition of the to-be-tested case, and associating the execution condition log with the test report.
For specific limitations of the automatic rerun apparatus for failed use cases, reference may be made to the above limitations of the automatic rerun method for failed use cases, and details are not described here. All or part of each module in the automatic rerun device for the failed use case can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the automatic rerun method for the failed use case when executing the computer program.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when executed by a processor, implements the steps of the above-described failed use case auto-rerun method.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method for automatically rerunning a failed use case, the method comprising:
acquiring a to-be-tested case and test data in a test scene;
executing the to-be-tested case according to the test data to obtain an execution result;
comparing the execution result with a preset expected result according to the execution result to determine a comparison result;
when the comparison result is not matched, updating the residual re-execution times to obtain the updated residual re-execution times;
and when the updated residual re-execution times are larger than 0, returning to the step of executing the to-be-tested case according to the test data to obtain an execution result, and obtaining a final execution result until the updated residual re-execution times are equal to 0 or the comparison result is matched.
2. The method according to claim 1, wherein the step of updating the remaining number of times of re-execution to obtain the updated remaining number of times of re-execution when the comparison result is not matched comprises:
when the comparison result is not matched, marking the times that the comparison result of the execution result of the to-be-tested case is not matched;
and subtracting 1 from the current residual re-execution times to obtain the updated residual re-execution times.
3. The method of claim 1, wherein the step of obtaining the test cases and test data to be tested in the test scenario comprises:
when the preset test time is reached, triggering a test instruction;
and acquiring a to-be-tested case and test data in a test scene according to the test instruction.
4. The method of claim 1, further comprising:
acquiring a test report template;
and generating a test report according to the test report template and the final execution result.
5. The method of claim 4, further comprising:
acquiring contact information of a person in charge related to the case to be tested;
and sending the test report to the relevant responsible person according to the contact information of the relevant responsible person.
6. The method of claim 4, further comprising:
when the final execution result has a result of execution failure, the test report also includes a specific reason for prompting the user of an error and the related module.
7. The method of claim 4, further comprising:
and generating an execution condition log of a specific related element component module according to the execution condition of the to-be-tested case, and associating the execution condition log with the test report.
8. An automatic rerun apparatus for a failed use case, the apparatus comprising:
the data acquisition module is used for acquiring a to-be-tested case and test data in a test scene;
the execution module is used for executing the to-be-tested case according to the test data to obtain an execution result;
the comparison module is used for comparing the execution result with a preset expected result to determine a comparison result;
the number updating module is used for updating the residual re-execution number when the comparison result is not matched, and obtaining the updated residual re-execution number;
and a final execution result obtaining module, configured to, when the updated remaining number of times of re-execution is greater than 0, return to the step of executing the to-be-tested case according to the test data to obtain an execution result, and obtain a final execution result until the updated remaining number of times of re-execution is equal to 0 or the comparison result is a match.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202011378638.0A 2020-11-30 2020-11-30 Automatic rerun method and device for failed case, computer equipment and storage medium Pending CN112540916A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011378638.0A CN112540916A (en) 2020-11-30 2020-11-30 Automatic rerun method and device for failed case, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011378638.0A CN112540916A (en) 2020-11-30 2020-11-30 Automatic rerun method and device for failed case, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112540916A true CN112540916A (en) 2021-03-23

Family

ID=75016584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011378638.0A Pending CN112540916A (en) 2020-11-30 2020-11-30 Automatic rerun method and device for failed case, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112540916A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113392009A (en) * 2021-06-21 2021-09-14 中国工商银行股份有限公司 Exception handling method and device for automatic test
CN114036074A (en) * 2022-01-07 2022-02-11 荣耀终端有限公司 Test method and test device for terminal equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105117344A (en) * 2015-09-19 2015-12-02 北京暴风科技股份有限公司 Interface integration test method and system based on PB
CN108228400A (en) * 2016-12-15 2018-06-29 北京兆易创新科技股份有限公司 A kind of EMMC test methods and device
CN108459953A (en) * 2017-02-22 2018-08-28 北京京东尚科信息技术有限公司 test method and device
CN109359053A (en) * 2018-12-18 2019-02-19 上海科梁信息工程股份有限公司 Generation method and relevant apparatus, the test macro and storage medium of test report
CN109445309A (en) * 2018-12-21 2019-03-08 核动力运行研究所 A kind of nuclear heating device is from start-up and shut-down control analogue system and test method
CN109828906A (en) * 2018-12-15 2019-05-31 中国平安人寿保险股份有限公司 UI automated testing method, device, electronic equipment and storage medium
CN110324612A (en) * 2019-07-05 2019-10-11 深圳市康冠技术有限公司 Test method, testing and control terminal and the television set of television set
CN110941546A (en) * 2019-10-12 2020-03-31 平安健康保险股份有限公司 Automatic test method, device, equipment and storage medium for WEB page case
CN111767210A (en) * 2020-06-12 2020-10-13 浙江大搜车软件技术有限公司 Policy testing method and device, computer equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105117344A (en) * 2015-09-19 2015-12-02 北京暴风科技股份有限公司 Interface integration test method and system based on PB
CN108228400A (en) * 2016-12-15 2018-06-29 北京兆易创新科技股份有限公司 A kind of EMMC test methods and device
CN108459953A (en) * 2017-02-22 2018-08-28 北京京东尚科信息技术有限公司 test method and device
CN109828906A (en) * 2018-12-15 2019-05-31 中国平安人寿保险股份有限公司 UI automated testing method, device, electronic equipment and storage medium
CN109359053A (en) * 2018-12-18 2019-02-19 上海科梁信息工程股份有限公司 Generation method and relevant apparatus, the test macro and storage medium of test report
CN109445309A (en) * 2018-12-21 2019-03-08 核动力运行研究所 A kind of nuclear heating device is from start-up and shut-down control analogue system and test method
CN110324612A (en) * 2019-07-05 2019-10-11 深圳市康冠技术有限公司 Test method, testing and control terminal and the television set of television set
CN110941546A (en) * 2019-10-12 2020-03-31 平安健康保险股份有限公司 Automatic test method, device, equipment and storage medium for WEB page case
CN111767210A (en) * 2020-06-12 2020-10-13 浙江大搜车软件技术有限公司 Policy testing method and device, computer equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113392009A (en) * 2021-06-21 2021-09-14 中国工商银行股份有限公司 Exception handling method and device for automatic test
CN114036074A (en) * 2022-01-07 2022-02-11 荣耀终端有限公司 Test method and test device for terminal equipment

Similar Documents

Publication Publication Date Title
CN108400978B (en) Vulnerability detection method and device, computer equipment and storage medium
CN112540916A (en) Automatic rerun method and device for failed case, computer equipment and storage medium
CN111611172A (en) Project test defect analysis method, device, equipment and storage medium
CN112395202B (en) Interface automation test method and device, computer equipment and storage medium
CN110704312A (en) Pressure testing method and device, computer equipment and storage medium
CN113282513B (en) Interface test case generation method and device, computer equipment and storage medium
CN109324961B (en) System automatic test method, device, computer equipment and storage medium
CN110990289B (en) Method and device for automatically submitting bug, electronic equipment and storage medium
CN111159025B (en) Application program interface testing method and device, computer equipment and storage medium
CN111930579A (en) Test machine deployment permission verification method and device, computer equipment and storage medium
CN113553088A (en) Patch package distribution method and device, computer equipment and storage medium
CN111277476A (en) Gateway controller function verification method, gateway controller function verification device, computer equipment and storage medium
CN112612706A (en) Automated testing method, computer device and storage medium
CN110309057B (en) Automatic script-based flow project testing method and related equipment
CN114579473B (en) Application testing method, device, equipment and storage medium
CN113094251A (en) Embedded system testing method and device, computer equipment and storage medium
CN110633213A (en) Unit testing method, unit testing device, computer equipment and storage medium
CN115934129A (en) Software project updating method and device, computer equipment and storage medium
CN115576810A (en) Automatic testing method, system, medium and computing device for real-time alarm
CN110737426B (en) Program block creating method, program block creating device, computer equipment and storage medium
CN114564385A (en) Software testing method and device, computer equipment and storage medium
CN109240906B (en) Database configuration information adaptation method and device, computer equipment and storage medium
CN112486824A (en) Use case code generation method and device, computer equipment and storage medium
CN114756417B (en) Network loop line inspection method, device, equipment and medium of MOC card
CN109918290B (en) Automatic screening method and device for target equipment, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination