CN111240989A - Interface automation test framework and method - Google Patents

Interface automation test framework and method Download PDF

Info

Publication number
CN111240989A
CN111240989A CN202010053687.0A CN202010053687A CN111240989A CN 111240989 A CN111240989 A CN 111240989A CN 202010053687 A CN202010053687 A CN 202010053687A CN 111240989 A CN111240989 A CN 111240989A
Authority
CN
China
Prior art keywords
test
interface
request
data
case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010053687.0A
Other languages
Chinese (zh)
Other versions
CN111240989B (en
Inventor
方鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Start Zhihe (Jiangsu) Digital Technology Co.,Ltd.
Original Assignee
Joint Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Joint Digital Technology Co ltd filed Critical Joint Digital Technology Co ltd
Priority to CN202010053687.0A priority Critical patent/CN111240989B/en
Publication of CN111240989A publication Critical patent/CN111240989A/en
Application granted granted Critical
Publication of CN111240989B publication Critical patent/CN111240989B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The invention discloses an interface automatic test framework and a corresponding test method, wherein the framework comprises a test case, a Jenkins batch execution module, an initialization and restoration module and an api request module, wherein the Jenkins batch execution module executes the test case with the same directory in batch by executing a runtest. The test case is provided with a request connection, a request data and an expected result, the request connection, the request data and the expected result call an api request module, the api request module sends a request to a system interface to be tested, receives a return value of the interface, makes an assertion on the result, and marks the case according to the assertion result; the initialization and recovery module is used for initializing and recovering the tested system data. The invention encapsulates and calls the request sending function, saves codes, can realize the quick compiling of the interface automation test case, and can automatically trigger the mail notification.

Description

Interface automation test framework and method
Technical Field
The invention belongs to the technical field of computer software, and relates to an interface automatic testing framework and a corresponding testing method.
Background
The existing interface automation test framework on the market almost directly compiles the test case by the script code mode, and the script code mode leads the code of one case to have a plurality of lines and a great amount of redundancy. Some third-party software such as postman and SoapUI is used for managing interface use cases, but the third-party tool cannot perform batch execution and generate test reports.
In summary, the prior art does not perform automatic interface test management on a program, and has the following drawbacks:
(1) executing the use case, and automatically triggering the mail notification by combining jenkins;
(2) business scenarios for fast iteration of companies are not satisfied;
(3) test reports are single and not intuitive.
Disclosure of Invention
In order to solve the problems, the invention discloses an interface automation test framework and a corresponding test method, which save codes, can realize the quick compiling of interface automation test cases and can automatically trigger mail notification.
In order to achieve the purpose, the invention provides the following technical scheme:
an interface automation test framework comprises a test case, a Jenkins batch execution module, an initialization and restoration module and an api request module, wherein the Jenkins batch execution module is used for creating a construction task, configuring a timing task, configuring a mail template and executing the test case with the same directory as a runtest. The test case is provided with a request connection, a request data and an expected result, the request connection, the request data and the expected result are called by the api request module, the api request module sends the request data to a system interface to be tested according to the request connection, receives a return value of the interface, then uses the expected result and an actual return result for assertion, and marks the case according to the assertion result; the initialization and restoration module is used for deleting database table data to be processed and then inserting test data when the interface is initialized; and delete the processed database table data at the end of use case execution.
Further, the system comprises a getToken request module, wherein the api request module calls the getToken request module, and the getToken request module is used for acquiring Token.
Further, the system also comprises a simulation test module which is used for simulating an interface needing to be accessed and configuring an access IP address and a port number of the pile; the test case contains an interface object equal to the test stub, and the access path points to the address of the test stub.
Further, the "request data" and "expected result" in the test case are defined as parameters, and several sets of test data that can be substituted for the "request data" and "expected result" are included in one case.
Furthermore, the runtest.py file has a test report template mark, and different marks are used for corresponding to different test report templates.
The invention also provides an automatic interface testing method, which comprises the following steps:
step 1, adopting jenkins to execute test cases in batches at regular time according to a configured timing task, executing a runtest.py file, and executing all test cases in the runtest.py same directory in batches;
the process of executing the test case comprises the following substeps:
1) initializing an interface, deleting database table data to be processed, and then inserting test data;
2) executing all test cases under the interface, sending packaged api requests by each test case, and initiating interface requests to a tested system by the api requests according to parameters in the data;
3) the api requests to receive a return value of the interface, and then uses an expected result and an actual return result to perform assertion, wherein if the assertion is successful, the case mark passes, and if the assertion is failed, the case mark fails;
4) restoring and restoring the interface data, and deleting the processed database table data;
step 2, generating a corresponding test report under a runtest.
And step 3, automatically triggering and sending the mail.
Further, before step 1, the method further comprises the step of writing a test case:
the url, message body and expected result are filled in each test case "data".
Compared with the prior art, the invention has the following advantages and beneficial effects:
1. the test case is compiled without a native script mode, a test case framework convenient to reuse is adopted, the sending request function is packaged and then called, the automatic test of the interface can be realized only by transmitting parameters such as url, a message body, an expected result and the like into the case, the test case compiling is more visual, the operation is rapid, and codes are saved.
2. Py files are distributed, so that test cases of all the functional modules are independent, and each tester writes the test cases under the own functional directory without interference. And when the test case needs to be run integrally, each independent runtest.
3. Test case parameterization is supported. Test cases can be written in batch, including test cases of boundary values and equivalence classes.
4. Supporting database operations and database result assertions.
5. The switching of various test report templates can be realized, the test cases are regularly run and the test report mails are sent every day by combining jenkins, and the function is strong.
Drawings
FIG. 1 is a basic flow diagram of the present invention.
FIG. 2 is a table of contents of test cases.
FIG. 3 is an example of test case content.
FIG. 4 is an example of additional content in a test case.
FIG. 5 is an example of parameterized test case content.
FIG. 6 is a test framework root directory.
Py document "test report template" flag in runtest.
Fig. 8 is an example of a test report format.
Fig. 9 is a second example of a test report format.
Fig. 10 is an example of sending mail.
FIG. 11 is a data directory.
Fig. 12 is a test peg py file.
FIG. 13 is an example of test case content with a "test peg".
Detailed Description
The technical solutions provided by the present invention will be described in detail below with reference to specific examples, and it should be understood that the following specific embodiments are only illustrative of the present invention and are not intended to limit the scope of the present invention. Additionally, the steps illustrated in the flow charts of the figures may be performed in a computer system such as a set of computer-executable instructions and, although a logical order is illustrated in the flow charts, in some cases, the steps illustrated or described may be performed in an order different than here.
The interface automation test framework provided by the invention mainly comprises a test case, a Jenkins batch execution module, an initialization and reduction module, an api request module and a getToken request module, wherein the Jenkins batch execution module is used for executing the test case in batch, the api request module is called in the test case, the api request module carries out information interaction with a tested system interface, the api request module also calls the getToken request module, and the getToken request module is used for acquiring Token. As an improvement, the framework also comprises a simulation test module which can be called by a test case.
The directory in which the test cases are located is shown in fig. 2, one test interface of the system under test corresponds to one py file, as shown in fig. 2, the "deleted interface of the file" py "file represents the relevant test case for deleting the test file, and all the test cases relevant to the interface are placed in one file. The content of the use case is shown in fig. 3, the 'test _05_ invitation code mandatory item check' is a use case title, the information of an interface is stored in the 'data', and comprises a url, a message body and an expected result, each use case calls an 'api. request sending' method, the method refers to an api request module, namely, an encapsulated api. py file, and transmits the url, the message body and the expected result in the use case to the file. The assertion supports the assertion of json, the assertion of character strings and the assertion of a database. One assertion is not enough, and multiple assertions can be used, such as "expected result 1", "expected result 2", and so on.
In addition to the keywords "request connection", "request data", "expected result", the use cases may also include "request method", "request header", "interface object".
"method of request": the default no-write is to use the get method; if it is the raw mode of post, the "request method" is used: "post"; if it is the params mode of post, the "request method" is used: "post _ query".
The default value is the message header of the login user if the default value is not written; if the interfaces of other users need to be invoked, both token and message header will change, and then a "request header" may be used: "XXX" (defined within api. py).
"interface object", by default is the domain name of the url of the logged-in user, and "interface object": XXX "(defined in api. py) can be used if an interface to other domain names needs to be invoked.
In addition, some methods can be added into the use case file, and the methods are set to be executed once before and after the start of all use cases or before and after the start of each use case. Specifically, as shown in fig. 4, setUpClass indicates that all use cases are executed once before starting, tearDown class indicates that all use cases are executed once after finishing, setUp indicates that each use case is executed once before starting, and tearDown indicates that each use case is executed once after finishing.
As improvement, the invention also uses a parametrized. As shown in fig. 5, the name and the expected result in the use case are parameters, four sets of data into which the parameters can be substituted are written above the use case, for example, "[", "parameter check failure" ] ", i.e., a set of data, which represents that the name is" ", and the expected result is" parameter check failure ". Therefore, multiple groups of test data can be included in one case by defining the request data and the expected result as parameters, the related codes of the case do not need to be repeatedly arranged, and resources are saved. Furthermore, boundary value (parameter value range) and equivalence class (parameter type) verification can be conveniently performed on the parameters based on the parameterized.
The api request module sends the message body to the tested system interface according to the url, receives the return value of the interface, and then uses the expected result and the actual return result to make assertion, if the assertion is successful, the use case mark passes, and if the assertion is failed, the use case mark fails.
And creating a construction task in the Jenkins batch execution module, configuring a timing task, configuring a mail template, and then executing a runtest. And when the runtest file is executed, all test cases in the directory where the runtest is located are obtained, the run-test files are executed in batches, and a test report is generated in the corresponding directory. The test framework root directory is shown in fig. 6, and includes a data directory for storing test tools, a test report directory generated after test cases are executed in batch, and a directory for writing and storing test cases. As can be seen from the figure, the directory structure, the file name and the test case writing of the invention use Chinese format in many places, and the readability is very strong. As shown in fig. 6, there is also a runtest.py file under the root directory of the framework, and executing this file is to execute all test cases under the test case directory in batch, and generate a corresponding test report under the root directory. In the test cases, a runtest file is arranged below each directory, and executing the file means that all the test cases in the directory are executed in batch, and a corresponding test report is generated in the current directory. Unlike the prior art, the present invention provides diversified test report options, a "test report template" flag is provided in runtest.py file, as shown in fig. 7, the flag includes two values, 1 and 2, by setting the flag value, it is possible to switch within the "test report template" of runtest.py file being equal to 1 or equal to 2, and configuring the "test report template" being equal to 1, then generate the test report as shown in fig. 8: configuring the "test report template" equal to 2, a test report as shown in fig. 9 is generated. Py file is executed, jenkins will automatically trigger sending mail. The mail sent is shown in fig. 10.
Fig. 11 is a schematic diagram of a file below a data directory, where an initialization and restoration module is a mysql. And the initialization and restoration module deletes the processed database table data when the use case execution is finished, so that the tested system is kept as it is.
The simulation test module is a test stub under the data directory, a py file, which integrates the service end of the stub, where the interface to be accessed is simulated, and the access IP address and port number of the configuration stub, as shown in FIG. 12. The calling end of the stub is located in the test case, and only one interface object needs to be added to be equal to the test stub, that is, the address of the test stub is pointed by the accessed path, as shown in fig. 13. The test case can be written in advance and the simulation test can be carried out based on the simulation module, and the test case is not written until the tested system is developed.
Based on the above framework, the present invention further provides an interface automation test method, the flow of which is shown in fig. 1, and the method comprises the following steps:
compiling a test case; filling url, message body and expected result in each test case data, preferably adopting parameterized case writing mode;
step 1, adopting jenkins to execute test cases in batches at regular time according to a configured timing task, executing a runtest.py file, and executing all test cases in the runtest.py same directory in batches;
the process of executing the test case comprises the following substeps:
1) initializing an interface, deleting database table data to be processed, and then inserting test data;
2) executing all test cases under the interface, sending packaged api requests by each test case, and initiating interface requests to a tested system according to parameters in data;
3) the api requests to receive a return value of the interface, and then uses an expected result and an actual return result to perform assertion, wherein if the assertion is successful, the case mark passes, and if the assertion is failed, the case mark fails;
4) restoring and restoring the interface data, and deleting the processed database table data;
step 2, generating a corresponding test report under a runtest.
And step 3, automatically triggering and sending the mail.
The technical means disclosed in the invention scheme are not limited to the technical means disclosed in the above embodiments, but also include the technical scheme formed by any combination of the above technical features. It should be noted that those skilled in the art can make various improvements and modifications without departing from the principle of the present invention, and such improvements and modifications are also considered to be within the scope of the present invention.

Claims (7)

1. An interface automation test framework, characterized in that: the system comprises a test case, a Jenkins batch execution module, an initialization and restoration module and an api request module, wherein the Jenkins batch execution module is used for creating a construction task, configuring a timing task, configuring a mail template and executing the test case with the same directory as a runtest. The test case is provided with a request connection, a request data and an expected result, the request connection, the request data and the expected result are called by the api request module, the api request module sends the request data to a system interface to be tested according to the request connection, receives a return value of the interface, then uses the expected result and an actual return result for assertion, and marks the case according to the assertion result; the initialization and restoration module is used for deleting database table data to be processed and then inserting test data when the interface is initialized; and delete the processed database table data at the end of use case execution.
2. The interface automation test framework of claim 1, wherein: the system also comprises a getToken request module, wherein the api request module calls the getToken request module, and the getToken request module is used for acquiring the Token.
3. The interface automation test framework of claim 1, wherein: the device also comprises a simulation test module which is used for simulating an interface needing to be accessed and configuring an access IP address and a port number of the pile; the test case contains an interface object equal to the test stub, and the access path points to the address of the test stub.
4. The interface automation test framework of claim 1, wherein: the "request data" and "expected result" in the test case are defined as parameters, and several sets of test data into which the "request data" and "expected result" can be substituted are included in one case.
5. The interface automation test framework of claim 1, wherein: py document has a "test report template" mark, and different marks are used for corresponding to different test report templates.
6. An automatic interface testing method is characterized by comprising the following steps:
step 1, adopting jenkins to execute test cases in batches at regular time according to a configured timing task, executing a runtest.py file, and executing all test cases in the runtest.py same directory in batches;
the process of executing the test case comprises the following substeps:
1) initializing an interface, deleting database table data to be processed, and then inserting test data;
2) executing all test cases under the interface, sending packaged api requests by each test case, and initiating interface requests to a tested system by the api requests according to parameters in the data;
3) the api requests to receive a return value of the interface, and then uses an expected result and an actual return result to perform assertion, wherein if the assertion is successful, the case mark passes, and if the assertion is failed, the case mark fails;
4) restoring and restoring the interface data, and deleting the processed database table data;
step 2, generating a corresponding test report under a runtest.
And step 3, automatically triggering and sending the mail.
7. The automated interface testing method according to claim 6, further comprising the step of writing test cases before step 1:
the url, message body and expected result are filled in each test case "data".
CN202010053687.0A 2020-01-17 2020-01-17 Interface automation test framework and method Active CN111240989B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010053687.0A CN111240989B (en) 2020-01-17 2020-01-17 Interface automation test framework and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010053687.0A CN111240989B (en) 2020-01-17 2020-01-17 Interface automation test framework and method

Publications (2)

Publication Number Publication Date
CN111240989A true CN111240989A (en) 2020-06-05
CN111240989B CN111240989B (en) 2021-05-28

Family

ID=70874602

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010053687.0A Active CN111240989B (en) 2020-01-17 2020-01-17 Interface automation test framework and method

Country Status (1)

Country Link
CN (1) CN111240989B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112241373A (en) * 2020-10-30 2021-01-19 久瓴(江苏)数字智能科技有限公司 Automatic test method, test device, processor and test system
CN112765038A (en) * 2021-01-29 2021-05-07 北京联创信安科技股份有限公司 Distributed cluster software testing method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016177124A1 (en) * 2015-07-29 2016-11-10 中兴通讯股份有限公司 Method and device for implementing continuous integration test
CN108681507A (en) * 2018-05-08 2018-10-19 浪潮软件集团有限公司 Method for realizing automatic testing of RESTful API and web service
WO2018202173A1 (en) * 2017-05-05 2018-11-08 平安科技(深圳)有限公司 Method and system for database testing
CN109408357A (en) * 2017-12-04 2019-03-01 深圳市珍爱网信息技术有限公司 A kind of automatic interface testing method and device
CN110471831A (en) * 2019-06-21 2019-11-19 南京壹进制信息科技有限公司 A kind of automatic method and device of compatibility test

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016177124A1 (en) * 2015-07-29 2016-11-10 中兴通讯股份有限公司 Method and device for implementing continuous integration test
WO2018202173A1 (en) * 2017-05-05 2018-11-08 平安科技(深圳)有限公司 Method and system for database testing
CN109408357A (en) * 2017-12-04 2019-03-01 深圳市珍爱网信息技术有限公司 A kind of automatic interface testing method and device
CN108681507A (en) * 2018-05-08 2018-10-19 浪潮软件集团有限公司 Method for realizing automatic testing of RESTful API and web service
CN110471831A (en) * 2019-06-21 2019-11-19 南京壹进制信息科技有限公司 A kind of automatic method and device of compatibility test

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112241373A (en) * 2020-10-30 2021-01-19 久瓴(江苏)数字智能科技有限公司 Automatic test method, test device, processor and test system
CN112765038A (en) * 2021-01-29 2021-05-07 北京联创信安科技股份有限公司 Distributed cluster software testing method, device, equipment and storage medium
CN112765038B (en) * 2021-01-29 2024-02-02 北京联创信安科技股份有限公司 Distributed cluster software testing method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN111240989B (en) 2021-05-28

Similar Documents

Publication Publication Date Title
CN110427331A (en) The method for automatically generating performance test script based on interface testing tool
CN109614313A (en) Automated testing method, device and computer readable storage medium
CN108628748B (en) Automatic test management method and automatic test management system
CN111240989B (en) Interface automation test framework and method
CN111078555B (en) Test file generation method, system, server and storage medium
CN105912473A (en) BDD-based mobile APP automatic testing platform and testing method
WO2020186680A1 (en) Interface testing method based on modeling platform
CN114741283A (en) Automatic interface testing method and device based on python design
CN108923997B (en) Cloud service node automatic testing method and device based on python
CN115080398A (en) Automatic interface test system and method
CN112181852A (en) Interface automatic testing method and device, computer equipment and storage medium
CN106484613A (en) A kind of interface automated test frame based on fitnese
CN111597110A (en) Page testing method and device, electronic equipment and storage medium
CN111382084A (en) Test method and device and electronic equipment
CN115437954A (en) Interface automation test data separation method
CN110543429A (en) Test case debugging method and device and storage medium
CN114172835A (en) Automatic testing method of Bluetooth digital key
CN114138670A (en) Method based on interface automation test and function, performance and safety test fusion
CN117520190A (en) Cloud desktop client and test execution method thereof
CN111159028B (en) Webpage testing method and device
CN117194259A (en) Interface testing method, system, electronic equipment and storage medium
CN112650689A (en) Test method, test device, electronic equipment and storage medium
CN116955193A (en) Interface testing method, device, equipment and storage medium
CN116340169A (en) XPath path detection method and device for page element
CN114003293B (en) Interface management method, device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220223

Address after: 210000 4th floor, yisituo building, building 6, No. 19, ningshuang Road, Yuhuatai District, Nanjing, Jiangsu Province

Patentee after: Start Zhihe (Jiangsu) Digital Technology Co.,Ltd.

Address before: Room 801, building C, Nanjing (Yuhua) International Software Outsourcing Industrial Park, 17 tulip Road, Yuhuatai District, Nanjing City, Jiangsu Province, 210000

Patentee before: Joint Digital Technology Co.,Ltd.

TR01 Transfer of patent right