CN115168205A - Automatic testing method and device - Google Patents

Automatic testing method and device Download PDF

Info

Publication number
CN115168205A
CN115168205A CN202210800271.XA CN202210800271A CN115168205A CN 115168205 A CN115168205 A CN 115168205A CN 202210800271 A CN202210800271 A CN 202210800271A CN 115168205 A CN115168205 A CN 115168205A
Authority
CN
China
Prior art keywords
target
test data
data
test
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210800271.XA
Other languages
Chinese (zh)
Inventor
廖旭旺
肖洪华
林津如
林丽云
刘斌
彭上尉
邱德宗
石明睿
陶曾明
叶博文
尹天晴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bank of China Ltd
Original Assignee
Bank of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bank of China Ltd filed Critical Bank of China Ltd
Priority to CN202210800271.XA priority Critical patent/CN115168205A/en
Publication of CN115168205A publication Critical patent/CN115168205A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application provides an automatic testing method and device, which are used for reading target scene testing data and determining target basic testing data corresponding to the target scene testing data according to a mapping relation between the preset scene testing data and the basic testing data; determining a target message specification corresponding to the target basic test data according to a mapping relation between the pre-configured basic test data and the message specification; generating a target request data packet according to the target scene test data, the target basic test data and the target message specification; and sending the target request data packet to the tested system, and checking a return data packet sent by the tested system. The method and the device realize the separation of the test data and the test execution logic, and can reduce the cost of automatic test.

Description

Automatic testing method and device
Technical Field
The present disclosure relates to software testing, and more particularly, to an automated testing method and apparatus.
Background
In software enterprises, automated testing is more and more popular, and related test scripts are rapidly growing. With the continuous upgrade of software, the maintenance cost of the automated test scripts also increases significantly. Particularly in bank card transaction related systems, there may be hundreds or thousands of automation scripts. The advantage of this common automated scripting framework is that transactions can be initiated independently and concurrently, but there are two major disadvantages:
firstly, the maintenance work is huge: each automation script is independent, the message columns are numerous, and if one column is changed in a message interface, all the affected automation test scripts need to be modified and debugged.
Secondly, the expansion and the recombination of the test scenes are inconvenient, if the test scenes need to be newly added, a similar automatic script needs to be copied, and then some fine adjustment is carried out. When a new scene is tested in a combined mode, for example, withdrawal and query are combined together, or abnormal values of message fields need to be increased for robustness testing, and a large number of automation scripts with the same size and different size need to be added.
Disclosure of Invention
In view of this, the present application provides an automated testing method and apparatus, which reduce the workload of script writing and modifying and are more convenient to use and maintain by separating the test processing logic from the test data.
In one aspect, an embodiment of the present application provides an automated testing method, where the method includes:
reading target scene test data, and determining target basic test data corresponding to the target scene test data according to a mapping relation between the preset scene test data and the basic test data;
determining a target message specification corresponding to the target basic test data according to a mapping relation between the pre-configured basic test data and the message specification;
generating a target request data packet according to the target scene test data, the target basic test data and the target message specification;
and sending the target request data packet to a tested system, and checking a return data packet sent by the tested system.
Optionally, the packet specification includes:
at least one card organization message format definition and at least one transaction type message format definition.
Optionally, the basic test data includes:
request data and expected response data for at least one conventional test scenario.
Optionally, the basic test data is configured according to the following method:
and configuring each field of the request data and the expected response data corresponding to the basic test scene according to the message specification and the basic test scene.
Optionally, the scenario test data is configured according to the following method:
and according to the basic test data and the branch test scene, modifying, adding or deleting the request field corresponding to the basic test data to obtain the request data and the expected response data corresponding to the branch test scene.
Optionally, the method further includes:
generating a target expected response data packet according to the target scene test data, the target basic test data and the target message specification;
the checking the return data packet sent by the system under test includes:
and outputting a test result based on the comparison between the target expected response data packet and a return data packet sent by the tested system.
On the other hand, the embodiment of the present application further provides an automatic testing device, the device includes:
the scene data reading unit is used for reading target scene test data and determining target basic test data corresponding to the target scene test data according to a mapping relation between the preset scene test data and the basic test data;
the message specification determining unit is used for determining a target message specification corresponding to the target basic test data according to a mapping relation between the pre-configured basic test data and the message specification;
a request generating unit, configured to generate a target request data packet according to the target scene test data, the target basic test data, and the target packet specification;
and the test execution unit is used for sending the target request data packet to a tested system and checking a return data packet sent by the tested system.
Optionally, the apparatus further comprises:
an expected response data packet generating unit, configured to generate a target expected response data packet according to the target scene test data, the target basic test data, and the target packet specification;
the test execution unit is specifically configured to:
and outputting a test result based on the comparison between the target expected response data packet and a return data packet sent by the tested system.
On the other hand, an embodiment of the present application further provides an apparatus, where the apparatus includes: a processor and a memory;
the memory to store instructions;
the processor, executing the instructions in the memory, performs the method of the above aspect.
In another aspect, the present application also provides a computer-readable storage medium, which stores program codes or instructions, and when the program codes or instructions are executed on a computer, the computer is caused to execute the method of the above aspect.
Therefore, the embodiment of the application has the following beneficial effects:
the method comprises the steps of reading target scene test data, and determining target basic test data corresponding to the target scene test data according to a mapping relation between the preset scene test data and the basic test data; determining a target message standard corresponding to the target basic test data according to a mapping relation between the pre-configured basic test data and the message standard; generating a target request data packet according to the target scene test data, the target basic test data and the target message specification; and finally, sending the target request data packet to the tested system, and checking a return data packet sent by the tested system. By separating the test data such as the pre-configured scene test data, the basic test data and the message specification from the execution logic of the test, the method and the system can meet the requirement of large-scale increase of the test data and the scene, improve the working efficiency and scale of the automatic test, reduce the cost of the automatic test and improve the efficiency of the automatic test.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following descriptions are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic diagram of a conventional bank card test script according to an embodiment of the present application;
fig. 2 is a flowchart of an automated testing method according to an embodiment of the present disclosure;
FIG. 3 is a block diagram of an automated test script according to an embodiment of the present disclosure;
FIG. 4 is a block diagram illustrating an automated test script according to an embodiment of the present application;
fig. 5 is a schematic diagram of an automated testing apparatus according to an embodiment of the present disclosure.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments accompanying figures are described in detail below.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, but the present application may be practiced in other ways than those described herein, and it will be apparent to those of ordinary skill in the art that the present application is not limited by the specific embodiments disclosed below.
Software test automation is a technique for expanding the test range and improving the test execution efficiency by executing automated test scripts in batches and repeatedly. Developers of automation scripts have, among other languages, the robotframe and Python languages. An automation script typically includes input specific data that is sent to the system under test whether the return value of the system under test is expected. At present, when an automation script is developed by using a robotframe, a scheme is generally adopted in which one test scenario corresponds to one automation script.
Taking a bank card transaction related system as an example, with continuous upgrading of software, the maintenance cost of the automated test script also increases significantly. Referring to fig. 1, a schematic diagram of a conventional bank card test script provided in an embodiment of the present application is shown.
As shown in fig. 1, the system under test is an acquirer or issuer switching system, and the test scenarios are divided into a plurality of test scenarios according to the difference of card organizations (mastercard, VISA card, union pay card), the difference of transaction channels (ATM transaction, POS transaction), the difference of transaction types (cash withdrawal, inquiry, consumption, pre-authorization) and the difference of specific test scenarios (transaction success, insufficient balance, wrong password), and respectively correspond to a plurality of automation scripts. The number of automation scripts listed in the figures is already large, and in a real bank card transaction system, the number of automation scripts may even reach hundreds or thousands.
At present, the common automated script framework has the advantage that transactions can be initiated independently and concurrently, but has two major disadvantages:
firstly, the maintenance work is huge: each automation script is independent, the message fields are numerous, and if one field is changed in a message interface, all the affected automation test scripts need to be modified and debugged.
Secondly, the expansion and the recombination of the test scenes are inconvenient, if the test scenes need to be newly added, a similar automatic script needs to be copied, and then some fine adjustment is carried out. When a new scene is tested in a combined manner, for example, withdrawal and query are combined together, or an abnormal value of a message field needs to be increased for robustness testing, a large number of different automation scripts are required to be added.
In order to solve the above problems, the present application provides an automated testing method and apparatus, which reduce the workload of script writing and modification and are more convenient to use and maintain by separating the test processing logic from the test data.
For convenience of understanding, an automated testing method and an automated testing apparatus provided in the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of an automated testing method provided in an embodiment of the present application may include the following steps:
s101: reading target scene test data, and determining target basic test data corresponding to the target scene test data according to a mapping relation between the preset scene test data and the basic test data;
s102: determining a target message specification corresponding to the target basic test data according to a mapping relation between the pre-configured basic test data and the message specification;
s103: generating a target request data packet according to the target scene test data, the target basic test data and the target message specification;
s104: and sending the target request data packet to a tested system, and checking a return data packet sent by the tested system.
In the embodiment of the application, the message specification includes data message format definitions of various card organizations and different transaction types.
In a possible implementation manner, the packet specification includes:
at least one card organization message format definition and at least one transaction type message format definition.
Specifically, the message specification may be configured according to the following manner:
and converting the interface specification of the card organization into a parameter configuration file of the message specification. Such as name, length of each field; the type of characters allowed to be input; whether an input is necessary; for the variable length field, defining the number of characters occupied by the length value; for the composite sub-field, the length, character type, etc. of each sub-field are defined.
In the embodiment of the application, the basic test data can be test data of a conventional test scenario, and if a plurality of conventional test scenarios with large differences exist, a plurality of conventional test scenarios can be determined.
In one possible implementation, the basic test data includes:
request data and expected response data for at least one conventional test scenario.
In one possible implementation, the basic test data is configured according to the following method:
and configuring each column of the request data and the expected response data corresponding to the basic test scene according to the message specification and the basic test scene.
Specifically, basic test data may be defined according to a common service scenario, including request data and expected response data, where each basic test data may individually correspond to one test scenario.
In a possible implementation manner, the scenario test data is configured according to the following method:
and modifying, adding or deleting the request field corresponding to the basic test data according to the basic test data and the branch test scene to obtain the request data and the expected response data corresponding to the branch test scene.
In the embodiment of the application, the scenario test data is obtained by selecting the specified basic test data on the basis of the basic test data, and then adding, deleting and modifying the specific column appropriately, such as modifying to an error password, modifying to an error card number, and deleting a bank card number, to form the specific scenario test data.
Specifically, the scenario test data is other branch test scenarios derived from a scenario corresponding to the basic test data, and a specific request field column is added, deleted, and modified on the basic test data to form the scenario test data, which includes the request data and the expected response data, and only a difference part needs to be defined, and for parts with the same basic test data, repeated filling is not needed.
Fig. 3 is a block diagram of an automated test script according to an embodiment of the present disclosure. The graph comprises four parts of parameter configuration, basic test data, scene test data and execution logic, wherein the parameter configuration is the parameter configuration of message specification and comprises message formats of card organization, transaction types and the like; the basic test data comprises request and response data of a conventional service scene; the scenario test data comprises request and response data of the branch test scenario, such as password error, bank card number error, missing password field, missing bank card number and the like.
In an embodiment of the present application, the execution logic includes: and reading the scene test data and the corresponding message specification, generating a corresponding request data packet, then checking a return data packet, and finally verifying and recording an execution result. Then, the next scene test data can be read and the logic can be repeatedly executed, so that the high-efficiency automatic test is realized.
In a possible implementation, the method further includes:
generating a target expected response data packet according to the target scene test data, the target basic test data and the target message specification;
the checking the return data packet sent by the system under test comprises:
and outputting a test result based on the comparison between the target expected response data packet and a return data packet sent by the tested system.
Specifically, in the embodiment of the application, after reading the target scene test data, parameter configurations such as corresponding target basic test data and target message specifications are found, so that a target request data packet and a target expected response data packet are generated, then the request data is sent to the system to be tested, the expected response data packet is compared with actual response data of the system to be tested, and a test result is output; so that the next scenario test data continues to be processed until the test is completed.
Referring to fig. 4, a schematic block diagram of an automated test script is further provided in an embodiment of the present application. As shown in fig. 4, the automated test script includes four modules, which specifically include:
A. the parameter configuration module is used for configuring parameters related to message specification; specifically, an Excel text can be directly filled in according to the template and transmitted to the basic test data module, the scene test data module and the execution logic module;
B. the basic test data module is used for reading the output of the parameter configuration module; specifically, the source data may be an Excel text filled in by the parameter configuration module, form basic test data according to the definition of the main service scene, and transmit the basic test data to the scene test data module and the execution logic module;
C. the scene test data is used for reading the message specification and the basic test data from the parameter configuration module and the basic test data module; specifically, according to the definition of a specific service scene, the specific field of the basic test data is subjected to the addition and deletion modification processing to form scene test data, and the scene test data is transmitted to the execution logic module;
D. and the execution logic module is used for generating a test request data packet and an expected response data packet according to the scene test data and the basic test data input by the three modules and the message specification defined by the parameter configuration module, sending the test request data packet and the expected response data packet to the tested system, comparing the actually returned data packets, and outputting test result information to a user.
The three modules A, B and C can be used as data pre-preparation and configuration modules, and D can be repeatedly executed in the test process.
In a possible implementation mode, a centralized configuration management device, a cloud deployment device and an automatic scheduling device can be added to an execution logic module, so that the automatic test is more intelligent, the requirement of executing an automatic script according to multiple points concurrently as required is met, an automatic test service is formed, and various functional test and non-functional test scenes are assisted.
In the embodiment of the application, the parameter configuration module realizes the standardization of various message definitions; the test data and the test execution logic are separated, and the requirement of large-scale growth of the test data and scenes is met; and the parameter configuration and the execution logic are subjected to generalization to form an organization-level standard automatic test framework, so that repeated investment is avoided, the automatic test cost is reduced, and the automatic test efficiency is submitted.
According to the method and the device, target scene test data are read, and target basic test data corresponding to the target scene test data are determined according to a mapping relation between the preset scene test data and the basic test data; determining a target message standard corresponding to the target basic test data according to a mapping relation between the pre-configured basic test data and the message standard; generating a target request data packet according to the target scene test data, the target basic test data and the target message specification; and finally, sending the target request data packet to the tested system, and checking a return data packet sent by the tested system. According to the embodiment of the application, the test data such as the pre-configured scene test data, the basic test data and the message specification are separated from the execution logic of the test, so that the requirement of large-scale increase of the test data and the scene can be met, the working efficiency and the scale of the automatic test are improved, the automatic test cost is reduced, and the automatic test efficiency is improved.
Based on the report generation method, an automatic testing device is further provided in the embodiment of the present application, which is shown in fig. 5, and is a schematic diagram of an automatic testing device provided in the embodiment of the present application. The apparatus may include:
an automated testing apparatus, the apparatus comprising:
a scene data reading unit 201, configured to read target scene test data, and determine target basic test data corresponding to the target scene test data according to a mapping relationship between the pre-configured scene test data and the basic test data;
a message specification determining unit 202, configured to determine a target message specification corresponding to target basic test data according to a mapping relationship between the pre-configured basic test data and the message specification;
a request generating unit 203, configured to generate a target request data packet according to the target scene test data, the target basic test data, and the target packet specification;
and the test execution unit 204 is configured to send the target request packet to a system under test, and check a return packet sent by the system under test.
In a possible implementation manner, the apparatus further includes:
an expected response data packet generating unit, configured to generate a target expected response data packet according to the target scene test data, the target basic test data, and the target packet specification;
the test execution unit is specifically configured to:
and outputting a test result based on the comparison between the target expected response data packet and a return data packet sent by the tested system.
In a possible implementation manner, the packet specification includes:
at least one card organization message format definition and at least one transaction type message format definition.
In one possible implementation, the basic test data includes:
request data and expected response data for at least one conventional test scenario.
In one possible implementation, the basic test data is configured according to the following method:
and configuring each column of the request data and the expected response data corresponding to the basic test scene according to the message specification and the basic test scene.
In one possible implementation, the scenario test data is configured according to the following method:
and modifying, adding or deleting the request field corresponding to the basic test data according to the basic test data and the branch test scene to obtain the request data and the expected response data corresponding to the branch test scene.
Based on the above automatic testing method, an embodiment of the present application further provides a device, where the device may include: a processor and a memory;
a memory to store instructions;
a processor for executing the instructions in the memory to perform the automated testing method described above.
Based on the above automated testing method, the present application further provides a computer-readable storage medium storing program codes or instructions, which when executed on a computer, cause the computer to execute the above automated testing method.
It should be noted that, in the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the system or the device disclosed by the embodiment, the description is simple because the system or the device corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
It should be understood that, in this application, "at least one" means one or more, "a plurality" means two or more. "and/or" is used to describe the association relationship of the associated object, indicating that there may be three relationships, for example, "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b and c may be single or plural.
It is further noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. An automated testing method, the method comprising:
reading target scene test data, and determining target basic test data corresponding to the target scene test data according to a mapping relation between the preset scene test data and the basic test data;
determining a target message specification corresponding to the target basic test data according to a mapping relation between the pre-configured basic test data and the message specification;
generating a target request data packet according to the target scene test data, the target basic test data and the target message specification;
and sending the target request data packet to a tested system, and checking a return data packet sent by the tested system.
2. The method of claim 1, wherein the message specification comprises:
at least one card organization message format definition and at least one transaction type message format definition.
3. The method of claim 1, wherein the base test data comprises:
request data and expected response data for at least one conventional test scenario.
4. The method of claim 1, wherein the base test data is configured according to the following method:
and configuring each column of the request data and the expected response data corresponding to the basic test scene according to the message specification and the basic test scene.
5. The method of claim 1, wherein the scenario test data is configured according to the following method:
and modifying, adding or deleting the request field corresponding to the basic test data according to the basic test data and the branch test scene to obtain the request data and the expected response data corresponding to the branch test scene.
6. The method of claim 1, further comprising:
generating a target expected response data packet according to the target scene test data, the target basic test data and the target message specification;
the checking the return data packet sent by the system under test includes:
and outputting a test result based on the comparison between the target expected response data packet and a return data packet sent by the tested system.
7. An automated testing apparatus, the apparatus comprising:
the scene data reading unit is used for reading target scene test data and determining target basic test data corresponding to the target scene test data according to a mapping relation between the preset scene test data and the basic test data;
the message specification determining unit is used for determining a target message specification corresponding to the target basic test data according to a mapping relation between the pre-configured basic test data and the message specification;
a request generating unit, configured to generate a target request data packet according to the target scene test data, the target basic test data, and the target packet specification;
and the test execution unit is used for sending the target request data packet to a tested system and checking a return data packet sent by the tested system.
8. The apparatus of claim 7, further comprising:
an expected response data packet generating unit, configured to generate a target expected response data packet according to the target scene test data, the target basic test data, and the target packet specification;
the test execution unit is specifically configured to:
and outputting a test result based on the comparison between the target expected response data packet and a return data packet sent by the tested system.
9. An apparatus, characterized in that the apparatus comprises: a processor and a memory;
the memory to store instructions;
the processor, configured to execute the instructions in the memory, to perform the method of claims 1-6.
10. A computer-readable storage medium, characterized in that it stores program code or instructions which, when run on a computer, cause the computer to perform the method of claims 1-6.
CN202210800271.XA 2022-07-08 2022-07-08 Automatic testing method and device Pending CN115168205A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210800271.XA CN115168205A (en) 2022-07-08 2022-07-08 Automatic testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210800271.XA CN115168205A (en) 2022-07-08 2022-07-08 Automatic testing method and device

Publications (1)

Publication Number Publication Date
CN115168205A true CN115168205A (en) 2022-10-11

Family

ID=83492885

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210800271.XA Pending CN115168205A (en) 2022-07-08 2022-07-08 Automatic testing method and device

Country Status (1)

Country Link
CN (1) CN115168205A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117194253A (en) * 2023-09-11 2023-12-08 易方达基金管理有限公司 Method and system for generating test data of service scene

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117194253A (en) * 2023-09-11 2023-12-08 易方达基金管理有限公司 Method and system for generating test data of service scene
CN117194253B (en) * 2023-09-11 2024-04-19 易方达基金管理有限公司 Method and system for generating test data of service scene

Similar Documents

Publication Publication Date Title
CN108846659B (en) Block chain-based transfer method and device and storage medium
CN102831052B (en) Test exemple automation generating apparatus and method
CN101339532B (en) Web application system automatized test method and apparatus
CN101908015B (en) Device and method for creating test case based on components
US20140359362A1 (en) Information interaction test device and method based on automatic generation of associated test cases
CN112035363A (en) Automatic interface testing method and device
CN108241720B (en) Data processing method, device and computer readable storage medium
CN111400387A (en) Conversion method and device for import and export data, terminal equipment and storage medium
CN104657274A (en) Method and device for testing software interface
CN109361628A (en) Message assemble method, device, computer equipment and storage medium
CN110659870A (en) Business audit test method, device, equipment and storage medium
CN108460068A (en) Method, apparatus, storage medium and the terminal that report imports and exports
CN112181854A (en) Method, device, equipment and storage medium for generating flow automation script
CN112464632A (en) Form style dynamic storage and conversion method under excel report
WO2022199076A1 (en) Service processing method for multiple types of services, computer device, and storage medium
CN115952758A (en) Chip verification method and device, electronic equipment and storage medium
CN201435074Y (en) Device for generating test case based on member
CN115168205A (en) Automatic testing method and device
CN111752846A (en) Interface testing method and device
CN111190814A (en) Software test case generation method and device, storage medium and terminal
KR102160379B1 (en) Testing method for decentralized application based on blockchain and testing apparatus
CN106528718B (en) Method and apparatus for processing data from third party
CN116561003A (en) Test data generation method, device, computer equipment and storage medium
CN115470139A (en) Interface testing method and related equipment
CN113177021B (en) Data export method and device for different data sources

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination