CN110765026B - Automatic test method, device, storage medium and equipment - Google Patents

Automatic test method, device, storage medium and equipment Download PDF

Info

Publication number
CN110765026B
CN110765026B CN201911053638.0A CN201911053638A CN110765026B CN 110765026 B CN110765026 B CN 110765026B CN 201911053638 A CN201911053638 A CN 201911053638A CN 110765026 B CN110765026 B CN 110765026B
Authority
CN
China
Prior art keywords
test
target
containers
target container
test case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911053638.0A
Other languages
Chinese (zh)
Other versions
CN110765026A (en
Inventor
张乐源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wanghai Kangxin Beijing Technology Co ltd
Original Assignee
Wanghai Kangxin Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wanghai Kangxin Beijing Technology Co ltd filed Critical Wanghai Kangxin Beijing Technology Co ltd
Priority to CN201911053638.0A priority Critical patent/CN110765026B/en
Publication of CN110765026A publication Critical patent/CN110765026A/en
Application granted granted Critical
Publication of CN110765026B publication Critical patent/CN110765026B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5083Techniques for rebalancing the load in a distributed system
    • G06F9/5088Techniques for rebalancing the load in a distributed system involving task migration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The present application relates to the field of automated testing technologies, and in particular, to an automated testing method, an apparatus, a storage medium, and a device, where the automated testing method includes: acquiring a plurality of test cases according to preset test requirements; creating a plurality of target containers for executing the test cases on a preset distributed system; and matching the data size of the test case with the processing performance of the target container, and distributing the test case to the matched target container for operation. According to the scheme provided by the application, the plurality of target containers are constructed, the test cases are distributed to the target containers matched with the target containers for operation, the purpose of executing the plurality of test cases on the same host in parallel is achieved, and the utilization rate of resources and the automatic test efficiency are improved.

Description

Automatic test method, device, storage medium and equipment
Technical Field
The application relates to the technical field of automated testing, in particular to an automated testing method, an automated testing device, a storage medium and an automated testing device.
Background
With the continuous development of the software industry, newly developed software is more and more complex, and functions are more and more, so that software testing is more and more complex. In the software testing process, a large number of test cases are usually required to be designed and written in order to ensure the testing quality, then the written test cases are used for performing functional testing or performance testing of the software, and the traditional manual testing mode by using the test cases is low in efficiency.
At present, automatic test is often performed by using test cases, but the automatic test can only be performed by a single thread on one machine, that is, common automatic test execution adopts a single process to execute test case sets, and test time increases along with the increase of the number of the test cases and the complexity of the process, generally, when the number of the interface test case sets is up to several hundred, the test time is more than ten minutes, and for some items with faster iteration cycles, such as: the sensitive development project, its correspondent automation use cases are continuously accumulated, lead to its number of test use cases to increase continuously along with the increase of the number of tests, this kind of scheme runs all test use cases and needs to consume a large amount of time, the test efficiency is low.
Disclosure of Invention
The application provides an automatic test method, an automatic test device, a computer readable storage medium and computer equipment, so as to improve the efficiency of an automatic test process.
The embodiment of the application firstly provides an automatic test method, which comprises the following steps:
acquiring a plurality of test cases according to preset test requirements;
creating a plurality of target containers for executing the test cases on a preset distributed system;
and matching the data size of the test case with the processing performance of the target container, and distributing the test case to the matched target container for operation.
In one embodiment, the step of matching the data size of the test case with the processing performance of the target container includes:
dividing a plurality of test cases into batches according to the number of the target containers, wherein the number of each batch of test cases is the same as the number of the target containers;
for each batch, acquiring the data size of each test case, and sequencing the test cases according to the data size;
for each batch, calling a sequencing result of sequencing the target containers according to the processing performance of the target containers in advance;
for each batch, matching the test cases with the same sequence with the target container.
In one embodiment, the method includes the steps of creating a plurality of target containers for executing the test cases on a preset distributed system, including:
acquiring node information of a distributed system in which a target container is located;
reading the system resource occupancy rate of each node on the distributed system according to the node information;
determining the processing performance of a target container corresponding to each node according to the system resource occupancy rate of each node;
generating configuration files corresponding to all nodes according to the test requirements and the processing performance, and creating a target container according to the configuration files.
In one embodiment, after the step of distributing the test cases to the target containers matched with the test cases for operation, the method further comprises:
collecting the system resource occupancy rate of each node to obtain the system resource occupancy rate;
and correcting the processing performance of the target container according to the system resource occupancy rate.
In one embodiment, after the step of distributing the test cases to the target containers matched with the test cases for operation, the method further comprises:
acquiring a log file for recording the running process of the test case;
extracting a test result and a corresponding test case identifier from the log file;
and integrating the test result and the test case identifier to generate a test report and sending the test report.
In one embodiment, after the step of generating the test report, the method further comprises:
detecting that a test result in a test report exceeds a preset standard reference range, and determining a test case identifier associated with the test result;
performing secondary execution on the target test case corresponding to the test case identifier to obtain a secondary test result;
and determining abnormal information which causes the abnormal test result according to the secondary test result.
In one embodiment, the step of determining the abnormality information causing the abnormality of the test result according to the secondary test result includes:
and when the secondary test result also exceeds the standard reference range, determining that the test case corresponding to the test result is abnormal.
Correspondingly, the application also provides an automatic testing device, which comprises:
the test case acquisition module is used for acquiring a plurality of test cases according to preset test requirements;
the target container creating module is used for creating a plurality of target containers for executing the test cases on a preset distributed system;
and the distribution operation module is used for matching the data size of the test case with the processing performance of the target container and distributing the test case to the matched target container for operation.
Further, the embodiment of the application further provides a computer readable storage medium, which is used for storing computer instructions, when the computer readable storage medium runs on a computer, so that the computer can execute the steps of the automatic test method according to any one of the technical schemes.
Still further, embodiments of the present application also provide a computer device, including:
one or more processors;
storage means for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the steps of the automated test method according to any one of the above-mentioned aspects.
Compared with the prior art, the scheme provided by the application has the following advantages:
according to the automatic test method, corresponding test cases are obtained according to test requirements, a plurality of target containers for executing the test cases are created on the distributed system, the test cases matched with the processing performance of the target containers are processed by the target containers, and compared with a traditional single-thread test case execution scheme, the purpose of executing the plurality of test cases on the same host in parallel is achieved by constructing the plurality of target containers, and the utilization rate of resources and the test efficiency of the test cases are improved.
In addition, the data size of the test case is matched with the processing performance of the target container, and the test case with larger data size is distributed to the target container with stronger processing performance for execution, so that the execution efficiency of the test case is improved, and the efficiency of the whole test process is improved.
Drawings
FIG. 1 is a diagram of an environment for implementing an automated test method according to one embodiment of the present application;
FIG. 2 is a flow chart of an automated test method provided in one embodiment of the present application;
FIG. 3 is a flow chart of creating a plurality of target containers for executing the test cases on a preset distributed system according to one embodiment of the present application;
FIG. 4 is a flowchart of matching a data size of the test case with a processing performance of a target container according to one embodiment of the present application;
FIG. 5 is a flow chart of data analysis of a test report provided in one embodiment of the present application;
FIG. 6 is a schematic structural diagram of an automated testing apparatus according to one embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of illustrating the present application and are not to be construed as limiting the present application.
Fig. 1 is a diagram of an implementation environment of an automatic test method according to an embodiment, where the implementation environment includes a user terminal and a server, and a distributed system disposed at the user terminal or the server, where the distributed system includes a plurality of nodes, and fig. 1 illustrates a case where the distributed system is disposed at the server, where the distributed system includes N nodes (N is an integer greater than 1).
In combination with the environmental diagram provided in fig. 1, the implementation process of the scheme provided in the present application when used at the server side is as follows: according to the test requirements, a plurality of test cases meeting the test requirements are obtained, a plurality of target containers for executing the test cases are created on a preset distributed system, matching processing is carried out according to the data size of the test cases and the processing performance of the target containers, the test cases are distributed to the matched target containers for operation, the purpose of parallel operation of the test cases by utilizing the plurality of target containers is achieved, and the efficiency of automatic test is improved.
The user terminal provided by the application can be a smart phone, a tablet computer, a notebook computer, a desktop computer and the like, and the server side can be a computer device with a processing function, but is not limited to the smart phone, the tablet computer, the notebook computer, the desktop computer and the like. The server and the user terminal may be connected to each other through bluetooth, USB (universal serial bus), or other communication connection methods, which is not limited herein.
Fig. 2 is a flowchart of an automated testing method according to an embodiment of the present application, including the following steps:
step S210, acquiring a plurality of test cases according to preset test requirements;
step S220, a plurality of target containers for executing test cases are created on a preset distributed system according to the test requirements;
and step S230, matching the data size of the test case with the processing performance of the target container, and distributing the test case to the matched target container for operation.
The server receives a test request for an application, analyzes the test request to obtain a test requirement therein, wherein the test requirement is a test for a software development result in a software development process, and can comprise a test for a whole or partial program of the software, such as a functional test, a performance test and the like. The method and the system preferably adopt a regression test method to carry out software tests, and each new version of software test can be tested by using the regression test, so that the cost of the stages of system test, maintenance and upgrading and the like is greatly reduced.
Writing test cases according to test requirements or calling the test cases stored in advance for the test requirements, and calling the test cases stored in advance for the test requirements as follows: and loading the storage address of the test case in the distributed system, namely directing the running address of the test case to the distributed system, and grabbing the test case by the distributed system through the storage address of the test case.
After the test case is obtained, the test case is started by using the configuration file at fixed time, for example: by using the jenkins to start the test cases at regular time, the configuration files can be set according to actual conditions, the jenkins configuration files can be used for dynamically monitoring the starting process of the test cases, and the starting period and the starting time can be set through the configuration files, so that the aim of automatically starting the test for a large number of test cases is fulfilled.
The distributed system provided by the application comprises a plurality of nodes, wherein the nodes can be server nodes, the server nodes can be distributed on a plurality of different hardware devices, certain data processing performance can be configured on each server node except for automatic test, and certain data processing performance can also be configured for other application programs, namely, a plurality of application programs are simultaneously operated on one server node, so that the available data processing performance on different server nodes is different, and the processing performance of the target container can be preset to be fixed on the server node based on the target container established by the server node, and the processing performance currently available in the server node can be set to be the processing performance of the target container.
The target container provided by the application can realize virtualization on the operating system level, a plurality of target containers can directly multiplex the operating system of the local host in the testing process, the purpose of executing a plurality of test cases on the same host in parallel is realized, the utilization rate of resources is improved, the cost is reduced, and the management and fault tolerance and disaster tolerance are convenient.
After the step of creating a plurality of target containers for executing the test cases in step S220, the method further includes: and acquiring the processing performance of the target container, and if the processing performance of the target container is the currently available processing performance in the server node, determining the currently available system resources based on the currently occupied system resources and rated system resources of the server node, namely determining the processing performance of the target container through the occupancy rate of the residual system resources. If the processing performance of the target container is the set fixed processing performance, different fixed processing performances can be set for different target containers, so as to improve the system resource utilization rate of different nodes in the distributed system.
The data size of the test cases is obtained, matching is carried out according to the data size of the test cases and the processing performance of the target container, and the test cases with larger data size are distributed to the target container with stronger processing performance for execution, so that the test efficiency is improved.
According to the scheme provided by the application, corresponding test cases are acquired according to test requirements, the test process of the test cases is started, then a plurality of target containers for executing the test cases are created on the distributed system, the test cases matched with the processing performance of the target containers are processed by utilizing the target containers, and compared with the scheme of executing the test cases by a traditional single thread, the purpose of executing the test cases on the same host in parallel is achieved by constructing the plurality of target containers, and the utilization rate of resources and the test efficiency of the test cases are improved.
In addition, the data size of the test case is matched with the processing performance of the target container, and the test case with larger data size is distributed to the target container with stronger processing performance for execution, so that the execution efficiency of the test case is improved, and the efficiency of the whole test process is improved.
In order to make the automatic test scheme and its technical effects provided in the present application more clear, specific embodiments thereof will be described in detail below with reference to a plurality of examples.
In one embodiment, the step of creating a plurality of target containers for executing the test cases in the preset distributed system in step S220 may be performed by the following manner, where the flowchart is shown in fig. 3, and includes:
s310, acquiring node information of a distributed system in which a target container is located;
s320, reading the system resource occupancy rate of each node on the distributed system according to the node information;
s330, determining the processing performance of a target container corresponding to each node according to the system resource occupancy rate of each node;
s340, generating configuration files corresponding to the nodes according to the test requirements and the processing performance, and creating a target container according to the configuration files.
Creating a plurality of target containers for executing test cases on a preset distributed system, and determining configuration files of the target containers according to test requirements and node information of the distributed system before the target containers.
The node information of the distributed system comprises system resource occupancy rate, available system resources of the target container are determined according to the current system resource occupancy rate of each node, and processing performance of the target container is determined according to the available system resources.
The node information of the distributed system comprises information such as the number, the position and the system resource occupancy rate of the node, and the system resource occupancy rate of the node represents the used resource occupancy rate of each application in the current node or the sum of the occupied resource occupancy rates of all applications in the current node. The processing performance of the target container is comprised of: one or more of parameters such as CPU operation, IO interface, external system and the like are characterized, and the processing performance of the target container is in direct proportion to the size of the allocated system resources. The system resources of the target container may be the remaining system resources of the current node, which are available to the target container, or fixed system resources set based on these remaining system resources, for example: the system resource occupancy of the current server node is 47%, and then the processing capability of the target container is the processing capability of the system resource characterization (53% of the total operation memory of the server node), or the processing capability directly characterized by the system resource set (50% of the total operation memory of the server node) is the processing capability of the target container.
According to the embodiment of the application, the processing performance of the target container is determined according to the system resource occupancy rate, so that the test cases are distributed according to the processing performance of the target container, and the execution efficiency of the test cases is improved.
Generating a configuration file according to the test requirements and the processing performance of the target container obtained in the embodiment, and creating the target container according to the configuration file corresponding to each node, wherein the test requirements are dataized, configuration parameters in the configuration file, such as the number of the target containers, are determined according to the test requirements, the configuration file of the target container is generated by utilizing the test requirements and the processing performance of the target container, and the target container is generated according to the configuration file and the creation function.
The target container provided by the application is a docker container, and the creation-operation process of the docker container is as follows: creating a corresponding Job instance; configuring environment parameters for jobs by using Docker Daemon; executing the job's run function. In Job implementation, there are two ways to configure parameters for Job: first, when creating Job instance, directly initializing the Args attribute of Job with the specified parameters; second, after Job is created, a specified environment variable is added to Job.
The target container created by the method is a docker container, a software development user can package test application and rely on package to a portable docker container in a unified mode, and then release the container to a server of He An provided with a docker engine, so that migration expansion can be easily carried out. Moreover, the target container can effectively divide the resources managed by a single operating system into a plurality of isolated groups, so that conflicting resource use requirements are balanced among the groups, test cases are simultaneously operated in the plurality of containers, different containers are mutually isolated and are not mutually interfered, and the utilization rate of the operating system resources is higher.
Further, in order to improve the creation efficiency of the target container, a mirror image warehouse may be used, by reducing the time required for downloading the component when the mirror image is built, or by combining a plurality of running commands into one command, the number of layers in the mirror image is reduced, the speed of building the mirror image is increased, the time for building the target container is shortened, and the efficiency of the whole test process is improved.
According to the embodiment of the application, the plurality of target containers are created on the distributed system, so that the plurality of test cases can be executed in parallel by using the plurality of target containers later, and the test efficiency is improved.
On this basis, because a plurality of programs can be simultaneously operated on each node of the distributed system, that is, the available system resources on each node are in a variable state, in order to improve the accuracy of the processing performance of the target container, after the step of distributing the test cases to the target container matched with the test cases for operation, the present application further provides the following scheme, including:
a1, collecting the system resource occupancy rate of each node to obtain the system resource occupancy rate;
a2, correcting the processing performance of the target container according to the system resource occupancy rate.
According to the scheme provided by the embodiment of the application, after the test cases are distributed to the target containers for execution, the processing performance of each target container can be corrected through collecting the system resource occupancy rate, further, the processing performance of each target container can be corrected regularly, the correction period can be set according to actual conditions, one test case can be set for updating once after processing, the processing performance of each target container can be updated at the timing of the update period, the processing performance of each target container is corrected based on the currently collected system resource occupancy rate, the correction of the processing performance of each target container is realized, so that the matching between the test cases and the target containers is carried out based on the corrected processing performance, the matching accuracy between the test cases and the target containers is improved, and the test efficiency of the test cases is further improved.
The step of matching the data size of the test case with the processing performance of the target container in step S230 may be performed by the following manner, and the flowchart thereof is as shown in fig. 4, and includes:
s410, dividing batches of a plurality of test cases according to the number of target containers, wherein the number of each batch of test cases is the same as the number of the target containers;
s420, acquiring the data size of each test case for each batch, and sequencing the test cases according to the data size;
s430, calling a sequencing result of sequencing the target containers according to the processing performance of the target containers in advance for each batch;
s440, matching the test cases with the same sequence with the target container for each batch.
The embodiment of the application provides a matching scheme of target containers and test cases, each target container executes one test case at a time, so that batch tests are carried out on the test cases in batches, the number of the test cases in each batch is the same as that of the target containers, the target containers are sequenced in advance according to the processing performance of the target containers, the sequencing result of the target containers in the processing performance is obtained, the test cases in each batch are sequenced according to the data size, the sequencing result of the test cases is obtained, and because the number of the test cases in each batch is the same as that of the target containers, the test cases with the same sequencing are matched with the target containers, and the matching relationship is one-to-one correspondence.
Under the condition, if the quantity of the test cases is not zero to the surplus of the quantity of the target containers, the quantity r of the test cases in the last batch is obtained, the test cases in the last batch are ordered according to the data size, r target containers which are ranked at the front are called, and matching processing is carried out according to the processing performance of the target containers and the data size of the test cases, so that the processing efficiency of the test cases is improved to the greatest extent.
The above embodiment achieves the purpose of improving the execution efficiency of the test case by setting a plurality of target containers on the distributed system and distributing the test case to the matched target containers for execution, and the present embodiment further improves the test process by the following embodiment, and after the step of distributing the test case to the matched target containers for operation in step S230, the method further includes:
b1, acquiring a log file for recording the running process of the test case;
b2, extracting test results and corresponding test case identifications from the log file;
and B3, integrating the test result and the test case identifier to generate a test report and sending the test report.
The log file records the whole process of the test cases running in the target container, intermediate process data and final test results, and after the log file of each test case is obtained, the log file is subjected to data crawling to obtain the test results and the test case identifications in the log file. The method comprises the steps of obtaining log files of a plurality of test cases, crawling test results and associated test case identifications, integrating the data of the plurality of test results and the test case identifications to generate a test report, and sending the test report to a user side in a mail form.
The process of creating the target container, running the test case by using the target container and forming the test report can be automatically executed through a script file, and in the process of compiling the script file to realize the generation of the test report, the process of executing the test case in the target container and generating the test report can be realized by using a run () method of HTMLTestRunner, wherein HTMLTestRunner is an extension of a Python standard card unitest unit test framework and is mainly used for generating an HTML test report so as to generate a popular and easily understood test report to display an automatic test result.
After obtaining the test report through the above embodiment, the data analysis is performed on the test report, and the data analysis process may be performed in the following manner, where the flowchart is shown in fig. 5, and includes:
s510, detecting that a test result in a test report exceeds a preset standard reference range, and calling a test case identifier associated with the test result;
s520, performing secondary execution on the target test case corresponding to the test case identifier to obtain a secondary test result;
s530, determining abnormal information which causes the abnormal test result according to the secondary test result.
A standard reference range is set for each test result in advance, each test result in a test report is compared with the corresponding standard reference range, whether an abnormal condition that the test result exceeds the standard reference range exists is judged, if the test result in the test report exceeds the standard reference range, the abnormal condition is detected, the abnormal condition indicates that the test result is abnormal, and the abnormal condition is possibly caused by problems in the execution process of the target container, such as incomplete test caused by insufficient running space of the target container, abnormal running environment and the like, and also possibly caused by errors of the test case, damaged test case data and the like. Further testing is necessary to determine the cause of the abnormality of the test result, and the scheme provided by the application is that, for the abnormal test result, a test case identifier associated with the test result is called, a target test case corresponding to the test case identifier is placed in a target container for secondary execution, and in order to facilitate distinguishing between the target containers, the target container for obtaining the test result by the first execution is called a first target container, where the target container is preferably another target container except the first target container, for example: the abnormal test result is executed in the target container 4, the target containers numbered 1 to 4 operate normally, the test case corresponding to the test result can be executed again in any one of the target containers numbered 1 to 3, the secondary test result is obtained, and the abnormal information causing the abnormality of the test result is determined based on the analysis of the secondary test result, such as: and comparing and analyzing the secondary test result with the test result of the first test, and determining abnormal information according to the analysis result. The first target container may also be utilized to execute the target test case twice, so as to eliminate test anomalies caused by sporadic factors or environmental factors.
According to the scheme provided by the embodiment of the application, the target test case corresponding to the abnormal test result is secondarily executed, the abnormal information causing the abnormal test result is determined based on the test result and the secondary test result, and the abnormal information can be conveniently and definitely determined in the mode, for example, the abnormal test result caused by accidental factors or environmental factors can be directly eliminated.
On the basis, the application also provides the following embodiment for further defining the test reason of the abnormal test result, and in combination with fig. 5, the steps for determining the abnormal information causing the abnormal test result according to the test result include:
s540, judging whether the secondary test result exceeds the standard reference range;
s550, when the secondary test result is beyond the standard reference range, determining that the test case corresponding to the test result is abnormal.
Because the test result and the secondary test result are tests aiming at the same test case, a standard reference range preset for the test case is called, the secondary test result is compared with the standard reference range, if the secondary test result is beyond the standard reference range, the test case is determined to be wrong, and if the secondary test result is not beyond the standard reference range, the target container corresponding to the test result is likely to be abnormal.
By utilizing the scheme provided by the application, whether the current test result with the problem is caused by the error of the test case can be determined, so that the erroneous judgment of the test case is avoided, and the accurate reason of the problem of the test result is facilitated to be determined.
Further, after the step of determining that the test case corresponding to the test result is abnormal in step S550, the method further includes: s560, sending prompt information of abnormality of the test case, and executing operation of deleting the abnormality test case in response to a deleting instruction carried in the prompt information.
After the abnormality of the test case is confirmed, the prompt information of the test case error is sent, the prompt information carries the deleting information, the operation of deleting the abnormal test case is executed in response to the deleting instruction, so that the accuracy of the test case set is ensured, other test processes can be prevented from testing by using the test case with error, and the accuracy of a test result is ensured.
Further, after the step of distributing the test cases to the target containers matched with the test cases for operation, the method further comprises the following steps: the target container is deleted.
After the test process of all the test cases is completed, the test process completion includes: after the test cases with the test results exceeding the preset standard reference range are executed for the second time, namely, all the test cases complete the test and test verification process, the test process does not need to be executed by the target container any more, the target container is deleted, and the occupation of the target container to system resources is relieved.
The foregoing is an embodiment of an automated testing method provided herein, for which embodiments of an automated testing apparatus corresponding thereto are described below.
The embodiment of the application also provides an automatic testing device, the structural schematic diagram of which is shown in fig. 6, comprising: the test case obtaining module 610, the target container creating module 620, and the allocation running module 630 are specifically as follows:
the test case acquisition module 610 is configured to acquire a plurality of test cases according to a preset test requirement;
a create target container module 620, configured to create a plurality of target containers for executing the test cases on a preset distributed system;
and the allocation operation module 630 is configured to match the data size of the test case with the processing performance of the target container, and allocate the test case to the matched target container for operation.
The specific manner in which the individual modules perform the operations of the automated test equipment in the embodiments described above have been described in detail in connection with the embodiments of the method and will not be described in detail herein.
Further, embodiments of the present application also provide a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the automated test method of any of the above. Wherein the storage medium includes, but is not limited to, any type of disk including floppy disks, hard disks, optical disks, CD-ROMs, and magneto-optical disks, ROMs (Read-Only Memory), RAMs (Random AcceSS Memory ), EPROMs (EraSable Programmable Read-Only Memory), EEPROMs (Electrically EraSable Programmable Read-Only Memory), flash Memory, magnetic cards, or optical cards. That is, a storage medium includes any medium that stores or transmits information in a form readable by a device (e.g., a computer). And may be a read-only memory, a magnetic or optical disk, etc.
Still further, embodiments of the present application also provide a computer device, including:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the steps of the automated test method of any of the preceding claims.
Fig. 7 is a block diagram illustrating a system for a computer device 700, according to an example embodiment. For example, the computer device 700 may be provided as a server-side. Referring to fig. 7, a computer device 700 includes a processing component 722 that further includes one or more processors and memory resources represented by memory 732 for storing instructions, such as application programs, executable by the processing component 722. The application programs stored in memory 732 may include one or more modules that each correspond to a set of instructions. Further, the processing component 722 is configured to execute instructions to perform the steps of the automated test method described above.
The computer device 700 may also include a power supply component 726 configured to perform power management of the computer device 700, a wired or wireless network interface 750 configured to connect the computer device 700 to a network, and an input output (I/O) interface 758. The computer device 700 may operate based on an operating system stored in memory 732, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like. It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
It should be understood that each functional unit in the embodiments of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for a person skilled in the art, several improvements and modifications can be made without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (9)

1. An automated testing method, comprising:
acquiring a plurality of test cases according to preset test requirements;
creating a plurality of target containers for executing the test cases on a preset distributed system;
matching the data size of the test case with the processing performance of the target container, and distributing the test case to the matched target container for operation;
acquiring a log file for recording the running process of the test case, wherein the log file comprises the test result of the test case;
if an abnormal test result exists in the test report, determining a test case identifier associated with the abnormal test result if the abnormal test result exceeds a preset standard reference range;
placing the target test case corresponding to the test case identifier into other target containers for execution, and secondarily executing the target test case in the other target containers to obtain a secondary test result;
determining abnormal information causing the abnormality of the test result based on the abnormal test result and the secondary test result; the other target container is other containers than the first target container among the plurality of target containers; the first target container is a target container for executing the target test case for the first time to obtain the abnormal test result; the matching of the data size of the test case and the processing performance of the target container comprises the following steps:
dividing a plurality of test cases into batches according to the number of the target containers, wherein the number of each batch of test cases is the same as the number of the target containers;
for each batch, acquiring the data size of each test case, and sequencing the test cases according to the data size;
for each batch, calling a sequencing result of sequencing the target containers according to the processing performance of the target containers in advance;
for each batch, matching the test cases with the same sequence with the target container.
2. The automated test method of claim 1, wherein creating a plurality of target containers for executing the test case on a preset distributed system comprises:
acquiring node information of a distributed system in which a target container is located;
reading the system resource occupancy rate of each node on the distributed system according to the node information;
determining the processing performance of a target container corresponding to each node according to the system resource occupancy rate of each node;
generating configuration files corresponding to all nodes according to the test requirements and the processing performance, and creating a target container according to the configuration files.
3. The automated testing method of claim 2, wherein after the step of distributing the test cases to run in the matching target containers, further comprising:
collecting the system resource occupancy rate of each node to obtain the system resource occupancy rate;
and correcting the processing performance of the target container according to the system resource occupancy rate.
4. The automated test method of claim 1, wherein after the step of assigning test cases to run in matching target containers, further comprising:
extracting a test result and a corresponding test case identifier from the log file;
and integrating the test result and the test case identifier to generate a test report and sending the test report.
5. The automated test method of claim 1, wherein after the obtaining the secondary test result, further comprising:
and determining abnormal information which causes the abnormal test result according to the secondary test result.
6. The automated test method of claim 5, wherein determining anomaly information that causes the test result to be anomalous based on the secondary test result comprises:
and when the secondary test result also exceeds the standard reference range, determining that the test case corresponding to the test result is abnormal.
7. An automated test equipment, comprising:
the test case acquisition module is used for acquiring a plurality of test cases according to preset test requirements;
the target container creating module is used for creating a plurality of target containers for executing the test cases on a preset distributed system;
the distribution operation module is used for matching the data size of the test case with the processing performance of the target container and distributing the test case to the matched target container for operation; acquiring a log file for recording the running process of the test case, wherein the log file comprises the test result of the test case; if an abnormal test result exists in the test report, determining a test case identifier associated with the abnormal test result if the abnormal test result exceeds a preset standard reference range; placing the target test case corresponding to the test case identifier into other target containers for execution, and secondarily executing the target test case in the other target containers to obtain a secondary test result; determining abnormal information causing the abnormality of the test result based on the abnormal test result and the secondary test result; the other target container is other containers than the first target container among the plurality of target containers; the first target container is a target container for executing the target test case for the first time to obtain the abnormal test result;
the distribution operation module is specifically used for dividing batches of a plurality of test cases according to the number of the target containers, wherein the number of each batch of test cases is the same as the number of the target containers;
for each batch, acquiring the data size of each test case, and sequencing the test cases according to the data size;
for each batch, calling a sequencing result of sequencing the target containers according to the processing performance of the target containers in advance;
for each batch, matching the test cases with the same sequence with the target container.
8. A computer readable storage medium for storing computer instructions which, when run on a computer, cause the computer to perform the steps of the automated test method of any of claims 1 to 6.
9. A computer device, the computer device comprising:
one or more processors;
storage means for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the steps of the automated test method of any of claims 1 to 6.
CN201911053638.0A 2019-10-31 2019-10-31 Automatic test method, device, storage medium and equipment Active CN110765026B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911053638.0A CN110765026B (en) 2019-10-31 2019-10-31 Automatic test method, device, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911053638.0A CN110765026B (en) 2019-10-31 2019-10-31 Automatic test method, device, storage medium and equipment

Publications (2)

Publication Number Publication Date
CN110765026A CN110765026A (en) 2020-02-07
CN110765026B true CN110765026B (en) 2023-08-01

Family

ID=69335079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911053638.0A Active CN110765026B (en) 2019-10-31 2019-10-31 Automatic test method, device, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN110765026B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111506465A (en) * 2020-04-20 2020-08-07 北京易点淘网络技术有限公司 Computer function testing method and device
CN111581085B (en) * 2020-04-28 2023-06-20 广州市百果园信息技术有限公司 Joint debugging test system and method
CN111639022B (en) * 2020-05-16 2023-06-06 中信银行股份有限公司 Transaction testing method and device, storage medium and electronic device
CN112162927B (en) * 2020-10-13 2024-04-26 网易(杭州)网络有限公司 Testing method, medium, device and computing equipment of cloud computing platform
CN112346979A (en) * 2020-11-11 2021-02-09 杭州飞致云信息科技有限公司 Software performance testing method, system and readable storage medium
CN112486812A (en) * 2020-11-26 2021-03-12 北京海量数据技术股份有限公司 Distributed framework software testing method and device supporting cloud
CN112596750B (en) * 2020-12-28 2022-04-26 上海安畅网络科技股份有限公司 Application testing method and device, electronic equipment and computer readable storage medium
CN113485905B (en) * 2021-02-26 2023-09-05 杜自然 Test method, device, equipment and computer storage medium in data transaction
CN114039974B (en) * 2021-10-20 2024-05-31 支付宝(杭州)信息技术有限公司 Method and device for providing equipment service for user, storage medium and electronic equipment
CN117234949B (en) * 2023-11-13 2024-03-19 广州品唯软件有限公司 Test data noise reduction method and device, storage medium and computer equipment
CN117520129B (en) * 2023-11-21 2024-05-10 北京东青互联科技有限公司 Data center equipment monitoring method, device, equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8063908B1 (en) * 2007-11-08 2011-11-22 Nvidia Corporation System, method, and computer program product for validating a graphics processor design
CN108959080A (en) * 2018-06-27 2018-12-07 郑州云海信息技术有限公司 A kind of automated testing method executed parallel based on UnitTest
CN109062780A (en) * 2018-06-25 2018-12-21 深圳市远行科技股份有限公司 The development approach and terminal device of automatic test cases

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609352B (en) * 2011-01-19 2014-11-19 阿里巴巴集团控股有限公司 Parallel testing method and parallel testing server
CN108171050A (en) * 2017-12-29 2018-06-15 浙江大学 The fine granularity sandbox strategy method for digging of linux container
CN110083535A (en) * 2019-04-22 2019-08-02 网宿科技股份有限公司 A kind of method for testing software and device
CN114168429A (en) * 2021-12-17 2022-03-11 平安付科技服务有限公司 Error reporting analysis method and device, computer equipment and storage medium
CN114691494A (en) * 2022-02-28 2022-07-01 网宿科技股份有限公司 Test case execution method and system and test equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8063908B1 (en) * 2007-11-08 2011-11-22 Nvidia Corporation System, method, and computer program product for validating a graphics processor design
CN109062780A (en) * 2018-06-25 2018-12-21 深圳市远行科技股份有限公司 The development approach and terminal device of automatic test cases
CN108959080A (en) * 2018-06-27 2018-12-07 郑州云海信息技术有限公司 A kind of automated testing method executed parallel based on UnitTest

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种J2EE服务器端软件测试用例的复用框架研究;赵亮等;《小型微型计算机***》(第4期);第663-667页 *
基于用例录制的电力二次智能设备自动测试平台;于哲等;《电工技术》(第21期);第75-78页 *

Also Published As

Publication number Publication date
CN110765026A (en) 2020-02-07

Similar Documents

Publication Publication Date Title
CN110765026B (en) Automatic test method, device, storage medium and equipment
CN110162435B (en) Method, system, terminal and storage medium for starting and testing PXE of server
CN110045991B (en) RAID configuration method and device of server, computer equipment and storage medium
CN109587331B (en) Method and system for automatically repairing cloud mobile phone fault
US11556409B2 (en) Firmware failure reason prediction using machine learning techniques
CN111831567B (en) Application test environment configuration method, device, system and medium
CN111371610A (en) Network card firmware batch refreshing method, system, terminal and storage medium
CN111966551A (en) Method, system, terminal and storage medium for verifying remote command execution result
CN106708727B (en) Distributed virus characteristic sample verification method and system
CN111858201A (en) BMC (baseboard management controller) comprehensive test method, system, terminal and storage medium
CN111399862A (en) Batch installation method and device of Windows missing driver and computer equipment
CN109002348B (en) Load balancing method and device in virtualization system
CN114064216A (en) Virtual machine initialization method, device, terminal equipment and storage medium
CN112214413A (en) Application program testing method, device, equipment and storage medium
CN111768129A (en) Delivery method, device, equipment and storage medium of proprietary cloud product
CN114860694B (en) Asynchronous collaborative data migration method and device for wind power plant monitoring system
CN116149941A (en) Monitoring method and device of server component, server and storage medium
CN112463574A (en) Software testing method, device, system, equipment and storage medium
CN112596750B (en) Application testing method and device, electronic equipment and computer readable storage medium
CN115221092A (en) Method, device and equipment for determining distributable bus of PCI-E card and storage medium
CN115080382A (en) Code testing method, device, equipment and medium
CN114461458A (en) Server memory test method, system, terminal and storage medium
CN113656291A (en) Dynamic calling method for software script engine
CN113703804A (en) System upgrading method, system, device and storage medium
CN112817620A (en) Controller terminal program updating method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 801-2, floor 8, building 3, No. 22, Ronghua Middle Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Applicant after: Wanghai Kangxin (Beijing) Technology Co.,Ltd.

Address before: Room 07, Room 2, Building B, 12 Hongda North Road, Beijing Daxing District, Beijing

Applicant before: BEIJING NEUSOFT VIEWHIGH TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant