CN118132406A - Test method, test device, electronic equipment and storage medium - Google Patents

Test method, test device, electronic equipment and storage medium Download PDF

Info

Publication number
CN118132406A
CN118132406A CN202211504261.8A CN202211504261A CN118132406A CN 118132406 A CN118132406 A CN 118132406A CN 202211504261 A CN202211504261 A CN 202211504261A CN 118132406 A CN118132406 A CN 118132406A
Authority
CN
China
Prior art keywords
test
data
determining
environment
training sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211504261.8A
Other languages
Chinese (zh)
Inventor
还新新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to CN202211504261.8A priority Critical patent/CN118132406A/en
Priority to PCT/CN2023/104877 priority patent/WO2024113860A1/en
Publication of CN118132406A publication Critical patent/CN118132406A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0475Generative networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/094Adversarial learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Quality & Reliability (AREA)
  • Computer Hardware Design (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the disclosure relates to a testing method, a testing device, electronic equipment and a storage medium, wherein the testing method comprises the following steps: receiving a test request, wherein the test request carries a test information set; determining a test intention corresponding to the test request; determining file information required for constructing a test environment and test data required for being applied to the test environment according to the test intention; constructing a test environment according to the file information and the test information set; and applying the test data to a test environment to obtain a test result corresponding to the test request. Therefore, the embodiment of the disclosure does not need to input excessive manpower to prepare test environments and test data, and the test efficiency is improved.

Description

Test method, test device, electronic equipment and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of data processing, in particular to a testing method, a testing device, electronic equipment and a storage medium.
Background
At present, when network management software testing is performed, test environments and test data required by the test environments are usually prepared and applied manually according to the programming of test scenes. However, there is a problem in that the test efficiency is low by manually preparing the test environment and the test data.
Disclosure of Invention
In view of the above, in order to solve the above technical problems or some of the technical problems, embodiments of the present disclosure provide a testing method, a testing device, an electronic apparatus, and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a test method, including:
receiving a test request, wherein the test request carries a test information set;
determining a test intention corresponding to the test request;
determining file information required for constructing a test environment and test data required for being applied to the test environment according to the test intention;
Constructing the test environment according to the file information and the test information set;
and applying the test data to the test environment to obtain a test result corresponding to the test request.
In an optional embodiment, the building the test environment according to the file information and the test information set includes:
determining first hardware resource information currently remained in a hardware resource pool;
Determining second hardware resource information required for constructing the test environment from the first hardware resource information according to the file information and the test information set;
Generating an environment construction script corresponding to the file information;
And running the environment construction script according to the second hardware resource information, the file information and the test information set to construct the test environment.
In an optional implementation manner, the determining, according to the file information and the test information set, second hardware resource information required for building the test environment from the first hardware resource information includes:
Generating a resource detection script according to the file information and the test information set;
running the resource detection script to detect the first hardware resource information to obtain a target detection result;
And under the condition that the target detection result is that the hardware resources are sufficient, determining second hardware resource information required for constructing the test environment from the first hardware resource information according to the file information and the test information set.
In an alternative embodiment, the test data required for the application to the test environment is determined by:
Determining identification information corresponding to test data required by the test environment according to the test intention;
Determining a data type corresponding to the test data according to the identification information; wherein the data types include a first data type and a second data type; the first data type is used for representing basic configuration class data, and the second data type is used for representing non-basic configuration class data;
And determining test data required for application to the test environment according to the data type.
In an alternative embodiment, the determining test data required for the test environment according to the data type includes:
when the data type is the first data type, executing query operation of a data warehouse according to the identification information corresponding to the first data type so as to obtain the test data required by the test environment;
when the data type is the second data type, executing the query operation of the data warehouse according to the identification information corresponding to the second data type so as to obtain training sample data;
inputting the training sample data into a pre-constructed machine learning model, so that the machine learning model outputs target generation data corresponding to the training sample data;
The training sample data and the target generation data are determined as the test data required for application to the test environment.
In an optional embodiment, according to the test intention, determining a first data volume corresponding to the identification information, where the first data volume is used to characterize the required quantity of the test data corresponding to the second data type;
the determining the training sample data and the target generation data as the test data required for application to the test environment includes:
determining a second amount of data for the training sample data and the target generation data;
When the second data volume does not reach the first data volume, updating the training sample data according to the training sample data and the target generation data, and repeating the target data generation step until the second data volume of the training sample data, which is output by the machine learning model and corresponds to the target generation data and is not updated, reaches the first data volume;
And determining the target generation data corresponding to the training sample data and the second data volume which are not updated and correspond to the first data volume as the test data required by the test environment.
In an alternative embodiment, the machine learning model includes a generation model and a discrimination model;
inputting the training sample data into a pre-constructed machine learning model, so that the machine learning model outputs target generation data corresponding to the training sample data, wherein the method comprises the following steps:
Inputting the training sample data into the generation model so that the generation model outputs initial generation data;
Inputting the initial generation data and the training sample data into the discrimination model so that the discrimination model outputs the credibility between the initial generation data and the training sample data;
And determining target generation data corresponding to the training sample data from the initial generation data according to the credibility and a preset cost function.
In an alternative embodiment, the determining file information required for constructing a test environment and test data required for the test environment according to the test intention includes:
And executing the query operation of the pre-constructed knowledge graph according to the test intention to obtain file information required by constructing a test environment and test data required by being applied to the test environment.
In a second aspect, embodiments of the present disclosure provide a test apparatus, comprising:
The receiving module is used for receiving a test request, wherein the test request carries a test information set;
the determining module is used for determining the test intention corresponding to the test request;
the determining module is further used for determining file information required by building a test environment and test data required by the test environment according to the test intention;
The construction module is used for constructing the test environment according to the file information and the test information set;
And the application module is used for applying the test data to the test environment to obtain a test result corresponding to the test request.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: the device comprises a processor and a memory, wherein the processor is used for executing a test program stored in the memory so as to realize the test method.
In a fourth aspect, embodiments of the present disclosure provide a storage medium storing one or more programs executable by one or more processors to implement the test method as described above.
The test method provided by the embodiment of the disclosure comprises the following steps: receiving a test request sent by a client, wherein the test request carries a test information set; determining a test intention corresponding to the test request; determining file information required for constructing a test environment and test data required for being applied to the test environment according to the test intention; constructing a test environment according to the file information and the test information set; and applying the test data to a test environment to obtain a test result corresponding to the test request. By means of the method, the embodiment of the disclosure obtains the test intention by carrying out intention analysis on the test request, then builds the test environment and prepares the test data by combining the test intention, finally obtains the final test result according to the built test environment and the prepared test data, does not need to input excessive manpower to prepare the test environment and the test data, and improves the test efficiency.
Drawings
FIG. 1 is a flow chart of a test method according to an embodiment of the disclosure;
FIG. 2 is a schematic diagram of a build-up type countermeasure network provided by embodiments of the present disclosure;
FIG. 3 is a schematic structural diagram of a testing device according to an embodiment of the present disclosure;
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure;
In the above figures:
10. a receiving module; 20. a determining module; 30. constructing a module; 40. an application module;
400. an electronic device; 401. a processor; 402. a memory; 4021. an operating system; 4022. an application program; 403. a user interface; 404. a network interface; 405. a bus system.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
For the purpose of facilitating an understanding of the embodiments of the present disclosure, reference will now be made to the specific embodiments illustrated in the drawings and not to the extent that they are intended to limit the embodiments of the present disclosure.
Referring to fig. 1, fig. 1 is a flow chart of a test method according to an embodiment of the disclosure. The test method provided by the embodiment of the disclosure comprises the following steps:
s101: and receiving a test request, wherein the test request carries a test information set.
In this embodiment, the execution body is a server, the test request is sent to the server by the client, and the client is configured to obtain the test request of the tester, and send the obtained test request to the server. The client side is provided with a recording device, and when the test requirement exists, the client side collects sound signals of a tester through the recording device and obtains a test request after processing the sound signals. Or a text input interface is displayed in a screen of the client, and a user can input the content to be tested through the text input interface, process the input content to be tested and acquire a test request. The test information set includes a plurality of test information, and the test information includes, but is not limited to, a test local point, a test environment scale, a test function scene and a test environment database type. The test information set is used for constructing a test environment for testing the network management software.
S102: and determining the test intention corresponding to the test request.
In this embodiment, the test intent is determined to determine some relevant files that build the test environment and test data that are needed for application to the test environment. Before executing step S102, after receiving the test request sent by the client, the test information set included in the test request is detected, so as to detect whether the test information included in the test information set is complete, whether the input semantics are complete, and under the condition of incomplete and/or incomplete, the input error prompt is input again through the client prompt. Specifically, after the semantic analysis is performed on the test request, the intent recognition is performed, so that the test intent can be determined, where the intent recognition method may use the prior art, and details are not described herein in this embodiment.
S103: according to the test intention, determining file information required for constructing the test environment and test data required for being applied to the test environment.
In the present embodiment, file information required for constructing a test environment includes, but is not limited to: blueprint files, resource files, database type files. Specifically, the test intent corresponding to the test request may be determined as follows.
And executing the query operation of the pre-constructed knowledge graph according to the test intention to obtain file information required by constructing a test environment and test data required by being applied to the test environment.
In this embodiment, in order to facilitate determining file information required for building a test environment and test data required for application to the test environment, a corresponding knowledge graph is pre-built, and the knowledge graph is queried according to the determined test intention, so that the file information required for building the test environment and the test data required for application to the test environment can be obtained.
S104: and constructing a test environment according to the file information and the test information set.
In this embodiment, an environment construction script corresponding to the test environment can be generated according to the file information, hardware planning is performed according to the file information and the test information, and on the basis of the obtained file information and the test information set, the hardware resource planning is combined to run the environment construction script, so that the test environment is constructed. Specifically, the test environment may be constructed in the following manner, in particular as follows.
Determining first hardware resource information currently remained in a hardware resource pool;
Determining second hardware resource information required for constructing a test environment from the first hardware resource information according to the file information and the test information set;
Generating an environment construction script corresponding to the file information;
and constructing a script by the running environment according to the second hardware resource information, the file information and the test information set so as to construct a test environment.
In this embodiment, the second hardware resource information required for constructing the test environment may be determined from the first hardware resource information according to the file information and the test information set, in the case that the first hardware resource information is sufficient. The environment construction script is generated by a script generator. And if the test information set comprises the environment version information, extracting a corresponding environment version from the product library according to the environment version information. Specifically, when the test information set includes environment version information, the execution environment may construct a script to construct a test environment according to the second hardware resource, the database type file, and the environment version.
In this embodiment, the first hardware resource information may be detected according to the file information and the test information set, so as to detect whether the current first hardware resource information meets the requirement of building the test environment, and when the first hardware resource information meets the requirement of building the test environment, the second hardware resource information required for building the test environment is determined from the first hardware resource according to the file information and the test information set. Specifically, the second hardware resource information required to build the test environment may be determined as follows.
Generating a resource detection script according to the file information and the test information set;
Running a resource detection script to detect the first hardware resource information to obtain a target detection result;
And under the condition that the target detection result is that the hardware resources are sufficient, determining second hardware resource information required for constructing the test environment from the first hardware resource information according to the file information and the test information set.
In this embodiment, the target test result includes sufficient hardware resources and insufficient hardware resources, and in the case of sufficient hardware resources, the second hardware resource information required for building the test environment can be determined from the first hardware resource information. Under the condition of insufficient hardware resources, an environment construction script is not generated, and an error prompt is further returned to the client to prompt a tester.
In this embodiment, in order to be able to more accurately determine the data required for application to the test environment, in determining the test data required for application to the test environment according to the test intention, this may be achieved in the following manner, specifically as follows.
Determining identification information corresponding to test data required by a test environment according to the test intention;
determining the data type corresponding to the test data according to the identification information; wherein the data types include a first data type and a second data type; the first data type is used for representing basic configuration class data, and the second data type is used for representing non-basic configuration class data;
According to the data type, test data required for application to the test environment is determined.
In this embodiment, the basic configuration class data includes, but is not limited to, configuration data, industrial parameter data, and log data, and the basic configuration class data may be understood as some basic class, non-real-time data. Non-basic configuration class data includes, but is not limited to, performance data and MR data, which can be understood as data with strong real-time, time span and high data volume requirements. The basic configuration class data can be directly obtained from the data warehouse, although the non-basic configuration class data can also be obtained from the data warehouse, the quantity of the non-basic configuration class data stored in the data warehouse is small, when the data quantity of the non-basic configuration class data required by the test environment is large, the non-basic configuration class data stored in the data warehouse cannot meet the test requirement, so the data type required by the test environment is determined according to the test intention, and the test data required by the test environment is obtained from the data warehouse according to the data type.
Specifically, the identification information is used for identifying the test data and the data types of the test data, the data types of different test data are distinguished through the identification information, and different test data acquisition modes are adopted according to the data types so as to acquire the test data corresponding to the different data types.
More specifically, test data required for application to a test environment may be determined according to the data type in the following manner, in particular, as follows.
When the data type is the first data type, the query operation of the data warehouse is executed according to the identification information corresponding to the first data type so as to obtain test data required by the test environment;
when the data type is the second data type, performing query operation of the data warehouse according to the identification information corresponding to the second data type to obtain training sample data;
inputting training sample data into a pre-constructed machine learning model, so that the machine learning model outputs target generation data corresponding to the training sample data;
The training sample data and the target generation data are determined as test data required for application to the test environment.
In this embodiment, for the data of the basic configuration class, the query operation may be directly performed on the data warehouse, so as to obtain the test data of the basic configuration class required by the test environment. For the test data of the non-basic configuration class, a large amount of test data of the non-basic configuration class can be output through a small part of test data of the non-basic configuration class in a data warehouse and combining with a machine learning model, so that the obtained test data of the non-basic configuration class is more similar to the actual use scene of network management software. The machine learning model is a generated type countermeasure network model, and test data of non-basic configuration classes with more data volume can be obtained on the basis of test data of a small quantity of non-basic configuration classes through the pre-built machine learning model.
In this embodiment, it should be noted that, the data amount of the test data of the non-basic configuration class acquired from the data warehouse can meet the test requirement, and the test data of the non-basic configuration class can be acquired directly from the data warehouse without further acquiring the test data of the non-basic configuration class through a machine learning model. Whether the data volume of the test data of the non-basic configuration class in the data warehouse can meet the test requirement is determined according to the data volume of the test data of the non-basic configuration class determined by the test intention. Specifically, the training sample data and the target generation data may be determined as test data required for the application test environment in the following manner.
Determining a first data volume corresponding to the identification information according to the test intention, wherein the first data volume is used for representing the quantity of test data corresponding to the required second data type;
Determining a second data amount of the training sample data and the target generation data;
When the second data volume does not reach the first data volume, updating the training sample data according to the training sample data and the target generation data, and repeating the target data generation step until the second data volume of the training sample data output by the machine learning model and corresponding to the non-updated target generation data reaches the first data volume;
And determining the training sample data corresponding to the non-updated data and the target generation data corresponding to the second data volume reaching the first data volume as test data required by the test environment.
In this embodiment, the first data amount is the data amount of the test data applied to the non-basic configuration class required by the test environment. It should be noted that, in general, the data amount of the test data of the non-basic configuration class stored in the data warehouse is generally small, and the data amount cannot generally meet the test requirement, but when the data amount can reach the first data amount, there is no need to perform the step of inputting the training sample data into the machine learning model constructed in advance, so that the machine learning model outputs the target generated data corresponding to the training sample data. The training sample data is the test data of the non-basic configuration class meeting the test requirement. And when the data volume of the test data of the non-basic configuration class stored in the data warehouse cannot reach the first data volume, executing the step of inputting the training sample data into a pre-constructed machine learning model so that the machine learning model outputs target generation data corresponding to the training sample data. However, since the number of the finally generated target generation data and training sample data still cannot meet the first data amount of the test data of the non-basic configuration class required by the test requirement, the optimization data needs to be iterated repeatedly to obtain the test data of the non-basic configuration class of the first data amount. When iterative optimization data is carried out, training sample data of a machine learning model input last time and target generation data output by the machine learning model are used as training sample data of a current machine learning model, so that the training sample data output the target generation data, the training sample data are reciprocated, iteration is stopped until the data volume of the output target generation data and the data volume of training sample data (namely training sample data corresponding to the first time when the data volume is not updated) reaches a first data volume, and the output target generation data and the first training sample data are used as non-basic configuration type data required by testing.
Specifically, since the machine learning model is a generated countermeasure network model that is adopted, and the training sample data that is initially input to the machine learning model is real performance or MR data that is acquired from an external field, the target generation data that is output by the machine learning model is more realistic, and the output target generation data is applied to the constructed test environment, the accuracy of the final test result can be improved.
In this embodiment, the machine learning model (i.e., the generative antagonism network model) includes a generative model and a discriminant model. Therefore, the machine learning model can output the target generation data corresponding to the training sample data in the following manner, which is specifically as follows:
Inputting training sample data into the generation model so that the generation model outputs initial generation data;
Inputting the initial generation data and the training sample data into a discrimination model, so that the discrimination model outputs the credibility between the initial generation data and the training sample data;
And determining target generation data corresponding to the training sample data from the initial generation data according to the reliability and the preset cost function.
In this embodiment, a small amount of performance or MR data, which is initially obtained from the external field, is referred to as real data, and is input to the input end of the generating type countermeasure network, that is, the input end of the generating model, the generating model automatically generates sample data according to the input real data, the real data and the sample data are both input to the discriminating model, the discriminating model can output the reliability of the input update result, and the consistency and similarity of the generated data are determined by the reliability.
Specifically, referring to FIG. 2, the generative antagonism network model may be implemented as follows. The neural network can be used for constructing a generating type countermeasure network model comprising a generating model and a judging model respectively, training sample data are selected according to requirements, the training sample data are input into the generating type countermeasure network model, a characteristic function which accords with expected and expected deviation is obtained, the generating model and the judging model with higher performance are obtained through continuous training, wherein the G (Z) table generates the model, and Z table random noise is random data generated by a Gaussian distribution model or other probability distribution models, in the embodiment, the random noise represents original data, the model G converts random noise Z into data types G (x), and D table judging models output real numbers of 0-1 to any input x, D (x) and are used for representing the reliability of the input x. Wherein, pd and Pg are used for respectively representing the distribution of real data and generated data, and then the objective function of the discrimination model is as follows: maxDEx to Pd [ log D (x) ]+Ex to Pd [ log (1-D (x)) ]; the optimization objective function of the system is a preset cost function V (G, D): minGmaxDEx to Pd [ log D (x) ]+Ex to Pg [ log (1-D (x)) ]; wherein: max represents the maximum function, ex represents the variance of random data, and log is the log function; the continuous iteration of the two formulas can realize the improvement of the performance of the generated model and the discrimination model. The optimization process can be expressed as interactive iteration to D and G respectively, and the specific mode of the interactive iteration is as follows: firstly, fixing G unchanged, optimizing D, fixing D unchanged after a period of time, and optimizing G until the whole process converges.
S105: and applying the test data to a test environment to obtain a test result corresponding to the test request.
In the embodiment, the obtained basic configuration data and non-basic configuration data meeting the test requirements are applied to the test environment for testing, so that the test efficiency is improved, and the accuracy of the obtained test result is also improved.
According to the testing method provided by the embodiment, the testing request is subjected to intention analysis to obtain the testing intention, the construction of the testing environment and the preparation of the testing data are carried out by combining the testing intention, and finally the final testing result is obtained according to the constructed testing environment and the prepared testing data, so that the testing environment and the testing data do not need to be prepared by inputting excessive manpower, and the testing efficiency is improved.
Two examples are shown below, specifically describing the testing procedure, as follows.
Example one: a certain communication operator A needs to upgrade to a B version in the field network management software, the B version is newly added with two functions of C and D, the field environment scale is E, a test environment is required to be installed in a research and development stage aiming at the scene, test data are prepared for test verification, and the steps are as follows:
1. The test request is input by voice or words, the test request needs to contain basic information of local points, scale, scene type and required test functions, and for the scene, the following test request can be input: "test C and D functions of the E-scale B version of the office point A, prepare a test environment and test data";
2. firstly, detecting validity of a test request, then starting to determine a test intention corresponding to the test request, and giving suggestions for environment construction and test data preparation, wherein the suggestions comprise:
1) Test environment construction advice: which blueprint file, resource file, etc. are needed by the scene test environment, how to plan to use the existing hardware resources;
2) Test data preparation advice: the function under the scene test needs which test data, the function C needs network element configuration data and performance data, and the function D needs network element configuration data, engineering parameter data, two-week performance data and two-week MR data;
3. The script generator generates an environment construction script according to the test environment construction suggestion given by the test intention, and starts to construct the test environment;
4. analyzing the given data preparation suggestion according to the test intention, and starting to directly import the basic data;
5. According to the data preparation proposal given by the analysis of the test intention, a generating model and a judging model of the neural network are established to form a generating type countermeasure network, the performance data of a small amount of external fields and the decoded MR data (namely real data) in the data warehouse are input, and training is started to generate simulation data; the data of the existing outfield in the data warehouse only has one day of data quantity, the simulation data and the performance data output by the first round of training are input into the generated countermeasure network again until the output data quantity can meet the functional test requirement;
6. and importing the obtained data into a constructed test environment to obtain a corresponding test result.
Example two: some internet company develops a new function a and a new function B, and needs to build a test environment and prepare test data to test the functions first.
1. Inputting a test request: a new function A, B needs to be tested, and a test environment and test data are prepared;
2. Determining a test intention corresponding to the test request, firstly detecting the validity of the test request, then starting to determine the test intention corresponding to the test request, and giving suggestions for environment construction and test data preparation, wherein the suggestions comprise:
1) Test environment construction advice: the function test of A and B needs which blueprint file, resource file, etc., how to plan to use the existing hardware resource;
2) Test data preparation advice: the function A needs the C data of the user, the needed data size is D, the function B needs D, E data of the user, the needed data size is F, a small amount of C, D, E data exists in the data warehouse, but the data size does not meet the condition, and a large amount of simulation data needs to be generated by using the generation type countermeasure network;
3. The script generator generates an environment construction script according to the test environment construction suggestion given by the test intention, and starts to construct the test environment;
4. constructing a generated countermeasure network according to suggestions given by test intention, inputting a small amount of real data, and generating data meeting the data quantity requirement;
5. And importing the obtained data into a constructed test environment to obtain a corresponding test result.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a testing device according to an embodiment of the disclosure. The test device provided by the embodiment of the disclosure comprises a receiving module 10, a determining module 20, a constructing module 30 and an application module 40, wherein the receiving module 10 is used for receiving a test request, and the test request carries a test information set; a determining module 20, configured to determine a test intention corresponding to the test request; the determining module 20 is further configured to determine file information required for constructing a testing environment and test data required for the testing environment according to the testing intention; a construction module 30, configured to construct the test environment according to the file information and the test information set; and the application module 40 is configured to apply the test data to the test environment to obtain a test result corresponding to the test request.
In this embodiment, the construction module 30 is further configured to:
determining first hardware resource information currently remained in a hardware resource pool;
Determining second hardware resource information required for constructing the test environment from the first hardware resource information according to the file information and the test information set;
Generating an environment construction script corresponding to the file information;
And running the environment construction script according to the second hardware resource information, the file information and the test information set to construct the test environment.
In this embodiment, the construction module 30 is further configured to:
Generating a resource detection script according to the file information and the test information set;
running the resource detection script to detect the first hardware resource information to obtain a target detection result;
And under the condition that the target detection result is that the hardware resources are sufficient, determining second hardware resource information required for constructing the test environment from the first hardware resource information according to the file information and the test information set.
In this embodiment, the determining module 20 is further configured to:
Determining identification information corresponding to test data required by the test environment according to the test intention;
Determining a data type corresponding to the test data according to the identification information; wherein the data types include a first data type and a second data type; the first data type is used for representing basic configuration class data, and the second data type is used for representing non-basic configuration class data;
And determining test data required for application to the test environment according to the data type.
In this embodiment, the determining module 20 is further configured to:
when the data type is the first data type, executing query operation of a data warehouse according to the identification information corresponding to the first data type so as to obtain the test data required by the test environment;
when the data type is the second data type, executing the query operation of the data warehouse according to the identification information corresponding to the second data type so as to obtain training sample data;
inputting the training sample data into a pre-constructed machine learning model, so that the machine learning model outputs target generation data corresponding to the training sample data;
The training sample data and the target generation data are determined as the test data required for application to the test environment.
In this embodiment, the determining module 20 is further configured to:
And determining a first data volume corresponding to the identification information according to the test intention, wherein the first data volume is used for representing the quantity of the test data corresponding to the second data type.
In this embodiment, the determining module 20 is further configured to:
determining a second amount of data for the training sample data and the target generation data;
When the second data volume does not reach the first data volume, updating the training sample data according to the training sample data and the target generation data, and repeating the target data generation step until the second data volume of the training sample data, which is output by the machine learning model and corresponds to the target generation data and is not updated, reaches the first data volume;
And determining the target generation data corresponding to the training sample data and the second data volume which are not updated and correspond to the first data volume as the test data required by the test environment.
In this embodiment, the machine learning model includes a generation model and a discrimination model.
In this embodiment, the determining module 20 is further configured to:
inputting the training sample data into a pre-constructed machine learning model, so that the machine learning model outputs target generation data corresponding to the training sample data, wherein the method comprises the following steps:
Inputting the training sample data into the generation model so that the generation model outputs initial generation data;
Inputting the initial generation data and the training sample data into the discrimination model so that the discrimination model outputs the credibility between the initial generation data and the training sample data;
And determining target generation data corresponding to the training sample data from the initial generation data according to the credibility and a preset cost function.
In this embodiment, the determining module 20 is further configured to:
And executing the query operation of the pre-constructed knowledge graph according to the test intention to obtain file information required by constructing a test environment and test data required by being applied to the test environment.
According to the testing device provided by the embodiment, the testing request is subjected to intention analysis to obtain the testing intention, the construction of the testing environment and the preparation of the testing data are carried out by combining the testing intention, and finally the final testing result is obtained according to the constructed testing environment and the prepared testing data, so that the testing environment and the testing data do not need to be prepared by inputting excessive manpower, and the testing efficiency is improved.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, and the electronic device 400 shown in fig. 4 includes: at least one processor 401, memory 402, at least one network interface 404, and other user interfaces 403. The various components in electronic device 400 are coupled together by bus system 405. It is understood that the bus system 405 is used to enable connected communications between these components. The bus system 405 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 405 in fig. 4.
The user interface 403 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, a trackball, a touch pad, or a touch screen, etc.).
It is to be appreciated that the memory 402 in embodiments of the present disclosure may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available, such as static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate Synchronous dynamic random access memory (Double DATA RATE SDRAM, DDRSDRAM), enhanced Synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCH LINK DRAM, SLDRAM), and Direct memory bus random access memory (DRRAM). The memory 402 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some implementations, the memory 402 stores the following elements, executable units or data structures, or a subset thereof, or an extended set thereof: an operating system 4021 and application programs 4022.
The operating system 4021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application programs 4022 include various application programs such as a media player (MEDIA PLAYER), a Browser (Browser), and the like for implementing various application services. A program implementing the method of the embodiment of the present disclosure may be included in the application program 4022.
In the embodiments of the present disclosure, the processor 401 is configured to execute the method steps provided by the method embodiments by calling a program or an instruction stored in the memory 402, specifically, a program or an instruction stored in the application program 4022, for example, including: receiving a test request, wherein the test request carries a test information set; determining a test intention corresponding to the test request; determining file information required for constructing a test environment and test data required for being applied to the test environment according to the test intention; constructing a test environment according to the file information and the test information set; and applying the test data to a test environment to obtain a test result corresponding to the test request.
The method disclosed in the embodiments of the present disclosure may be applied to the processor 401 or implemented by the processor 401. The processor 401 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 401 or by instructions in the form of software. The Processor 401 described above may be a general purpose Processor, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), an Application SPECIFIC INTEGRATED Circuit (ASIC), an off-the-shelf programmable gate array (Field Programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The various methods, steps and logic blocks of the disclosure in the embodiments of the disclosure may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present disclosure may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software elements in a decoded processor. The software elements may be located in a random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory 402, and the processor 401 reads the information in the memory 402 and, in combination with its hardware, performs the steps of the above method.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or a combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application SPECIFIC INTEGRATED Circuits (ASICs), digital signal processors (DIGITAL SIGNAL Processing, DSPs), digital signal Processing devices (DSPDEVICE, DSPD), programmable logic devices (Programmable Logic Device, PLDs), field-Programmable gate arrays (Field-Programmable GATE ARRAY, FPGA), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units for performing the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented by means of units that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
The electronic device provided in this embodiment may be an electronic device as shown in fig. 4, and may perform all the steps of the testing method shown in fig. 1, so as to achieve the technical effects of the testing method shown in fig. 1, and the detailed description with reference to fig. 1 is omitted herein for brevity.
The disclosed embodiments also provide a storage medium (computer-readable storage medium). The storage medium here stores one or more programs. Wherein the storage medium may comprise volatile memory, such as random access memory; the memory may also include non-volatile memory, such as read-only memory, flash memory, hard disk, or solid state disk; the memory may also comprise a combination of the above types of memories.
When one or more programs are stored in the storage medium, the one or more programs are executable by the one or more processors to implement the test method performed on the test device side as described above.
The processor is used for executing the test program stored in the memory to realize the following steps of the test method executed on the test equipment side: receiving a test request, wherein the test request carries a test information set; determining a test intention corresponding to the test request; determining file information required for constructing a test environment and test data required for being applied to the test environment according to the test intention; constructing a test environment according to the file information and the test information set; and applying the test data to a test environment to obtain a test result corresponding to the test request.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of function in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The foregoing description of the embodiments has been provided for the purpose of illustrating the general principles of the invention, and is not meant to limit the scope of the invention, but to limit the invention to the particular embodiments, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (11)

1. A method of testing, comprising:
receiving a test request, wherein the test request carries a test information set;
determining a test intention corresponding to the test request;
determining file information required for constructing a test environment and test data required for being applied to the test environment according to the test intention;
Constructing the test environment according to the file information and the test information set;
and applying the test data to the test environment to obtain a test result corresponding to the test request.
2. The method of claim 1, wherein said constructing said test environment from said file information and said set of test information comprises:
determining first hardware resource information currently remained in a hardware resource pool;
Determining second hardware resource information required for constructing the test environment from the first hardware resource information according to the file information and the test information set;
Generating an environment construction script corresponding to the file information;
And running the environment construction script according to the second hardware resource information, the file information and the test information set to construct the test environment.
3. The method according to claim 2, wherein determining second hardware resource information required for constructing the test environment from the first hardware resource information according to the file information and the test information set includes:
Generating a resource detection script according to the file information and the test information set;
running the resource detection script to detect the first hardware resource information to obtain a target detection result;
And under the condition that the target detection result is that the hardware resources are sufficient, determining second hardware resource information required for constructing the test environment from the first hardware resource information according to the file information and the test information set.
4. The method of claim 1, wherein the test data required for application to the test environment is determined by:
Determining identification information corresponding to test data required by the test environment according to the test intention;
Determining a data type corresponding to the test data according to the identification information; wherein the data types include a first data type and a second data type; the first data type is used for representing basic configuration class data, and the second data type is used for representing non-basic configuration class data;
And determining test data required for application to the test environment according to the data type.
5. The method of claim 4, wherein determining test data required for application to the test environment based on the data type comprises:
when the data type is the first data type, executing query operation of a data warehouse according to the identification information corresponding to the first data type so as to obtain the test data required by the test environment;
when the data type is the second data type, executing the query operation of the data warehouse according to the identification information corresponding to the second data type so as to obtain training sample data;
inputting the training sample data into a pre-constructed machine learning model, so that the machine learning model outputs target generation data corresponding to the training sample data;
The training sample data and the target generation data are determined as the test data required for application to the test environment.
6. The method according to claim 5, further comprising:
Determining a first data volume corresponding to the identification information according to the test intention, wherein the first data volume is used for representing the quantity of the test data corresponding to the second data type;
the determining the training sample data and the target generation data as the test data required for application to the test environment includes:
determining a second amount of data for the training sample data and the target generation data;
When the second data volume does not reach the first data volume, updating the training sample data according to the training sample data and the target generation data, and repeating the target data generation step until the second data volume of the training sample data, which is output by the machine learning model and corresponds to the target generation data and is not updated, reaches the first data volume;
And determining the target generation data corresponding to the training sample data and the second data volume which are not updated and correspond to the first data volume as the test data required by the test environment.
7. The method of claim 5, wherein the machine learning model comprises a generative model and a discriminant model;
inputting the training sample data into a pre-constructed machine learning model, so that the machine learning model outputs target generation data corresponding to the training sample data, wherein the method comprises the following steps:
Inputting the training sample data into the generation model so that the generation model outputs initial generation data;
Inputting the initial generation data and the training sample data into the discrimination model so that the discrimination model outputs the credibility between the initial generation data and the training sample data;
And determining target generation data corresponding to the training sample data from the initial generation data according to the credibility and a preset cost function.
8. The method of claim 1, wherein determining file information required to build a test environment and test data required to apply to the test environment based on the test intent comprises:
And executing the query operation of the pre-constructed knowledge graph according to the test intention to obtain file information required by constructing a test environment and test data required by being applied to the test environment.
9. A test device, comprising:
The receiving module is used for receiving a test request, wherein the test request carries a test information set;
the determining module is used for determining the test intention corresponding to the test request;
the determining module is further used for determining file information required by building a test environment and test data required by the test environment according to the test intention;
The construction module is used for constructing the test environment according to the file information and the test information set;
And the application module is used for applying the test data to the test environment to obtain a test result corresponding to the test request.
10. An electronic device, comprising: a processor and a memory, the processor being configured to execute a test program stored in the memory to implement the test method of any one of claims 1 to 8.
11. A storage medium storing one or more programs executable by one or more processors to implement the test method of any one of claims 1-8.
CN202211504261.8A 2022-11-28 2022-11-28 Test method, test device, electronic equipment and storage medium Pending CN118132406A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211504261.8A CN118132406A (en) 2022-11-28 2022-11-28 Test method, test device, electronic equipment and storage medium
PCT/CN2023/104877 WO2024113860A1 (en) 2022-11-28 2023-06-30 Test method and apparatus, and electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211504261.8A CN118132406A (en) 2022-11-28 2022-11-28 Test method, test device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN118132406A true CN118132406A (en) 2024-06-04

Family

ID=91239444

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211504261.8A Pending CN118132406A (en) 2022-11-28 2022-11-28 Test method, test device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN118132406A (en)
WO (1) WO2024113860A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102651700A (en) * 2011-02-28 2012-08-29 Sap股份公司 Management test automation
CN108459960A (en) * 2017-12-29 2018-08-28 中国平安财产保险股份有限公司 Method of automatic configuration, device, equipment and the storage medium of test environment
CN112286779B (en) * 2019-07-23 2024-04-09 腾讯科技(深圳)有限公司 Test task processing method and device, storage medium and computer equipment
CN113900925A (en) * 2021-09-06 2022-01-07 特赞(上海)信息科技有限公司 Test environment building and utilizing method, device, equipment and storage medium
CN114064473A (en) * 2021-11-12 2022-02-18 上汽通用五菱汽车股份有限公司 Vehicle machine system testing method and system, vehicle and computer readable storage medium

Also Published As

Publication number Publication date
WO2024113860A1 (en) 2024-06-06

Similar Documents

Publication Publication Date Title
US10642721B2 (en) Generation of automated testing scripts by converting manual test cases
US11074162B2 (en) System and a method for automated script generation for application testing
US20070061641A1 (en) Apparatus and method for generating test driver
US20090217246A1 (en) Evaluating Software Programming Skills
CN109542780B (en) Test method, test device and storage medium for natural language processing application
CN112416778A (en) Test case recommendation method and device and electronic equipment
CN109614325B (en) Method and device for determining control attribute, electronic equipment and storage medium
CN115562656A (en) Page generation method and device, storage medium and computer equipment
CN113434395A (en) Automatic generation method, device, equipment and medium of test case
US11422917B2 (en) Deriving software application dependency trees for white-box testing
CN115757014A (en) Power consumption testing method and device
CN115658452A (en) Buried point checking method, buried point checking device, readable storage medium and electronic equipment
CN113220597B (en) Test method, test device, electronic equipment and storage medium
CN117407312A (en) Application testing method, device, computer equipment and storage medium
CN116467219A (en) Test processing method and device
CN115795059A (en) Threat modeling method and system for agile development
CN118132406A (en) Test method, test device, electronic equipment and storage medium
CN116204396A (en) Test method and device for performance of analytical database
CN114416596A (en) Application testing method and device, computer equipment and storage medium
CN109597638B (en) Method and device for solving data processing and equipment linkage based on real-time computing engine
CN110647314B (en) Skill generation method and device and electronic equipment
CN117539793B (en) Method, device and storage medium for automatically testing UI (user interface) of browser
CN117971679A (en) Intelligent contract test data generation method, device, equipment and storage medium
CN113434409A (en) Micro-service test method and device
CN116955140A (en) SDK test method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication