WO2023103640A1 - 测试用例的生成方法、装置、电子设备和存储介质 - Google Patents

测试用例的生成方法、装置、电子设备和存储介质 Download PDF

Info

Publication number
WO2023103640A1
WO2023103640A1 PCT/CN2022/128119 CN2022128119W WO2023103640A1 WO 2023103640 A1 WO2023103640 A1 WO 2023103640A1 CN 2022128119 W CN2022128119 W CN 2022128119W WO 2023103640 A1 WO2023103640 A1 WO 2023103640A1
Authority
WO
WIPO (PCT)
Prior art keywords
product
sub
standard data
test
test cases
Prior art date
Application number
PCT/CN2022/128119
Other languages
English (en)
French (fr)
Inventor
陈惠娟
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2023103640A1 publication Critical patent/WO2023103640A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Definitions

  • the embodiments of the present application relate to the field of software testing, and in particular to a method, device, electronic device and storage medium for generating test cases.
  • test design is an important part of the test work.
  • the test plan and test cases output in this link determine the direction and content of test execution.
  • the completeness and redundancy of test plans and test cases directly determine the quality and efficiency of test work.
  • test cases test the content that test cases need to cover needs to come from at least two aspects: requirement specification and code implementation. If the test cases only focus on the software code, identify test points and design use cases based on the code, it is difficult to find behaviors that have been specified but not implemented by the software, and 100% code test coverage cannot guarantee the usability of the software. On the contrary, if the test cases only focus on the requirement specifications, identify test points and design use cases based on the requirements, it is difficult to find behaviors that are not specified in the requirements but implemented by the software (for example, Trojan horse virus), and there may be serious redundancy among test cases.
  • the completeness and redundancy of test cases are generally measured by statistical code coverage after the test cases are executed, and then corrected by updating the test cases.
  • Common code coverage includes: line coverage, branch coverage, condition coverage, path coverage, etc.
  • this method can only judge the completeness and redundancy of test cases from the perspective of code, and cannot measure the completeness and redundancy of test cases from the perspective of requirements specification, especially for large-scale wireless communication systems, which have a large number of industry protocols specification, this measure is less effective, which in turn leads to less validity of the final generated test cases.
  • the main purpose of the embodiment of the present application is to propose a test case generation method, device, electronic equipment and storage medium, aiming to measure the completeness and redundancy of the test case from the two perspectives of code implementation and requirement specification, Improve the effectiveness of test cases.
  • the embodiment of the present application provides a method for generating test cases, including: obtaining the standardized input and output data of each software entity of the product to be tested as product standard data; Value, to determine the characteristics of the product to be tested, the characteristics represent the product function and value; for each characteristic, obtain the sub-functions used to realize the characteristics, and generate test cases for each sub-function; among them, the sub-functions and sub-functions Test cases are described by product standard data.
  • the embodiment of the present application also proposes a test case generation device, including: an acquisition module for acquiring standardized input and output data of each software entity of the product to be tested as product standard data; a determination module for According to the function of the product to be tested and the value of the function to the user, determine the characteristics of the product to be tested, and the characteristics represent the product function and value; the generation module is used to obtain the sub-functions used to realize the characteristics for each characteristic, and for each characteristic Test cases are generated for each sub-function; among them, the sub-functions and the test cases of sub-functions are described by product standard data.
  • an embodiment of the present application also proposes an electronic device, the device includes: at least one processor; and a memory connected to the at least one processor in communication; wherein, the memory stores information that can be executed by the at least one processor. Instructions, the instructions are executed by at least one processor, so that the at least one processor can execute the method for generating test cases as described above.
  • the embodiment of the present application also proposes a computer-readable storage medium storing a computer program, and when the computer program is executed by a processor, the method for generating test cases as described above is implemented.
  • test case generation method before generating the test case, according to the standardized input and output data of each software entity of the product to be tested, a product standard data set is generated, and when the test case is generated, according to the function of the product to be tested and the value of the function to the user, determine all the features contained in the product to be tested, and then determine the sub-functions that need to be used when each feature is realized, use the pre-generated product standard data to describe each sub-function, and provide automatic Can generate test cases described by product standard data.
  • the test case is generated by the product standard data generated according to the standardized input and output data of each software entity of the product under test, so that the generated test case and the function of the product under test are mapped and linked; Generate multiple features of the product to be tested, and obtain the sub-functions needed for each feature, and generate test cases for the sub-functions, so that the completeness of the test cases' coverage of the expected content of the requirements can be measured, and then it can be realized from the code and the requirements. Standardize two angles to measure the completeness and redundancy of test cases, and improve the effectiveness of the final generated test cases.
  • Fig. 1 is the generation method flowchart of the test case in the embodiment of the present application.
  • Fig. 2 is a schematic diagram of the product standard data generation process in the embodiment of the present application.
  • FIG. 3 is a schematic diagram of timing interaction of DRX characteristics in an embodiment of the present application.
  • Fig. 4 is a schematic diagram of the sub-function splitting of the characteristics in the embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a test case generation device in another embodiment of the present application.
  • Fig. 6 is a schematic structural diagram of an electronic device in another embodiment of the present application.
  • test cases and test case generation methods can only judge the completeness and redundancy of test cases from the perspective of code, and cannot measure the completeness and redundancy of test cases from the perspective of requirements specifications.
  • the generated test cases and measurement The method is less effective. Therefore, how to establish test cases that can measure completeness and redundancy from the perspective of code and requirement specification is an urgent problem that needs to be solved.
  • the embodiment of the present application provides a method for generating test cases, including: obtaining the standardized input and output data of each software entity of the product to be tested as product standard data; The value of each feature of the product to be tested is determined, and the feature represents the function and value of the product; for each feature, each sub-function used to implement the feature is obtained, and test cases are generated for each sub-function; among them, sub-functions and sub-functions All test cases are described by product standard data.
  • test case generation method before generating the test case, according to the standardized input and output data of each software entity of the product to be tested, a product standard data set is generated, and when the test case is generated, according to the function of the product to be tested and the value of the function to the user, determine all the features contained in the product to be tested, and then determine the sub-functions that need to be used when each feature is realized, use the pre-generated product standard data to describe each sub-function, and provide automatic Can generate test cases described by product standard data.
  • the test case is generated by the product standard data generated according to the standardized input and output data of each software entity of the product under test, so that the generated test case and the function of the product under test are mapped and linked; Generate multiple features of the product to be tested, and obtain the sub-functions needed for each feature, and generate test cases for the sub-functions, so that the completeness of the test cases' coverage of the expected content of the requirements can be measured, and then it can be realized from the code and the requirements. Standardize two angles to measure the completeness and redundancy of test cases, and improve the effectiveness of the final generated test cases.
  • the first aspect of the embodiment of the present application provides a method for generating test cases.
  • a method for generating test cases Refer to FIG. 1 for the specific flow of the method for generating test cases.
  • the method for generating test cases is applied to a terminal device with analysis capabilities.
  • a terminal device with analysis capabilities Such as computers, tablets, mobile phones and other electronic devices, this embodiment takes the application in computers as an example for illustration, and the method for generating test cases includes at least but not limited to the following steps:
  • Step 101 acquire standardized input and output data of each software entity of the product to be tested as product standard data.
  • the computer when the computer generates the test cases of the product to be tested, it first obtains the software entities contained in the product to be tested, and then uses the standardized input and output data of each software entity of the product to be tested as the product standard data describing the test case.
  • the computer obtains standardized input and output data of each software entity of the product to be tested as product standard data, including: decomposing the implementation architecture of the product to be tested as a software entity; Input and output data, the standardized input and output data of each software entity is extracted from the software product version package; wherein, each software entity obtained after decomposition includes one of the following or any combination thereof: system, subsystem, module, and submodule.
  • each software entity obtained after decomposition includes one of the following or any combination thereof: system, subsystem, module, and submodule.
  • the computer acquires product standard data, it needs to obtain the overall structure and level of the realization function of the product under test, and decompose the realization structure of the product under test into structured and hierarchical software entities to obtain software entities of various granularities. .
  • the product standard data can support the requirements of generating test cases for each sub-function, and it is convenient to meet the demand expectations through the generated test cases. Content coverage completeness is measured.
  • the specific flow diagram of computer-generated product standard data can be referred to Figure 2.
  • the product to be tested is split into several subsystems.
  • the product to be tested is a communication product, it can be split into Configuration management, control plane, user plane, wireless scheduling, platform and other subsystems.
  • each subsystem is further split to obtain several modules of each subsystem (for example, the control plane subsystem in a communication product is further split into modules such as cell management, physical resource allocation, process management, etc.), and the communication product to be tested is obtained Software entities at different granularities.
  • the computer sorts out the content and rules of the interaction between each software entity and the peripheral environment. According to the sorted out rules, The standardized input and output data of each software entity is extracted from the software product version package as product standard data.
  • the computer extracts the standardized input and output data of each software entity from the software product version package, including: extracting the standardized input and output data of each software entity from the software product version package according to the category of the input and output data interacting with the outside;
  • the categories include one or any combination of the following: configuration parameters, key indicators, abnormal alarms, protocol cells, and software interfaces.
  • the computer extracts the standardized input and output data of each software entity, it classifies the product standard data that needs to be extracted, which can be divided into configuration parameters, key indicators, and abnormal alarms according to the purpose and implementation management methods in software products. , protocol cells, software interfaces and other categories.
  • the main product standard data that can be extracted include: configuration parameters, protocol cells, software interfaces, test logs, counters, key indicators, alarms, etc.
  • the respective data attributes are sorted out, and the storage format of each type of data is determined in combination with the attributes.
  • the data attributes of key indicators include: indicator number, indicator name, measurement type, indicator meaning, calculation formula, unit, etc.
  • extract rules that are automatically extracted from version packages and code bases. Complete the automatic extraction of these standardized data and their attributes in the version data package according to specific rules, and then complete the storage according to the storage address of this type of data, forming a standard input and output data set of the expected content of the requirements specification of the communication system under test.
  • Step 102 according to the function of the product under test and the value of the function to the user, determine the characteristics of the product under test.
  • the functions provided by the product to be tested and the value of these functions to users are tested respectively, and then the functions are evaluated according to the specific functions and their value to users. Classify and split out a series of characteristics of the product to be tested, where the characteristics represent product functions and values.
  • the typical characteristics of a communication system include: providing uplink power control services to achieve the effect of ensuring demodulation, reducing interference and saving power; providing DRX (discontinuous reception) configuration services to achieve a balance between service delay and terminal power saving effect .
  • the computer can also clarify the effect of each characteristic, and the implementation plan of each characteristic uses the dismantled system, subsystem, module, submodule, and products that interact with them Standard data to describe, and then determine the product standard data involved in each characteristic.
  • the modification content of the requirements is divided into one or several characteristics that have been determined, and can be extracted if necessary.
  • New features so as to ensure that the functions and values of software products are described through the feature system. Revise the software entity and product standard data of the affected features according to the changes of new requirements. At the same time, the required product standard data collection can also be maintained according to the granularity of the requirement.
  • DRX discontinuous Reception
  • this requirement can be divided into the feature “providing DRX (Discontinuous Reception) configuration services to achieve a balance between service delay and terminal power saving effect.” ", revise the product standard data involved in this characteristic according to the change of this requirement.
  • Step 103 for each feature, obtain each sub-function for realizing the feature, and generate test cases for each sub-function.
  • the computer divides the characteristics of the product to be tested, according to the functions and values of the characteristics, it analyzes the functions that need to be completed by each software entity that has been split, and splits the sub-functions of the characteristics step by step.
  • the sub-functions build a systematic test framework for the product to be tested, and design test cases for each sub-function in the test framework, where each sub-function of the product to be tested and the test cases for each sub-function are described by product standard data .
  • test cases for the sub-functions of the features the completeness of the test cases' coverage of the expected content of the requirements can be measured, and then the completeness and redundancy of the test cases can be measured from the perspectives of code implementation and requirement specification.
  • the computer when generating test cases, draws the timing interaction diagrams of the subsystems and modules involved for each feature, and combines the drawn timing interaction diagrams to split each A subfunction of a feature.
  • a systematic test framework of the communication system under test is constructed. Design test cases for each sub-function in the test architecture, and the input and output of each sub-function and its use cases are described by the above-mentioned product standard data.
  • the sequence interaction diagram of the feature "provide DRX (discontinuous reception) configuration service to achieve a balance between service delay and terminal power saving effect" (referred to as feature DRX) is shown in Figure 3 .
  • the computer splits the sub-functions of this feature step by step according to the order of the software carrying entity from the system —> multi-subsystem —> single subsystem —> single module, and the schematic diagram of the sub-function splitting of each feature As shown in Figure 4, the feature is split into the most granular sub-function level. These sub-functions make up the test architecture for this feature. Use cases are designed for each sub-function, and the input and output of each sub-function and its use cases are described by product standard data. All features and their sub-functions constitute the systematic test framework of the communication system under test, and the use cases of all sub-functions form the systematic test case set of the communication system under test.
  • the computer finds that the extracted product standard data is insufficient during the design process of feature sub-functions and use cases, it can further dismantle the physical software of the product to be tested, and based on the standardized input and output of the newly added physical software Data, adding specific product standard data; if it is found that the functions of some subsystems/modules cannot be decomposed into the extracted feature set, the function and value of the product to be tested can be further expanded to obtain new specific features.
  • the computer after the computer generates the test cases for each sub-function, it also includes: testing the test completeness of the product to be tested according to the product standard data, and/or, the redundancy of the generated test cases according to the product standard data Test the degree; supplement the test cases according to the test results of the completeness, and/or integrate the test cases according to the test results of the redundancy.
  • the computer After the computer generates initial test cases for each sub-function based on the product standard data, it can check the test completeness of the product to be tested based on the product standard data according to the preset order, or directly in the process of continuous integration of product requirements and continuous design of use cases During the process, the latest standardized data set of each version is automatically extracted, compared with the standardized data and its attributes covered by the current version of the use case, the test completeness of the product to be tested is tested, and the test case is tested if the test completeness is insufficient. Replenish. While testing the completeness of the test, the computer can also check the redundancy between the generated test cases according to the product standard data, and integrate the redundant test cases. Based on product standard data, during the test case generation process, the test completeness and test case redundancy are detected, so that the test case architecture and test cases can be improved and optimized in the design process, and the test case can be improved. Use-case refinement optimizes efficiency.
  • the detection of the test completeness of the product to be tested by the computer based on the product standard data includes one of the following or any combination thereof: according to the quantity of product standard data covered by sub-functions and the quantity of all product standard data, the test framework Detect the completeness of the test case; according to the quantity of product standard data covered by the test case and the quantity of all product standard data, test the completeness of the test case; according to the quantity of product standard data covered by the sub-function of the feature and all The quantity of product standard data is used to detect the completeness of the characteristic test framework; according to the quantity of product standard data covered by the characteristic test cases and the quantity of all product standard data involved in the characteristic, the completeness of the characteristic test cases is detected; according to the product The quantity of product standard data that has been covered by the sub-functions of the requirements and the quantity of all product standard data involved in the product requirements are used to test the completeness of the requirements test architecture; The quantity of all product standard data involved is used to test the completeness of the requirement test cases. Test the structure of various
  • the completeness of the test can be measured from multiple angles by the following calculation method, according to the product standard data
  • completeness of the requirement test structure the quantity of product standard data covered by the sub-functions of the requirement / the data involved in the requirement Quantity of all product standard data
  • completeness of requirement test case quantity of product standard data covered by use cases of the requirement/quantity of all product standard data involved in the requirement.
  • the computer can also supplement sub-functions or test cases if the completeness of the test does not meet the requirements. For example, for each data of various product standard data such as configuration parameters, protocol cells, software interfaces, counters, key indicators, and alarms of the communication system to be tested, count the number of sub-functions and use cases that refer to the data as input/output in sequence , when the overall test structure or test case completeness is low, refine the product standard data that has not been covered, and determine the characteristics of the product standard data that have not been referenced according to the feature splitting scheme, and perform sub-function splitting for this feature Sub-functions and supplements, and supplementary test cases for supplementary sub-functions, to complete the test coverage of product standard data that is currently not covered.
  • product standard data such as configuration parameters, protocol cells, software interfaces, counters, key indicators, and alarms of the communication system to be tested.
  • the computer detects the redundancy of the generated test cases according to the product standard data, including: detecting the similarity between the sub-functions according to the carrying entity of the sub-function and the product standard data; judging that the similarity is greater than There is redundancy in the test cases of sub-functions with preset thresholds.
  • the computer detects the redundancy of a test case, it directly characterizes the redundancy based on the functional similarity between the sub-functions corresponding to the test case, and through the carrying entity of the sub-function and its product standard data, Calculate the similarity between different sub-functions according to the similarity calculation algorithm between sub-functions, identify the sub-functions whose similarity is higher than the preset threshold, and then determine the existence of test cases between multiple sub-functions whose similarity is higher than the preset threshold Redundancy, and de-redundancy for the integration of sub-functions and use cases above the preset threshold, reducing the redundancy of test cases.
  • the similarity of the sub-function is checked, and the test cases that are likely to be relatively redundant are obtained by identifying similar sub-functions, and then the test cases are integrated to reduce the redundancy of the test cases.
  • S_Input represents the input similarity
  • S_Output represents the output Similarity
  • S_Bearer means bearer entity similarity.
  • the input similarity, output similarity, and bearer entity similarity can be calculated by the following formulas:
  • S_Bearer (M1Bearer ⁇ M2Bearer) ⁇ (M1Bearer ⁇ M2Bearer);
  • M1_Input and M2_Input represent the input product standard data sets of sub-functions M1 and M2 respectively
  • M1_Output and M2_Output represent the output product standard data sets of sub-functions M1 and M2 respectively
  • M1_Bearer and M2_Bearer represent the software bearing entities of sub-functions M1 and M2 respectively gather.
  • the computer after the computer generates test cases for each sub-function, it also includes: identifying the cross-influence between sub-functions according to the relationship between the product standard data of different sub-functions; Example. Specifically, for each sub-function in the test architecture, the mutual cross-influence can be identified based on the relationship between their corresponding product standard data. After creation, the poor impact between sub-functions can be detected, and supplementary test cases are used to cover the intersections of sub-functions that are currently not covered to improve the completeness of test cases.
  • M1_Output ⁇ M2_Input ⁇ indicating that part of the input of the sub-function M2 is determined by M1, after the change of the sub-function M1 needs to be revised synchronously to cover its impact on the sub-function M2;
  • M1_Input_Resource ⁇ M2_Input_Resource ⁇ where M1_Input_Resource and M2_Input_Resource represent the sub-function M1 , resource product standard input of M2, this formula indicates that sub-function M1 and sub-function M2 share some resources, when one of the sub-functions increases or decreases the occupancy of the corresponding resources, it is necessary to revise the use case synchronously to cover its use for the other sub-function
  • M1_Output ⁇ M2_Output ⁇ which indicates that the standard output of some products is jointly determined by sub-function M1 and sub-function M2. It is necessary to analyze the interaction between the operation timing and operation ratio of this part of output M1 and M2, and design use cases cover.
  • the computer further includes: supplementing the test cases according to the changed product standard data when the product standard data changes.
  • the computer needs to maintain the test case set.
  • the product under test may change in requirements and expectations in different versions, which will cause data increase or decrease in various product standard data and data Attribute changes. In order to ensure that the test cases can support new requirements, the redundancy of the test cases is as low as possible.
  • the computer extracts the current product standard data in the software version package or code library according to the pre-extraction rules, and extracts a new one each time.
  • FIG. 5 Another aspect of the embodiment of the present application relates to a device for generating test cases, referring to FIG. 5 , including:
  • An acquisition module 501 configured to acquire standardized input and output data of each software entity of the product to be tested as product standard data
  • a determining module 502 configured to determine each characteristic of the product to be tested according to the function of the product to be tested and the value of the function to the user, where the characteristic represents the function and value of the product;
  • the generating module 503 is used to obtain each sub-function for realizing the feature for each feature, and generate test cases for each sub-function;
  • this embodiment is an apparatus embodiment corresponding to the method embodiment, and this embodiment can be implemented in cooperation with the method embodiment.
  • the relevant technical details mentioned in the method embodiments are still valid in this embodiment, and will not be repeated here in order to reduce repetition.
  • the related technical details mentioned in this embodiment can also be applied in the method embodiment.
  • modules involved in this embodiment are logical modules.
  • a logical unit can be a physical unit, or a part of a physical unit, or multiple physical units. Combination of units.
  • units that are not closely related to solving the technical problem proposed by the present invention are not introduced in this embodiment, but this does not mean that there are no other units in this embodiment.
  • FIG. 6 it includes: at least one processor 601; Instructions executed by at least one processor 601, the instructions are executed by at least one processor 601, so that at least one processor 601 can execute the method for generating test cases described in any one of the above method embodiments.
  • the memory 602 and the processor 601 are connected by a bus, and the bus may include any number of interconnected buses and bridges, and the bus connects one or more processors 601 and various circuits of the memory 602 together.
  • the bus may also connect together various other circuits such as peripherals, voltage regulators, and power management circuits, all of which are well known in the art and therefore will not be further described herein.
  • the bus interface provides an interface between the bus and the transceivers.
  • a transceiver may be a single element or multiple elements, such as multiple receivers and transmitters, providing means for communicating with various other devices over a transmission medium.
  • the data processed by the processor 601 is transmitted on the wireless medium through the antenna, and further, the antenna also receives the data and transmits the data to the processor 601 .
  • Processor 601 is responsible for managing the bus and general processing, and may also provide various functions including timing, peripheral interface, voltage regulation, power management, and other control functions. And the memory 602 may be used to store data used by the processor 601 when performing operations.
  • Another aspect of the embodiments of the present application also provides a computer-readable storage medium storing a computer program.
  • the above method embodiments are implemented when the computer program is executed by the processor.
  • a storage medium includes several instructions to make a device ( It may be a single-chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disc, etc., which can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)
  • Stored Programmes (AREA)

Abstract

本申请公开了一种测试用例的生成方法、装置、电子设备和存储介质,方法包括:获取待测产品的各软件实体的标准化输入输出数据作为产品标准数据;根据待测产品的功能和功能对于用户的价值,确定待测产品的各特性,特性表示产品功能和价值;针对每一特性,获取用于实现特性的各子功能,并为各子功能分别生成测试用例;其中,子功能及子功能的测试用例均以产品标准数据描述。

Description

测试用例的生成方法、装置、电子设备和存储介质
相关申请
本申请要求于2021年12月08日申请的、申请号为202111493037.9的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及软件测试领域,特别涉及一种测试用例的生成方法、装置、电子设备和存储介质。
背景技术
随着通信技术的不断发展演变,当前软件***的复杂度、质量要求都在持续提升,尤其对于大型的无线通信***软件,这种变化更为明显。在此大背景下,测试作为发掘软件故障、评估改进软件质量的一种重要手段,其本身的质量和效率变得越来越重要。测试设计作为测试工作中的重要一环,该环节所输出的测试方案、测试用例决定了测试执行的方向和内容。测试方案和测试用例的完备性、冗余度直接决定了测试工作的质量和效率。
测试领域中关于测试用例测什么,一个通用的共识是:测试用例需要覆盖的内容至少需要来自需求规范和代码实现两个方面。如果测试用例仅关注软件代码,基于代码识别测试点并设计用例,难以发现需求已规定但软件未实现的行为,代码测试覆盖率100%也不能保证软件可用。相反,如果测试用例仅关注需求规范,基于需求识别测试点并设计用例,难以发现需求未规定但软件已实现的行为(例如,木马病毒),同时测试用例间还可能存在严重冗余。
传统的测试中,测试用例的完备性和冗余度一般是在用例执行后,统计代码覆盖率来度量及后续进行用例更新来修正的。常见的代码覆盖率包括:行覆盖率、分支覆盖率、条件覆盖率、路径覆盖率等。但是,这种方式仅能从代码角度判断用例的完备性和冗余度,无法从需求规范角度度量测试用例的完备性和冗余度,尤其对于大型无线通信***,其内部存在大量的行业协议规范,这种度量方式的有效程度较低,进而导致最终生成的测试用例的有效性较差。
发明内容
本申请实施例的主要目的在于提出一种测试用例的生成方法、装置、电子设备和存储介质,旨在能够从代码实现和需求规范两个角度对测试用例的完备性和冗余度进行度量,提高测试用例的有效性。
为实现上述目的,本申请实施例提供了一种测试用例的生成方法,包括:获取待测产品的各软件实体的标准化输入输出数据作为产品标准数据;根据待测产品的功能和功能对于用户的价值,确定待测产品的各特性,特性表示产品功能和价值;针对每一特性,获取用于实现特性的各子功能,并为各子功能分别生成测试用例;其中,子功能及子功能的测试用例均以产品标准数据描述。
为实现上述目的,本申请实施例还提出了一种测试用例的生成装置,包括:获取模块,用于获取待测产品的各软件实体的标准化输入输出数据作为产品标准数据;确定模块,用于根据待测产品的功能和功能对于用户的价值,确定待测产品的各特性,特性表示产品功能和价值;生成模块,用于针对每一特性,获取用于实现特性的各子功能,并为各子功能分别生成测试用例;其中,子功能及子功能的测试用例均以产品标准数据描述。
为实现上述目的,本申请实施例还提出了一种电子设备,设备包括:至少一个处理器;以及,与至少一个处理器通信连接的存储器;其中,存储器存储有可被至少一个处理器执行的指令,指令被至少一个处理器执行,以使至少一个处理器能够执行如上所述的测试用例的生成方法。
为实现上述目的,本申请实施例还提出了计算机可读存储介质,存储有计算机程序,计算机程序被处理器执行时实现如上所述的测试用例的生成方法。
本申请实施例提供的测试用例的生成方法,生成测试用例前,根据待测产品的各软件实体的标准化输入输出数据,生成产品标准数据集,在进行测试用例生成时,根据待测产品的功能与功能对用户的价值,确定出待测产品包含的所有特性,然后确定出每一个特性实现时需要用到的子功能,利用预先生成的产品标准数据对各子功能进行描述,并为各自动能生成采用产品标准数据描述的测试用例。通过根据待测产品各软件实体的标准化输入输出数据生成的产品标准数据进行测试用例生成,使得生成的测试用例与待测产品的功能建立了映射链接;通过根据产品功能和功能对用户的价值确定出待测产品的多个特性,并获取各特性需要用到的子功能,针对子功能进行测试用例生成,使得测试用例对需求期望内容的覆盖完备性实现可度量,进而能够从代码实现和需求规范两个角度对测试用例的完备性和冗余度进行度量,提高最终生成的测试用例的有效性。
附图说明
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定。
图1是本申请实施例中的测试用例的生成方法流程图;
图2是本申请实施例中的产品标准数据生成流程示意图;
图3是本申请实施例中的DRX特性的时序交互示意图;
图4是本申请实施例中的特性的子功能拆分示意图;
图5是本申请另一实施例中的测试用例的生成装置的结构示意图;
图6是本申请另一实施例中的电子设备的结构示意图。
具体实施方式
由背景技术可知,传统测试方式和测试用例生成方法仅能从代码角度判断用例的完备性和冗余度,无法从需求规范角度度量测试用例的完备性和冗余度,生成的测试用例和度量方式有效性较差。因此,如何建立能够从代码角度和需求规范两个角度进行完备性和冗余度度量的测试用例是一个迫切需要得到解决的问题。
为了解决上述问题,本申请的实施例提供了一种测试用例的生成方法,包括:获取待测产品的各软件实体的标准化输入输出数据作为产品标准数据;根据待测产品的功能和功能对于用户的价值,确定待测产品的各特性,特性表示产品功能和价值;针对每一特性,获取用于实现特性的各子功能,并为各子功能分别生成测试用例;其中,子功能及子功能的测试用例均以产品标准数据描述。
本申请实施例提供的测试用例的生成方法,生成测试用例前,根据待测产品的各软件实体的标准化输入输出数据,生成产品标准数据集,在进行测试用例生成时,根据待测产品的功能与功能对用户的价值,确定出待测产品包含的所有特性,然后确定出每一个特性实现时需要用到的子功能,利用预先生成的产品标准数据对各子功能进行描述,并为各自动能生成 采用产品标准数据描述的测试用例。通过根据待测产品各软件实体的标准化输入输出数据生成的产品标准数据进行测试用例生成,使得生成的测试用例与待测产品的功能建立了映射链接;通过根据产品功能和功能对用户的价值确定出待测产品的多个特性,并获取各特性需要用到的子功能,针对子功能进行测试用例生成,使得测试用例对需求期望内容的覆盖完备性实现可度量,进而能够从代码实现和需求规范两个角度对测试用例的完备性和冗余度进行度量,提高最终生成的测试用例的有效性。
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合附图对本申请的各实施例进行详细的阐述。然而,本领域的普通技术人员可以理解,在本申请各实施例中,为了使读者更好地理解本申请而提出了许多技术细节。但是,即使没有这些技术细节和基于以下各实施例的种种变化和修改,也可以实现本申请所要求保护的技术方案。以下各个实施例的划分是为了描述方便,不应对本申请的具体实现方式构成任何限定,各个实施例在不矛盾的前提下可以相互结合相互引用。
下面将结合具体的实施例的对本申请记载的测试用例的生成方法的实现细节进行具体的说明,以下内容仅为方便理解提供的实现细节,并非实施本方案的必须。
本申请实施例的第一方面提供了一种测试用例的生成方法,测试用例的生成方法的具体流程参考图1,在一些实施例中,测试用例的生成方法应用于具有分析能力终端设备中,如电脑、平板、手机等电子设备,本实施例以应用在电脑中为例进行说明,测试用例的生成方法至少包括但不限于以下步骤:
步骤101,获取待测产品的各软件实体的标准化输入输出数据作为产品标准数据。
具体地说,电脑在生成待测产品的测试用例时,先获取待测产品包含的软件实体,然后将待测产品各软件实体的标准化输入输出数据作为描述测试用例的产品标准数据。
在一个例子中,电脑获取待测产品的各软件实体的标准化输入输出数据作为产品标准数据,包括:将待测产品的实现架构进行软件实体分解;根据分解后得到的各软件实体与外部交互的输入输出数据,从软件产品版本包中提取各软件实体的标准化输入输出数据;其中,分解后得到的各软件实体包括以下之一或其任意组合:***、子***、模块、子模块。具体而言,电脑在进行产品标准数据的获取时,向获取待测产品实现功能的整体架构和层次,对待测产品的实现架构进行结构化和层次化的软件实体分解,得到各个粒度的软件实体。通过根据产品架构进行软件实体分解,根据各软件实体与外部交互的输入输出数据,得到产品标准数据,使得产品标准数据能够支持生成各子功能测试用例的需求,便于通过生成的测试用例对需求期望内容的覆盖完备性进行度量。
例如,电脑生成产品标准数据的具体流程示意图可以参考图2,先根据待测产品的整体架构,将待测产品拆分为若干子***,例如,待测产品为通信产品,则可以拆分为配置管理、控制面、用户面、无线调度、平台等子***。然后再对每一个子***进行进一步拆分,得到各子***的若干模块(如通信产品中的控制面子***进一步拆分为小区管理、物理资源分配、流程管理等模块),得到待测通信产品不同粒度下的软件实体。对于每一个软件实体,由于其都要和***环境进行交互,必然会有和***交互的协议、接口、约束等规则,电脑梳理各软件实体和***交互的内容和规则,根据梳理出的规则,在软件产品版本包中抽取出各软件实体的标准化的输入输出数据,作为产品标准数据。
进一步地,电脑从软件产品版本包中提取各软件实体的标准化输入输出数据,包括:根 据与外部交互的输入输出数据所属的类别,从软件产品版本包中提取各软件实体的标准化输入输出数据;其中,类别包括以下之一或其任意组合:配置参数、关键指标、异常告警、协议信元、软件接口。具体而言,电脑在提取各软件实体的标准化输入输出数据时,对需要提取出的产品标准数据进行分类,按照用途和在软件产品中的实现管理方式可分为配置参数、关键指标、异常告警、协议信元、软件接口等类别。针对每一类标准化数据,抽取其从软件产品版本包中的提取规则,依据特定规则梳理其取值范围、取值规则、默认值、单位、所属被测对象等属性,然后将根据特定规则梳理出的标准化数据及其属性的自动提取和储存,形成各个软件实体的需求规范期望内容的标准化输入输出数据集合。
例如,对于通信***产品时,可提取的主要产品标准数据有:配置参数、协议信元、软件接口、测试日志、计数器、关键指标、告警等。对于每一类产品标准数据梳理给出各自的数据属性,并结合属性确定各类数据的存储格式。如关键指标的数据属性包括:指标编号、指标名称、测量类型、指标含义、计算公式、单位等。对于每一类产品标准数据及其属性,抽取从版本包和代码库中自动提取的规则。依据特定规则在版本数据包中完成这些标准化数据及其属性的自动提取,再依据该类数据的存储地址完成储存,形成被测通信***的需求规范期望内容的标准输入输出数据集合。
步骤102,根据待测产品的功能和功能对于用户的价值,确定待测产品的各特性。
具体地说,电脑根据各软件实体的标准化输入输出数据生成产品标准数据后,对待测产品提供的功能和这些功能分别对于用户的价值进行检测,然后根据具体的功能和功能对于用户的价值对功能进行分类,拆分出待测产品的一系列特性,其中,特性表示产品功能和价值。
例如通信***的典型特性有:提供上行功率控制服务以达到保证解调效果并降低干扰和节电的效果;提供DRX(不连续接收)配置服务以达到业务时延和终端节电效果均衡的效果。在拆分出待测产品的各特性后,电脑还可以明确为达到各个特性的效果,各个特性的实现方案使用所拆分的***、子***、模块、子模块,以及他们之间交互的产品标准数据来描述,进而确定出各个特性所涉及的产品标准数据。
值得一提的是,在待测产品出现新增需求的情况下,根据新增需求的作用,将需求的改动内容划分到某个或某几个已经确定出的特性中,必要时还可以抽取新的特性,从而保证软件产品的功能及价值都通过特性体系描述。根据新需求的改动修订其所波及特性的软件实体和产品标准数据。同时,也可按照需求粒度来维护需求的产品标准数据集合。例如新增需求要求针对电量不敏感终端调整DRX(不连续接收)配置参数,则该需求可划分到特性“提供DRX(不连续接收)配置服务以达到业务时延和终端节电效果均衡的效果”中,按照该需求的改动修订该特性所涉及的产品标准数据。
步骤103,针对每一特性,获取用于实现特性的各子功能,并为各子功能分别生成测试用例。
具体地说,电脑对待测产品进行特性划分后,根据特性的功能和价值,对拆分出的各软件实体需要完成的功能进行解析,逐级拆分出特性的子功能,通过各特性及其子功能构建出待测产品的体系化的测试架构,并针对该测试架构中的每个子功能设计测试用例,其中,待测产品的各子功能及各子功能的测试用例均以产品标准数据描述。通过针对特性的子功能进行测试用例生成,使得测试用例对需求期望内容的覆盖完备性实现可度量,进而能够从代码实现和需求规范两个角度对测试用例的完备性和冗余度进行度量。
例如,待测产品为通信***产品,在生成测试用例的时候,电脑针对每个特性,绘制所涉及的子***、模块的时序交互图,结合绘制出的时序交互图,逐级拆分出各特性的子功能。通过这些特性及其子功能构建出被测通信***的体系化的测试架构。针对该测试架构中的每个子功能设计测试用例,每个子功能及其用例的输入输出均通过上述产品标准数据描述。例如,特性“提供DRX(不连续接收)配置服务以达到业务时延和终端节电效果均衡的效果”(简称特性DRX)的时序交互图如图3所示。电脑基于绘制出的时序交互图,按照软件承载实体从***—>多子***—>单子***—>单模块的顺序,逐级拆分出该特性的子功能,每一特性子功能拆分示意图如图4所示,将特性拆分到最细化的子功能层级。这些子功能组成了该特性的测试架构体系。针对每个子功能设计用例,每个子功能及其用例的输入输出均通过产品标准数据描述。所有特性及其子功能组成了被测通信***的体系化测试架构,所有子功能的用例组成了被测通信***的体系化测试用例集。
值得一提的是,电脑在特性子功能及用例的设计过程中若发现所提取的产品标准数据不足,则可以进一步对待测产品进行实体软件拆解,并根据新增的实体软件的标准化输入输出数据,增加特定的产品标准数据;若发现部分子***/模块的功能无法分解到所提取的特性集中,则可以进一步对待测产品功能和功能的价值进行进一步拓展,从而得到新增的特定特性。
在另一个例子中,电脑为各子功能分别生成测试用例之后,还包括:根据产品标准数据对待测产品的测试完备度进行检测,和/或,根据产品标准数据对生成的测试用例的冗余度进行检测;根据完备度的检测结果对测试用例进行补充,和/或根据冗余度的检测结果对测试用例进行整合。电脑在基于产品标准数据为各子功能生成初始的测试用例后,可以按照预设顺序,基于产品标准数据对待测产品的测试完备度进行检测,或者直接在产品需求持续合入、用例持续设计的过程中,自动提取每个版本最新的标准化数据集合,对比当前版本用例所覆盖的标准化数据及其属性,对待测产品的测试完备性进行检测,在测试完备性不足的情况下,对测试用例进行补充。在进行测试完备性检测的同时,电脑还可以根据产品标准数据,对生成的测试用例之间的冗余度进行检测,对存在冗余的测试用例进行整合。通过基于产品标准数据,在测试用例的生成过程中,对测试的完备度和测试用例的冗余度进行检测,使得测试用例的体系架构和测试实例可以在设计过程中实现完善和优化,提高测试用例的完善优化效率。
在另一个例子中,电脑根据产品标准数据对待测产品的测试完备度进行的检测包括以下之一或其任意组合:根据子功能已覆盖的产品标准数据数量和全部产品标准数据数量,对测试架构的完备度进行检测;根据测试用例已覆盖的产品标准数据数量和全部产品标准数据数量,对测试用例的完备度进行检测;根据特性的子功能已覆盖的产品标准数据数量和特性所涉及的全部产品标准数据数量,对特性测试架构的完备度进行检测;根据特性的测试用例已覆盖的产品标准数据数量和特性所涉及的全部产品标准数据数量,对特性测试用例的完备度进行检测;根据产品需求的子功能已覆盖的产品标准数据数量和产品需求所涉及的全部产品标准数据数量,对需求测试架构的完备度进行检测;根据产品需求的测试用例已覆盖的产品标准数据数量和产品需求所涉及的全部产品标准数据数量,对需求测试用例的完备度进行检测。直接基于产品标准数据对测试的多种维度的架构和用例完备程度进行检测,实现多种角度的测试完备度检验。
具体而言,电脑在测试用例设计过程中或者初始测试用例设计完成后,对测试的完备性 进行检测的时候,可以通过以下的计算方式对测试的完备度进行多角度的度量,根据产品标准数据总量,对测试架构和测试用例集的完备度进行检测:测试架构的完备度=子功能已覆盖的产品标准数据数量/全部产品标准数据数量;测试用例完备度=用例已覆盖的产品标准数据数量/全部产品标准数据数量。根据单个特性设计的全部产品标准数据数量对特性测试架构和特性测试用例的完备度进行检测:特性测试架构的完备度=该特性的子功能已覆盖的产品标准数据数量/该特性所涉及的全部产品标准数据数量;特性测试用例完备度=该特性的用例已覆盖的产品标准数据数量/该特性所涉及的全部产品标准数据数量。根据单个需求涉及的全部产品标准数据数量对单个需求的测试架构和测试用例的完备度进行检测:需求测试架构的完备度=该需求的子功能已覆盖的产品标准数据数量/该需求所涉及的全部产品标准数据数量;需求测试用例完备度=该需求的用例已覆盖的产品标准数据数量/该需求所涉及的全部产品标准数据数量。
值得一提的是,电脑在对测试的完备度进行检测后,还可以在测试完备度不满足要求的情况下,对子功能或者测试用例进行补充。例如,针对待测通信***的配置参数、协议信元、软件接口、计数器、关键指标、告警等各类产品标准数据的每个数据,依次统计引用该数据作为输入/输出的子功能及用例数量,在整体测试架构或者测试用例完备度较低的情况下,细化尚未覆盖到的产品标准数据,根据特性拆分方案,确定尚未引用的产品标准数据所属特性,并针对该特性进行子功能拆分和补充,并为补充的子功能进行测试用例补充,完成当前未覆盖的产品标准数据的测试覆盖。
在另一个例子中,电脑根据产品标准数据对生成的测试用例的冗余度进行检测,包括:根据子功能的承载实体和产品标准数据,对子功能间的相似度进行检测;判定相似度大于预设门限的子功能的测试用例存在冗余。具体而言,电脑对测试用例的冗余度进行检测时,直接依据测试用例所对应的子功能之间的功能相似度进行冗余度的表征,通过子功能的承载实体及其产品标准数据,依据子功能间的相似度计算算法计算不同子功能间的相似度,识别出相似度高于预设门限的子功能,然后判定相似度高于预设门限的多个子功能的测试用例之间存在冗余,并对高于预设门限的子功能及其用例的整合去冗余,降低测试用例的冗余度。根据子功能的承载实体和产品标准数据对子功能进行相似度检验,通过识别出相似子功能,获取极可能较大冗余的测试用例,进而对测试用例进行整合降低测试用例的冗余度。
电脑在进行子功能间相似度的计算时,可以根据以下相似度算法对子功能M1和子功能M2的相似度进行计算:S=S_Input+S_Output+S_Bearer;其中,S_Input表示输入相似度,S_Output表示输出相似度,S_Bearer表示承载实体相似度。输入相似度、输出相似度、承载实体相似度可以分别通过以下公式计算:
S_Input=(M1_Input∩M2_Input)÷(M1_Input∪M2_Input);
S_Output=(M1_Output∩M2_Output)÷(M1_Output∪M2_Output);
S_Bearer=(M1Bearer∩M2Bearer)÷(M1Bearer∪M2Bearer);
其中,M1_Input、M2_Input分别表示子功能M1、M2的输入产品标准数据集合,M1_Output、M2_Output分别表示子功能M1、M2的输出产品标准数据集合,M1_Bearer、M2_Bearer分别表示子功能M1、M2的软件承载实体集合。
在另一个例子中,电脑在为各子功能分别生成测试用例之后,还包括:根据不同子功能的产品标准数据的关系,识别子功能之间的交叉影响;根据识别到的交叉影响,补充测试用 例。具体而言,对于测试架构中的各个子功能,可基于它们对应的产品标准数据之间的关系来识别相互间的交叉影响,因此,电脑在涉及测试用例的过程中或者完成初始测试用例集的创建后,可以对子功能之间的较差影响进行检测,补充测试用例用于覆盖当前未被覆盖的子功能交叉点,提升测试用例的完备性。
电脑在进行子功能交互影响检测时,检测的典型交互影响有如下几种:
M1_Output∩M2_Input≠Φ,表明子功能M2的部分输入由M1确定,子功能M1变化后需要同步修订用例覆盖其对于子功能M2的影响;M1_Input_Resource∩M2_Input_Resource≠Φ,其中M1_Input_Resource和M2_Input_Resource分别表示子功能M1、M2的资源类产品标准输入,此公式即表明子功能M1和子功能M2共享部分资源,当其中一个子功能对于对应资源的占用量增大或减小时,需要同步修订用例覆盖其对于另一子功能的影响;M1_Output∩M2_Output≠Φ,表明部分产品标准输出是由子功能M1和子功能M2共同决定的,需要分析针对这部分输出M1和M2的操作时序、操作比例之间的相互影响,并设计用例覆盖。
值得一提的是,电脑在进行子功能交互影响检测的时候,还可以根据不同的软件产品特点也可增补其他的交互影响算法,本实施例对采用的交互影响算法不做限制。
在另一个例子中,电脑在为各子功能分别生成测试用例之后,还包括:在产品标准数据发生变更的情况下,根据变更后的产品标准数据补充测试用例。具体而言,电脑在生成测试用例后,还需要对测试用例集进行维护,待测产品在不同版本中可能会发生需求期望内容变更,进而会引起各类产品标准数据中的数据增减以及数据属性变更情况。电脑为了保证测试用例能够支持新的需求的同时,测试用例的冗余度尽可能低,电脑在软件版本包或代码库中根据预先提取规则对当前的产品标准数据进行提取,每次新提取一种产品标准数据后,与历史存储的数据对比,识别出哪些数据的哪些属性已变更,进而根据各类产品标准数据中的数据增减情况、以及数据属性变更情况,对于识别到的变更点,触发测试架构及用例的修订。通过根据产品标准数据变化,更新补充或删除测试用例,保证能够对新增功能进行准确的测试的同时,保证测试用例的冗余度尽可能低。
本申请实施例的另一方面涉及一种测试用例的生成装置,参考图5,包括:
获取模块501,用于获取待测产品的各软件实体的标准化输入输出数据作为产品标准数据;
确定模块502,用于根据待测产品的功能和功能对于用户的价值,确定待测产品的各特性,特性表示产品功能和价值;
生成模块503,用于针对每一特性,获取用于实现特性的各子功能,并为各子功能分别生成测试用例;
其中,子功能及子功能的测试用例均以产品标准数据描述。
不难发现,本实施例为与方法实施例相对应的装置实施例,本实施例可与方法实施例互相配合实施。方法实施例中提到的相关技术细节在本实施例中依然有效,为了减少重复,这里不再赘述。相应地,本实施例中提到的相关技术细节也可应用在方法实施例中。
值得一提的是,本实施例中所涉及到的各模块均为逻辑模块,在实际应用中,一个逻辑单元可以是一个物理单元,也可以是一个物理单元的一部分,还可以以多个物理单元的组合实现。此外,为了突出本发明的创新部分,本实施例中并没有将与解决本发明所提出的技术问题关系不太密切的单元引入,但这并不表明本实施例中不存在其它的单元。
本申请实施例的另一方面还提供了一种电子设备,参考图6,包括:包括至少一个处理器601;以及,与至少一个处理器601通信连接的存储器602;其中,存储器602存储有可被至少一个处理器601执行的指令,指令被至少一个处理器601执行,以使至少一个处理器601能够执行上述任一方法实施例所描述的测试用例的生成方法。
其中,存储器602和处理器601采用总线方式连接,总线可以包括任意数量的互联的总线和桥,总线将一个或多个处理器601和存储器602的各种电路连接在一起。总线还可以将诸如***设备、稳压器和功率管理电路等之类的各种其他电路连接在一起,这些都是本领域所公知的,因此,本文不再对其进行进一步描述。总线接口在总线和收发机之间提供接口。收发机可以是一个元件,也可以是多个元件,比如多个接收器和发送器,提供用于在传输介质上与各种其他装置通信的单元。经处理器601处理的数据通过天线在无线介质上进行传输,进一步,天线还接收数据并将数据传输给处理器601。
处理器601负责管理总线和通常的处理,还可以提供各种功能,包括定时,***接口,电压调节、电源管理以及其他控制功能。而存储器602可以被用于存储处理器601在执行操作时所使用的数据。
本申请实施例的另一方面还提供了一种计算机可读存储介质,存储有计算机程序。计算机程序被处理器执行时实现上述方法实施例。
即,本领域技术人员可以理解,实现上述实施例方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
本领域的普通技术人员可以理解,上述各实施例是实现本申请的具体实施例,而在实际应用中,可以在形式上和细节上对其作各种改变,而不偏离本申请的精神和范围。

Claims (11)

  1. 一种测试用例的生成方法,包括:
    获取待测产品的各软件实体的标准化输入输出数据作为产品标准数据(101);
    根据所述待测产品的功能和功能对于用户的价值,确定所述待测产品的各特性,所述特性表示产品功能和价值(102);
    针对每一特性,获取用于实现所述特性的各子功能,并为各所述子功能分别生成测试用例(103);
    其中,所述子功能及所述子功能的测试用例均以所述产品标准数据描述。
  2. 根据权利要求1所述的测试用例的生成方法,其中,所述获取待测产品的各软件实体的标准化输入输出数据作为产品标准数据(101),包括:
    将所述待测产品的实现架构进行软件实体分解;
    根据分解后得到的各软件实体与外部交互的输入输出数据,从软件产品版本包中提取所述各软件实体的标准化输入输出数据;
    其中,所述分解后得到的各软件实体包括以下之一或其任意组合:***、子***、模块、子模块。
  3. 根据权利要求2所述的测试用例的生成方法,其中,所述从软件产品版本包中提取所述各软件实体的标准化输入输出数据,包括:
    根据所述与外部交互的输入输出数据所属的类别,从软件产品版本包中提取所述各软件实体的标准化输入输出数据;
    其中,所述类别包括以下之一或其任意组合:配置参数、关键指标、异常告警、协议信元、软件接口。
  4. 根据权利要求1至3中任一项所述的测试用例的生成方法,其中,所述为各所述子功能分别生成测试用例(103)之后,还包括:
    根据所述产品标准数据对所述待测产品的测试完备度进行检测,和/或,根据所述产品标准数据对生成的所述测试用例的冗余度进行检测;
    根据所述完备度的检测结果对所述测试用例进行补充,和/或根据所述冗余度的检测结果对所述测试用例进行整合。
  5. 根据权利要求4所述的测试用例的生成方法,其中,所述根据所述产品标准数据对所述待测产品的测试完备度进行的检测包括以下之一或其任意组合:
    根据子功能已覆盖的产品标准数据数量和全部产品标准数据数量,对测试架构的完备度进行检测;
    根据测试用例已覆盖的产品标准数据数量和全部产品标准数据数量,对测试用例的完备度进行检测;
    根据所述特性的子功能已覆盖的产品标准数据数量和所述特性所涉及的全部产品标准数据数量,对特性测试架构的完备度进行检测;
    根据所述特性的测试用例已覆盖的产品标准数据数量和所述特性所涉及的全部产品标准数据数量,对特性测试用例的完备度进行检测;
    根据产品需求的子功能已覆盖的产品标准数据数量和所述产品需求所涉及的全部产品标准数据数量,对需求测试架构的完备度进行检测;
    根据产品需求的测试用例已覆盖的产品标准数据数量和所述产品需求所涉及的全部产品标准数据数量,对需求测试用例的完备度进行检测。
  6. 根据权利要求4所述的测试用例的生成方法,其中,所述根据所述产品标准数据对生成的所述测试用例的冗余度进行检测,包括:
    根据子功能的承载实体和所述产品标准数据,对子功能间的相似度进行检测;
    判定所述相似度大于预设门限的子功能的测试用例存在冗余。
  7. 根据权利要求1至3中任一项所述的测试用例的生成方法,其中,在所述为各所述子功能分别生成测试用例(103)之后,还包括:
    根据不同子功能的产品标准数据的关系,识别子功能之间的交叉影响;
    根据识别到的所述交叉影响,补充测试用例。
  8. 根据权利要求1至3中任一项所述的测试用例的生成方法,其中,在所述为各所述子功能分别生成测试用例(103)之后,还包括:
    在所述产品标准数据发生变更的情况下,根据变更后的产品标准数据补充测试用例。
  9. 一种测试用例的生成装置,包括:
    获取模块(501),用于获取待测产品的各软件实体的标准化输入输出数据作为产品标准数据;
    确定模块(502),用于根据所述待测产品的功能和功能对于用户的价值,确定所述待测产品的各特性,所述特性表示产品功能和价值;
    生成模块(503),用于针对每一特性,获取用于实现所述特性的各子功能,并为各所述子功能分别生成测试用例;
    其中,所述子功能及所述子功能的测试用例均以所述产品标准数据描述。
  10. 一种电子设备,包括:
    至少一个处理器(601);以及,
    与所述至少一个处理器(601)通信连接的存储器(602);其中,
    所述存储器(602)存储有可被所述至少一个处理器(601)执行的指令,所述指令被所述至少一个处理器(601)执行,以使所述至少一个处理器(601)能够执行如权利要求1至8中任意一项所述的测试用例的生成方法。
  11. 一种计算机可读存储介质,存储有计算机程序,其中,所述计算机程序被处理器执行时实现权利要求1至8中任意一项所述的测试用例的生成方法。
PCT/CN2022/128119 2021-12-08 2022-10-28 测试用例的生成方法、装置、电子设备和存储介质 WO2023103640A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111493037.9A CN116302902A (zh) 2021-12-08 2021-12-08 测试用例的生成方法、装置、电子设备和存储介质
CN202111493037.9 2021-12-08

Publications (1)

Publication Number Publication Date
WO2023103640A1 true WO2023103640A1 (zh) 2023-06-15

Family

ID=86729585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/128119 WO2023103640A1 (zh) 2021-12-08 2022-10-28 测试用例的生成方法、装置、电子设备和存储介质

Country Status (2)

Country Link
CN (1) CN116302902A (zh)
WO (1) WO2023103640A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117971705A (zh) * 2024-03-28 2024-05-03 成都九洲电子信息***股份有限公司 基于定制化流量洞察的智能接口自动化测试***及方法
CN117951018B (zh) * 2024-01-26 2024-06-28 中国人民解放军军事科学院***工程研究院 一种软件无线电操作***标准符合性测试方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916225A (zh) * 2010-09-02 2010-12-15 于秀山 图形用户界面软件功能覆盖测试方法
CN102360331A (zh) * 2011-10-09 2012-02-22 中国航空无线电电子研究所 基于形式化描述的测试程序自动生成方法
CN103279415A (zh) * 2013-05-27 2013-09-04 哈尔滨工业大学 基于组合测试的嵌入式软件测试方法
CN105260300A (zh) * 2015-09-24 2016-01-20 四川长虹电器股份有限公司 基于会计准则通用分类标准应用平台的业务测试方法
US20180095867A1 (en) * 2016-10-04 2018-04-05 Sap Se Software testing with minimized test suite
CN111752833A (zh) * 2020-06-23 2020-10-09 南京领行科技股份有限公司 一种软件质量体系准出方法、装置、服务器及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916225A (zh) * 2010-09-02 2010-12-15 于秀山 图形用户界面软件功能覆盖测试方法
CN102360331A (zh) * 2011-10-09 2012-02-22 中国航空无线电电子研究所 基于形式化描述的测试程序自动生成方法
CN103279415A (zh) * 2013-05-27 2013-09-04 哈尔滨工业大学 基于组合测试的嵌入式软件测试方法
CN105260300A (zh) * 2015-09-24 2016-01-20 四川长虹电器股份有限公司 基于会计准则通用分类标准应用平台的业务测试方法
US20180095867A1 (en) * 2016-10-04 2018-04-05 Sap Se Software testing with minimized test suite
CN111752833A (zh) * 2020-06-23 2020-10-09 南京领行科技股份有限公司 一种软件质量体系准出方法、装置、服务器及存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117951018B (zh) * 2024-01-26 2024-06-28 中国人民解放军军事科学院***工程研究院 一种软件无线电操作***标准符合性测试方法及装置
CN117971705A (zh) * 2024-03-28 2024-05-03 成都九洲电子信息***股份有限公司 基于定制化流量洞察的智能接口自动化测试***及方法

Also Published As

Publication number Publication date
CN116302902A (zh) 2023-06-23

Similar Documents

Publication Publication Date Title
CN109491894A (zh) 一种接口测试的方法及设备
CN110554958B (zh) 图数据库测试方法、***、设备和存储介质
JPWO2012157471A1 (ja) 複数の制御システムの異常を検知する異常検知システム
CN111818136A (zh) 数据处理方法、装置、电子设备及计算机可读介质
CN110135590B (zh) 信息处理方法、装置、介质及电子设备
CN110737689B (zh) 数据标准符合性检测方法、装置、***及存储介质
CN112181767A (zh) 软件***异常的确定方法、装置和存储介质
CN111782900B (zh) 异常业务检测方法、装置、电子设备及存储介质
CN111240876A (zh) 微服务的故障定位方法、装置、存储介质及终端
CN111651595A (zh) 一种异常日志处理方法及装置
CN115509797A (zh) 一种故障类别的确定方法、装置、设备及介质
CN112306833A (zh) 应用程序的崩溃统计方法、装置、计算机设备及存储介质
CN116414717A (zh) 基于流量回放的自动测试方法、装置、设备、介质及产品
CN110716912B (zh) 一种sql性能检测方法及服务器
WO2023103640A1 (zh) 测试用例的生成方法、装置、电子设备和存储介质
CN110727565B (zh) 一种网络设备平台信息收集方法及***
CN114513334B (zh) 风险管理方法和风险管理装置
WO2023060740A1 (zh) 数据处理、测试方法、装置、设备及存储介质
CN109257348A (zh) 一种基于工业控制***的集群漏洞挖掘方法和装置
CN114461407A (zh) 数据处理方法、装置、分发服务器、***及存储介质
CN114116866A (zh) 一种数据获取方法、装置、终端设备及存储介质
CN115396280B (zh) 告警数据的处理方法、装置、设备及存储介质
CN117648718B (zh) 基于数据源的业务对象显示方法、装置、电子设备和介质
CN116401113B (zh) 一种异构众核架构加速卡的环境验证方法、装置及介质
CN112837040B (zh) 应用于智能电网的电力数据管理方法及***

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22903064

Country of ref document: EP

Kind code of ref document: A1