CN107622014B - Test report generation method and device, readable storage medium and computer equipment - Google Patents

Test report generation method and device, readable storage medium and computer equipment Download PDF

Info

Publication number
CN107622014B
CN107622014B CN201710864491.8A CN201710864491A CN107622014B CN 107622014 B CN107622014 B CN 107622014B CN 201710864491 A CN201710864491 A CN 201710864491A CN 107622014 B CN107622014 B CN 107622014B
Authority
CN
China
Prior art keywords
test
type
detail information
question
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710864491.8A
Other languages
Chinese (zh)
Other versions
CN107622014A (en
Inventor
郑胜雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN201710864491.8A priority Critical patent/CN107622014B/en
Publication of CN107622014A publication Critical patent/CN107622014A/en
Application granted granted Critical
Publication of CN107622014B publication Critical patent/CN107622014B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Stored Programmes (AREA)

Abstract

A test report generation method, a device, a readable storage medium and a computer device are provided, and the method comprises the following steps: acquiring a test item number, and inquiring a label name containing the item number; inquiring all the question files marked with the label names, and acquiring question detail information in each question file; and analyzing the problem detail information, and correspondingly writing an analysis result into a preset test report template to generate the test report. The embodiment of the invention automatically records and analyzes the data of the test result, such as bug data, test case execution condition, working time and the like, and greatly improves the efficiency compared with the prior art.

Description

Test report generation method and device, readable storage medium and computer equipment
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a test report generation method, an apparatus, a readable storage medium, and a computer device.
Background
At present, in the process of developing computer software, testing and optimization need to be performed on the developed software, and testers need to complete testing and provide a relatively complete test report according to the result of each test. In each test period, since each tester needs to record the progress of the test content and the problems occurring in the test process, when the number of the participating developers, testers and optimization content is large, it takes a lot of time to generate the test report of each stage.
Typically, a test report will contain the test environment, test version, tester, time consumption, bug status, and execution status of the test case of the test project. In the prior art, the reports generated by personnel in a testing department are different, and a uniform template is not available, so that the management is inconvenient. And for the data of the test result, such as bug data, test case execution condition, working time and the like, manual recording is performed, and analysis is lacked, or manual operation is performed on the analysis data, so that the time consumption is long.
Disclosure of Invention
In view of the above, it is necessary to provide a test report generation method, apparatus, readable storage medium and computer device to solve the problem of inefficient test report generation in the prior art.
The embodiment of the invention provides a test report generation method, which comprises the following steps:
acquiring a test item number, and inquiring a label name containing the item number;
inquiring all the question files marked with the label names, and acquiring question detail information in each question file;
and analyzing the problem detail information, and correspondingly writing an analysis result into a preset test report template to generate the test report.
The embodiment of the invention associates the test tasks, the defects and the test plans with the test project numbers by marking the information required by the test reports with the labels. When the item number of a test item input by a user is acquired, the problem detail information of all the problem files corresponding to the item number can be directly acquired. And calculating, comparing and analyzing the obtained information to obtain a visual project test report with corresponding data analysis. The embodiment of the invention can automatically record and analyze the data of the test result, and greatly improve the efficiency compared with the existing manual arrangement. Moreover, the test report templates obtained by each person are uniform in plate type, and management is facilitated.
Further, in the method for generating a test report, the step of analyzing the problem detail information includes:
determining the type of the question file according to the question type information in the question detail information;
and analyzing the question detail information of each type of question file according to a preset report generation rule.
Further, in the test report generating method, the step of analyzing the question detail information of each type of question according to a preset report generating rule includes:
when the problem file is of a defect type, determining the defect type, and performing cumulative counting on each type of defect to respectively obtain the total number of each type of defect;
when the problem file is of a test task type, extracting sponsor information of the test task and working hours consumed by completing the test task from the problem detail information, and performing cumulative calculation on the working hours to obtain total testing working hours;
and when the problem file is of a test plan type, extracting the test case execution result in the problem detail information, and calculating the probability of finding a problem by the test case.
Further, in the method for generating a test report, the step of determining the defect type includes:
determining newly added defects according to the problem creation time information and the test item creation time information in the problem detail information;
and determining the defect of the return and the closed defect according to the current state information and the return time information in the problem detail information.
Further, in the method for generating a test report, the step of calculating the probability of finding a problem for all test cases includes
Inquiring all the defects related to the test cases, and counting to obtain the number of the related defects;
and calculating the ratio of the number of the associated defects to the total number of the newly added defects to obtain the probability value of the problem found by the test case.
Further, in the method for generating a test report, the step of extracting the test case execution result in the problem detail information includes:
acquiring test stage information of the test item and acquiring a test period list in the problem detail information;
and inquiring the test cycle ID with the same test cycle name and test phase information in the test cycle list, and acquiring the execution results of all test cases and the number of the associated defects of the test cases according to the ID of the test cycle.
An embodiment of the present invention further provides a device for generating a test report, including:
the query module is used for acquiring the test item number and querying the label name containing the item number;
the acquisition module is used for inquiring all the question files marked with the label names and acquiring the question detail information in each question file;
the analysis module is used for analyzing the problem detail information;
and the test report generating module is used for correspondingly writing the analysis result into a preset test report template so as to generate the test report.
Further, in the test report generating apparatus, the analysis module includes:
the determining module is used for determining the type of the question file according to the question type information in the question detail information;
and the analysis submodule is used for analyzing the problem detail information of each type of problem file according to a preset report generation rule.
Further, in the test report generating apparatus, the analysis sub-module includes:
the defect analysis module is used for determining the defect type when the problem file is of the defect type and counting each type of defects in an accumulated mode so as to obtain the total number of each type of defects respectively;
the test task analysis module is used for extracting the sponsor information of the test task and the working hours consumed by completing the test task from the problem detail information when the problem file is of the test task type, and performing cumulative calculation on the working hours to obtain the total testing working hours;
and the test plan analysis module is used for extracting the test case execution result in the problem detail information and calculating the probability of finding a problem by the test case when the problem file is of the test plan type.
Further, in the test report generating apparatus, the step of determining the defect type by the defect analyzing module includes:
determining newly added defects according to the problem creation time information and the test item creation time information in the problem detail information;
and determining the defect of the return and the closed defect according to the current state information and the return time information in the problem detail information.
Further, in the test report generating apparatus, the step of calculating the probability of finding a problem by all test cases by the test plan analysis module includes:
inquiring all the defects related to the test cases, and counting to obtain the number of the related defects;
and calculating the ratio of the number of the associated defects to the total number of the newly added defects to obtain the probability value of the problem found by the test case.
Further, in the test report generating apparatus, the step of extracting the test case execution result in the problem detail information by the test plan analyzing module includes:
acquiring test stage information of the test item and acquiring a test period list in the problem detail information;
and inquiring the test cycle ID with the same test cycle name and test phase information in the test cycle list, and acquiring the execution results of all test cases and the associated defects of the test cases according to the ID of the test cycle.
Embodiments of the present invention also provide a readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the above method.
An embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the steps of the method are implemented.
Drawings
FIG. 1 is a flow chart of a test report generation method in a first embodiment of the present invention;
FIG. 2 is a flow chart of a test report generation method according to a second embodiment of the present invention;
FIG. 3 is a flowchart of a test report generation apparatus according to a third embodiment of the present invention;
fig. 4 is a block diagram of the analysis module in fig. 3.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
These and other aspects of embodiments of the invention will be apparent with reference to the following description and attached drawings. In the description and drawings, particular embodiments of the invention have been disclosed in detail as being indicative of some of the ways in which the principles of the embodiments of the invention may be practiced, but it is understood that the scope of the embodiments of the invention is not limited correspondingly. On the contrary, the embodiments of the invention include all changes, modifications and equivalents coming within the spirit and terms of the claims appended hereto.
JIRA is a project and transaction tracking tool, and is widely applied to the working fields of defect tracking, customer service, demand collection, process approval, task tracking, project tracking, agile management and the like. In the embodiment of the invention, the test process of the program project development is tracked based on the JIRA, and the test report is generated according to the recorded problem information. The user records information such as version, test environment, tester information, test case execution condition, test defect and the like of the test item in detail through JIRA. The progress of each project test can be conveniently followed by the program developers, information interaction among the program developers can be conveniently carried out, and each test project can be conveniently managed.
Referring to fig. 1, a test report generating method according to a first embodiment of the invention includes steps S11-S13.
Step S11, acquiring a test item number, and querying a tag name including the item number.
The follow-up of the JIRA platform on the project testing is achieved in a single problem. The user creates each of the question files in the project testing process on the JIRA platform and edits the question detail information on the JIRA interface. When a problem file is established for the first time for a project, basic information of the project, such as project number, project version number, test environment, test phase, creation time of project test application and the like, is added on a JIRA interface. The question for the project may then follow the project base information of the first created question file. The test analysis report can be automatically counted according to different versions in the test process, or the test analysis report can be automatically counted according to different stages in the test process, so that the blindness of the subsequent test process is greatly reduced.
In the JIRA platform, a user may edit a title for each question, which is typically a brief description of the created question, and populate question detail information in the JIRA interface, which includes information on the question type, the question creation time, the status, the resolution, the label, the sponsor, the test case, and so on.
The embodiment develops a program suitable for generating the project test report based on the JIRA system, is applied to a server, such as a computer, and automatically generates the project test report according to the data in the JIRA system, so that the data is accurate and the format is uniform, and the working efficiency of a user is improved. And after acquiring the item number input by the user, inquiring the label name containing the item number. And determining all the problem files required by the test report according to the label names, and acquiring data required to be written into the test report template in the problem detail information in the problem files to generate the test report, wherein other problem files without labels do not participate in the generation of the test report. Therefore, when a user creates a problem on the JIRA platform, the user edits a tag name of the problem to be written with a test report, wherein the edited tag name includes a project test number, so as to generate a test report corresponding to the test project.
Step S12, querying all question files marked with the label names, and obtaining question detail information in each question file.
After the tester inputs the test application number, the tester can obtain the version, test environment, test stage and test time (the time is defaulted to the time for starting the test) of the test item through the number, and can inquire all the problem files marked with the label names. The information of the question details in each question file is directly obtained through a JIRA API (Application Programming Interface). The detail information is the information required by the test report.
And step S13, analyzing the question detail information, and writing the analysis result into a preset test report template correspondingly to generate the test report.
Based on the JIRA system, a required test report template, that is, a pattern of a test report, required test data, a chart, and the like are set in advance. And presetting a test report generation rule aiming at the test report template, namely presetting a rule for analyzing the acquired problem detail information.
After the server obtains the problem detail information in each marked problem file, the server analyzes the problem detail information of each problem file according to the rule set by the report template, for example, the server determines the defect type according to the problem detail information, counts the number of various defects, test working hours or test case execution conditions, and the like. And writing the analysis result into the corresponding position of the test report template, thereby generating a test report.
The present embodiment labels the problem files required for the test report by the label, and for example, the label function on the JIRA platform can be used to associate the problem detail information in the problem file, such as the test task, defect, and test plan, with the test item number. When the item number of a test item input by a user is acquired, inquiring the question detail information of all question files corresponding to the item number. And calculating, comparing and analyzing the acquired information to obtain a visual project test report with corresponding data analysis. The embodiment automatically records and analyzes the data of the test result, such as bug data, test case execution condition, working time and the like, and greatly improves the efficiency compared with the existing manual sorting. Moreover, the test report templates obtained by each person are uniform in plate type, and management is facilitated.
Referring to fig. 2, a method for generating a test report according to a second embodiment of the present invention includes the steps of:
step S21, acquiring a test item number, and querying a tag name including the item number.
Step S22, querying all question files marked with the label names, and obtaining question detail information in each question file.
The specific implementation process of the steps S21 and S22 is the same as that of the first embodiment, and is not repeated here.
And step S23, determining the type of the question file according to the question type information in the question detail information.
When a user creates a question file on the JIRA platform, the question type of the question file is determined at the question details. General problem types include defect type, test task type, and test plan type. The data required for each problem type is different, and the analysis process is also different. And setting a report generation rule according to the user requirement, and analyzing each type of problem file. When the problem file is of a defect type, the problem created is a program bug, and the bug can be divided into a newly added defect (bug), a return defect and a closed defect. The newly added bug is the newly found problem in the version test, and the problem of returning the bug and closing the bug are the problems solved by the previous version. When the question file is a test task type, the total man-hours consumed by the test are mainly analyzed. When the problem file is of a test plan type, the analysis is directed to the execution condition of the test case. And traversing all the problem files marked with the labels by the server, determining the type of each problem file according to the problem type information in each problem file, and performing corresponding data processing according to the problem type.
And step S24, when the problem file is of a defect type, determining the defect type, and performing cumulative counting on each type of defect to respectively obtain the total number of each type of defect.
And traversing all the problem files marked with the labels, and determining the defect type of the problem file with the defect type, namely judging whether the defect is a newly added bug or a returned bug or a closed bug. The number of defects of all categories is counted. Namely counting the number of newly added bugs, returned bugs and closed bugs in all the marked problem files.
When the method is specifically implemented, firstly, problem creation time information and test item creation time information in the problem detail information are compared; and when the problem file creation time information is after the test item creation time information, determining the defect as a newly added defect, and adding 1 to the number of the newly added defect.
Secondly, inquiring the state information in the question detail information.
When the current state information of the problem file is in an unsolved state, inquiring whether the problem detail information contains return time information or not, if so, determining that the defect is a return defect, and adding 1 to the number of the return defects;
and when the current state of the problem file is a solved state, determining the defect as a closed defect, and adding 1 to the number of closed defects.
And performing accumulative calculation on the defect quantity of each category to obtain the quantity of newly added bugs, returned bugs and closed bugs.
And step S25, when the question file is of the test task type, extracting the sponsor information of the test task and the man-hour consumed for completing the test task in the question detail information, and performing cumulative calculation on the man-hour to obtain the total man-hour of the test.
And traversing all the problem files marked with the labels, and for the problem files of the test task type, traversing to obtain the sponsors of all the test tasks, and taking all the sponsors as the testers in the test report. And in addition, the working hours consumed by each test task are simultaneously acquired when the test tasks are traversed, and the cumulative calculation is carried out to obtain the total testing working hours. And if a certain test task comprises the subtasks, traversing the subtasks of the test task to calculate the sum of the working hours consumed by all the subtasks, and obtaining the working hours of the test task. That is, the user can know the tester and the test man-hour for the test of the project.
And step S26, when the question file is of the test plan type, extracting the test case execution result in the question detail information, and calculating the probability of finding a question by the test case.
A tester generally designs a plurality of test cases for a test item to implement the functions of the test item. After the user creates the problem file on the JIRA platform, the JIRA records the execution result (success, failure, block, not support) of each test case. And when the bug in the test item is successfully found by executing the test case, associating the test case with the bug found by the test case. Therefore, how the effect of the case is explained by the number of the test case discovery defects executed by the tester, and whether each test case can really discover bugs in the project is determined.
In the above step, the step of calculating the probability of finding a problem by the test case includes:
step S261, inquiring all the defects related to the test cases, and counting to obtain the number of the related defects;
step S262, calculating a ratio of the number of the associated defects to the total number of the newly added defects to obtain a probability value of the problem found by the test case.
When a problem of a test plan type is inquired, counting the number of the defects associated with the test case according to the problem detail information in the problem file to obtain the number of the associated defects. And dividing the number of the associated defects by the total number of the newly added defects to obtain the probability value of the problems found by all the test cases.
In specific implementation, when the type of a question file is judged to be a test plan type, the test stage information of the test item is obtained, and a test cycle list in the question detail information of the question file is obtained. And inquiring the test cycle ID with the same test cycle name and test phase in the test cycle list, acquiring the execution conditions (success, failure, blockage and non-support) of all test cases and the associated defects of all test cases according to the ID of the test cycle, and calculating the number of the associated defects. And when the traversal problem is finished, the probability value of the problem found by the test case is obtained by calculating the ratio of the number of the associated defects to the total number of the newly added defects.
And step S27, writing the name of each type of defect, the total number of each type of defect, the information of the tester, the total testing working hours, the execution result of the test case and the probability of finding problems by the test case into a testing template so as to generate the testing report.
In particular, the test report may include text information and chart information, that is, the problem analysis result of the test item is generated and displayed in the form of text information and chart information.
It can be understood that the user can flexibly change the test report template and the information required by the current test report according to different requirements. For example, the item version number, the test environment, the test phase and the creation time of the item test application in the question detail information can be extracted and written into the test template.
In this embodiment, the problem file types of the test project are divided into a defect type, a test task type and a test plan type, the three types of problem files are analyzed respectively to obtain the total number of various defects, tester information, total test time, test case execution results and probability of finding problems by the test case, and the analysis results are written into a test report template to generate a corresponding test report. By carrying out classification analysis on the problem files, the generated test report has high efficiency and strong intuition and meets the requirements of users. In the embodiment, based on the JIRA platform api, a complete test report can be generated by only inputting a test application number through corresponding data binding, which is relatively to manual sorting. The efficiency is greatly improved.
Referring to fig. 3, a test report generating apparatus according to an embodiment of the present invention includes:
the query module 100 is configured to obtain a test item number and query a tag name including the item number;
an obtaining module 200, configured to query all the question files marked with the tag names, and obtain question detail information in each question file;
an analysis module 300, configured to analyze the question detail information;
the test report generating module 400 is configured to write the analysis result into a preset test report template correspondingly, so as to generate the test report.
Further, as shown in fig. 4, the analysis module 300 includes:
a determining module 310, configured to determine the type of the problem according to the problem type information in the problem detail information;
and the analysis submodule 320 is configured to analyze the question detail information of each type of question file according to a preset report generation rule.
Further, in the above test report generating apparatus, the analysis sub-module 320 includes:
a defect analysis module 321, configured to determine the defect type when the problem file is of the defect type, and count each type of defect cumulatively to obtain the total number of each type of defect;
the test task analysis module 322 is configured to, when the problem file is of a test task type, extract the sponsor information of the test task and the man-hours consumed for completing the test task from the problem detail information, and perform cumulative calculation on the man-hours to obtain a total test man-hour;
and the test plan analysis module 323 is configured to, when the question file is of a test plan type, extract a test case execution result in the question detail information, and calculate a probability that a problem is found by a test case.
Further, in the test report generating apparatus, the step of determining the defect type by the defect analyzing module 321 includes:
determining newly added defects according to the problem creation time information and the test item creation time information in the problem detail information;
and determining the defect of the return and the closed defect according to the current state information and the return time information in the problem detail information.
Further, in the test report generating apparatus, the step of calculating the probability of finding a problem by all test cases by the test plan analyzing module 323 includes:
inquiring all the defects related to the test cases, and counting to obtain the number of the related defects;
and calculating the ratio of the number of the associated defects to the total number of the newly added defects to obtain the probability value of the problem found by the test case.
Further, in the test report generating apparatus, the step of extracting the test case execution result in the problem detail information by the test plan analyzing module 323 includes:
acquiring test stage information of the test item and acquiring a test period list in the problem detail information;
and inquiring the test cycle ID with the same test cycle name and test phase information in the test cycle list, and acquiring the execution results of all test cases and the associated defects of the test cases according to the ID of the test cycle.
The test report generating apparatus of this embodiment may be configured to execute the technical solution of any one of the method embodiments shown in fig. 1 to fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.
The present invention also provides a readable storage medium on which a computer program is stored, characterized in that the program, when executed by a processor, implements the steps of the method of any one of the above embodiments 1 to 2.
The invention also provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method of any of the above embodiments 1 to 2 when executing the program.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. A method for generating a test report, comprising:
acquiring a test item number, and inquiring a tag name containing the item number, wherein the tag name is a mark added to a problem needing to generate a test report;
inquiring all the question files marked with the label names, and acquiring question detail information in each question file;
analyzing the problem detail information, and correspondingly writing an analysis result into a preset test report template to generate the test report, wherein the analysis result is obtained by analyzing the problem detail information and writing the analysis result into the preset test report template
The step of analyzing the issue detail information includes,
determining the type of the question file according to the question type information in the question detail information;
analyzing the problem detail information of each type of problem file according to a preset report generation rule;
wherein
The step of analyzing the issue detail information for each type of issue according to a preset report generating rule includes,
when the problem file is of a defect type, determining the defect type, and performing cumulative counting on each type of defect to respectively obtain the total number of each type of defect;
when the problem file is of a test task type, extracting sponsor information of the test task and working hours consumed by completing the test task from the problem detail information, and performing cumulative calculation on the working hours to obtain total testing working hours;
and when the problem file is of a test plan type, extracting the test case execution result in the problem detail information, and calculating the probability of finding a problem by the test case.
2. The test report generation method of claim 1, wherein the step of determining a defect class comprises:
determining newly added defects according to the problem creation time information and the test item creation time information in the problem detail information;
and determining the defect of the return and the closed defect according to the current state information and the return time information in the problem detail information.
3. The method of claim 2, wherein the step of calculating the probability that all test cases will find a problem comprises:
inquiring all the defects related to the test cases, and counting to obtain the number of the related defects;
and calculating the ratio of the number of the associated defects to the total number of the newly added defects to obtain the probability value of the problem found by the test case.
4. The test report generation method of claim 1, wherein the step of extracting the test case execution result in the issue detail information comprises:
acquiring test stage information of the test item and acquiring a test period list in the problem detail information;
and inquiring the test cycle ID with the same test cycle name and test phase information in the test cycle list, and acquiring the execution results of all test cases and the associated defects of the test cases according to the ID of the test cycle.
5. A test report generation apparatus, comprising:
the system comprises a query module, a test report generation module and a test result generation module, wherein the query module is used for acquiring a test item number and querying a tag name containing the item number, and the tag name is a mark added for a problem needing to generate a test report;
the acquisition module is used for inquiring all the question files marked with the label names and acquiring the question detail information in each question file;
the analysis module is used for analyzing the problem detail information;
a test report generating module for correspondingly writing the analysis result into a preset test report template to generate the test report, wherein
The analysis module includes:
the determining module is used for determining the type of the question file according to the question type information in the question detail information;
the analysis submodule is used for analyzing the problem detail information of each type of problem file according to a preset report generation rule;
the analysis submodule includes:
the defect analysis module is used for determining the defect type when the problem file is of the defect type and counting each type of defects in an accumulated mode so as to obtain the total number of each type of defects respectively;
the test task analysis module is used for extracting the sponsor information of the test task and the working hours consumed by completing the test task from the problem detail information when the problem file is of the test task type, and performing cumulative calculation on the working hours to obtain the total testing working hours;
and the test plan analysis module is used for extracting the test case execution result in the problem detail information and calculating the probability of finding a problem by the test case when the problem file is of the test plan type.
6. A readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1-4.
7. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1-4 when executing the program.
CN201710864491.8A 2017-09-22 2017-09-22 Test report generation method and device, readable storage medium and computer equipment Active CN107622014B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710864491.8A CN107622014B (en) 2017-09-22 2017-09-22 Test report generation method and device, readable storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710864491.8A CN107622014B (en) 2017-09-22 2017-09-22 Test report generation method and device, readable storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN107622014A CN107622014A (en) 2018-01-23
CN107622014B true CN107622014B (en) 2021-04-06

Family

ID=61090749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710864491.8A Active CN107622014B (en) 2017-09-22 2017-09-22 Test report generation method and device, readable storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN107622014B (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110221955A (en) * 2018-03-01 2019-09-10 宝沃汽车(中国)有限公司 Test report generating means
CN108594015A (en) * 2018-04-17 2018-09-28 中北大学 Cable static impedance auto testing instrument and test method
CN108804314A (en) * 2018-05-23 2018-11-13 北京五八信息技术有限公司 Installation kit test method, device, equipment and computer readable storage medium
CN109032919A (en) * 2018-05-31 2018-12-18 深圳壹账通智能科技有限公司 APP flux monitoring method, system, computer equipment and storage medium
CN108776642B (en) * 2018-06-01 2023-04-11 平安普惠企业管理有限公司 Test report generation method and device, computer equipment and storage medium
CN108804333B (en) * 2018-06-14 2021-11-02 郑州云海信息技术有限公司 Method and system for associating test case with CQ
CN110737577B (en) * 2018-07-20 2022-11-04 北京奇虎科技有限公司 Test defect data storage method and device
CN108959101A (en) * 2018-07-27 2018-12-07 郑州云海信息技术有限公司 Test result processing method, device, equipment and memory software testing system
CN109491815A (en) * 2018-10-17 2019-03-19 深圳壹账通智能科技有限公司 Based on multistage data creation method, device and computer equipment
CN111178376B (en) * 2018-11-13 2023-06-20 鸿富锦精密电子(成都)有限公司 Automatic classification device, automatic classification method, and computer-readable storage medium
CN109828919A (en) * 2019-01-18 2019-05-31 深圳壹账通智能科技有限公司 Test report automatic generation method, device, computer equipment and storage medium
CN109872230B (en) * 2019-01-23 2024-04-02 平安科技(深圳)有限公司 Test method and device of financial data analysis system, medium and electronic equipment
CN109902012A (en) * 2019-02-28 2019-06-18 苏州浪潮智能科技有限公司 A kind of automation generates the method and device of server test report
CN110109834A (en) * 2019-04-30 2019-08-09 贝壳技术有限公司 A kind of test report automatically generating device and method
CN111930606A (en) * 2019-05-13 2020-11-13 阿里巴巴集团控股有限公司 Automatic generation method of data processing flow test report and related device
CN110704093A (en) * 2019-09-18 2020-01-17 上海麦克风文化传媒有限公司 Method and system for processing operation feedback online fault
CN111143216A (en) * 2019-12-27 2020-05-12 京东数字科技控股有限公司 Quality report generation method, quality report generation device, quality report generation equipment and computer readable storage medium
CN111858354A (en) * 2020-07-23 2020-10-30 远光软件股份有限公司 Method and device for automatically generating test report, storage medium and electronic equipment
CN112214398B (en) * 2020-09-16 2022-03-22 武汉木仓科技股份有限公司 Method and equipment for acquiring execution efficiency and test efficiency of test case in xmind tool
CN112463534B (en) * 2020-11-26 2022-11-11 歌尔科技有限公司 Daily newspaper generating method, device, equipment and medium
CN113778847A (en) * 2020-12-03 2021-12-10 北京沃东天骏信息技术有限公司 Test report generation method and device
CN113128981A (en) * 2021-05-18 2021-07-16 中国农业银行股份有限公司 Project management method and system
CN113568829A (en) * 2021-07-05 2021-10-29 Oppo广东移动通信有限公司 External field test method and device and storage medium
CN113742213A (en) * 2021-07-13 2021-12-03 北京关键科技股份有限公司 Method, system, and medium for data analysis
CN113642306A (en) * 2021-07-29 2021-11-12 一汽奔腾轿车有限公司 Management method and management system for test problems of electrical function test
CN113643128B (en) * 2021-08-31 2024-02-27 中国银行股份有限公司 Automatic testing method and device for bank products
CN114218962B (en) * 2021-12-16 2022-08-19 哈尔滨工业大学 Artificial intelligent emergency semantic recognition system and recognition method for solid waste management information

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101447898A (en) * 2008-11-19 2009-06-03 中国人民解放军信息安全测评认证中心 Test system used for network safety product and test method thereof
CN101471819A (en) * 2007-12-29 2009-07-01 ***通信集团公司 Test system, test method, management domain and operation domain
CN103323714A (en) * 2013-06-20 2013-09-25 国家电网公司 Automatic test method based on report template technology in intelligent substation test system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101471819A (en) * 2007-12-29 2009-07-01 ***通信集团公司 Test system, test method, management domain and operation domain
CN101447898A (en) * 2008-11-19 2009-06-03 中国人民解放军信息安全测评认证中心 Test system used for network safety product and test method thereof
CN103323714A (en) * 2013-06-20 2013-09-25 国家电网公司 Automatic test method based on report template technology in intelligent substation test system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
jira用户指南;李中武;《https://wenku.***.com/view/690285a57f1922791688e893.html》;20150415;第1页-第13页 *

Also Published As

Publication number Publication date
CN107622014A (en) 2018-01-23

Similar Documents

Publication Publication Date Title
CN107622014B (en) Test report generation method and device, readable storage medium and computer equipment
CN106844217B (en) Method and device for embedding point of applied control and readable storage medium
D'Ambros et al. Visualizing co-change information with the evolution radar
US9354867B2 (en) System and method for identifying, analyzing and integrating risks associated with source code
US20120016701A1 (en) Intelligent timesheet assistance
CN112817865A (en) Coverage precision test method and system based on componentized distributed system
Dragan et al. Using stereotypes to help characterize commits
CN103646090A (en) Application program recommendation method and optimization method and device for program starting speeds
CN108710571B (en) Method and device for generating automatic test code
CN111125068A (en) Metadata management method and system
JPH0683598A (en) Job flow specification automatic generating method
CN111190814B (en) Method and device for generating software test case, storage medium and terminal
CN111274136B (en) Onboard software test management system and test process management method
CN110825725B (en) Data quality checking method and system based on double helix management
Sosnowski et al. Analysing problem handling schemes in software projects
Ostrand et al. A Tool for Mining Defect-Tracking Systems to Predict Fault-Prone Files.
CN113326193A (en) Applet testing method and device
CN115699042A (en) Collaborative system and method for validating analysis of device failure models in crowd-sourced environments
Johannsen et al. Supporting knowledge elicitation and analysis for business process improvement through a modeling tool
CN112035308A (en) Method and device for generating system interface test table
CN114064157B (en) Automatic flow implementation method, system, equipment and medium based on page element identification
CN112925856B (en) Entity relationship analysis method, entity relationship analysis device, entity relationship analysis equipment and computer storage medium
CN113419739B (en) Node map difference detection method and device, electronic equipment and storage medium
CN117972115B (en) Method, equipment and medium for constructing process automation rule base
CN106021209A (en) Source data tracking management system and comprehensive compilation management system for technical publications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant