CN112882957B - Test task validity checking method and device - Google Patents

Test task validity checking method and device Download PDF

Info

Publication number
CN112882957B
CN112882957B CN202110343526.XA CN202110343526A CN112882957B CN 112882957 B CN112882957 B CN 112882957B CN 202110343526 A CN202110343526 A CN 202110343526A CN 112882957 B CN112882957 B CN 112882957B
Authority
CN
China
Prior art keywords
test
task
target
case
log
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110343526.XA
Other languages
Chinese (zh)
Other versions
CN112882957A (en
Inventor
付静
冷炜
高蕊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Citic Bank Corp Ltd
Original Assignee
China Citic Bank Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Citic Bank Corp Ltd filed Critical China Citic Bank Corp Ltd
Priority to CN202110343526.XA priority Critical patent/CN112882957B/en
Publication of CN112882957A publication Critical patent/CN112882957A/en
Application granted granted Critical
Publication of CN112882957B publication Critical patent/CN112882957B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application relates to the technical field of software testing, in particular to a test task validity checking method and device. The method comprises the following steps: determining a target test task to be checked, and extracting a test case serial number of the target test task recorded in a test result; according to the serial number, extracting a test log of the target test task from the tested system, and extracting key elements of the test log; according to a preset analysis rule, the key elements are respectively compared with the test case abstract information of the target test task and the result information of the target test task; and determining the validity of the test result of the target test task according to the comparison result. The scheme of the application solves the problem that the validity of the test result of the tester is difficult to verify at present.

Description

Test task validity checking method and device
Technical Field
The invention relates to the technical field of software testing, in particular to a test task validity checking method and device.
Background
The test execution stage in the software test life cycle is that a test executor executes a test according to a test case and a plan, and usually after the test environment executes a completion case, the tester needs to manually modify the test task state into a test passing state in a case result registration list on a case management platform. Based on this, the case execution result is completely fed back by the test execution personnel, and the test execution may be missed or miss. There are also individual schemes to solve the above-mentioned problem of manual recording, but there is no theory and method of analytical verification from the viewpoint of checking the validity of test execution. In the test execution stage, how to automatically judge the validity of the test execution result is an important problem to be solved, which affects the quality of software. There are also individual schemes to solve the above-mentioned problem of manual recording, but there is no theory and method of analytical verification from the viewpoint of checking the validity of test execution. In the test execution stage, how to automatically judge the validity of the test execution result is an important problem to be solved, which affects the quality of software.
Disclosure of Invention
The object of the present application is to solve at least one of the technical drawbacks mentioned above. The technical scheme adopted by the application is as follows:
in a first aspect, an embodiment of the present application discloses a test task validity checking method, where the method includes:
determining a target test task to be checked, and extracting a test case serial number of the target test task recorded in a test result;
According to the serial number, extracting a test log of the target test task from the tested system, and extracting key elements of the test log;
According to a preset analysis rule, the key elements are respectively compared with the test case abstract information and the test task result information of the target test task;
And determining the validity of the test result of the target test task according to the comparison result.
Further, the test log key elements include, but are not limited to:
Task code of test case, test case time stamp, number of test cases, serial number of test cases;
the test case summary information of the target test task includes, but is not limited to: test case transaction codes and test case numbers;
The target test task result information includes, but is not limited to: test case serial number, test case completion time, test result.
Further, according to a preset analysis rule, comparing the key element with the test case abstract information of the target test task includes: and when the task code in the test log is different from the task code in the test case abstract information of the target test task, determining that the test result of the target test task is invalid.
Further, comparing the key element with the test task result information according to a preset analysis rule includes:
When the task code in the test log is the same as the task code in the test case abstract information of the target test task, checking whether the test case time stamp in the test log is consistent with the recorded test case completion time in the target test task result information;
And if the test results are inconsistent, determining that the test results of the target test task are invalid.
Further, comparing the key elements with the test case abstract information and the test task result information of the target test task according to a preset analysis rule includes:
When the task code in the test log is the same as the task code in the test case abstract information of the target test task and the test case time stamp in the test log is consistent with the recorded test case completion time in the test task result information, the number of the test cases in the test log is extracted;
When the number of the test cases recorded in the test case abstract information of the target test task and the number of the test cases in the test log are 1, the test result of the target test task is determined to be effective.
Further, when the number of the test cases recorded in the test case abstract information of the target test task is different from the number of the test cases in the test log, the test result of the target test task is determined to be invalid.
Further, when the number of the test cases recorded in the test case abstract information of the target test task is determined to be the same as the number of the test cases in the test log and larger than 1, the serial numbers of all the test cases in the test log are obtained,
Analyzing abstract information of the test cases of the target test task when the serial numbers of the test cases in the test logs are the same, and if the test cases in the test logs are confirmed to be different checkpoints of the same task, determining that the test results of the target test task are valid
Further, when the serial numbers of the test cases in the test log are different, respectively extracting test request messages of the test cases with different serial numbers;
When the effective field in the test request message of each test case is different from the effective field of the test request message of any test case, determining that the test result of the target test task is effective;
and when the effective field in the test request message of each test case is the same as the effective field of the test request message of any test case, determining that the test result of the target test task is invalid.
In a second aspect, an embodiment of the present application provides a test task validity checking apparatus, including: the device comprises a determining module, an extracting module, an analyzing module and a judging module, wherein,
The determining module is used for determining a target test task to be checked;
The extraction module is used for extracting a test case serial number of the target test task recorded in the test result; the extraction module is used for extracting a test log of the target test task from the tested system according to the serial number and extracting key elements of the test log;
the analysis module is used for comparing the key elements with the test case abstract information and the test task result information of the target test task respectively according to a preset analysis rule;
and the judging module is used for determining the validity of the test result of the target test task according to the comparison result.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory;
the memory is used for storing operation instructions;
the processor is configured to execute the method described in any one of the foregoing embodiments by calling the operation instruction.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method as in any of the embodiments described above.
The embodiment of the application provides a test task validity checking scheme, which is used for checking the validity (also called as authenticity) of a test execution result by analyzing a test log of a tested system in a test environment and identifying test evidence of a corresponding test case in the log, thereby solving the problem that whether the test result of a tester is truly valid or not is difficult to verify at present.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings that are required to be used in the description of the embodiments of the present application will be briefly described below.
FIG. 1 is a schematic flow chart of a test task validity checking method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a test task validity checking device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application.
It will be appreciated by those of skill in the art that, unless expressly stated otherwise, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, wherein "first," "second," etc. are used for purposes of describing the clarity of understanding only and are not intended to limit the subject itself, and of course the subject defined by "first" and "second" may be the same terminal, device, user, etc. or the same terminal, device, and user. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The term "and/or" as used herein includes all or any element and all combination of one or more of the associated listed items.
Furthermore, it should be understood that in embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one (item) below" or the like, refers to any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
In the following embodiments, a "task code" refers to an identification of an interface of a task system or a task server that carries a certain type of task. For example, the type a task requested from the server is performed through the type a task interface of the server, and the same task interface has only one task code, but the same task interface can generate a plurality of specific tasks of the same type, that is, all tasks through the type a task interface have the same task code, but each specific task is an independent task, so each task has a task serial number. It should be noted that the embodiments described below may be used in any software testing field, but are particularly applicable to the field of banking software or system testing. The method is applied to the field of bank financial transaction software testing, and correspondingly, the task code is embodied as a transaction code, and the task serial number is embodied as a transaction serial number. For example, the transfer transaction requested from the server is performed through the transfer transaction interface of the server, and the same transaction interface has only one transaction code, but the same transaction interface can perform a plurality of transactions, namely, all transfer transactions through the transfer transaction interface have the same transaction code, but each transfer transaction is an independent service, so each transaction has a transaction service serial number, which can be simply called a transaction serial number.
Furthermore, the relation among the test cases, the test scenes and the test tasks needs to be described. One test task corresponds to a task code, at least one test case is arranged under one test task, one test case corresponds to one test scene, each test case corresponds to key element information such as abstract and the like, and after the test executive personnel finishes execution, the test case task serial number (abbreviated as test case serial number or transaction serial number) and the test case execution completion time are recorded. The relationship between the test cases and the task code and the transaction serial number can be divided into the following cases:
(1) One test case corresponds to one test task, namely, one test case corresponds to the same task code and the same serial number;
(2) One test task corresponds to a plurality of test cases, in which case the plurality of test cases correspond to the same task code, but each test case task has a different serial number. The condition test case verifies that different scenes (or different test case tasks) of the same type of task, so that a certain transaction under different scenes needs to be initiated multiple times; for example, if the test case is "a bank has a single transfer amount of not more than 50 ten thousand", three different scenes (scenes) of "50 ten thousand", "less than 50 ten thousand", and "more than 50 ten thousand" are required to be tested, respectively.
(3) One test task corresponds to a plurality of test cases, but the test cases all correspond to the same serial number. The case test cases test different test points of the same task, only a task is initiated, and different test points are checked to execute a plurality of test cases. For example, if the test case is "a account can be transferred out by 3000 yuan", only one transaction task is needed to test out two different test points, namely "a account" and "3000 yuan" respectively.
Fig. 1 shows a flow chart of test task validity check provided by an embodiment of the present application, and as shown in fig. 1, the method mainly may include:
s101, determining a target test task to be checked, and extracting a test case serial number of the target test task recorded in a test result;
S102, extracting a test log of a target test task from a tested system according to a serial number, and extracting key elements of the test log;
In an embodiment of the present application, the test log key elements include, but are not limited to: task code of test cases, test case time stamp, number of test cases, test case serial number.
S103, according to a preset analysis rule, the key elements are respectively compared with the test case abstract information and the test task result information of the target test task;
In an embodiment of the present application, the test case summary information for the target test task includes, but is not limited to: test case transaction codes and test case numbers; target test task result information includes, but is not limited to: test case serial number, test case completion time, test result.
S104, determining the validity of the test result of the target test task according to the comparison result. Where validation includes testing with no or incomplete compliance with test cases of the test task, including no execution of test cases, missing execution of test cases, or erroneous execution of test cases, etc.
On the basis of the above embodiment, further, comparing the key element with the test case abstract information of the target test task according to the preset analysis rule includes: and when the task code in the test log is different from the task code in the test case abstract information of the target test task, determining that the test result of the target test task is invalid.
In an alternative embodiment of the present application, the comparing the key element with the test task result information according to the preset analysis rule includes:
step 1, when a task code in a test log is the same as a task code in test case abstract information of a target test task, checking whether a test case time stamp in the test log is consistent with the completion time of a recorded test case in target test task result information;
and step 2, if the test results are inconsistent, determining that the test results of the target test task are invalid.
On the basis of the above embodiment, further, according to a preset analysis rule, comparing the key elements with the test case summary information and the test task result information of the target test task respectively includes:
Step 1, when a task code in a test log is the same as a task code in test case abstract information of a target test task, and a test case time stamp in the test log is consistent with the completion time of a recorded test case in test task result information, the number of the test cases in the test log is extracted;
And step 2, when the number of the test cases recorded in the test case abstract information of the target test task and the number of the test cases in the test log are 1, determining that the test result of the target test task is effective.
On the basis of the embodiment, when the number of the test cases recorded in the test case abstract information of the target test task is different from the number of the test cases in the test log, the test result of the target test task is determined to be invalid.
On the basis of the above embodiment, in a preferred embodiment, the method further includes:
Step1, when the number of test cases recorded in the test case abstract information of the target test task is determined to be the same as the number of test cases in the test log and larger than 1, acquiring the serial numbers of all the test cases in the test log;
Step 2, analyzing abstract information of test cases of a target test task when the serial numbers of the test cases in the test logs are the same, and determining that the test result of the target test task is effective if the test cases in the test logs are different checkpoints of the same task
On the basis of the above embodiment, in a preferred embodiment, the method further includes:
step 1, when serial numbers of test cases in a test log are different, respectively extracting test request messages of the test cases with different serial numbers;
Step 2, when the effective field in the test request message of each test case is different from the effective field of the test request message of any test case, determining that the test result of the target test task is effective;
and step 3, when the effective field in the test request message of each test case is the same as the effective field of the test request message of any test case, determining that the test result of the target test task is invalid.
According to the embodiment of the application, the test log is extracted according to the key elements (serial numbers) in the executed test cases recorded by the testers, the evidence which can be used as the test case execution basis in the test log is used as the key element information and the test case abstract information of the test tasks, and the test task execution result information is checked and compared, so that whether the test task result execution completed by the testers is truly effective, whether omission exists or not is checked, and the like. In an alternative embodiment, a test task validity check report may be generated according to the comparison result and the determination result described above.
Based on the test task validity checking method shown in fig. 1, on the other hand, the embodiment of the application provides a test task validity checking device, where the device is shown in fig. 2, and the device may include: a 201 determination module, a 202 extraction module, a 203 analysis module and a 204 judgment module; wherein,
The 201 determining module is used for determining a target test task to be checked;
the 202 extraction module is used for extracting a test case serial number of a target test task recorded in a test result; the extraction module is used for extracting a test log of a target test task from the tested system according to the serial number and extracting key elements of the test log;
The 203 analysis module is used for comparing the key elements with the test case abstract information and the test task result information of the target test task respectively according to a preset analysis rule;
and the 204 judgment module is used for determining the validity of the test result of the target test task according to the comparison result.
It will be appreciated that the above-described constituent devices of the test task validity checking apparatus in the present embodiment have functions of implementing the respective steps of the method in the embodiment shown in fig. 1. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules or systems corresponding to the functions described above. The modules and the systems can be software and/or hardware, and each module and the system can be implemented independently or a plurality of modules and systems can be integrated. The functional description of the above modules and systems may be specifically referred to the corresponding description of the method in the embodiment shown in fig. 1, and thus, the advantages achieved by the above modules and systems may be referred to the advantages of the corresponding method provided above, which are not described herein.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the specific structure of the test task validity checking device. In other embodiments of the application, the test task validity checking means may comprise more or less components than shown, or certain components may be combined, certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The embodiment of the application provides electronic equipment, which comprises a processor and a memory;
A memory for storing operation instructions;
and the processor is used for executing the test task validity checking method provided in any embodiment of the application by calling the operation instruction.
As an example, fig. 3 shows a schematic structural diagram of an electronic device to which an embodiment of the present application is applied, and as shown in fig. 3, the electronic device 300 includes: a processor 301 and a memory 303. Wherein the processor 301 is coupled to the memory 303, such as via a bus 302. Optionally, the electronic device 300 may also include a transceiver 304. It should be noted that, in practical application, the transceiver 304 is not limited to one. It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the specific structure of the electronic device 300. In other embodiments of the application, electronic device 300 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. Optionally, the electronic device may further comprise a display screen 305 for displaying images or receiving user operation instructions if necessary.
The processor 301 is applied to the embodiment of the present application, and is configured to implement the method shown in the above embodiment of the method. Transceiver 304 may include a receiver and a transmitter, with transceiver 304 being employed in embodiments of the present application to perform functions that enable an electronic device of embodiments of the present application to communicate with other devices.
The processor 301 may run the test task validity checking method provided by the embodiment of the present application, so as to reduce the operation complexity of the user, improve the intelligent degree of the terminal device, and improve the user experience. The processor 301 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the test task validity checking method provided by the embodiment of the present application, for example, a part of algorithms in the test task validity checking method are executed by the CPU, and another part of algorithms are executed by the GPU, so as to obtain a faster processing efficiency.
Bus 302 may include a path to transfer information between the components. Bus 302 may be a PCI (PERIPHERAL COMPONENT INTERCONNECT, peripheral component interconnect standard) bus or an EISA (Extended Industry Standard Architecture ) bus, or the like. Bus 302 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 3, but not only one bus or one type of bus.
Optionally, the memory 303 is used for storing application program codes for executing the inventive arrangements, and is controlled by the processor 301 for execution. The processor 301 is configured to execute application code stored in the memory 303 to implement the test task validity checking method provided in any one of the embodiments of the present application.
The memory 303 may also store one or more computer programs corresponding to the test task validity checking method provided by the embodiment of the present application. The one or more computer programs are stored in the memory 303 and configured to be executed by the one or more processors 301, the one or more computer programs comprising instructions that can be used to perform the various steps in the respective embodiments described above.
Of course, the code of the test task validity checking method provided by the embodiment of the application can also be stored in the external memory. In this case, the processor 301 may run the code of the test task validity checking method stored in the external memory through the external memory interface, and the processor 301 may control the running of the test task validity checking flow.
The display screen 305 includes a display panel. In some embodiments, the electronic device 300 may include 1 or N display screens 305, N being a positive integer greater than 1. The display screen 305 may be used to display information entered by a user or provided to a user as well as various graphical user interfaces (GRAPHICAL USER INTERFACE, GUI). For example, the display screen 305 may display photographs, videos, web pages, or files, etc.
The electronic device provided by the embodiment of the present application is suitable for any embodiment of the above method, so the beneficial effects that can be achieved by the electronic device can refer to the beneficial effects in the corresponding method provided above, and will not be described herein.
The embodiment of the application provides a computer readable storage medium, and a computer program is stored on the computer readable storage medium, and when the program is executed by a processor, the test task validity checking method shown in the embodiment of the method is realized.
The computer readable storage medium provided by the embodiment of the present application is applicable to any of the above embodiments of the method, and therefore, the beneficial effects achieved by the method can refer to the beneficial effects provided in the corresponding method, and are not described herein.
The embodiments of the present application also provide a computer program product which, when run on a computer, causes the computer to perform the above-mentioned related steps to implement the method in the above-mentioned embodiments. The computer program product provided by the embodiment of the present application is applicable to any of the above embodiments of the method, and therefore, the advantages achieved by the computer program product can refer to the advantages provided in the corresponding method, and are not described herein.
According to the test task validity checking scheme disclosed by the embodiment of the application, the target test task to be checked is determined, and the test case serial number of the target test task recorded in the test result is extracted; according to the serial number, extracting a test log of the target test task from the tested system, and extracting key elements of the test log; according to a preset analysis rule, the key elements are respectively compared with the test case abstract information of the target test task and the result information of the target test task; and determining the validity of the test result of the target test task according to the comparison result. The scheme of the application solves the problem that the validity of the test result of the tester is difficult to verify at present.
The foregoing is merely illustrative of the present application, and the scope of the present application is not limited thereto, and any person skilled in the art will readily recognize that changes and substitutions can be made without departing from the scope of the present application, and these changes, substitutions, modifications and alterations are also deemed to be within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (7)

1. A method of testing task availability checking, the method comprising:
determining a target test task to be checked, and extracting a test case serial number of the target test task recorded in a test result;
According to the serial number, extracting a test log of the target test task from the tested system, and extracting key elements of the test log, wherein the key elements of the test log include but are not limited to: task code of test case, test case time stamp, number of test cases, serial number of test cases;
According to a preset analysis rule, the key elements are respectively compared with test case abstract information of a target test task and test case result information of the target test task, wherein the test case abstract information of the target test task comprises, but is not limited to: test case transaction codes and test case numbers; the target test task result information includes, but is not limited to: the step of comparing the key elements with test case abstract information of a target test task and target test task result information respectively comprises the following steps of:
When the task code in the test log is the same as the task code in the test case abstract information of the target test task and the test case time stamp in the test log is consistent with the recorded test case completion time in the test task result information, the number of the test cases in the test log is extracted;
When the number of the test cases recorded in the test case abstract information of the target test task and the number of the test cases in the test log are 1, determining that the test result of the target test task is effective;
When the number of the test cases recorded in the test case abstract information of the target test task is determined to be the same as the number of the test cases in the test log and larger than 1, the serial numbers of all the test cases in the test log are obtained,
Analyzing abstract information of the test cases of the target test task when the serial numbers of the test cases in the test logs are the same, and determining that the test results of the target test task are valid if the test cases in the test logs are confirmed to be different checkpoints of the same task;
when the serial numbers of the test cases in the test log are different, respectively extracting test request messages of the test cases with different serial numbers;
When the effective field in the test request message of each test case is different from the effective field of the test request message of any test case, determining that the test result of the target test task is effective;
When the effective field in the test request message of each test case is the same as the effective field of the test request message of any test case, determining that the test result of the target test task is invalid; and
And determining the validity of the test result of the target test task according to the comparison result.
2. The test task validity checking method according to claim 1, wherein comparing the key element with the test case digest information of the target test task according to a preset analysis rule includes:
And when the task code in the test log is different from the task code in the test case abstract information of the target test task, determining that the test result of the target test task is invalid.
3. The test task validity checking method according to claim 1, wherein comparing the key element with test task result information according to a preset analysis rule includes:
When the task code in the test log is the same as the task code in the test case abstract information of the target test task, checking whether the test case time stamp in the test log is consistent with the recorded test case completion time in the target test task result information;
And if the test results are inconsistent, determining that the test results of the target test task are invalid.
4. The test task validity checking method according to claim 1, wherein when it is determined that the number of test cases recorded in the test case digest information of the target test task is different from the number of test cases in the test log, it is determined that the test result of the target test task is invalid.
5. A test task validity checking device, the device comprising: the device comprises a determining module, an extracting module, an analyzing module and a judging module, wherein,
The determining module is used for determining a target test task to be checked;
The extraction module is used for extracting a test case serial number of the target test task recorded in the test result; the extraction module is further configured to extract a test log of the target test task from the tested system according to the serial number, and extract key elements of the test log, where the key elements of the test log include, but are not limited to: task code of test case, test case time stamp, number of test cases, serial number of test cases;
The analysis module is configured to compare the key elements with test case summary information of a target test task and test task result information according to a preset analysis rule, where the test case summary information of the target test task includes, but is not limited to: test case transaction codes and test case numbers; the target test task result information includes, but is not limited to: the step of comparing the key elements with test case abstract information of a target test task and target test task result information respectively comprises the following steps of:
When the task code in the test log is the same as the task code in the test case abstract information of the target test task and the test case time stamp in the test log is consistent with the recorded test case completion time in the test task result information, the number of the test cases in the test log is extracted;
When the number of the test cases recorded in the test case abstract information of the target test task and the number of the test cases in the test log are 1, determining that the test result of the target test task is effective;
When the number of the test cases recorded in the test case abstract information of the target test task is determined to be the same as the number of the test cases in the test log and larger than 1, the serial numbers of all the test cases in the test log are obtained,
Analyzing abstract information of the test cases of the target test task when the serial numbers of the test cases in the test logs are the same, and determining that the test results of the target test task are valid if the test cases in the test logs are confirmed to be different checkpoints of the same task;
when the serial numbers of the test cases in the test log are different, respectively extracting test request messages of the test cases with different serial numbers;
When the effective field in the test request message of each test case is different from the effective field of the test request message of any test case, determining that the test result of the target test task is effective;
When the effective field in the test request message of each test case is the same as the effective field of the test request message of any test case, determining that the test result of the target test task is invalid; and
And the judging module is used for determining the validity of the test result of the target test task according to the comparison result.
6. An electronic device comprising a processor and a memory;
the memory is used for storing operation instructions;
The processor is configured to execute the method of any one of claims 1-4 by invoking the operation instruction.
7. A computer readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when executed by a processor, implements the method of any of claims 1-4.
CN202110343526.XA 2021-03-30 2021-03-30 Test task validity checking method and device Active CN112882957B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110343526.XA CN112882957B (en) 2021-03-30 2021-03-30 Test task validity checking method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110343526.XA CN112882957B (en) 2021-03-30 2021-03-30 Test task validity checking method and device

Publications (2)

Publication Number Publication Date
CN112882957A CN112882957A (en) 2021-06-01
CN112882957B true CN112882957B (en) 2024-05-24

Family

ID=76040273

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110343526.XA Active CN112882957B (en) 2021-03-30 2021-03-30 Test task validity checking method and device

Country Status (1)

Country Link
CN (1) CN112882957B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113961464A (en) * 2021-10-28 2022-01-21 中国银行股份有限公司 Test case demand coverage inspection method and device
CN115629950B (en) * 2022-12-19 2023-04-28 深圳联友科技有限公司 Extraction method of performance test asynchronous request processing time point

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110727567A (en) * 2019-09-09 2020-01-24 平安证券股份有限公司 Software quality detection method and device, computer equipment and storage medium
CN111639022A (en) * 2020-05-16 2020-09-08 中信银行股份有限公司 Transaction testing method and device, storage medium and electronic device
CN112052170A (en) * 2020-09-03 2020-12-08 中国银行股份有限公司 Automatic detection method and device, storage medium and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110727567A (en) * 2019-09-09 2020-01-24 平安证券股份有限公司 Software quality detection method and device, computer equipment and storage medium
CN111639022A (en) * 2020-05-16 2020-09-08 中信银行股份有限公司 Transaction testing method and device, storage medium and electronic device
CN112052170A (en) * 2020-09-03 2020-12-08 中国银行股份有限公司 Automatic detection method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN112882957A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
CN108683562B (en) Anomaly detection positioning method and device, computer equipment and storage medium
CN106506283B (en) Business test method and device of bank and enterprise docking system
US20110258609A1 (en) Method and system for software defect reporting
CN112882957B (en) Test task validity checking method and device
CN110474900B (en) Game protocol testing method and device
CN108874654B (en) Idempotent validity test method, device and equipment and readable medium
CN111459800B (en) Method, device, equipment and medium for verifying availability of service system
CN109408361A (en) Monkey tests restored method, device, electronic equipment and computer readable storage medium
US8661414B2 (en) Method and system for testing an order management system
CN112650676A (en) Software testing method, device, equipment and storage medium
CN114218110A (en) Account checking test method and device for financial data, computer equipment and storage medium
CN112100070A (en) Version defect detection method and device, server and storage medium
CN112395182A (en) Automatic testing method, device, equipment and computer readable storage medium
CN112181485A (en) Script execution method and device, electronic equipment and storage medium
CN113282496B (en) Automatic interface testing method, device, equipment and storage medium
CN112419052B (en) Transaction testing method, device, electronic equipment and readable storage medium
CN115563008A (en) Code coverage rate detection system, method, device and storage medium
CN115061924A (en) Automatic test case generation method and generation device
CN113238940A (en) Interface test result comparison method, device, equipment and storage medium
CN114968829B (en) Full link pressure test method, electronic device and storage medium
CN112650679B (en) Test verification method, device and computer system
CN117290223A (en) Multisystem test efficiency analysis method, device, equipment and medium
CN116701375A (en) Data real-time checking method and device, electronic equipment and storage medium
CN115578081A (en) Code flow visualization display method and device, electronic equipment and storage medium
CN116662317A (en) Data verification method and device, processor and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant