CN113986746A - Performance test method and device and computer readable storage medium - Google Patents

Performance test method and device and computer readable storage medium Download PDF

Info

Publication number
CN113986746A
CN113986746A CN202111307439.5A CN202111307439A CN113986746A CN 113986746 A CN113986746 A CN 113986746A CN 202111307439 A CN202111307439 A CN 202111307439A CN 113986746 A CN113986746 A CN 113986746A
Authority
CN
China
Prior art keywords
test
behavior
message
client
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111307439.5A
Other languages
Chinese (zh)
Inventor
陈肇权
马泽政
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202111307439.5A priority Critical patent/CN113986746A/en
Publication of CN113986746A publication Critical patent/CN113986746A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses a performance testing method and a device thereof and a computer readable storage medium, relating to the field of financial science and technology, wherein the testing method comprises the following steps: after acquiring the link instruction, establishing a communication link path between the server and the server based on the server IP information, receiving a test script message through the communication link path, and analyzing and processing the test script message to obtain a test script, wherein the test script comprises: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: and sequentially executing the testing steps by the operation behavior of the user and the calling behavior of the application program to obtain a testing result message, and sending the testing result message to the server. The invention solves the technical problems that in the related art, the performance test is only carried out based on the server side, the performance of the client side cannot be evaluated, the test quality and efficiency are reduced, and the user experience is reduced.

Description

Performance test method and device and computer readable storage medium
Technical Field
The invention relates to the technical field of performance testing, in particular to a performance testing method and device and a computer readable storage medium.
Background
Software performance is an important index for software quality evaluation, so that performance testing plays an important role in software quality assurance in software development engineering. In the related art, performance testing is often implemented based on a GS model, i.e., a "pressure Generator (Generator) -Server under test (Server)" model. Under this model, the pressure generator evaluates the performance quality of the service end of the project under test under multiple concurrent, high frequency requests by generating pressure to the service end where the project under test is deployed according to predefined behavior using preconfigured data. In the GS model, the pressure generator uses multiple concurrent socks short connections to encapsulate specific communication packets and simulate the user's request to generate pressure directly to the server under test, where the related predefined behavior and preconfigured data are called "test scripts".
Unlike traditional standalone non-networked software, networked software generally consists of clients deployed on user devices and servers that provide services centrally. For example, for mobile phone software, a client generally refers to APP software installed on a user mobile phone, and in a performance test implementation process based on a GS model, a test tool is generally used as a pressure generator to simulate a service request sent by the mobile phone APP to a server, and directly initiate concurrency pressure on a background service, where a performance concern is mainly performance of the background service under multiple concurrent requests. However, the performance test implementation process based on the traditional GS model has several disadvantages as follows:
(1) in the test process, only the performance of the centralized service of the server is evaluated, the performance evaluation of the client is lacked, and the problems of slow response of the terminal, high resource consumption of the terminal and the like caused by the performance defect of the client cannot be found;
(2) in terms of the test method, the simulation of the transaction request is switched in from the communication layer, and a part of flow for the operation of the user terminal is lacked, so that the time consumption for the operation of the terminal is lacked in the aspect of time consumption evaluation of functions, and the test result is different from the actual feeling of the user;
(3) when the test tool is used for simulating concurrency, requests are generally initiated on a few servers, so that simulation and actual of communication link scenes such as IP diversity, base station distribution and connection drifting are different, and scene simulation distortion is easily caused.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a performance testing method and device and a computer readable storage medium, which are used for at least solving the technical problems that in the related technology, performance testing is only carried out based on a server side, and the performance of a client side cannot be evaluated, so that the testing quality and efficiency are reduced, and the user experience is reduced.
According to an aspect of the embodiments of the present invention, a performance testing method is provided, which is applied to a preset client, and includes: obtaining a link instruction, wherein the link instruction at least comprises: server IP information; establishing a communication link path with a server based on the server IP information; receiving a test script message through the communication link path, and performing analysis processing operation on the test script message to obtain a test script, wherein the test script comprises: at least one test step, wherein the behavior type of the test behavior contained in each test step comprises at least one of the following: the operation behavior of the user and the calling behavior of the application program; sequentially executing the test steps to obtain a test result message, wherein the test result message carries test execution results and statistical result data of different test behaviors; and sending the test result message to the server, wherein the server counts the terminal performance data of different test dimensions based on the test result message and the resource consumption information of the client in the process of executing the test script to obtain a performance test result.
Optionally, the step of obtaining the link instruction includes: monitoring whether a new link instruction exists in a message queue or not by adopting a preset link subscription mode; and pulling the link instruction from the message queue after monitoring that a new link instruction exists in the message queue.
Optionally, the step of performing parsing operation on the test script packet to obtain a test script includes: and analyzing the test script message to obtain the test script comprising at least one test step, wherein the test step comprises the test behavior and test data used by the test behavior.
Optionally, after the analyzing and processing operation is performed on the test script packet to obtain the test script, the performance testing method further includes: storing a variable list defined by the test data as a mapping table in a key-value form, wherein keys in the mapping table are variable names, and values are candidate value lists of each variable; for each test step, checking whether the test data has variable reference; and if the variable is referenced in the test data, replacing the referenced variable with a preset candidate value based on the mapping table.
Optionally, the step of replacing the referenced variable with a preset candidate value based on the mapping table includes: after a circulating pointer is set; when the first replacement step is executed, selecting a first element of the candidate value list as a candidate value, and increasing a pointer value of the circular pointer by a preset value; when the second replacement step is executed, selecting a second element of the candidate value list as a candidate value, and increasing the pointer value of the circular pointer by a preset value; continuing to execute the replacing step until the pointer value of the circular pointer is equal to the number of elements in the candidate value list; and if the pointer value of the circular pointer is equal to the number of elements in the candidate value list, setting the pointer value of the circular pointer to be restored to an initial value.
Optionally, after the testing steps are sequentially executed to obtain a test result message, the performance testing method further includes: and performing structured disassembly on the test data and the response data, and storing a disassembly result into a variable pool, wherein the response data refers to response parameters after the preset client is monitored to execute the test behavior.
Optionally, the operation behavior of the user includes at least one of: click operation, double click operation, long press operation and stroking operation; the type of the application program comprises at least one of the following: an operating system, a Software Development Kit (SDK) and an application APP.
Optionally, after the testing steps are sequentially executed to obtain a test result message, the performance testing method further includes: performing assertion detection on the test execution result of the test step, wherein the type of assertion detection comprises at least one of the following types: numeric values, strings, behavioral actions; when the type detected by the assertion is a numerical value and/or a character string, replacing variable quote with a test actual value in a variable pool, and calculating a test expected value through four operations to obtain a calculated numerical value; judging whether the calculated values of the test actual value and the test expected value meet a test expected relationship or not; if the test expected relationship is met, determining that the test execution result is successful; when the type of assertion detection is behavior action, judging whether the behavior action meets expected behavior based on a probe in a client; and if the expected behavior is met, determining that the test execution result is successful.
Optionally, before performing assertion detection on the test execution result of the testing step, the performance testing method further includes: after receiving an operation instruction, collecting information data in the preset client, wherein the information data at least comprises: log data; and performing edge calculation on the information data, and performing summary statistics on different dimensions based on the calculated information data to obtain statistical result data.
According to another aspect of the embodiments of the present invention, there is also provided a performance testing method applied to a server, where the server is connected to a preset client, and the preset client executes the performance testing method according to any one of claims 1 to 9, including: pushing a link instruction to a message queue, wherein the link instruction at least comprises: server IP information for establishing a communication link path between a client and the server; sending a test script message to the client by adopting the established communication link path, and acquiring resource consumption information of the client in the process of executing the test script, wherein the test script carried in the test script message comprises: at least one test step, wherein the behavior type of the test behavior contained in each test step comprises at least one of the following: the operation behavior of the user and the calling behavior of the application program; receiving a test result message returned by the client, wherein the test result message carries test execution results and statistical result data of different test behaviors; and counting the terminal performance data of different testing dimensions based on the testing result message and the resource consumption information to obtain a performance testing result.
Optionally, before pushing the link instruction to the message queue, the performance testing method further includes: reading a pre-written test script, wherein the test steps in the test script comprise: the test behavior and test data used by the test behavior, the test data including at least: business data used by the application; and packaging the test script into a message with a specified format to obtain the test script message.
Optionally, before pushing the link instruction to the message queue, the performance testing method further includes: based on the dynamic data of each client terminal counted in advance, the client terminal is awakened or dormant, wherein the dynamic data at least comprises the following components: concurrency, response time, throughput.
Optionally, the resource consumption information includes at least one of: function time consumption statistics, function call times, a client CPU and memory consumption, wherein the test dimension comprises at least one of the following: number of requests, single time consuming, throughput information.
According to another aspect of the embodiments of the present invention, there is provided a performance testing apparatus, applied to a preset client, including: an obtaining unit, configured to obtain a link instruction, where the link instruction at least includes: server IP information; a creating unit, configured to create a communication link path with a server based on the server IP information; a first receiving unit, configured to receive a test script packet through the communication link path, and perform parsing operation on the test script packet to obtain a test script, where the test script includes: at least one test step, wherein the behavior type of the test behavior contained in each test step comprises at least one of the following: the operation behavior of the user and the calling behavior of the application program; the execution unit is used for sequentially executing the test steps to obtain a test result message, wherein the test result message carries test execution results and statistical result data of different test behaviors; and the sending unit is used for sending the test result message to the server, wherein the server counts the terminal performance data of different test dimensions based on the test result message and the resource consumption information of the client in the process of executing the test script to obtain a performance test result.
Optionally, the obtaining unit includes: the first monitoring module is used for monitoring whether a new link instruction exists in the message queue by adopting a preset link subscription mode; and the first pulling unit is used for pulling the link instruction from the message queue after monitoring that a new link instruction exists in the message queue.
Optionally, the first receiving unit includes: the first analysis module is used for analyzing the test script message to obtain the test script comprising at least one test step, wherein the test step comprises the test behavior and test data used by the test behavior.
Optionally, the performance testing apparatus further includes: the first storage unit is used for storing a variable list defined by the test data into a mapping table in a key-value form after the test script message is analyzed and processed to obtain a test script, wherein keys in the mapping table are variable names, and values are candidate value lists of each variable; a first checking module, configured to check, for each of the testing steps, whether the test data includes a variable reference; and the first replacement module is used for replacing the referenced variable into a preset candidate value based on the mapping table if the variable is referenced in the test data.
Optionally, the first replacement module comprises: the first setting submodule is used for setting a circulation pointer; a first selection submodule, configured to select a first element of the candidate value list as a candidate value and increase a pointer value of the circular pointer by a preset value when the first replacement step is performed; a second selection submodule, configured to select a second element of the candidate value list as a candidate value and increase the pointer value of the circular pointer by a preset value when the second replacement step is performed; a first execution submodule, configured to continue to execute the replacing step until a pointer value of the circular pointer is equal to the number of elements in the candidate value list; and the second setting submodule is used for setting the pointer value of the circular pointer to be restored to the initial value if the pointer value of the circular pointer is equal to the number of elements in the candidate value list.
Optionally, the performance testing apparatus further includes: and the first disassembling module is used for performing structured disassembling on the test data and the response data after the test steps are sequentially executed to obtain a test result message, and storing a disassembling result into a variable pool, wherein the response data refers to response parameters after the test behavior is executed by the preset client side.
Optionally, the operation behavior of the user includes at least one of: click operation, double click operation, long press operation and stroking operation; the type of the application program comprises at least one of the following: an operating system, a Software Development Kit (SDK) and an application APP.
Optionally, the performance testing apparatus further includes: the first detection module is configured to perform assertion detection on the test execution result of the test step after the test step is sequentially executed to obtain a test result message, where a type of the assertion detection includes at least one of: numeric values, strings, behavioral actions; the first calculation module is used for replacing variable quotation with a test actual value in a variable pool when the type detected by the assertion is a numerical value and/or a character string, and calculating a test expected value through four arithmetic operations to obtain a calculated numerical value; the first judgment module is used for judging whether the calculated values of the test actual value and the test expected value meet a test expected relationship; the first determining module is used for determining that the test execution result is successful if the test expected relationship is met; the second judgment module is used for judging whether the behavior action meets the expected behavior or not based on the probe in the client when the type of the assertion detection is the behavior action; and the second determination module is used for determining that the test execution result is successful if the expected behavior is met.
Optionally, the performance testing apparatus further includes: a first collecting module, configured to collect information data in the preset client after receiving an operation instruction before performing assertion detection on a test execution result of the testing step, where the information data at least includes: log data; and the first statistical module is used for carrying out edge calculation on the information data and carrying out summary statistics on different dimensions based on the calculated information data to obtain statistical result data.
According to another aspect of the embodiments of the present invention, there is also provided a performance testing apparatus, applied to a server, where the server is connected to a preset client, and the preset client executes the performance testing method according to any one of claims 1 to 9, including: a pushing unit, configured to push a link instruction to a message queue, where the link instruction at least includes: server IP information for establishing a communication link path between a client and the server; the issuing unit is configured to issue a test script message to the client by using the established communication link path, and acquire resource consumption information of the client in a process of executing the test script, where a test script carried in the test script message includes: at least one test step, wherein the behavior type of the test behavior contained in each test step comprises at least one of the following: the operation behavior of the user and the calling behavior of the application program; a second receiving unit, configured to receive a test result message returned by the client, where the test result message carries test execution results and statistical result data of different test behaviors; and the statistical unit is used for counting the terminal performance data of different testing dimensions based on the testing result message and the resource consumption information to obtain a performance testing result.
Optionally, the performance testing apparatus further includes: the device comprises a first reading module, a second reading module and a third reading module, wherein the first reading module is used for reading a pre-written test script before pushing a link instruction to a message queue, and the test step in the test script comprises the following steps: the test behavior and test data used by the test behavior, the test data including at least: business data used by the application; and the first packaging module is used for packaging the test script into a message with a specified format to obtain the test script message.
Optionally, the performance testing apparatus further includes: a first wake-up module, configured to wake up or sleep a client based on pre-counted dynamic data of each client before pushing a link instruction to a message queue, where the dynamic data at least includes: concurrency, response time, throughput.
Optionally, the resource consumption information includes at least one of: function time consumption statistics, function call times, a client CPU and memory consumption, wherein the test dimension comprises at least one of the following: number of requests, single time consuming, throughput information.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium, where the computer-readable storage medium includes a stored computer program, and when the computer program runs, the apparatus where the computer-readable storage medium is located is controlled to execute any one of the performance testing methods described above.
According to the method and the device, the performance data of the client in different testing dimensions are collected in the testing process, the performance defects of slow software response, unsmooth running and the like of the client can be found in time, and the consumption degree of the client resources is evaluated.
In this disclosure, after obtaining the link instruction, based on the server IP information, a communication link path between the server and the test script message is created, and the test script message is received through the communication link path, and the test script message is analyzed and processed to obtain the test script, where the test script includes: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: and sequentially executing the testing steps by the operation behavior of the user and the calling behavior of the application program to obtain a testing result message, and sending the testing result message to the server, wherein the server counts the terminal performance data of different testing dimensions to obtain a performance testing result based on the testing result message and the resource consumption information of the client in the process of executing the testing script. In the application, the client can execute the test script under the environment of a plurality of user terminals by simulating the operation behavior of a user and the calling behavior of an application program, so that a concurrent performance test scene is realized, the influence of an actual network and the client on performance indexes can be effectively simulated, the test quality and efficiency are improved, and the technical problems that in the related technology, the performance test is only carried out based on the server, the performance of the client cannot be evaluated, the test quality and efficiency are reduced, and the user experience is reduced are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of an alternative performance testing system according to an embodiment of the present invention;
FIG. 2 is a flow diagram of an alternative performance testing method according to an embodiment of the present invention;
FIG. 3 is a flow diagram of another alternative performance testing method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an alternative full link performance testing method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an alternative performance testing apparatus according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of an alternative performance testing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
To facilitate understanding of the invention by those skilled in the art, some terms or nouns referred to in the embodiments of the invention are explained below:
and (3) edge calculation: the method is characterized in that a nearest-end service is provided nearby by adopting an open platform integrating network, computing, storage and application core capabilities on one side close to an object or a data source.
The following embodiments of the present invention may be applied to various scenarios requiring performance testing of software/applications, where the client/terminal where the software/applications are located includes but is not limited to: cell phones, IPADs, PCs, tablets, etc.
In the present invention, fig. 1 is a schematic diagram of an alternative performance testing system according to an embodiment of the present invention, including: server side, a plurality of customer end and information memory cell, wherein, server side includes: the master control drive unit, the client includes: agent execution unit, procedure call unit, information acquisition unit, main control drive unit still includes: script management module, total control scheduling module, end machine communication module 1, result summarize the module, and agent execution unit still includes: terminal machine communication module 2, script instantiation module, test execution module, assertion check module, the information acquisition unit still includes: the system comprises a communication interaction module, a log acquisition module, a resource acquisition module, an information statistics module and a front-end temporary storage database.
The invention defines a performance test system consisting of a unified main control drive unit and agent execution units distributed on a plurality of user terminals, can be applied to the field of software/application performance test and the field of performance test for finding performance defects of a client and a communication link layer when the full link of a service function needs to be evaluated is time-consuming, and the agent execution units execute the service function under the environment of a plurality of real user terminals by actually operating interface actions of terminal software or Calling (CALL) client programs, thereby realizing concurrent performance test scenes and effectively solving the problems that the traditional performance test method cannot cover the client processing and cannot simulate complex communication environment. Meanwhile, the invention defines an information acquisition unit on the client, acquires performance data and logs of the mobile phone end in the test process, provides technical support for finding performance defects of slow response, unsmooth running and the like of the client and evaluating the consumption of resources of the client, and can complete operations of variable replacement, assertion check and the like on the client by introducing edge calculation, thereby greatly reducing the resource consumption of the server and being beneficial to further improving the request receiving and sending efficiency.
Example one
In accordance with an embodiment of the present invention, there is provided a performance testing method embodiment, it is noted that the steps illustrated in the flowchart of the figure may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
FIG. 2 is a flow chart of an alternative performance testing method according to an embodiment of the present invention, as shown in FIG. 2, the method comprising the steps of:
step S201, acquiring a link instruction, where the link instruction at least includes: server IP information.
Step S202, based on the IP information of the server, a communication link path between the server and the server is created.
Step S203, receiving the test script message through the communication link path, and performing analysis processing operation on the test script message to obtain a test script, wherein the test script comprises: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: the operation behavior of the user and the calling behavior of the application program.
And step S204, executing the test steps in sequence to obtain a test result message, wherein the test result message carries test execution results and statistical result data of different test behaviors.
And S205, sending the test result message to a server, wherein the server counts the terminal performance data of different test dimensions based on the test result message and resource consumption information of the acquisition client in the process of executing the test script to obtain a performance test result.
Through the steps, after the link instruction is obtained, a communication link path between the server and the server is established based on the server IP information, the test script message is received through the communication link path, and the test script message is analyzed and processed to obtain the test script, wherein the test script comprises: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: and sequentially executing the testing steps by the operation behavior of the user and the calling behavior of the application program to obtain a testing result message, and sending the testing result message to the server, wherein the server counts the terminal performance data of different testing dimensions to obtain a performance testing result based on the testing result message and the resource consumption information of the client in the process of executing the testing script. In the embodiment of the invention, the client can execute the test script under the environment of a plurality of user terminals by simulating the operation behavior of the user and the calling behavior of the application program, so that a concurrent performance test scene is realized, the influence of an actual network and the client on performance indexes can be effectively simulated, the test quality and efficiency are improved, and the technical problems that the performance test is only carried out based on the server in the related technology, the performance of the client cannot be evaluated, the test quality and efficiency are reduced, and the user experience is reduced are solved.
The following will explain the embodiments of the present invention in detail with reference to the above steps.
The following steps are applied to a preset client, where the preset client may be a user terminal, including but not limited to: a mobile phone terminal, an IPAD, a tablet, a PC, etc., which will be described in detail below with the mobile phone terminal as an example.
Step S201, acquiring a link instruction, where the link instruction at least includes: server IP information.
In this embodiment of the present invention, the client is used as a CONSUMER of a message (constructor) to actively pull (poll) a "CONNECT" broadcast instruction (i.e., a link instruction) issued by the master control driving unit, where the link instruction includes server IP information, and the server IP information is used to create a first type of connection between the master control server and the client (this embodiment is schematically illustrated by a TCP long connection).
Optionally, the step of obtaining the link instruction includes: monitoring whether a new link instruction exists in a message queue or not by adopting a preset link subscription mode; and after monitoring that a new link instruction exists in the message queue, pulling the link instruction from the message queue.
In the embodiment of the invention, all active agent execution units monitor the message queue in a long-chain subscription mode (namely a preset link subscription mode), and actively pull the link instruction from the message queue after monitoring that the new message exists, and perform subsequent processing.
Step S202, based on the IP information of the server, a communication link path between the server and the server is created.
Step S203, receiving the test script message through the communication link path, and performing analysis processing operation on the test script message to obtain a test script, wherein the test script comprises: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: the operation behavior of the user and the calling behavior of the application program.
In the embodiment of the present invention, a created TCP long connection is used to receive a test script message issued by a master control driving unit, and a scheduling script instantiation module completes edge calculation such as script analysis and data instantiation to obtain a test script, where the test script may include multiple test steps, each test step includes a specific test behavior, and the test behavior types include: the operation behavior of the user, the calling behavior of the application program, and the like, for example, the operation behavior of the user may be represented as a test behavior for a "simulated action" class, and the test execution module may actually simulate the operation behavior of the real user on the mobile phone by means of the program calling unit, including but not limited to: click, enter, swipe, double click, etc. The calling behavior of the application program can be expressed as testing behavior for a 'program calling' class, and the test execution module can call programs of a mobile phone operating system, an SDK (software development kit) and an APP by means of a program calling unit and transmits test data.
Optionally, the step of performing parsing operation on the test script message to obtain the test script includes: and analyzing the test script message to obtain a test script comprising at least one test step, wherein the test step comprises test behaviors and test data used by the test behaviors.
In the embodiment of the present invention, the script instantiation module may implement edge calculation by using a mobile phone computing resource, and instantiates a general test script message issued by the main control drive unit into a specific and unique test script available at the mobile phone end through steps such as behavior analysis, variable value replacement, and test data combination (i.e., performing analysis processing operation on the test script message), where the test script may include multiple test steps, and the test steps may include test behaviors and test data used by the test behaviors. For example, a JSON-formatted test script transmitted by the client communication module 2 may be received and parsed into specific test steps, each test step includes a specific test behavior and test data used by the behavior, and meanwhile, a test result JSON packet may be initialized, and START _ TIME may be assigned as the current timestamp.
Optionally, after the analyzing and processing operation is performed on the test script message to obtain the test script, the performance testing method further includes: storing a variable list defined by the test data as a mapping table in a key-value form, wherein keys in the mapping table are variable names, and values are candidate value lists of each variable; for each test step, checking whether the test data has variable reference; and if the variable is referenced in the test data, replacing the referenced variable with a preset candidate value based on the mapping table.
In the embodiment of the invention, whether a variable is quoted in the test data or not can be checked for each test step, wherein the variable refers to a constant value which has a certain business meaning and is not fixed and needs to be used in the test process and is changed in real time according to a predetermined rule, the script instantiation module stores a variable list defined by the test data into a mapping table in a key-value form, when each step is analyzed, the test behavior and the test data are traversed, and when a variable quote grammar appears, the variable is replaced by a specific candidate value (namely a preset candidate value). For example, the request message PARAM _ LIST node has two variables, KEYWORD and TIME, KEYWORD has three candidate values, TIME has two candidate values, and the variable and the candidate values can be stored as a mapping table in a key-value form.
Optionally, the step of replacing the referenced variable with a preset candidate value based on the mapping table includes: after a circulating pointer is set; when the first replacement step is executed, selecting a first element of the candidate value list as a candidate value, and increasing the pointer value of the circular pointer by a preset value; when the second replacement step is executed, selecting a second element of the candidate value list as a candidate value, and increasing the pointer value of the circular pointer by a preset value; continuing to execute the replacing step until the pointer value of the circular pointer is equal to the number of elements in the candidate value list; and if the pointer value of the circular pointer is equal to the number of elements in the candidate value list, setting the pointer value of the circular pointer to be restored to the initial value.
In the embodiment of the invention, the script instantiation module can maintain a mapping table and store the variable and the candidate value. KEY of the mapping table is the name of the variable, and VALUE is an ArrayList that holds a list of candidate VALUEs for each variable. The script instantiation module can traverse the ACTION _ LIST node of the request message, and respectively relates to variable reference when inputting ACTION and variable assignment, and for each variable, the script instantiation module can replace the parameter to the element of the ArrayList corresponding to the variable when executing for multiple times, and the detailed steps are as follows: setting a circular pointer, selecting the first element of the ArrayList when the replacement is performed for the first TIME (e.g., KEYWORD weather, TIME 2021-05-17), and increasing the pointer by a preset value (e.g., plus 1), selecting the second element when the replacement is performed for the second TIME (e.g., KEYWORD rain, TIME 2021-03-01), and so on, and resetting the pointer value to the initial value when the pointer value is equal to the number of elements in the ArrayList (i.e., setting the pointer value of the circular pointer to return to the initial value until the pointer value of the circular pointer is equal to the number of elements in the candidate list). The values of a plurality of associated variables ArrayList in the script are subjected to Cartesian product, and the full value combination of the associated variables can be calculated.
And step S204, executing the test steps in sequence to obtain a test result message, wherein the test result message carries test execution results and statistical result data of different test behaviors.
In the embodiment of the present invention, the test execution module may complete test execution at the mobile phone end according to the test script (that is, sequentially execute the test steps to obtain the test result message), where the test execution result may be given by the test execution module according to the assertion check result, and when all assertion checks of each test step pass, the test execution result is successful, otherwise, the test execution result is failed, and the statistical result data may include: the method comprises the steps of calling times of business functions, average time consumption of business function execution, statistics of APP function time consumption and TOP 10 ranking, calling times of functions and TOP 10 ranking, average CPU consumption and peak consumption, and average memory consumption and peak consumption.
And the test execution module can complement the initial test result JSON message obtained by initialization according to the test execution result, when the message is complemented, END _ TIME is assigned as the current timestamp, END _ TIME and START _ TIME are subtracted to obtain the processing TIME consumption of the request, and the processing TIME consumption is assigned to an ELAPSED _ TIME node of the message.
Optionally, after the testing steps are sequentially executed to obtain the test result message, the performance testing method further includes: and carrying out structured disassembly on the test data and the response data, and storing a disassembly result into a variable pool, wherein the response data refers to response parameters after the test behavior of the preset client is monitored.
In the embodiment of the present invention, the test execution module may perform structured disassembly on test data and response data, store the contents of a packet header, a packet body field, and the like in a variable pool, and may use a "req _ variable name" and a "resp _ variable name" as KEY values, where the response data refers to response parameters after monitoring a preset client to execute a test behavior, and may be uploaded to a server during a test process.
Optionally, the operation behavior of the user includes at least one of: click operation, double click operation, long press operation and stroking operation; the type of application includes at least one of: an operating system, a Software Development Kit (SDK) and an application APP.
In the embodiment of the invention, the program calling unit is an execution unit which actually simulates the user behavior at the mobile phone end and calls the mobile phone operating system and the APP program, is responsible for responding to the calling of the proxy execution unit, simulating user operations such as clicking, inputting, double clicking, long pressing and the like, calling the operating system and the APP bottom layer function, and completing the execution of the test script at the mobile phone end, and comprises a behavior simulation module and a function calling module.
The behavior simulation module can run on an operating system driving layer in an injection mode, simulate interactive behaviors sent to the operating system bottom layer by hardware sensors such as a screen and a key and drive the operating system to interact with APP response.
The function calling module can run between the operating system and the mobile phone APP and interact with the mobile phone APP in a HOOK (HOOK) mode. The function calling module responds to the proxy execution unit, and completes the operation of calling the mobile phone program function by the script by calling an operating system API, a technical support layer (for example, JAVA of Objective-C and Android of IOS) SDK, a page presentation layer function (for example, functions provided by H5 and JJAVASCRIPT), an application APP and the like.
Optionally, after the testing steps are sequentially executed to obtain the test result message, the performance testing method further includes: performing assertion detection on a test execution result of the testing step, wherein the type of assertion detection comprises at least one of the following types: numeric values, strings, behavioral actions; when the type of assertion detection is a numerical value and/or a character string, replacing variable reference with a test actual value in a variable pool, and calculating a test expected value through four operations to obtain a calculated numerical value; judging whether the calculated values of the test actual value and the test expected value meet the test expected relationship or not; if the test expected relationship is met, determining that the test execution result is successful; when the type of assertion detection is behavior action, judging whether the behavior action meets the expected behavior based on a probe in the client; and if the expected behavior is met, determining that the test execution result is successful.
In the embodiment of the present invention, the assertion checking module may check the execution result of the test script, the assertion may be a boolean expression including an actual value, an expected relationship, and an expected value, and the actual value and the expected value may be a numerical value (or a numerical expression) or a character string, or may be a behavior action of a mobile phone terminal.
For numeric values and/or strings, assertion checking may involve four operations and variable references, one example of an assertion being as follows: for example,
$<resp_BALANCE>==$<req_BALANCE>+100;
where $ < resp _ BALANCE > is the actual value, $ is the expected relationship, $ < req _ BALANCE > +100 is the expected value, and the variable references are described in $ < variable name > format.
When the assertion checking module checks the assertion of the numerical value and/or the character class, the variable reference can be replaced by actual data (namely, a test actual value) in the variable pool, then the actual value and the expected value are calculated by four arithmetic operations to calculate the numerical value, and finally whether the numerical values of the actual value and the expected value meet the expected relationship given by the assertion is judged, if so, the assertion checking is successful, otherwise, the assertion is failed.
For the behavior action of the mobile phone end, the assertion check judges whether the expected behavior action occurs or not through a Boolean expression and a predefined function, for example:
app dialog ("process success"). visable ═ true';
when the assertion checking module checks the assertion of the behavior action class, whether the expected interface control and action meet the expected behavior, for example, whether the action is called up or grayed out, can be judged through the probes of the mobile phone operating system and the APP layer, if yes, the assertion checking is successful, otherwise, the assertion checking is failed.
For example, for an assertion of $ < resp _ httpcode > ═ 200, "the assertion check module may obtain the HTTP response code from the variable pool, replace the variable reference, and check for success when HTTP returns 200 and fail when HTTP returns non-200 (e.g., HTTP 500).
Optionally, before performing assertion detection on the test execution result of the testing step, the performance testing method further includes: after receiving the operation instruction, collecting information data in a preset client, wherein the information data at least comprises: log data; and performing edge calculation on the information data, and performing summary statistics on different dimensions based on the calculated information data to obtain statistical result data.
In the embodiment of the invention, an information acquisition unit is responsible for acquiring performance data and logs of a mobile phone end in a test process, the performance data and the logs are sent to a main control driving unit after edge calculation of the mobile phone end, the technical support is provided for analyzing slow response and unsmooth operation of the mobile phone end and evaluating resource consumption of the mobile phone, an agent execution unit asynchronously starts the information acquisition unit after receiving an RUN instruction sent by the main control driving unit (namely after receiving an operation instruction), acquires information (namely acquiring information data in a preset client) and completes statistics, and sends the information to the main control driving unit, the agent execution unit calls the information acquisition unit after receiving a STOP instruction sent by the main control driving unit, STOPs acquiring the information, completes whole-process information statistics and sends the information to the main control driving unit, wherein log data can be acquired by a log acquisition module which defines a general log format and an interface, after the mobile phone APP inherits the interface to realize log output, business function operation and interaction are carried out, and then log information is output to a mobile phone end file through pile burying, and a log collection module periodically reads each log file, analyzes an increment and stores the increment to a front-end temporary storage database.
Optionally, the resource consumption information includes at least one of: function time consumption statistics, function call times, client CPU consumption (CPU average consumption and peak consumption), and memory consumption (memory average consumption and peak consumption).
And after the information data are collected, the information statistical module can perform edge calculation on the log data stored in the front-end temporary storage database at the mobile phone end, perform summary statistics on different dimensions, and upload the log data to the main control driving unit through the communication interaction module.
Wherein, the data included in the test result message: the method comprises the steps of counting the number of calling times of business functions, the average time consumed by executing the business functions, the APP function time consumption, the TOP (any natural number) ranking, the function calling times, the TOP ranking and the like, summarizing and counting in different dimensions to obtain statistical result data, storing the summarizing and counting dimensions in a front-end temporary storage database in a configurable rule mode, and adding and modifying according to specific models, operating system versions and performance analysis requirements without limitation.
And S205, sending the test result message to a server, wherein the server counts the terminal performance data of different test dimensions based on the test result message and resource consumption information of the acquisition client in the process of executing the test script to obtain a performance test result.
In the embodiment of the present invention, a test result message is sent to a server, where the test result message includes test execution results and statistical result data of different test behaviors, and after receiving the test result message, the server may count terminal performance data in different test dimensions based on resource consumption information and the test result message collected during executing a test script, so as to obtain a performance test result, where the test dimensions include, but are not limited to: the request quantity, the single time consumption, the throughput and other dimensions, wherein the request quantity can be obtained by retrieving the number of request messages and the number of response messages of the current test in the information storage unit; for a single TIME consumption, reading an ELAPSED _ TIME attribute of a response message, namely the TIME consumption for processing the request, traversing the information storage unit by the result summarizing module, calculating the TIME consumption of each response message in the current test, and taking an Average Value (AVG), namely the average TIME consumption of the single test; for the throughput, the result summarizing module may traverse the information storage unit, summarize the END _ TIME fields according to the second-level granularity, SUM (SUM) the number of requests completed per second, that is, the Throughput (TPS) of the second, and count the average throughput and the variation trend of the throughput in the TIME series according to the throughput information of each second.
The embodiment of the invention is suitable for the field of terminal software performance test, and is suitable for the field of performance test concerning program client performance and network architecture by defining and using the performance test request, scheduling and initiating the distributed pressure of the multi-terminal machine, executing the client of the test script, calculating the edge of data and assertion, acquiring and summarizing performance information and the like, can effectively simulate the influence of an actual network and the terminal machine on performance indexes, reduces the input cost of the performance test, and improves the test quality and efficiency.
Example two
FIG. 3 is a flow chart of an alternative performance testing method according to an embodiment of the present invention, as shown in FIG. 3, the method including the steps of:
step S301, pushing a link instruction to a message queue, where the link instruction at least includes: and server IP information, wherein the server IP information is used for establishing a communication link path between the client and the server.
Step S302, the established communication link path is adopted to send a test script message to the client, and resource consumption information of the client in the process of executing the test script is collected, wherein the test script carried in the test script message comprises: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: the operation behavior of the user and the calling behavior of the application program.
Step S303, receiving a test result message returned by the client, where the test result message carries test execution results and statistical result data of different test behaviors.
Step S304, counting the terminal performance data of different testing dimensions based on the testing result message and the resource consumption information to obtain a performance testing result.
Through the steps, the link instruction can be pushed to the message queue, the established communication link path is adopted, the test script message is sent to the client, and the resource consumption information of the client in the process of executing the test script is acquired, wherein the test script carried in the test script message comprises: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: and receiving a test result message returned by the client according to the operation behavior of the user and the calling behavior of the application program, and counting the terminal performance data of different test dimensions based on the test result message and the resource consumption information to obtain a performance test result. In the embodiment of the invention, the performance test result can be obtained by sending the test script message to the client, collecting the resource consumption information of the client in the process of executing the test script and combining the test result message returned by the client, so that the performance evaluation of the client can be realized, the influence of an actual network and the client on the performance index can be effectively simulated, the test quality and efficiency are improved, and the technical problems that the performance test is only carried out based on the server and the performance of the client can not be evaluated in the related technology, the test quality and efficiency are reduced, and the user experience is reduced are solved.
The following will explain the embodiments of the present invention in detail with reference to the above steps.
The following steps are applied to a server, the server is connected with a preset client, and the preset client executes the performance testing method in the first embodiment.
In this embodiment of the present invention, before pushing the link instruction to the message queue, the performance testing method further includes: reading a pre-written test script, wherein the test steps in the test script comprise: test behaviors and test data used by the test behaviors, the test data including at least: business data used by the application; and packaging the test script into a message with a specified format to obtain a test script message.
In the embodiment of the invention, the script management module is responsible for analyzing a pre-compiled test script and storing the test script in the information storage unit, wherein the test script comprises a plurality of test steps, and the test steps comprise test behaviors and test data used by the test behaviors:
the test behavior is a series of operations of the function to be tested, and describes the use behavior of the function to be tested by a real user. In the test process, the operation of multiple clients is simulated through the playback of test behaviors, and the parallel request pressure of a server with a function to be tested is realized, wherein the test behaviors can be the calling of a specific program in a terminal and the collective action of specific functions of software, and include but are not limited to: click, double click, enter, swipe, long press, etc.
The test data is data which is used in the test process, has certain business meaning and meets the requirements of cases. The test data may be business data entered or selected during use of the software, and interacts with the software through test actions, and in the script management module, the test data may be defined and managed using formatted strings, and string formats include, but are not limited to: forms KEY-VALUE collections, JSON, XML, and the like.
And the master control scheduling module can read the structured test script from the information storage unit, encapsulate the test script into a request message in a JSON format, call the terminal machine communication module in a multithreading mode, and send the request message to the client.
Optionally, before pushing the link instruction to the message queue, the performance testing method further includes: based on the dynamic data of each client terminal counted in advance, the client terminal is awakened or dormant, wherein the dynamic data at least comprises the following components: concurrency, response time, throughput.
In the embodiment of the invention, the master control scheduling module can wake up the proxy execution unit of the client to execute the test script through the client-machine communication module, in the performance test process, the master control scheduling module wakes up or sleeps a specific number of clients according to the dynamic data of the clients such as concurrency, response time, throughput and the like, for example, a certain scene needs at least 12 mobile phones to participate in the test, the expected throughput needs to reach 30 pens/sec, after the test is started, the master control scheduling module acquires the TCP connection of 12 mobile phones from the thread pool according to the preset concurrency, pushes the test script to the mobile phone proxy execution unit in a connection parameter mode, and issues an RUN instruction, in the test process, the master control scheduling module monitors and summarizes the performance index of each mobile phone response in real time, and increases or decreases the participating mobile phones as required, for example, the real-time throughput is more than 30 pens/sec, and selecting part of the reference test mobile phones, and sending a STOP instruction to STOP executing the script.
Step S301, pushing a link instruction to a message queue, where the link instruction at least includes: and server IP information, wherein the server IP information is used for establishing a communication link path between the client and the server.
In the embodiment of the present invention, a master control scheduling module of a master control driving unit is a broadcast message PRODUCER (product), when waking up all proxy execution units, the master control scheduling module pushes a "CONNECT" instruction (i.e., a link instruction) to a message queue in a push manner, and all active proxy execution units monitor the message queue in a long-chain subscription manner, and actively pull (poll) when monitoring that a newly added message exists and establish a communication link path between a client and a server based on server IP information carried in the link instruction.
Step S302, the established communication link path is adopted to send a test script message to the client, and resource consumption information of the client in the process of executing the test script is collected, wherein the test script carried in the test script message comprises: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: the operation behavior of the user and the calling behavior of the application program.
In the embodiment of the present invention, after the client establishes the communication link path with the server, the server uses the communication link path to issue the test script message to the client, and the server can collect the resource consumption information after the edge calculation of the information collection unit from the client in the test process, including but not limited to: function time consumption statistics, function calling times, client CPU (central processing unit) and memory consumption and the like, wherein the message interaction is realized by TCP (transmission control protocol) short connection between a main control driving unit and an information acquisition unit, TCP connection is established before the interaction, and the destruction connection is completed interactively.
Step S303, receiving a test result message returned by the client, where the test result message carries test execution results and statistical result data of different test behaviors.
In the embodiment of the invention, the server side can receive the test result messages (including test execution results, statistical result data and the like) sent by the client side in the test process and/or after the test is completed, the message interaction is realized by TCP long connection (namely a communication link path), and the messages are managed in the main control drive unit in a connection pool mode.
Step S304, counting the terminal performance data of different testing dimensions based on the testing result message and the resource consumption information to obtain a performance testing result.
Optionally, the resource consumption information includes at least one of: function time consumption statistics, function calling times, a client CPU and memory consumption, wherein the test dimension comprises at least one of the following: number of requests, single time consuming, throughput information.
In the embodiment of the present invention, after receiving the test result message, the server may count the terminal performance data in different test dimensions based on the resource consumption information and the test result message collected during the execution of the test script, so as to obtain the performance test result, where the test dimensions include, but are not limited to: the request quantity, the single time consumption, the throughput and other dimensions, wherein the request quantity can be obtained by retrieving the number of request messages and the number of response messages of the current test in the information storage unit; for a single TIME consumption, reading an ELAPSED _ TIME attribute of a response message, namely the TIME consumption for processing the request, traversing the information storage unit by the result summarizing module, calculating the TIME consumption of each response message in the current test, and taking an Average Value (AVG), namely the average TIME consumption of the single test; for the throughput, the result summarizing module may traverse the information storage unit, summarize the END _ TIME fields according to the second-level granularity, SUM (SUM) the number of requests completed per second, that is, the Throughput (TPS) of the second, and count the average throughput and the variation trend of the throughput in the TIME series according to the throughput information of each second.
In the embodiment of the invention, through a performance test scheduling system consisting of the unified master control drive unit and the agent execution units distributed on a plurality of clients, the interface action of the actual operation client or a Calling (CALL) client program can be simulated, the service function is executed under the environment of a plurality of real clients, the concurrent performance test scene is realized, the problems that the traditional performance test method cannot cover the client processing and cannot simulate the complex communication environment can be solved, and the test quality and efficiency are improved.
EXAMPLE III
As shown in fig. 1, an optional performance testing system in an embodiment of the present invention includes: server side, a plurality of customer end and information memory cell, wherein, server side includes: the master control drive unit, the client includes: the system comprises an agent execution unit, a program calling unit and an information acquisition unit, and specifically comprises the following steps:
at the server side, mainly include the master control drive unit, wherein, the master control drive unit still includes: script management module, total control scheduling module, terminal machine communication module 1, result summarize module, each module specific function as follows:
(1) the script management module is responsible for analyzing the test script written by research personnel and storing the test script in the information storage unit, and in the embodiment of the invention, the defined test script comprises the following components: test behavior and test data:
1) the test behavior is a series of operations of the function to be tested, and describes the use behavior of the function to be tested by the real user. In the testing process, the operation of multiple clients and the parallel request pressure of a server with a function to be tested are simulated through the playback of testing behaviors, wherein the testing behaviors are the calling of specific programs of the clients and the collective action of specific functions of the clients, and include but are not limited to: click, enter, swipe, etc.
2) The test data is the data which is used in the test process, has certain business meaning and meets the case requirement, the test data is the business data which is input or selected in the software using process, the test data interacts with the client software through the test behavior, in the script management module, the test data uses the definition and management of the formatted character string, and the character string format includes but is not limited to: forms KEY-VALUE collections, JSON, XML, and the like.
(2) The master control scheduling module is responsible for pressure master control of performance test, client scheduling and cooperative management and receiving of execution results.
1) And the master control scheduling module reads the structured test script from the information storage unit, encapsulates the test script into a request message in a JSON format, calls the terminal machine communication module 1 in a multithreading mode and issues the request message to the client.
2) The master control scheduling module wakes up the agent execution unit of the client to execute the test script through the terminal machine communication module 1, and in the performance test process, the master control scheduling module can wake up or sleep a specific number of clients according to the dynamic performances of concurrency, response time, throughput and the like.
3) And the master control scheduling module calls the terminal machine communication module 1 to continuously subscribe and monitor the communication response of the proxy execution unit, receives the JSON format response message returned by the proxy execution unit, and stores the response message in the information storage unit after structured disassembly.
The following is an example of a functional description of a master control scheduling module, where multiple behaviors (i.e., multiple test steps) for a certain APP search function are defined in a test script:
A. clicking a certain degree of APP on the desktop of the client, opening the APP and displaying a certain degree of home page (corresponding to the URL www.moudu.com);
B. clicking a query box and inputting a retrieval keyword;
C. calling a hidden domain assignment function to assign a retrieval date;
D. clicking a search button, retrieving a result by a server side, and returning the result to the client side for displaying;
meanwhile, the test script defines a plurality of sets of test data, wherein the search keyword comprises three sets of data, the search date comprises two sets of data, and six sets of test data are obtained after Cartesian product combination.
First, the master control scheduling module can read the test script and package it into a request message.
The master control scheduling module calls the terminal machine communication module 1, a connection instruction is pushed in a request queue of the terminal machine communication module 1 and carries a local IP, an agent execution unit of the online client side pulls a request message from the request queue of the terminal machine communication module 1, TCP long connection is actively established with the master control scheduling module, and the master control scheduling module manages the connection in a thread pool mode.
Before the performance test, the master control scheduling module may pre-configure dimensional indexes such as concurrency, throughput, single time consumption and the like, for example, scene a requires at least 12 clients to participate in the test, and the expected throughput needs to reach 30/sec. After the test is started, the master control scheduling module acquires the TCP connection of 12 clients from the thread pool according to the preset concurrency, pushes the test script to the client proxy execution unit in a connection parameter mode, and issues an RUN instruction. In the testing process, the master control scheduling module monitors and summarizes the performance indexes responded by the client sides in real time, and increases and decreases the reference client sides as required. For example, if the real-time throughput is greater than 30 pens/sec, a part of the reference clients are selected, and a STOP command is sent to STOP executing the script.
And the master control scheduling module acquires the execution result response of the agent execution unit from the TCP long connection, calls JSON analysis for structured disassembly, and stores the JSON analysis in the information storage unit.
(3) The terminal machine communication module 1 is responsible for data interaction between the main control driving unit and the agent execution unit and between the main control driving unit and the information acquisition unit, and specifically comprises three types of data:
1) broadcast messaging, for example, where the master driver unit initiates a wake up of the proxy execution units in all clients, this type of messaging may be implemented by a message queue in "publish-subscribe" mode, e.g., KAFKA, ActiveMQ, etc. Taking KAFKA as an example, the client communication module 1 maintains a KAFKA message queue, the master control scheduling module of the master control driving unit is called a broadcast message PRODUCER (PRODUCER), the proxy execution unit is called a CONSUMER (CONSUMER), the master control scheduling module pushes a "CONNECT" instruction to the message queue in a push (push) manner when waking up all the proxy execution units, all active proxy execution units monitor the message queue in a long-chain subscription manner, and actively pull (poll) and perform subsequent processing when monitoring that a new message is added.
2) The master control scheduling instruction comprises an RUN instruction and a STOP instruction of the master control driving unit for testing scripts of the specific agent execution unit and test script information issued before running; and for the agent execution unit, the agent execution unit comprises test statistical information and execution result responses which are uploaded in the test process and after the test is finished, the message interaction is realized by TCP long connection, and the message interaction is managed in the main control drive unit in a connection pool mode.
3) The client state monitoring information includes log statistical information and resource consumption information acquired from the client in the testing process and calculated by the information acquisition unit edge, and includes but is not limited to: function time consumption statistics, function calling times, client CPU (central processing unit) and memory consumption and the like, wherein the message interaction is realized by TCP (transmission control protocol) short connection between a main control driving unit and an information acquisition unit, TCP connection is established before the interaction, and the destruction connection is completed interactively.
(4) The result summarizing module is responsible for summarizing and counting the result of the current test according to the attribute information of the response message, and can count the performance test effect from the dimensions of request quantity, single time consumption, throughput and the like, wherein:
1) the request quantity can be obtained by searching the number of the request messages and the number of the response messages of the current test in the information storage unit;
2) for a single TIME consumption, the elapseed _ TIME attribute of the response message can be read, the attribute is the processing TIME consumption of the request, the result summarizing module traverses the information storage unit 5, calculates the TIME consumption of each response message in the current test, and takes an Average Value (AVG), namely the average TIME consumption of the single test;
3) and traversing the information storage unit by the result summarizing module, classifying and summarizing the END _ TIME fields according to second-level granularity, counting the number of requests completed per second by a Summation (SUM), namely the Throughput (TPS) of the second, and counting the average value of the throughput and the variation trend of the throughput on a TIME sequence according to the throughput information of each second.
At the client, mainly include agent execution unit, procedure call unit, information acquisition unit, wherein, agent execution unit is the test execution unit of client, is responsible for responding the unified dispatch of master control drive unit, according to the test script that master control drive unit issued, actually carries out test operation, gathers and sends test statistical information and request to the master control drive unit and returns the result, simultaneously, agent execution unit utilizes the power of calculation of client end, is responsible for edge calculation such as test data, test result, include and not be limited to: variable value replacement, test data generation, assertion checking, and simple statistics of results of multiple loop executions. The agent execution unit completes multiple executions of the test script to obtain result quantization information of each test execution of the client, including but not limited to: success or failure of the test, execution start time, execution end time, and the like.
The agent execution unit further includes: the terminal machine communication module 2, the script instantiation module, the test execution module and the assertion check module have the following specific functions:
(1) the end-to-end machine communication module 2 is responsible for data interaction communication between the agent execution unit and the main control drive unit, and specifically comprises:
1) the method comprises the steps of connecting a message queue in a 'publish-subscribe' mode, serving as a CONSUMER of the message (CONSUMER), actively pulling (poll) a 'CONNECT' broadcast instruction issued by a main control drive unit, and actively creating TCP long connection between a main control server and a client according to server IP information contained in the main control drive unit instruction.
2) Receiving a test script message issued by a main control driving unit by using the created TCP long connection, and finishing edge calculation such as script analysis, data instantiation and the like by a scheduling script instantiation module; receiving a RUN instruction and a STOP instruction of a main control driving unit, and scheduling a test execution module to finish test start and STOP; and in the test process and after the test is finished, sending test statistical information and responding to the test result to the main control drive unit.
The specific functions of the terminal communication module 2 are basically the same as those of the terminal communication module 1 in the main control drive unit, and the description thereof will not be repeated here.
(2) The script instantiation module fully utilizes computing resources of the client to implement edge computing, and instantiates a universal test script message issued by the main control drive unit into a specific and unique test script of the client through the steps of behavior analysis, variable value replacement, test data combination and the like:
1) behavior analysis: the JSON-format test script transmitted by the receiver-side communication module 2 is analyzed into specific test steps, each test step comprises a specific test behavior and a test data definition used by the behavior, and meanwhile, a test result JSON message can be initialized and START _ TIME can be assigned as a current timestamp.
2) And for each test step, checking whether the test data has variable reference, wherein the variable refers to data which has certain business meaning and needs to be used in the test process, is not a fixed constant value, but a variable value and changes in real time according to a preset rule. And the script instantiation module stores a variable list defined by the test data into a mapping table in a KEY-VALUE form, traverses the test behavior and the test data when analyzing in each step, and replaces the variable with a specific candidate VALUE when a variable reference grammar appears.
For example, the request message PARAM _ LIST node has two variables, namely KEYWORD and TIME, KEYWORD has three candidate VALUEs, and TIME has two candidate VALUEs, the script instantiation module maintains a HashMap mapping table, stores the variables and the candidate VALUEs, the KEY of the mapping table is the variable name, and the VALUE is an araylist, in which the candidate VALUE LIST of each variable is stored.
And traversing the ACTION _ LIST nodes of the request message by the script instantiation module, wherein in an example of the function specification of the master control scheduling module, in the step B and the step C, variable reference is involved in input ACTION and variable assignment respectively, and for each variable, the script instantiation module replaces the parameter with the element of ArrayList corresponding to the variable. A pointer is maintained, the first element of the ArrayList is selected when the replacement is performed for the first TIME (e.g., KEYWORD weather, TIME 2021-05-17), the pointer is fixed (e.g., plus 1), the second element is selected when the replacement is performed for the second TIME (e.g., KEYWORD rain, TIME 2021-03-01), and so on, the pointer value is reset to the initial value (e.g., 0) when the pointer value equals the number of elements in the ArrayList. The values of a plurality of associated variables ArrayList in the script are subjected to Cartesian product, and the full value combination of the associated variables can be calculated.
(3) The test execution module is responsible for actually completing test execution at the client according to the test script:
1) for the test behavior of "simulation action", the test execution module actually simulates the operation behavior of the real user to the client by means of the program call unit, including but not limited to: click, enter, swipe, select, etc.
2) And for the test behavior of the 'program call' class, the test execution module calls programs of the client operating system, the SDK and the APP by virtue of the program call unit and transmits test data.
For example, in one example of the general control scheduler module functional specification, the test script defines 4 test steps in sequence, including 3 "simulated action" behaviors and 1 "program call" behaviors.
Step A: the behavior type is a simulation action, the specific action is 'CLICK _ APP', the APP is named 'certain degree search', the test execution module simulates and CLICKs an area where the certain degree search is located in the screen, and the certain degree APP is opened and the home page is automatically accessed.
And B: the behavior type is a simulation action, the specific action is 'INPUT', and the test execution module simulates a character string corresponding to the INPUT _ CONTENT of the keyboard at a software cursor (a search box).
And C: the behavior type is program calling, the specific action is FORM assignment, the test execution module retrieves the current page, finds the FIELD of which FIELD _ NAME is equal to qrydate in the FORM (FORM), and directly assigns the FIELD VALUE to the character string corresponding to VALUE.
Step D: the behavior type is a simulation action, the specific action is 'CLICK', the test execution module simulates a button of clicking FIELD _ NAME in a screen to be 'search', and the client APP submits a query request and obtains a return.
3) The test execution module performs structured disassembly on test data and response data, stores contents such as a message header, a message body field and the like in a variable pool, and can use a 'req _ variable name' and a 'resp _ variable name' as KEY values KEY.
4) And the test execution module calls the assertion checking module to check whether the assertion defined by the test script is successful or not based on the collected function operation data. For example, in an example of the function description of the total control scheduling module, step a and step D both define an assertion that $ < resp _ httpcode > ═ 200', and $ < resp _ httpcode > is an HTTP response code, and since the HTTP execution of each test step returns to 200, the assertion check passes.
5) And the test execution module gives out a test execution result according to the assertion check result. When all assertion checks of each step pass, the execution result is tested as success, otherwise, the execution result is tested as failure.
6) And the test execution module completes the initialized test result JSON message according to the test execution result and responds to the main control driving unit 1 through TCP long connection. When the message is completed, END _ TIME is assigned as the current timestamp, END _ TIME and START _ TIME are subtracted to obtain the processing TIME consumption of the request, the END _ TIME and the START _ TIME are assigned to an ELAPSED _ TIME node of the message, and the test execution result and the checking result of each assertion in each test step are recorded in the message.
(4) The assertion checking module is in charge of checking the execution result of the test script, the assertion in the embodiment of the invention is a Boolean expression and consists of an actual value, an expected relationship and an expected value, and the actual value and the expected value are numerical expressions or character strings and can also be behavior actions of a client.
1) For numeric values and strings, assertion checks may contain four arithmetic and variable references. The following is an example of an assertion that may be asserted, for example,
$<resp_BALANCE>==$<req_BALANCE>+100;
where $ < resp _ BALANCE > actual value, $ is the expected relationship, $ < req _ BALANCE > +100 is the expected value, and the variable references are described in $ < variable name > format.
When the assertion checking module checks the assertion of the numerical value and the character class, firstly, the variable reference is replaced by the actual data in the variable pool, then the numerical value is calculated by four arithmetic operations of the actual value and the expected value, and finally, whether the numerical value of the actual value and the expected value meets the expected relationship given by the assertion is judged, if so, true is checked by the assertion, otherwise, false is checked.
2) For the client's behavior action, the assertion check determines whether the expected behavior action occurs through a boolean expression and a predefined function. For example:
app dialog ("process success"). visable ═ true';
when the assertion checking module checks the assertion of the behavior action class, whether the expected interface control and action meet the expected behavior, for example, whether the expected interface control and action are called up or grayed out, is judged through the probes of the client operating system and the APP layer, and if the expected interface control and action meet the expected behavior, the assertion checking module checks true, otherwise, false.
For example, for an assertion of $ < resp _ httpcode > ═ 200', the assertion check module obtains the HTTP response code from the variable pool, replaces the variable reference, checks true when HTTP returns to 200, and checks false when HTTP returns to non-200 (e.g., HTTP 500).
The client also comprises a program calling unit, wherein the program calling unit is an execution unit which actually simulates the user behavior of the client and calls an operating system and an APP program of the client, and is responsible for responding to the calling of the proxy execution unit, simulating user operations such as clicking, inputting and sliding, calling the operating system and the APP bottom-layer functions, and finishing the execution of the test script on the client.
The program calling unit also comprises a behavior simulation module and a function calling module:
(1) the behavior simulation module runs on an operating system driving layer in an injection mode, simulates interaction behaviors sent to the operating system bottom layer by hardware sensors such as a screen and a key and drives the operating system to interact with APP response.
(2) The function calling module runs between the operating system and the client APP and interacts with the client APP in a HOOK (HOOK) mode, responds to the proxy execution unit, and completes the operation of calling the client program function by the script by calling an operating system API, a technical support layer (such as Objective-C of iOS and JAVA of Android) SDK and a page presentation layer function (such as H5 and functions provided by JJAVASCRIPT).
The client also comprises an information acquisition unit which is responsible for acquiring performance data and logs of the client in the test process, and the performance data and the logs are uploaded to the main control driving unit after edge calculation of the client, so that technical support is provided for analyzing slow response and unsmooth running of the client and evaluating resource consumption of the client. After the agent execution unit receives the RUN instruction sent by the main control drive unit, the information acquisition unit is started asynchronously to acquire information and complete statistics and send the information to the main control drive unit, and after the agent execution unit receives the STOP instruction sent by the main control drive unit, the information acquisition unit is called to STOP acquiring information, complete the whole process information statistics and send the information to the main control drive unit.
The information acquisition unit also comprises a communication interaction module, a log acquisition module, a resource acquisition module, an information statistics module and a front-end temporary storage database, and the specific functions of the modules are as follows:
(1) the communication interaction module is responsible for processing communication interaction between the information acquisition unit and the main control driving unit, the interaction is realized by TCP short connection, before interaction each time, the communication interaction module actively establishes TCP connection with the main control driving unit, and the destruction connection is completed interactively.
(2) The log collection module defines a general log format and an interface, after the client APP inherits the interface to realize log output, the business function operation and interaction action are embedded before and after to output log information to a client file, and the log collection module periodically reads each log file, analyzes the increment and stores the increment in a front-end temporary storage database.
For example, the log collection module dynamically monitors the execution action of the keyword retrieval function through APP (application) pile burying, and outputs a log file.
(3) The resource acquisition module continuously calls a resource acquisition command of the operating system through a plurality of monitoring threads, acquires client resource data and stores the client resource data in a front-end temporary storage database, and the resource data acquired by the resource acquisition module comprises but is not limited to: CPU, memory, network read-write quantity, etc.
For example, taking an android client as an example, the resource acquisition module defines a monitoring process to continuously call a vmstat 1 command, and acquires dimension CPU consumption data such as user, sys, wa, and the like.
(4) The information statistics module performs edge calculation on log data and resource data stored in a front-end temporary storage database at a client, performs summary statistics on multiple dimensions, and uploads the summary statistics to a main control driving unit through a communication interaction module, wherein the summary statistics dimensions include and are not limited to: the method comprises the steps of service function calling times, average service function execution time consumption, APP function time consumption statistics and TOP 10 ranking, function calling times and TOP 10 ranking, CPU average consumption and peak value consumption, average memory consumption and peak value consumption, summarizing and counting dimensions, storing the statistics dimensions in a front-end temporary storage database in a configurable rule mode, and adding and modifying according to specific model, operating system version and performance analysis requirements without limitation.
The embodiment of the invention provides a performance test system, which is suitable for the field of performance test of terminal software and the field of performance test concerning the performance of a program client and a network architecture, can effectively simulate the influence of an actual network and an end machine on performance indexes, reduce the investment cost of performance test, and improve the test quality and efficiency by defining and using a performance test request, scheduling and initiating the distributed pressure of a multi-end machine, executing the client of a test script, calculating the edge of data and assertion, acquiring and summarizing performance information and the like.
Example four
Fig. 4 is a schematic diagram of an alternative full link performance testing method according to an embodiment of the present invention, as shown in fig. 4, including the following steps:
the method comprises the following steps: analyzing the test script:
in the embodiment of the invention, the test script is structurally analyzed and disassembled into the test behavior and the test data, and the test behavior and the test data are stored in the information storage unit.
Step two: waking up and connecting the test client:
in the embodiment of the invention, the main control drive unit broadcasts the link instruction through the message queue, and the proxy execution unit of the online client responds and actively creates the TCP connection.
Step three: scheduling multiple clients to generate concurrency pressure:
in the embodiment of the invention, the main control drive unit reads the structured test script, encapsulates the test script into the script message in the JSON format, selects to run the client end set according to the pressure, pushes the script message down to the client end, and can adjust the number of the tested client ends according to the actual pressure in the test process.
Step four: instantiation of the test script:
in the embodiment of the invention, the script instantiation module analyzes the request message in the JSON format into a specific test step, and replaces variable references in the request message into actual data.
Step five: and (3) executing a performance test:
in the embodiment of the invention, the agent execution unit simulates action behaviors and calls the client function according to the script definition to finish the performance test execution of the client
Step six: edge side variable save and assertion check:
in the embodiment of the invention, the agent execution unit carries out structured disassembly on the actually used test data and response data and stores the test data and the response data into the variable pool. And for each assertion, replacing variables, calculating the value of an expression, capturing expected interface controls and actions, evaluating whether the actual value and the expected value meet an expected relationship, and giving a conclusion whether the assertion passes the inspection.
Step seven: calculating an edge side test result:
in the embodiment of the invention, information such as time consumption of processing, test execution results and the like is calculated at a client side (edge side) to be tested, a JSON message of the test result is generated and supplemented, and the JSON message is transmitted to the main control drive unit through a TCP long connection.
Step eight: collecting and summarizing edge terminal information:
in the embodiment of the invention, the information acquisition unit acquires the log information of the service function operation and the interactive action and the resource information of the client during the test and when the test is finished, and the log information and the resource information of the client are uploaded to the main control driving unit after multi-dimensional summary statistics.
Step nine: summarizing and counting test results:
in the embodiment of the invention, the main control driving unit can summarize and count the performance test effect from the dimensions of pressure pen number, single-pen time consumption, throughput and the like.
The embodiment of the invention provides a performance test method, which is suitable for the field of performance test of terminal software and the field of performance test concerning the performance of a program client and a network architecture, and can effectively simulate the influence of an actual network and an end machine on performance indexes, reduce the investment cost of performance test and improve the test quality and efficiency by defining and using a performance test request, scheduling and initiating the distributed pressure of a multi-end machine, executing the client of a test script, calculating the edge of data and assertion, acquiring and summarizing performance information and the like.
EXAMPLE five
The performance testing apparatus provided in this embodiment includes a plurality of implementation units, and each implementation unit corresponds to each implementation step in the first embodiment.
FIG. 5 is a schematic diagram of an alternative performance testing apparatus according to an embodiment of the present invention, as shown in FIG. 5, the testing apparatus may include: an obtaining unit 50, a creating unit 51, a first receiving unit 52, an executing unit 53, a sending unit 54, wherein,
an obtaining unit 50, configured to obtain a link instruction, where the link instruction at least includes: server IP information;
a creating unit 51 for creating a communication link path with the server based on the server IP information;
the first receiving unit 52 is configured to receive the test script message through the communication link path, and perform parsing operation on the test script message to obtain a test script, where the test script includes: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: the operation behavior of the user and the calling behavior of the application program;
an executing unit 53, configured to sequentially execute the testing steps to obtain a testing result message, where the testing result message carries testing execution results and statistical result data of different testing behaviors;
and a sending unit 54, configured to send the test result message to the server, where the server counts terminal performance data of different test dimensions based on the test result message and resource consumption information of the client during execution of the test script, and obtains a performance test result.
The above-mentioned testing apparatus, can obtain the link instruction through obtaining the unit 50, through establishing the unit 51 based on server IP information, establish the communication link route with the server, through the first receiving element 52 utilizing the communication link route, receive the test script message, and carry out the analysis processing operation to the test script message, obtain the test script, wherein, the test script includes: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: the operation behavior of the user and the calling behavior of the application program sequentially execute the test steps through the execution unit 53 to obtain a test result message, and the test result message is sent to the server through the sending unit 54, wherein the server counts the terminal performance data of different test dimensions based on the test result message and the resource consumption information of the client in the process of executing the test script to obtain the performance test result. In the embodiment of the invention, the client can execute the test script under the environment of a plurality of user terminals by simulating the operation behavior of the user and the calling behavior of the application program, so that a concurrent performance test scene is realized, the influence of an actual network and the client on performance indexes can be effectively simulated, the test quality and efficiency are improved, and the technical problems that the performance test is only carried out based on the server in the related technology, the performance of the client cannot be evaluated, the test quality and efficiency are reduced, and the user experience is reduced are solved.
Optionally, the obtaining unit includes: the first monitoring module is used for monitoring whether a new link instruction exists in the message queue by adopting a preset link subscription mode; and the first pulling unit is used for pulling the link instruction from the message queue after monitoring that a new link instruction exists in the message queue.
Optionally, the first receiving unit includes: and the first analysis module is used for analyzing the test script message to obtain a test script comprising at least one test step, wherein the test step comprises test behaviors and test data used by the test behaviors.
Optionally, the performance testing apparatus further includes: the first storage unit is used for storing a variable list defined by the test data into a mapping table in a key-value form after analyzing and processing the test script message to obtain the test script, wherein keys in the mapping table are variable names, and values are candidate value lists of each variable; the first checking module is used for checking whether the variable quotes in the test data or not for each test step; and the first replacement module is used for replacing the referenced variable into a preset candidate value based on the mapping table if the variable is referenced in the test data.
Optionally, the first replacement module includes: the first setting submodule is used for setting a circulation pointer; the first selection submodule is used for selecting a first element of the candidate value list as a candidate value and increasing the pointer value of the circular pointer by a preset value when the first replacement step is executed; the second selection submodule is used for selecting a second element of the candidate value list as a candidate value and increasing the pointer value of the circular pointer by a preset value when the second replacement step is executed; a first execution submodule, configured to continue to execute the replacing step until a pointer value of the cyclic pointer is equal to the number of elements in the candidate value list; and the second setting submodule is used for setting the pointer value of the circular pointer to be restored to the initial value if the pointer value of the circular pointer is equal to the number of elements in the candidate value list.
Optionally, the performance testing apparatus further includes: and the first disassembling module is used for performing structured disassembling on the test data and the response data after the test steps are sequentially executed to obtain the test result message, and storing the disassembling result into the variable pool, wherein the response data refers to the response parameters after the test behavior of the preset client is monitored.
Optionally, the operation behavior of the user includes at least one of: click operation, double click operation, long press operation and stroking operation; the type of application includes at least one of: an operating system, a Software Development Kit (SDK) and an application APP.
Optionally, the performance testing apparatus further includes: the first detection module is used for performing assertion detection on the test execution result of the test step after the test step is sequentially executed to obtain a test result message, wherein the type of assertion detection comprises at least one of the following types: numeric values, strings, behavioral actions; the first calculation module is used for replacing variable quotation with a test actual value in a variable pool when the type of assertion detection is a numerical value and/or a character string, and calculating a test expected value through four arithmetic operations to obtain a calculated numerical value; the first judgment module is used for judging whether the calculated values of the test actual value and the test expected value meet the test expected relationship; the first determining module is used for determining that the test execution result is successful if the test expected relationship is met; the second judgment module is used for judging whether the behavior action meets the expected behavior or not based on the probe in the client when the type of the assertion detection is the behavior action; and the second determination module is used for determining that the test execution result is successful if the expected behavior is met.
Optionally, the performance testing apparatus further includes: the first acquisition module is used for acquiring information data in a preset client after receiving an operation instruction before assertion detection is performed on a test execution result of the test step, wherein the information data at least comprises: log data; and the first statistical module is used for performing edge calculation on the information data, and performing summary statistics on different dimensions based on the calculated information data to obtain statistical result data.
The above-mentioned testing device may further include a processor and a memory, and the above-mentioned acquiring unit 50, the creating unit 51, the first receiving unit 52, the executing unit 53, the sending unit 54, and the like are all stored in the memory as program units, and the processor executes the above-mentioned program units stored in the memory to implement the corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory. The kernel can be set to be one or more, and the test result message is sent to the server by adjusting the kernel parameters.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
The present application further provides a computer program product adapted to perform a program for initializing the following method steps when executed on a data processing device: after acquiring the link instruction, establishing a communication link path between the server and the server based on the server IP information, receiving a test script message through the communication link path, and analyzing and processing the test script message to obtain a test script, wherein the test script comprises: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: and sequentially executing the testing steps by the operation behavior of the user and the calling behavior of the application program to obtain a testing result message, and sending the testing result message to the server, wherein the server counts the terminal performance data of different testing dimensions to obtain a performance testing result based on the testing result message and the resource consumption information of the client in the process of executing the testing script.
EXAMPLE six
The performance testing apparatus provided in this embodiment includes a plurality of implementation units, and each implementation unit corresponds to each implementation step in the second embodiment.
FIG. 6 is a schematic diagram of an alternative performance testing apparatus according to an embodiment of the present invention, as shown in FIG. 6, the testing apparatus may include: a pushing unit 60, a sending unit 61, a second receiving unit 62, and a counting unit 63, wherein,
a pushing unit 60, configured to push a link instruction to the message queue, where the link instruction at least includes: server IP information, the server IP information is used for establishing a communication link path between the client and the server;
the issuing unit 61 is configured to issue a test script message to the client by using the established communication link path, and acquire resource consumption information of the client in the process of executing the test script, where the test script carried in the test script message includes: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: the operation behavior of the user and the calling behavior of the application program;
a second receiving unit 62, configured to receive a test result message returned by the client, where the test result message carries test execution results and statistical result data of different test behaviors;
and the counting unit 63 is configured to count the terminal performance data of different test dimensions based on the test result message and the resource consumption information, so as to obtain a performance test result.
The above-mentioned testing device can be through the propelling movement unit 60 with linking instruction propelling movement to the message queue, adopts the communication link route of establishing through issue unit 61, issues the test script message to the client to gather the resource consumption information of client in the execution test script in-process, wherein, the test script that carries in the test script message includes: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: the operation behavior of the user and the calling behavior of the application program receive the test result message returned by the client through the second receiving unit 62, and the statistical unit 63 performs statistics on the terminal performance data of different test dimensions based on the test result message and the resource consumption information to obtain a performance test result. In the embodiment of the invention, the test script message is sent to the client to execute the operation, the resource consumption information of the client in the process of executing the test script is collected, and the performance test result is obtained through statistics by combining the test result message returned by the client, so that the performance evaluation of the client can be realized, the influence of an actual network and the client on the performance index can be effectively simulated, the test quality and efficiency are improved, and the technical problems that the performance test is only carried out based on the server and the performance of the client can not be evaluated in the related technology, the test quality and efficiency are reduced, and the user experience is reduced are solved.
Optionally, the performance testing apparatus further includes: the device comprises a first reading module and a second reading module, wherein the first reading module is used for reading a pre-written test script before pushing a link instruction to a message queue, and the test step in the test script comprises the following steps: test behaviors and test data used by the test behaviors, the test data including at least: business data used by the application; and the first packaging module is used for packaging the test script into a message with a specified format to obtain a test script message.
Optionally, the performance testing apparatus further includes: the first awakening module is used for awakening or sleeping the client based on the pre-counted dynamic data of each client before pushing the link instruction to the message queue, wherein the dynamic data at least comprises: concurrency, response time, throughput.
Optionally, the resource consumption information includes at least one of: function time consumption statistics, function calling times, a client CPU and memory consumption, wherein the test dimension comprises at least one of the following: number of requests, single time consuming, throughput information.
The above-mentioned test device may further include a processor and a memory, the above-mentioned pushing unit 60, the sending unit 61, the second receiving unit 62, the statistic unit 63, etc. are all stored in the memory as program units, and the processor executes the above-mentioned program units stored in the memory to implement corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory. One or more than one kernel can be set, and the performance test result is obtained by adjusting the kernel parameters to count the terminal performance data of different test dimensions.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
The present application further provides a computer program product adapted to perform a program for initializing the following method steps when executed on a data processing device: the link instruction is pushed to a message queue, a test script message is issued to a client by adopting an established communication link path, and resource consumption information of the client in the process of executing the test script is acquired, wherein the test script carried in the test script message comprises: at least one test step, each test step comprising a behavior type of the test behavior comprising at least one of: and receiving a test result message returned by the client according to the operation behavior of the user and the calling behavior of the application program, and counting the terminal performance data of different test dimensions based on the test result message and the resource consumption information to obtain a performance test result.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium, where the computer-readable storage medium includes a stored computer program, and when the computer program runs, the apparatus where the computer-readable storage medium is located is controlled to execute any one of the performance testing methods described above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (16)

1. A performance testing method is applied to a preset client, and comprises the following steps:
obtaining a link instruction, wherein the link instruction at least comprises: server IP information;
establishing a communication link path with a server based on the server IP information;
receiving a test script message through the communication link path, and performing analysis processing operation on the test script message to obtain a test script, wherein the test script comprises: at least one test step, wherein the behavior type of the test behavior contained in each test step comprises at least one of the following: the operation behavior of the user and the calling behavior of the application program;
sequentially executing the test steps to obtain a test result message, wherein the test result message carries test execution results and statistical result data of different test behaviors;
and sending the test result message to the server, wherein the server counts the terminal performance data of different test dimensions based on the test result message and the resource consumption information of the client in the process of executing the test script to obtain a performance test result.
2. The method of claim 1, wherein the step of obtaining the link instruction comprises:
monitoring whether a new link instruction exists in a message queue or not by adopting a preset link subscription mode;
and pulling the link instruction from the message queue after monitoring that a new link instruction exists in the message queue.
3. The method according to claim 1, wherein the step of performing parsing operation on the test script message to obtain the test script comprises:
and analyzing the test script message to obtain the test script comprising at least one test step, wherein the test step comprises the test behavior and test data used by the test behavior.
4. The method according to claim 3, wherein after the parsing operation is performed on the test script packet to obtain the test script, the performance testing method further comprises:
storing a variable list defined by the test data as a mapping table in a key-value form, wherein keys in the mapping table are variable names, and values are candidate value lists of each variable;
for each test step, checking whether the test data has variable reference;
and if the variable is referenced in the test data, replacing the referenced variable with a preset candidate value based on the mapping table.
5. The method according to claim 4, wherein the step of replacing the referenced variable with a preset candidate value based on the mapping table comprises:
after a circulating pointer is set;
when the first replacement step is executed, selecting a first element of the candidate value list as a candidate value, and increasing a pointer value of the circular pointer by a preset value;
when the second replacement step is executed, selecting a second element of the candidate value list as a candidate value, and increasing the pointer value of the circular pointer by a preset value;
continuing to execute the replacing step until the pointer value of the circular pointer is equal to the number of elements in the candidate value list;
and if the pointer value of the circular pointer is equal to the number of elements in the candidate value list, setting the pointer value of the circular pointer to be restored to an initial value.
6. The performance testing method according to claim 3, wherein after the testing steps are sequentially executed to obtain the test result message, the performance testing method further comprises:
and performing structured disassembly on the test data and the response data, and storing a disassembly result into a variable pool, wherein the response data refers to response parameters after the preset client is monitored to execute the test behavior.
7. The performance testing method of claim 1, wherein the user's operational behavior comprises at least one of: click operation, double click operation, long press operation and stroking operation; the type of the application program comprises at least one of the following: an operating system, a Software Development Kit (SDK) and an application APP.
8. The performance testing method according to claim 1, wherein after the testing steps are sequentially executed to obtain the test result message, the performance testing method further comprises:
performing assertion detection on the test execution result of the test step, wherein the type of assertion detection comprises at least one of the following types: numeric values, strings, behavioral actions;
when the type detected by the assertion is a numerical value and/or a character string, replacing variable quote with a test actual value in a variable pool, and calculating a test expected value through four operations to obtain a calculated numerical value; judging whether the calculated values of the test actual value and the test expected value meet a test expected relationship or not; if the test expected relationship is met, determining that the test execution result is successful;
when the type of assertion detection is behavior action, judging whether the behavior action meets expected behavior based on a probe in a client; and if the expected behavior is met, determining that the test execution result is successful.
9. The performance testing method of claim 1, wherein prior to assertion detection of the test execution result of the testing step, the performance testing method further comprises:
after receiving an operation instruction, collecting information data in the preset client, wherein the information data at least comprises: log data;
and performing edge calculation on the information data, and performing summary statistics on different dimensions based on the calculated information data to obtain statistical result data.
10. A performance testing method applied to a server, wherein the server is connected to a preset client, and the preset client executes the performance testing method according to any one of claims 1 to 9, including:
pushing a link instruction to a message queue, wherein the link instruction at least comprises: server IP information for establishing a communication link path between a client and the server;
sending a test script message to the client by adopting the established communication link path, and acquiring resource consumption information of the client in the process of executing the test script, wherein the test script carried in the test script message comprises: at least one test step, wherein the behavior type of the test behavior contained in each test step comprises at least one of the following: the operation behavior of the user and the calling behavior of the application program;
receiving a test result message returned by the client, wherein the test result message carries test execution results and statistical result data of different test behaviors;
and counting the terminal performance data of different testing dimensions based on the testing result message and the resource consumption information to obtain a performance testing result.
11. The method of claim 10, wherein prior to pushing the chaining instruction to the message queue, the performance testing method further comprises:
reading a pre-written test script, wherein the test steps in the test script comprise: the test behavior and test data used by the test behavior, the test data including at least: business data used by the application;
and packaging the test script into a message with a specified format to obtain the test script message.
12. The method of claim 10, wherein prior to pushing the chaining instruction to the message queue, the performance testing method further comprises:
based on the dynamic data of each client terminal counted in advance, the client terminal is awakened or dormant, wherein the dynamic data at least comprises the following components: concurrency, response time, throughput.
13. The method of claim 10, wherein the resource consumption information comprises at least one of: function time consumption statistics, function call times, a client CPU and memory consumption, wherein the test dimension comprises at least one of the following: number of requests, single time consuming, throughput information.
14. The utility model provides a performance testing device which characterized in that is applied to preset the customer end, includes:
an obtaining unit, configured to obtain a link instruction, where the link instruction at least includes: server IP information;
a creating unit, configured to create a communication link path with a server based on the server IP information;
a first receiving unit, configured to receive a test script packet through the communication link path, and perform parsing operation on the test script packet to obtain a test script, where the test script includes: at least one test step, wherein the behavior type of the test behavior contained in each test step comprises at least one of the following: the operation behavior of the user and the calling behavior of the application program;
the execution unit is used for sequentially executing the test steps to obtain a test result message, wherein the test result message carries test execution results and statistical result data of different test behaviors;
and the sending unit is used for sending the test result message to the server, wherein the server counts the terminal performance data of different test dimensions based on the test result message and the resource consumption information of the client in the process of executing the test script to obtain a performance test result.
15. A performance testing apparatus, applied to a server, where the server is connected to a preset client, and the preset client executes the performance testing method according to any one of claims 1 to 9, including:
a pushing unit, configured to push a link instruction to a message queue, where the link instruction at least includes: server IP information for establishing a communication link path between a client and the server;
the issuing unit is configured to issue a test script message to the client by using the established communication link path, and acquire resource consumption information of the client in a process of executing the test script, where a test script carried in the test script message includes: at least one test step, wherein the behavior type of the test behavior contained in each test step comprises at least one of the following: the operation behavior of the user and the calling behavior of the application program;
a second receiving unit, configured to receive a test result message returned by the client, where the test result message carries test execution results and statistical result data of different test behaviors;
and the statistical unit is used for counting the terminal performance data of different testing dimensions based on the testing result message and the resource consumption information to obtain a performance testing result.
16. A computer-readable storage medium, comprising a stored computer program, wherein the computer program, when executed, controls an apparatus in which the computer-readable storage medium is located to perform the performance testing method of any one of claims 1 to 13.
CN202111307439.5A 2021-11-05 2021-11-05 Performance test method and device and computer readable storage medium Pending CN113986746A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111307439.5A CN113986746A (en) 2021-11-05 2021-11-05 Performance test method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111307439.5A CN113986746A (en) 2021-11-05 2021-11-05 Performance test method and device and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN113986746A true CN113986746A (en) 2022-01-28

Family

ID=79746825

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111307439.5A Pending CN113986746A (en) 2021-11-05 2021-11-05 Performance test method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113986746A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114756474A (en) * 2022-04-27 2022-07-15 苏州睿芯集成电路科技有限公司 Method and device for generating random vector in CPU verification and electronic equipment
CN116170354A (en) * 2023-02-28 2023-05-26 重庆长安汽车股份有限公司 Network performance test method, device, equipment and medium
CN117452873A (en) * 2023-12-26 2024-01-26 宁波和利时信息安全研究院有限公司 Communication method, device, equipment and storage medium
CN117714327A (en) * 2024-02-05 2024-03-15 神州灵云(北京)科技有限公司 Method, system, equipment and medium for tracking performance index of full-link service request

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114756474A (en) * 2022-04-27 2022-07-15 苏州睿芯集成电路科技有限公司 Method and device for generating random vector in CPU verification and electronic equipment
CN114756474B (en) * 2022-04-27 2023-07-21 苏州睿芯集成电路科技有限公司 Method and device for generating random vector in CPU verification and electronic equipment
CN116170354A (en) * 2023-02-28 2023-05-26 重庆长安汽车股份有限公司 Network performance test method, device, equipment and medium
CN116170354B (en) * 2023-02-28 2024-05-14 重庆长安汽车股份有限公司 Network performance test method, device, equipment and medium
CN117452873A (en) * 2023-12-26 2024-01-26 宁波和利时信息安全研究院有限公司 Communication method, device, equipment and storage medium
CN117452873B (en) * 2023-12-26 2024-03-15 宁波和利时信息安全研究院有限公司 Communication method, device, equipment and storage medium
CN117714327A (en) * 2024-02-05 2024-03-15 神州灵云(北京)科技有限公司 Method, system, equipment and medium for tracking performance index of full-link service request

Similar Documents

Publication Publication Date Title
CN113986746A (en) Performance test method and device and computer readable storage medium
CN107577805B (en) Business service system for log big data analysis
US8677324B2 (en) Evaluating performance of an application using event-driven transactions
CN104954453A (en) Data mining REST service platform based on cloud computing
US20170048120A1 (en) Systems and Methods for WebSphere MQ Performance Metrics Analysis
CN110750458A (en) Big data platform testing method and device, readable storage medium and electronic equipment
CN112559301B (en) Service processing method, storage medium, processor and electronic device
CN110147327B (en) Multi-granularity-based web automatic test management method
JP2016100005A (en) Reconcile method, processor and storage medium
JP2021502658A (en) Key-based logging for processing structured data items using executable logic
CN111352903A (en) Log management platform, log management method, medium, and electronic device
Wang Stream processing systems benchmark: Streambench
CN112217886B (en) Distributed system testing method and system, message production and consumption module
US20230385187A1 (en) Testing of a resource manager of an application management system
US8839208B2 (en) Rating interestingness of profiling data subsets
CN111177237A (en) Data processing system, method and device
CN111949521B (en) Software performance test method and device
Sfaxi et al. Babel: a generic benchmarking platform for Big Data architectures
CN105095070A (en) Method and system for obtaining QQ group data base on test assembly of browser
Han et al. Bigdatabench-mt: A benchmark tool for generating realistic mixed data center workloads
CN114610597A (en) Pressure testing method, device, equipment and storage medium
CN111949493A (en) Inference application-based power consumption testing method and device for edge AI server
CN116089490A (en) Data analysis method, device, terminal and storage medium
CN109933506A (en) Server big data method of evaluating performance, system and electronic equipment and storage medium
Simmhan et al. Data management in dynamic environment-driven computational science

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination