CN110825636A - Matching algorithm performance test method, device, equipment, system and medium - Google Patents

Matching algorithm performance test method, device, equipment, system and medium Download PDF

Info

Publication number
CN110825636A
CN110825636A CN201911070954.9A CN201911070954A CN110825636A CN 110825636 A CN110825636 A CN 110825636A CN 201911070954 A CN201911070954 A CN 201911070954A CN 110825636 A CN110825636 A CN 110825636A
Authority
CN
China
Prior art keywords
matching
application
test
user
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911070954.9A
Other languages
Chinese (zh)
Other versions
CN110825636B (en
Inventor
汤涛
吴建伟
肖磊
陈瑞坤
王常红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911070954.9A priority Critical patent/CN110825636B/en
Publication of CN110825636A publication Critical patent/CN110825636A/en
Application granted granted Critical
Publication of CN110825636B publication Critical patent/CN110825636B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6009Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application discloses a performance test method for a matching algorithm, which comprises the following steps: acquiring application operation data of a target area; converting the operating data into user application behavior data; under a test environment configured with a matching algorithm to be tested, carrying out user behavior playback according to the user application behavior data, and collecting a test log generated under the test environment; and determining the performance of the matching algorithm to be tested according to the application log and the test log. Therefore, the matching algorithm is tested by simulating the matching request of the real target user, the predicted performance of the matching algorithm can be fit with the target user, the evaluation result is accurate, and the method can be used for further optimization or configuration of the subsequent matching algorithm. The application also discloses a corresponding device, equipment and a storage medium.

Description

Matching algorithm performance test method, device, equipment, system and medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a computer storage medium for testing performance of a matching algorithm.
Background
At present, many Game software can support network battles, for example, Multiplayer Online tactical sports games (MOBA), first person perspective Shooting games (FPS), etc., most of which support the battles among a plurality of players, and the support of player Battle needs to rely on a matching algorithm to match out the Battle teams from a plurality of players, so that the fighting capabilities of two teams are balanced, the Game has good resistance and balance.
In the game development and optimization process, an important problem is to optimize a matching algorithm according to an actual environment to provide a matching algorithm with excellent performance according to an application environment, the matching algorithm is an algorithm for matching a battle team for a player in real time according to a request of the player in the game process, the performance of the matching algorithm is mainly expressed by matching efficiency and matching quality, and a performance test scheme of the matching algorithm is required to test the performance of the matching algorithm in the optimization process of the matching algorithm to determine whether the performance is excellent or not and whether the performance can be directly applied.
At present, a performance test scheme of a commonly used matching algorithm is that a robot forges a large number of client requests to simulate a large-scale matching scene, and the performance of the matching algorithm is judged by collecting and analyzing server response data.
However, since the robot behavior is completely random and cannot fit the target user, the simulated matching result is also different from the extranet in practical application, and the performance evaluation reference value of the matching algorithm is not large, so that the final performance evaluation result is not reliable.
Disclosure of Invention
The embodiment of the application provides a performance test method for a matching algorithm, which collects user application behaviors of real users, simulates a matching request of the real users in a target area based on the user application behaviors to realize the performance test of the matching algorithm in the target area, and thus, the test method can better improve the effectiveness of a test result.
The first aspect of the present application provides a method for testing performance of a matching algorithm, where the method includes:
acquiring application running data of a target area, wherein the application running data comprises an application log generated in an application environment;
converting the application running data into user application behavior data, wherein the user application behavior data is used for representing attribute data which affects the fighting performance of the user and corresponds to the user in the application operation process;
under a test environment configured with a matching algorithm to be tested, carrying out user behavior playback according to the user application behavior data, and collecting a test log generated under the test environment;
and determining the performance of the matching algorithm to be tested according to the test log and the application log.
A second aspect of the present application provides a device for testing performance of a matching algorithm, the device comprising:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring application running data of a target area, and the application running data comprises an application log generated in an application environment;
the data conversion module is used for converting the application running data into user application behavior data and representing attribute characteristic data corresponding to the target behavior implemented by the user in the target area;
the testing module is used for playing back user behaviors according to the user application behavior data in a testing environment configured with a matching algorithm to be tested and collecting a testing log generated in the testing environment;
and the performance evaluation module is used for determining the performance of the matching algorithm to be tested according to the application log and the test log.
A third aspect of the present application provides a system for testing performance of a matching algorithm, the system comprising:
the data acquisition server is used for acquiring application running data of a target area, wherein the application running data comprises an application log generated in an application environment;
the data conversion server is used for converting the application running data into user application behavior data, and the user application behavior data is used for representing attribute data which affects the fighting performance of the user and corresponds to the user in the application operation process; the matching request initiating server under the test environment is used for sending a matching request to the matching server under the test environment based on the user application behavior data simulation user behavior;
the matching server is used for responding to the matching request through a pre-configured matching algorithm to be tested and sending the generated log data serving as a test log to the performance evaluation equipment;
and the performance evaluation equipment is used for determining the performance of the matching algorithm to be tested according to the application log and the test log.
A fourth aspect of the present application provides an apparatus, comprising:
a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to execute the computer program to implement the matching algorithm performance testing method according to the first aspect.
A fifth aspect of the present application provides a computer-readable storage medium for storing a computer program which, when executed, implements the matching algorithm performance testing method of the first aspect described above.
A sixth aspect of the present application provides a computer program product comprising instructions for a computer readable storage medium storing a computer program which, when executed, implements the matching algorithm performance testing method of the first aspect described above.
According to the technical scheme, the performance test method for the matching algorithm has the following advantages:
the embodiment of the application provides a performance test method of a matching algorithm, which comprises the steps of collecting application running data of a target area, converting the application running data into user application behavior data, wherein the user application behavior data is used for representing attribute characteristic data corresponding to a user when the target area implements a target behavior; therefore, historical behavior data of a user operating an application in a target area can be collected, the user application behavior data is subsequently used in a subsequent testing process, user behavior playback is carried out according to the user application behavior data in a testing environment configured with a matching algorithm to be tested, a testing log generated in the testing environment is collected, and finally the performance of the matching algorithm to be tested is determined according to the application log generated in a historical real environment and the testing log generated in the testing environment. Because the method collects the real user application behavior data in the target area and simulates the user behavior under the test environment by utilizing the real user application behavior data to realize the performance test of the matching algorithm, the matching algorithm is put close to the real application environment to complete the whole test, and the reliability of the test result obtained by the method is higher, so that the operation of a new matching algorithm can be effectively promoted, and the operation risk can be effectively avoided.
Drawings
Fig. 1 is an exemplary diagram of an application scenario of a matching algorithm performance testing method according to an embodiment of the present application;
fig. 2 is a flowchart of a performance testing method of a matching algorithm according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating an acquisition manner of application operation data according to an embodiment of the present application;
fig. 4 is a schematic diagram of a conversion process of application running data provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of a testing environment provided by an embodiment of the present application;
fig. 6 is a schematic diagram of an application result of a matching algorithm performance testing method provided in the embodiment of the present application;
fig. 7 is a schematic diagram of an application result of a matching algorithm performance testing method according to an embodiment of the present application;
fig. 8 is a schematic diagram of an application result of a matching algorithm performance testing method according to an embodiment of the present application;
fig. 9 is an exemplary diagram of an application scenario of a method for testing performance of a matching algorithm according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a performance testing apparatus for a matching algorithm according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a performance testing apparatus for a matching algorithm according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of a matching algorithm performance testing apparatus according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of a matching algorithm performance testing apparatus according to an embodiment of the present disclosure;
fig. 14 is a schematic structural diagram of a matching algorithm performance testing apparatus according to an embodiment of the present disclosure;
fig. 15 is a schematic structural diagram of a matching algorithm performance testing apparatus according to an embodiment of the present disclosure;
fig. 16 is a schematic structural diagram of a server provided in an embodiment of the present application;
fig. 17 is a schematic structural diagram of a matching algorithm performance testing system according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
At present, a large number of client requests are forged by a robot to simulate a large-scale matching scene, so that the performance of a matching algorithm is tested, and the reliability of the test result is unstable due to the fact that the simulation behavior of the test method does not accord with the actual situation of a user.
In practical application, the matching algorithm testing method provided by the application can be applied to equipment with data processing capability, such as a server, and can also be applied to terminal equipment. When the specific hardware is deployed, the method can be applied to an independent server and can also be applied to a cluster server scene. For convenience of description, the following description will be given only by taking the application to the independent server as an example, and the processing logic applied to the cluster server or the terminal device is similar, and the description will not be repeated.
Next, referring to fig. 1, an application scenario of the matching algorithm performance testing method provided in the embodiment of the present application is explained with reference to fig. 1, and first, application requirements of the application scenario are briefly introduced.
Generally, game application is carried out in a development stage or an application stage, corresponding matching algorithms are configured for each application area, before the matching algorithms are put into application, future performance of the matching algorithms in a target area needs to be tested, whether the matching algorithms are directly put into application is determined according to test results, if the test results show that the performance of the matching algorithms meets requirements well, the matching algorithms can be directly put into application, and if the test records that the performance of the matching algorithms does not meet requirements yet, the matching algorithms need to be adjusted, and then the adjusted matching algorithms are tested.
Specifically, referring to the application scenario shown in fig. 1, which includes a database 102 for storing application running data and a server 101, where the server 101 is configured to execute a performance testing method of a matching algorithm provided in the present application, the server 101 obtains the application running data of a target area from the database 102, for example, the application running data generated by a game in an actual application includes a first matching server log recorded by a matching server in a game application environment and corresponding network quality data. Converting application running data into user application behavior data through data conversion, wherein the user application behavior data refers to attribute characteristic data corresponding to users in a target area in an actual game application process, such as user personal attribute characteristic data, team attribute data of a team to which the users belong, network environment data of the users, and the like; the obtained user application behavior data obtained through conversion is equivalent to the actual behavior of the user in the actual application, so that the actual behavior of the user can be simulated to perform performance test on the matching algorithm to be tested in the test environment based on the user application behavior data, and the performance of the matching algorithm can be tested according to the test log tested currently and the application log generated by the application in the actual environment.
In the application scenario, the performance test method for the matching algorithm can efficiently and reliably realize the performance test of the algorithm so as to meet the actual software development and test requirements and timely follow up the iterative update rate of the product.
Next, a detailed description is given of a matching algorithm performance testing method provided in the embodiment of the present application, standing at a server perspective. Referring to fig. 2, a method for testing performance of a matching algorithm is shown, the method comprising:
s201: and acquiring application running data of the target area.
The target area refers to an area needing to be matched with an updating algorithm in a plurality of areas released during application operation; for example, in the initial period of game application operation, the entire large game application launching area is generally divided into a plurality of small areas according to the principles of the total number of users, the number of active users (DAU) in the day, or administrative divisions, for example, if a game application is to be launched in china, the entire large launching area may be divided into small areas such as "chinese", "east of china", and "north of china", and a corresponding matching algorithm is set for each small area to support the battle service of game players, so that when the matching algorithm corresponding to a certain small area needs to be tested, the small area is used as a target area to obtain application operation data corresponding to the target area.
For another example, if a game application is launched into a southeast asia area, the entire southeast asia area can be divided into small areas such as "singapore", "malaysia" and the like for operation, and because the number of active people, the network environment, the horizontal distribution of players and other factors corresponding to each small area have great differences, developers can configure different matching algorithms for different small areas during actual deployment, and before the matching algorithms are launched, the performance needs to be tested first to determine whether the performance meets the business requirements; or, in the running process of the game application, if it is monitored that the performance of the current matching algorithm of a certain area is not good enough, research personnel need to provide a new matching algorithm for the area at the moment, so that the performance of the new matching algorithm needs to be tested first, once the performance meets the requirement, the new matching algorithm can be put into the area for use, and the area is used as a target area.
That is to say, when any one of the areas divided during the operation of the game application needs to test the performance of the matching algorithm in the area, the area is taken as a target area, so as to further acquire the application running data of the target area. If a plurality of areas need to be tested when the performance of the same matching algorithm is expressed in the plurality of areas, the plurality of areas are respectively used as target areas so as to further acquire respective application running data of each target area.
The application running data of the target area refers to running data generated by the game application operated by the user in the target area, and the running data may include a related log generated by the game application operated by the user in the actual game application environment, and is recorded as an application log, wherein the application log may include: matching request logs, matching success logs, and single office settlement logs, and network quality logs, among others.
The matching request log is used for recording personal attribute information of the user, team attribute information, matching request time and other information; the matching success log is used for recording information such as the matching duration and the matching success time of the player; the single-play settlement log is used for recording information such as game duration and game ending time after the battle is played based on the team matching result.
The process of collecting the application operation data will be briefly described with reference to fig. 3. As shown in fig. 3, in a game application environment, game application running data generated in each area by a game application is transmitted to a log server through a user datagram protocol, then logs in the log server are stored in a data warehouse in batches according to log types and area sub-tables, and finally when a matching algorithm corresponding to a certain area needs to be tested, the logs can be imported into a test environment through an IDE tool or a rogue system of a TDW to perform a performance test of the matching algorithm.
In addition, considering that in some scenarios, the amount of users in the target area is very large, and thus the data amount of the application operation data generated in the target area is very large, if all the application operation data are directly used to participate in the subsequent test, a large data pressure is applied to the server participating in the test, and at the same time, the test efficiency is low, so that the performance in terms of balancing the test efficiency and the test quality is balanced, optionally, a part of data with a high value to the test process can be screened out from the application operation data in the target area, for example, some data with the strongest user behavior capability can be collected, specifically, only the application operation data in two time periods of the highest online time and the lowest online time every day can be collected, or the application operation data can be collected according to the user grade proportion or according to the user login time point, and some data are randomly extracted in a staged manner to participate in subsequent tests, so that the test efficiency is improved while the test performance is ensured.
S202: and converting the application running data into user application behavior data.
The application operation data can record the behavior of each stage of the game application process operated by the user, but in order to provide data with practical value for subsequent tests, a data chain which can completely embody the value of the matching algorithm needs to be selected from the data, and the operation behavior data chain which can embody the value of the matching algorithm comprises data generated in the process of initiating a matching request by the user → responding the matching request by the server → successfully matching → playing a single game based on the matching result → settling a single game.
Based on this, in order to quickly sort out the valuable data chains from the application running data, the embodiment of the application also provides a reverse lookup manner, so that the data conversion efficiency can be improved.
The reverse lookup method is to look up a data chain by reverse thinking from the end point of the data chain, that is, from the reverse lookup route of "single-transaction settlement log → successful matching log → matching request log", based on the fact that some matching requests are not responded and some matching results are not adopted by the user, which results in that many data cannot form the data chain and cannot be used as basic data in the test environment.
From the data processing perspective, an explanation will be given below on how user behavior data useful for a subsequent test process is selected from application execution data. The user application behavior data is attribute data which is used for representing that the user correspondingly influences the fighting performance of the user in the application operation process; specifically, firstly, acquiring a single-bureau settlement log from the application running data, and searching a matching success log with the best matching time in the running data according to the single-bureau settlement log and the single-bureau time length; then, according to the searched matching success log and the matching duration, searching a matching request log with the most matched time from the application running data; and finally, according to the searched matching request log, determining the personal attribute information of the user, the team attribute information of the team where the user is and the network quality data recorded by the network quality log corresponding to the matching request time, and taking the network quality data as the user application behavior data.
The user application behavior data obtained through conversion in the above conversion mode is related data which can represent the operation ability of the user in the game in the process of playing the single game based on the matching result after representing that the user initiates the matching request and the matching request is successfully responded in the actual game application process. Specifically, the user application behavior data may include user personal attribute information, team attribute information, network quality, such as user game level (player slot) level, start matching time, team number, team level, player MMR value (match rating), network quality, and the like.
In order to more vividly understand the data conversion process, a specific implementation process of the above S202 is described below with reference to fig. 4.
Referring to fig. 4, the server first obtains a single-game settlement log 401 from the application running data, and calculates the matching success time of the game by using the settlement time and the single-game duration recorded in the single-game settlement log 401; the explanation will be given by taking as an example the first record in the single-round settlement log "player a, battle ID 1, time length of single round 900 seconds, i.e. 15 minutes, settlement time 2018, 1 month, 23 days 18:15: 00". Calculating reference matching success time based on the settlement time minus the time length of a single game, namely subtracting 15:00 from 18:15:00 to obtain the most probable matching success time corresponding to the game as the reference matching success time of 18:00:00, and searching the matching success time closest to the reference matching time related to the first record in the settlement log from the matching success log 402 on the basis of the reference matching success time, for example, searching the first record in the matching success log, wherein the first record comprises that the fighting ID is 1, the player A, the matching success time 2018, 1 month, 23, 17:57:00, the team number is 1 and the matching time length is 10 seconds; the closest matching success time is a matching success time that is closest to the reference matching success time and is not later than the reference matching success time.
Then, the server searches for the closest matching success time based on the above, namely, "2018, 1, 23, 17:57: 00", then, a reference matching request time is obtained by subtracting the matching duration from the matching success time, and 17:56:50 is obtained by 17:57:00-10 seconds as the reference matching request time, then searching the matching request log for the matching request time which is related to the first record in the matching success log and is closest to the reference matching request time, and further determines team attribute information based on the number attribute information of players on the same team at the same match request time, for example, calculating team MMR value with personal MMR value, and the user application behavior data also comprises speed measurement data from the player to each machine room, namely network quality, so that the matching algorithm can match the most suitable opponent for the player. And finally, determining user application behavior data by taking the team as a group, and storing the user application behavior data according to the matching protocol for subsequent test.
S203: and carrying out user behavior playback according to the user application behavior data in a test environment configured with a matching algorithm to be tested, and collecting a test log generated in the test environment.
After the user application behavior data are obtained, the user behavior data are utilized to carry out user behavior playback, namely, user behavior playback is carried out, namely, a matching request is initiated by simulating user behavior so as to test the performance of a matching algorithm, a server initiates the matching request according to corresponding matching request time according to user personal attribute information, team attribute information and network quality information in the user application behavior data, so that user behavior playback is carried out, and logs recorded in the whole process of responding to the matching request in the user behavior playback process by the server are used as test logs.
Because the user application behavior data is obtained by conversion according to historical application running data, the historical application running data is recorded by the server when the target area player plays the game. Therefore, the user behavior playback according to the user application behavior data is consistent with the real matching habit of the target area player, which is equivalent to the real operation of the target area player, so that the testing process is matched with the real user operation process, and the reliability of the testing result is improved.
The whole testing process is performed in a testing environment, which may be specifically set up in the manner shown in fig. 5.
Referring to fig. 5, the test environment described above may be implemented based on a game application real application environment, and as shown in 501, a hardware deployment of a general game application real application environment includes a plurality of servers such as an access server, a lobby server, a forwarding server, a matching server, a room server, and the like. In this case, the game application actual application environment is used as the test environment to complete the test for the matching algorithm, so that an additional environment is not required, and the test can be performed only by using each server deployed in the existing actual application environment.
However, in order to not affect the game application process and the player experience, a test environment is specially set up independently of the actual game application environment, and the test environment only needs to realize the basic logic of the game application and complete the test of the matching algorithm, so that all the business logic of the whole application is not required to be completely realized, the environment deployment can be simplified, the actual application environment can be cut out on the basis of the hardware architecture of the actual application environment, some unnecessary parts are cut out, a simplified hardware architecture is obtained to serve as the test environment, as shown in 502, the access server is deleted, the hall server in 501 is replaced by the SSTOOL packaging tool, and the room server is replaced by the similar Mock Svr, so that the cost can be saved and the test can be accelerated.
S204: and determining the performance of the matching algorithm to be tested according to the application log and the test log.
After the test log is obtained through user behavior playback, whether the performance of the matching algorithm of the current test meets the requirements or not can be determined according to the test log recorded in the test process and the application log recorded in the actual application.
The performance of the matching algorithm can be measured by the matching quality and the matching time, so that during specific implementation, the specific parameter of the matching time length in the test log and the application log can be compared, the difference value of the matching time lengths corresponding to the same matching request in the two logs is calculated, and the performance of the matching algorithm of the current test can be measured by the difference value of the matching time lengths. In addition to performance evaluation based on the matching duration, the matching results of the same matching request reflected by the test log and the application log can be compared, and the matching quality of the matching algorithm of the current test is measured by two pairs of matching results. And when the matching quality reflected by the test log is higher than the matching quality reflected by the application log, determining that the performance of the matching algorithm tested at present is good.
In some cases, the performance of the matching algorithm may be evaluated in stages in combination with the performance of the matching algorithm in different time periods, for example, for target areas with large difference in the number of active users in different time periods or target areas with network environments affected by time, the difference between the matching durations in the application log and the test log in different application time periods may be compared to obtain the matching durations of the matching algorithm to be tested in different application time periods, so that only the performance of the matching algorithm in some important application time periods is concerned.
Of course, in practical application, the evaluation of the two dimensions of the matching duration and the matching quality can be integrated, and the good performance of the matching algorithm under the current test can be finally determined only if the evaluation results of the two dimensions both indicate that the performance of the matching algorithm under the current test is good, otherwise, the performance of the matching algorithm under the current test is considered to be not good enough and not enough to be put into use.
During specific implementation, a time length difference value between the matching time length in the application log and the matching time length in the test log can be determined and used as a matching time length change corresponding to the matching algorithm to be tested; determining the matching quality change corresponding to the matching algorithm to be tested according to the matching result in the application log and the matching result in the test log; and when the matching time length change and the matching quality change corresponding to the matching algorithm to be tested both meet preset conditions, determining that the performance of the matching algorithm to be tested is qualified.
In addition, the application logs and the test logs in different areas and different time periods can be used for carrying out comprehensive quantity taking of the matching algorithm. As shown in fig. 6, when evaluating the matching algorithm, data analysis is performed on multiple groups of test logs and application logs to obtain multiple report data, and then the performance of the matching algorithm is comprehensively evaluated according to the report data. And finally, when the performance of the matching algorithm obtained by testing is unqualified, performing secondary optimization on the matching algorithm, and when the testing result is qualified, issuing the qualified matching algorithm to a target area for updating the matching algorithm, and replacing the original matching algorithm with the new matching algorithm for matching.
Therefore, the performance test method for the matching algorithm provided by the embodiment of the application converts the operation data collected from the target area into the user behavior data before testing the matching algorithm. Because the operation data is generated when the target area actually operates, the user behavior data obtained by converting the operation data generated according to the actual operation of the target area is consistent with the actual behavior of the user. Therefore, the matching algorithm is tested by using the user behavior data, namely the matching algorithm is directly configured in the target area for operation, and the result obtained by the test is very reliable.
In addition, when the performance of a new matching algorithm in a plurality of target areas needs to be tested, the operation data of different areas can be converted and tested, and then the test results of a plurality of areas are compared. As shown in fig. 7, when the matching performance of the matching algorithm in the three areas, namely the target area a, the target area B and the target area C, needs to be analyzed, the operation data of the three areas may be collected, then the operation data is converted into user application behavior data corresponding to the three areas for user behavior playback, then the matching durations in the test log and the playback log corresponding to each area are compared, and finally the variation trends of the matching durations of the three areas are transversely compared, so as to obtain the histogram shown in fig. 7. As can be seen from the data in fig. 7, the performance of the new matching algorithm in the three target areas of the region A, B, C is longer than that of the original matching algorithm, that is, the performance of the new matching algorithm is poor and not good enough, so that the new matching algorithm cannot be used yet.
In some cases, it may be desirable to select a best matching algorithm for the target area from a plurality of matching algorithms, in which case the plurality of matching algorithms may be tested using the same behavioral data and user application behavioral data. As shown in fig. 8, when the matching performance difference of the two matching algorithms to be tested in the target region needs to be compared, a set of user application behavior data may be used to perform user behavior playback on the two matching algorithms to be tested and collect test logs.
Then, the distribution condition of the average MMR difference values in the test logs is analyzed and compared, the MMR difference of each matching algorithm is compared transversely, the performance difference of a plurality of different matching algorithms on the same target area is determined, and the matching algorithm with the best performance on the target area can be further determined.
For the matching algorithm 1 to be tested in fig. 8, the distribution of the corresponding average MMR difference is analyzed, and it can be seen that the ratio of the matching result is the largest when the average MMR difference is 0 to 20, and decreases as the average MMR difference increases, which indicates that the matching is performed by using the matching algorithm 1 to be tested, and the horizontal difference between the two parties in the obtained matching result is small. By transversely comparing the distribution conditions of the original matching algorithm and the matching algorithm 1 to be tested in each average MMR difference interval, the MMR difference values obtained by matching with the matching algorithm 1 to be tested are mainly distributed in the interval of 0-40, and the distribution of the original matching algorithm is relatively even. Compared with the original matching algorithm, the matching algorithm 1 to be tested is used for matching, the level difference between the two parties in the battle is smaller in the obtained matching result, and the matching quality is better.
In order to facilitate understanding of the technical scheme of the present application, the matching algorithm performance testing method provided by the present application is introduced with reference to a specific scenario.
Referring to a scene schematic diagram of a performance testing method for a matching algorithm shown in fig. 9, the scene includes two main parts, namely an application environment and a testing environment, where the application environment is understood to be a game application environment, that is, an actual running environment of a game, and the actual running environment includes N terminals 910 configured with game clients and a running server 920; the testing environment is an environment for testing performance of the new matching algorithm in the target area, and the testing environment 950 may be a tailored environment as described above, and includes a packet sending tool sstTOOL951, a forwarding server 952, a matching server 953, and a receiving server 954. In specific implementation, the application environment and the test environment do not affect each other, only the application running data generated in the application environment is stored to be needed by the subsequent test of the matching algorithm, and the application running data generated in the target area is obtained and converted into user application behavior data only when the test is needed for a certain matching algorithm, so that the user application behavior data is utilized to play back the user behavior in the test environment based on the test environment, and the test of the matching algorithm is realized.
Specifically, in an application environment, when a user operates a game interface, the terminal 910 may send a matching request to the application server 920, and the application server 920 receives the matching request, and then matches the player and starts a game. In this process, various data generated by the application server 920 may be recorded in the database 930. When the performance test of the matching algorithm is needed, the conversion server 940 may first obtain the application running data of the target area from the database 930, and then convert the application running data into the user application behavior data and input the user application behavior data into the test environment 950.
In the test environment 950, the packet sending tool SSTOOL951 may simulate a user behavior according to user application behavior data, send a matching request to the forwarding server 952 according to an actual situation of a user in a target area, then the forwarding server 952 forwards the matching request to the matching server 953, and then the matching server 953 processes the matching request according to a pre-configured matching algorithm to be tested to obtain a matching result and returns the matching result to the forwarding server 952. Finally, the forwarding server 952 sends the received matching result to the receiving server 954, completes the simulated matching process, and records the data generated in the testing process as a test log. The test results can be provided for research and development personnel as basic materials, so that the research and development personnel can comprehensively analyze the test logs and the application logs to obtain the results of the performance test of the matching algorithm to be tested.
The foregoing provides some specific implementation manners of the matching algorithm performance testing method provided in the embodiment of the present application, and based on this, the present application also provides a corresponding apparatus. The above-mentioned device provided by the embodiments of the present application will be described in terms of functional modularity.
Referring to the schematic structural diagram of the matching algorithm performance testing apparatus shown in fig. 10, the apparatus 1000 includes:
an acquisition module 1010, configured to acquire application running data of a target area, where the application running data includes an application log generated in an application environment;
a data conversion module 1020, configured to convert the application running data into user application behavior data, where the user application behavior data is used to represent attribute data that affects the fighting performance of the user and corresponds to the user in the application operating process;
the test module 1030 is configured to, in a test environment configured with a matching algorithm to be tested, perform user behavior playback according to the user application behavior data, and collect a test log generated in the test environment;
and the performance evaluation module 1040 is configured to determine the performance of the matching algorithm to be tested according to the application log and the test log.
Optionally, referring to fig. 11, fig. 11 is a schematic structural diagram of a performance testing apparatus of a matching algorithm according to an embodiment of the present application, and on the basis of the structure shown in fig. 10, the data conversion module 1020 includes:
the first searching submodule 1021 is used for acquiring single-bureau settlement logs from the application running data, and searching matching success logs with the most matched time in the application running data according to the single-bureau settlement logs and the single-bureau duration;
the second searching sub-module 1022 is configured to search, according to the searched matching success log and the searched matching duration, a matching request log with the best matching time from the application running data;
and the data conversion sub-module 1023 is used for determining the personal attribute information of the user, the team attribute information of the team where the user is located and the network quality data corresponding to the matching request time according to the searched matching request log, and using the determined attribute information of the user, the team attribute information of the team where the user is located and the network quality data as the user application behavior data.
Optionally, referring to fig. 12, fig. 12 is a schematic structural diagram of a performance testing apparatus of a matching algorithm according to an embodiment of the present application, and on the basis of the structure shown in fig. 10, the testing module 1030 includes:
the simulation request submodule 1031 is configured to, in a test environment in which a matching algorithm to be tested is configured, simulate a user behavior according to the user behavior application data and send a matching request to a matching server in the test environment, and respond to the matching request through the matching server in the test environment, where the matching request carries the user behavior data;
and the test result obtaining sub-module 1032 is used for obtaining a first test log generated when the matching server responds to the matching request.
Optionally, referring to fig. 13, fig. 13 is a schematic structural diagram of a performance testing apparatus of a matching algorithm according to an embodiment of the present application, and on the basis of the structure shown in fig. 10, the performance evaluation module 1040 includes:
the first evaluation submodule is used for determining a time length difference value of the matching time length in the application log and the matching time length in the test log, and the time length difference value is used as the matching time length change corresponding to the matching algorithm to be tested;
the second evaluation submodule is used for determining the matching quality change corresponding to the matching algorithm to be tested according to the matching result in the application log and the matching result in the test log;
and the determining submodule is used for determining that the performance of the matching algorithm to be tested is qualified when the matching time length change and the matching quality change corresponding to the matching algorithm to be tested both meet preset conditions.
Optionally, referring to fig. 14, fig. 14 is a schematic structural diagram of a performance testing apparatus for a matching algorithm according to an embodiment of the present application, and on the basis of the structure shown in fig. 10, the apparatus 1000 further includes:
the test result summarizing module 1050 is configured to obtain a plurality of test logs, where different test logs are obtained by testing the same matching algorithm to be tested based on application running data of different target areas;
a first multidimensional performance comparison module 1060, configured to determine, according to the multiple test logs and the application log, a performance difference of the matching algorithm to be tested in different target areas.
Optionally, referring to fig. 15, fig. 15 is a schematic structural diagram of a performance testing apparatus for a matching algorithm according to an embodiment of the present application, and on the basis of the structure shown in fig. 10, the apparatus 1000 further includes:
the test result summarizing module 1070 is configured to obtain a plurality of test logs, where different test logs are obtained by testing different matching algorithms to be tested based on application running data in the same target area;
a second multi-dimensional performance comparison module 1080, configured to determine, according to the multiple test logs and the application log, a performance difference of different matching algorithms to be tested in the same target area.
Based on the specific implementation manner of the method and the device provided by the embodiment of the application, the application also provides equipment for realizing the performance test of the matching algorithm. The following describes the apparatus provided in the embodiments of the present application from the perspective of hardware implementation.
Referring to fig. 16, fig. 16 is a schematic structural diagram of a device provided in this embodiment of the present application, which may be a server, where the server 1600 may have relatively large differences due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 1622 (e.g., one or more processors) and a memory 1632, and one or more storage media 1630 (e.g., one or more mass storage devices) for storing applications 1642 or data 1644. Memory 1632 and storage media 1630 may be transient or persistent storage, among others. The program stored on the storage medium 1630 may include one or more modules (not shown), each of which may include a sequence of instructions operating on a server. Further, central processing unit 1622 may be configured to communicate with storage medium 1630 to execute a series of instruction operations on storage medium 1630 at server 1600.
The server 1600 may also include one or more power supplies 1626, one or more wired or wireless network interfaces 1650, one or more input-output interfaces 1658, and/or one or more operating systems 1641, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
The steps performed by the server in the above embodiment may be based on the server structure shown in fig. 16.
The CPU 1622 is configured to execute the following steps:
acquiring application running data of a target area, wherein the application running data comprises an application log generated in an application environment;
converting the application running data into user application behavior data, wherein the user application behavior data is used for representing attribute data which affects the fighting performance of the user and corresponds to the user in the application operation process;
under a test environment configured with a matching algorithm to be tested, carrying out user behavior playback according to the user application behavior data, and collecting a test log generated under the test environment;
and determining the performance of the matching algorithm to be tested according to the test log and the application log.
Optionally, the CPU 1622 is further configured to execute the steps of any implementation manner of the matching algorithm performance testing method provided in the embodiment of the present application.
The embodiment of the present application further provides a performance testing system for a matching algorithm, referring to a schematic structural diagram of the performance testing system for the matching algorithm shown in fig. 17, where the apparatus 1700 includes:
a data collection server 1710, configured to collect application running data of a target area, where the application running data includes an application log generated in an application environment;
for specific implementation of the data acquisition server 1710, reference may be made to the related description of step S201 in the above method embodiment, which is not described herein again;
the data conversion server 1720 is used for converting the application running data into user application behavior data, and the user application behavior data are used for representing attribute data which affect the fighting performance of the user and correspond to the user in the application operation process;
for specific implementation of the data conversion server 1720, reference may be made to the description related to step S202 in the above method embodiment, which is not described herein again;
the matching request initiating server 1730 in the test environment is configured to send a matching request to the matching server in the test environment based on the user application behavior data simulation user behavior;
for specific implementation of the matching request initiation server 1730, refer to the description related to step S203 in the above method embodiment, which is not described herein again;
the matching server 1740 is configured to respond to the matching request through a pre-configured matching algorithm to be tested and send generated log data serving as a test log to the performance evaluation device;
for specific implementation of the matching server 1740, refer to the related description of step S203 in the above method embodiment, which is not described herein again;
the performance evaluation device 1750 is configured to determine, according to the application log and the test log, performance of the matching algorithm to be tested.
For specific implementation of the performance evaluation device 1750, reference may be made to the related description of step S204 in the above method embodiment, and details are not described here again. The embodiment of the present application further provides a computer-readable storage medium, configured to store a program code, where the program code is configured to execute any one implementation of the matching algorithm performance testing method described in the foregoing embodiments.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (15)

1. A performance testing method for a matching algorithm is characterized by comprising the following steps:
acquiring application running data of a target area, wherein the application running data comprises an application log generated in an application environment;
converting the application running data into user application behavior data, wherein the user application behavior data is used for representing attribute data which affects the fighting performance of the user and corresponds to the user in the application operation process;
under a test environment configured with a matching algorithm to be tested, carrying out user behavior playback according to the user application behavior data, and collecting a test log generated under the test environment;
and determining the performance of the matching algorithm to be tested according to the test log and the application log.
2. The method of claim 1, wherein converting the application execution data into user application behavior data comprises:
acquiring single-office settlement logs from the application running data, and searching matching success logs with the best matching time in the running data according to the single-office settlement logs and the single-office duration;
searching a matching request log with the most matched time from the application running data according to the searched matching success log and the searched matching duration;
and determining personal attribute information of the user, team attribute information of the team where the user is located and network quality data recorded by the network quality log corresponding to the matching request time according to the searched matching request log, and taking the network quality data as the user application behavior data.
3. The method of claim 1, wherein the performing, in a test environment configured with a matching algorithm to be tested, user behavior playback according to the user application behavior data and collecting a test log generated in the test environment comprises:
under a test environment configured with a matching algorithm to be tested, simulating user behavior according to the user application behavior data and sending a matching request to a matching server under the test environment, wherein the matching request carries the user application behavior data;
and acquiring a test log generated in the process that the matching server responds to the matching request.
4. The method of claim 1, wherein determining the performance of the matching algorithm to be tested from the application log and the test log comprises:
determining a time length difference value of the matching time length in the application log and the matching time length in the test log as a matching time length change corresponding to the matching algorithm to be tested;
determining the matching quality change corresponding to the matching algorithm to be tested according to the matching result in the application log and the matching result in the test log;
and when the matching time length change and the matching quality change corresponding to the matching algorithm to be tested both meet preset conditions, determining that the performance of the matching algorithm to be tested is qualified.
5. The method of claim 1, further comprising:
obtaining a plurality of test logs, wherein different test logs are obtained by respectively testing the same matching algorithm to be tested based on application running data of different target areas;
and determining the performance difference of the matching algorithm to be tested on different target areas according to the plurality of test logs and the application logs.
6. The method of claim 1, further comprising:
obtaining a plurality of test logs, wherein different test logs are obtained by respectively testing different matching algorithms to be tested based on application running data of the same target area;
and determining the performance difference of different matching algorithms to be tested on the same target area according to the plurality of test logs and the application log.
7. The method of claim 1, further comprising:
and when the performance of the matching algorithm to be tested is tested to be qualified, the matching algorithm to be tested is issued to the target area for updating the matching algorithm.
8. A matching algorithm performance testing apparatus, the apparatus comprising:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring application running data of a target area, and the application running data comprises an application log generated in an application environment;
the data conversion module is used for converting the application running data into user application behavior data, and the user application behavior data is used for representing attribute data which affects the fighting performance of the user and corresponds to the user in the application operation process;
the testing module is used for playing back user behaviors according to the user application behavior data in a testing environment configured with a matching algorithm to be tested and collecting a testing log generated in the testing environment;
and the performance evaluation module is used for determining the performance of the matching algorithm to be tested according to the application log and the test log.
9. The apparatus of claim 8, wherein the data conversion module comprises:
the first searching submodule is used for acquiring single-bureau settlement logs from the application running data and searching matching success logs with the best matching time in the application running data according to the single-bureau settlement logs and the single-bureau duration;
the second searching submodule is used for searching the matching request log with the most matched time from the application running data according to the searched matching success log and the searched matching duration;
and the data conversion sub-module is used for determining the personal attribute information of the user, the team attribute information of the team where the user is located and the network quality data corresponding to the matching request time according to the searched matching request log, and taking the determined personal attribute information of the user, the team attribute information of the team where the user is located and the network quality data as the user application behavior data.
10. The apparatus of claim 8, wherein the testing module is configured to include:
the simulation request submodule is used for simulating user behaviors according to the user behavior application data and sending a matching request to a matching server under the test environment in the test environment provided with a matching algorithm to be tested, and responding to the matching request through the matching server under the test environment, wherein the matching request carries the user application behavior data;
and the test result acquisition submodule is used for acquiring a first test log generated when the matching server responds to the matching request.
11. The apparatus of claim 8, wherein the performance evaluation module comprises:
the first evaluation submodule is used for determining a time length difference value of the matching time length in the application log and the matching time length in the test log, and the time length difference value is used as the matching time length change corresponding to the matching algorithm to be tested;
the second evaluation submodule is used for determining the matching quality change corresponding to the matching algorithm to be tested according to the matching result in the application log and the matching result in the test log;
and the determining submodule is used for determining that the performance of the matching algorithm to be tested is qualified when the matching time length change and the matching quality change corresponding to the matching algorithm to be tested both meet preset conditions.
12. The apparatus of claim 8, further comprising:
the test result summarizing module is used for acquiring a plurality of test logs, and different test logs are obtained by respectively testing the same matching algorithm to be tested based on application running data of different target areas;
and the first multi-dimensional performance comparison module is used for determining the performance difference of the matching algorithm to be tested on different target areas according to the plurality of test logs and the application logs.
13. The apparatus of claim 8, further comprising:
the test result summarizing module is used for acquiring a plurality of test logs, and different test logs are obtained by respectively testing different matching algorithms to be tested based on application running data of the same target area;
and the second multi-dimensional performance comparison module is used for determining the performance difference of different matching algorithms to be tested on the same target area according to the plurality of test logs and the application log.
14. A matching algorithm performance testing system, the system comprising:
the data acquisition server is used for acquiring application running data of a target area, wherein the application running data comprises an application log generated in an application environment;
the data conversion server is used for converting the application running data into user application behavior data, and the user application behavior data is used for representing attribute data which affects the fighting performance of the user and corresponds to the user in the application operation process;
the matching request initiating server under the test environment is used for sending a matching request to the matching server under the test environment based on the user application behavior data simulation user behavior;
the matching server is used for responding to the matching request through a pre-configured matching algorithm to be tested and sending the generated log data serving as a test log to the performance evaluation equipment;
and the performance evaluation equipment is used for determining the performance of the matching algorithm to be tested according to the application log and the test log.
15. A computer-readable storage medium, characterized in that the computer-readable storage medium is used to store a computer program for performing the method of any of claims 1 to 7.
CN201911070954.9A 2019-11-05 2019-11-05 Matching algorithm performance test method, device, equipment, system and medium Active CN110825636B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911070954.9A CN110825636B (en) 2019-11-05 2019-11-05 Matching algorithm performance test method, device, equipment, system and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911070954.9A CN110825636B (en) 2019-11-05 2019-11-05 Matching algorithm performance test method, device, equipment, system and medium

Publications (2)

Publication Number Publication Date
CN110825636A true CN110825636A (en) 2020-02-21
CN110825636B CN110825636B (en) 2021-03-30

Family

ID=69552580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911070954.9A Active CN110825636B (en) 2019-11-05 2019-11-05 Matching algorithm performance test method, device, equipment, system and medium

Country Status (1)

Country Link
CN (1) CN110825636B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111552634A (en) * 2020-03-30 2020-08-18 深圳壹账通智能科技有限公司 Method and device for testing front-end system and storage medium
CN112738632A (en) * 2020-12-28 2021-04-30 深圳创维-Rgb电子有限公司 Method, device and equipment for optimizing performance of smart television and storage medium
CN113750540A (en) * 2021-09-17 2021-12-07 腾讯科技(成都)有限公司 Game matching method, device, storage medium and computer program product
CN114048087A (en) * 2021-11-10 2022-02-15 腾讯科技(深圳)有限公司 Method and device for testing data transfer performance of equipment
CN114328166A (en) * 2020-09-30 2022-04-12 阿里巴巴集团控股有限公司 AB test algorithm performance information acquisition method and device and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101001183A (en) * 2007-01-10 2007-07-18 网之易信息技术(北京)有限公司 Test method and system for network application software
CN104965784A (en) * 2015-06-16 2015-10-07 广州华多网络科技有限公司 Automatic test method and apparatus
CN105553821A (en) * 2015-12-14 2016-05-04 网易(杭州)网络有限公司 Game battle matching method and device
CN108388508A (en) * 2018-01-29 2018-08-10 华南理工大学 A kind of test cases selection method based on user conversation and hierarchical clustering algorithm
CN109173244A (en) * 2018-08-20 2019-01-11 贵阳动视云科技有限公司 Game running method and device
CN109271325A (en) * 2018-10-26 2019-01-25 携程旅游网络技术(上海)有限公司 Test method, system, electronic equipment and the storage medium of application
CN109885477A (en) * 2018-12-24 2019-06-14 苏州蜗牛数字科技股份有限公司 A kind of game automated testing method
US20190227917A1 (en) * 2018-01-19 2019-07-25 Spirent Communications, Inc. Adaptive system for mobile device testing
CN110321270A (en) * 2018-03-29 2019-10-11 广东神马搜索科技有限公司 Single machine performance test methods, device and server

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101001183A (en) * 2007-01-10 2007-07-18 网之易信息技术(北京)有限公司 Test method and system for network application software
CN104965784A (en) * 2015-06-16 2015-10-07 广州华多网络科技有限公司 Automatic test method and apparatus
CN105553821A (en) * 2015-12-14 2016-05-04 网易(杭州)网络有限公司 Game battle matching method and device
US20190227917A1 (en) * 2018-01-19 2019-07-25 Spirent Communications, Inc. Adaptive system for mobile device testing
CN108388508A (en) * 2018-01-29 2018-08-10 华南理工大学 A kind of test cases selection method based on user conversation and hierarchical clustering algorithm
CN110321270A (en) * 2018-03-29 2019-10-11 广东神马搜索科技有限公司 Single machine performance test methods, device and server
CN109173244A (en) * 2018-08-20 2019-01-11 贵阳动视云科技有限公司 Game running method and device
CN109271325A (en) * 2018-10-26 2019-01-25 携程旅游网络技术(上海)有限公司 Test method, system, electronic equipment and the storage medium of application
CN109885477A (en) * 2018-12-24 2019-06-14 苏州蜗牛数字科技股份有限公司 A kind of game automated testing method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
梁力图等: "基于用户会话的Web应用性能测试方法的研究", 《计算机科学》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111552634A (en) * 2020-03-30 2020-08-18 深圳壹账通智能科技有限公司 Method and device for testing front-end system and storage medium
CN114328166A (en) * 2020-09-30 2022-04-12 阿里巴巴集团控股有限公司 AB test algorithm performance information acquisition method and device and storage medium
CN112738632A (en) * 2020-12-28 2021-04-30 深圳创维-Rgb电子有限公司 Method, device and equipment for optimizing performance of smart television and storage medium
CN113750540A (en) * 2021-09-17 2021-12-07 腾讯科技(成都)有限公司 Game matching method, device, storage medium and computer program product
CN113750540B (en) * 2021-09-17 2023-08-18 腾讯科技(成都)有限公司 Game matching method, game matching device, storage medium and computer program product
CN114048087A (en) * 2021-11-10 2022-02-15 腾讯科技(深圳)有限公司 Method and device for testing data transfer performance of equipment

Also Published As

Publication number Publication date
CN110825636B (en) 2021-03-30

Similar Documents

Publication Publication Date Title
CN110825636B (en) Matching algorithm performance test method, device, equipment, system and medium
US11896905B2 (en) Methods and systems for continuing to execute a simulation after processing resources go offline
CN103838982B (en) Virtual game object generating method and device
CN109513215B (en) Object matching method, model training method and server
Wattimena et al. Predicting the perceived quality of a first person shooter: the Quake IV G-model
CN108553903B (en) Method and device for controlling robot player
CN104965695B (en) The method and apparatus of analog subscriber real-time operation
CN110339575B (en) Method and device for determining cheating users in online game
WO2010030313A1 (en) Metrics-based gaming operations
CN107335220B (en) Negative user identification method and device and server
CN111274151B (en) Game testing method, related device and storage medium
CN110585709A (en) Skill attribute adjusting method and device for virtual role
CN113318448B (en) Game resource display method and device, equipment and model training method
CN111984544B (en) Device performance test method and device, electronic device and storage medium
CN106373014A (en) Method and apparatus for assessing health degree of application
US9630108B2 (en) Method and apparatus for testing game data
CN111701240B (en) Virtual article prompting method and device, storage medium and electronic device
Wong et al. Gamers Private Network Performance Forecasting. From Raw Data to the Data Warehouse with Machine Learning and Neural Nets
CN108771869B (en) Performance test method and device, storage medium and electronic device
Pittman et al. Characterizing virtual populations in massively multiplayer online role-playing games
US20090279443A1 (en) Analyzing system of network traffic according to variable communication's mass and analyzing method thereof
Wu et al. Traffic modeling for massive multiplayer on-line role playing game (MMORPG) in GPRS access network
CN113457158B (en) Virtual character configuration method and device, storage medium and electronic device
Yang et al. Looking into online gaming from measurement perspective
CN113760518A (en) Information processing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022561

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant