CN117762760B - Method, device and medium for obtaining hardware performance test score of server - Google Patents

Method, device and medium for obtaining hardware performance test score of server Download PDF

Info

Publication number
CN117762760B
CN117762760B CN202311811485.8A CN202311811485A CN117762760B CN 117762760 B CN117762760 B CN 117762760B CN 202311811485 A CN202311811485 A CN 202311811485A CN 117762760 B CN117762760 B CN 117762760B
Authority
CN
China
Prior art keywords
test
server
case
tested
score
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311811485.8A
Other languages
Chinese (zh)
Other versions
CN117762760A (en
Inventor
花磊
武迪
温涛
崔骥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Boyun Technology Co ltd
Original Assignee
Jiangsu Boyun Technology Co ltd
Filing date
Publication date
Application filed by Jiangsu Boyun Technology Co ltd filed Critical Jiangsu Boyun Technology Co ltd
Priority to CN202311811485.8A priority Critical patent/CN117762760B/en
Publication of CN117762760A publication Critical patent/CN117762760A/en
Application granted granted Critical
Publication of CN117762760B publication Critical patent/CN117762760B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The application relates to a method, a device and a medium for obtaining a hardware performance test score of a server, belonging to the technical field of computers, wherein the method comprises the following steps: responding to a test request of at least one server to be tested; acquiring a case set of a server to be tested according to the test request, wherein the case set comprises at least one hardware test case; invoking a task template corresponding to the test case, wherein the task template comprises a test tool and an execution script; and executing the execution script by using a test tool in the task template to test the test cases corresponding to the task template until all the test cases in the case set are tested, and obtaining the performance test scores of all the test cases in the case set. The method for acquiring the hardware performance test score of the server can realize automatic test of the server, solves the technical problem of low test efficiency of the existing server, obtains the performance test score at the same time, and reduces the difficulty of evaluating the performance of the server by different manufacturers and platforms.

Description

Method, device and medium for obtaining hardware performance test score of server
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, and a medium for obtaining a hardware performance test score of a server.
Background
With the development of novel technologies such as cloud computing and big data, the performance requirements on the broadband, capacity and the like of a processor, a memory, data storage of a server are higher and higher, and how to efficiently evaluate each basic performance of the server has great significance for server manufacturers and server users.
The conventional server performance test technology needs to build a complex compiling environment on a server and manually execute scripts and commands on the server, and if a large number of servers need to be tested in batches, a great deal of time and effort are required to configure the scripts. Moreover, the testing standards for servers produced by different manufacturers are different, which undoubtedly increases the difficulty of clients in evaluating the performance of the servers.
Therefore, how to test server performance quickly and efficiently is a problem that needs to be solved by those skilled in the art.
Disclosure of Invention
The application aims to provide a method for acquiring the hardware performance test score of a server, which can simply and rapidly acquire the performance test result of the server to be tested, improve the test efficiency and reduce the performance evaluation difficulty of servers of different manufacturers.
To achieve the above objective, an embodiment of the present application provides a method for obtaining a hardware performance test score of a server, which is applied to a control platform, and the method for obtaining the hardware performance test score of the server, which is applied to the control platform, and is characterized by comprising the following steps:
responding to a test request of at least one server to be tested, wherein each server to be tested comprises at least one piece of hardware;
Acquiring a case set of the server to be tested according to the test request, wherein the case set comprises test cases of the at least one hardware;
invoking a task template corresponding to the test case, wherein the task template comprises a test tool and an execution script;
And executing the execution script by using the test tool in the task template to test the test cases corresponding to the task template until all the test cases in the case set are tested, and obtaining the performance test scores of all the test cases in the case set.
Optionally, the case set is provided with a benchmark score;
Each test case in the case set has a weight ratio of the test case in the case set, and the sum of the weight ratios of all the test cases in the case set is 1.
Optionally, the method for obtaining the weight proportion of the test case includes the following steps:
Acquiring a performance test result Y i and a corresponding feature X i of the test case;
the weight ratio W i of the test case was calculated using the following formula:
Where α is a coefficient.
Optionally, the method for obtaining the coefficient α includes the following steps:
The following loss function L (α) is constructed:
L (α) =f (α) +λ·α 2, where f (α) is a data fitting loss term function, λ·α 2 is a regularization term, and λ is a regularization parameter;
The loss function is minimized to update the value of α according to the following gradient descent formula:
η is the learning rate;
the alpha is updated iteratively a number of times until the loss function converges or reaches a specified number of iterations.
Optionally, the step of executing the execution script by using the test tool in the task template to test the test cases corresponding to the task template until all the test cases in the case set are tested, and obtaining the performance test scores of all the test cases in the case set, wherein each time one test case is tested, the performance test score of the corresponding test case is output, so that the performance test scores of all the test cases in the case set are obtained when all the test cases in the case set are tested.
Optionally, the method for outputting the performance test score of the corresponding test case after each test case is tested comprises the following steps:
obtaining a reference score J in a case set corresponding to the test case, a weight proportion Q occupied by the test case in the case set and a test result of testing by using a test tool corresponding to the test case;
carrying out standardization treatment on the test result to obtain a standard test result C;
the corresponding performance test score S for the test case is calculated using the following formula:
optionally, the test cases include disk read-write performance test cases;
When the test case is a disk read-write performance test case, the standardized processing is performed on the test result to obtain a standard test result C, which comprises the following steps:
obtaining a disk read-write standard B;
the standard test result C is calculated according to the following formula:
and T is a test result of testing by using a test tool corresponding to the disk read-write performance test case.
In particular, the invention also provides a server screening method, which comprises the following steps:
responding to a screening request of screening out a target server meeting target requirements, and transmitting a test request for testing a plurality of servers to be tested;
based on the hardware performance test method, obtaining the performance test scores of all test cases of the server to be tested;
And comparing the performance test scores of all the test cases of all the servers to be tested with the target test scores contained in the target demands, so as to determine the server closest to the target demands, and taking the server as the target server.
In particular, the present invention also provides a server performance test device, including:
A memory for storing a computer program;
And a processor for implementing the steps of the method for acquiring the hardware performance test score of the server or the steps of the method for screening the server when the computer program is executed.
The application has the beneficial effects that: after responding to a test request of at least one server to be tested, the method for obtaining the hardware performance test score of the server obtains a case set according to the test request, wherein the case set comprises at least one hardware test case, a task module corresponding to the test case is called, a task template comprises a test tool and an execution script, the test tool in the task template is used for executing the execution script to test the test case of the task template until all the test cases in the case set are completed, and the performance test score of all the test cases in the case set is obtained. In the scheme, the test task is issued to at least one server to be tested by responding to the test request, so that the deployment of the test tool and the test script is completed, and the script is not required to be manually executed on the server, thereby realizing the automatic test of the server, effectively improving the performance test efficiency of the server and saving the labor cost.
The performance test scores of all test cases are obtained, a reliability basis is provided for the test results, the results are more direct, the performance test and comparison of the server are simplified, the evaluation difficulty of different manufacturers and platforms on the performance of the server is reduced, the difficulty of server type selection is greatly reduced, and the type selection requirement of a client in the purchase of a high-performance server is greatly met.
The foregoing description is only an overview of the present invention, and is presented in terms of preferred embodiments of the present invention and detailed description of the invention with reference to the accompanying drawings.
Drawings
Fig. 1 is a flow chart of a method for obtaining a hardware performance test score of a server according to an embodiment of the present invention;
Fig. 2 is a flow chart of a server screening method according to an embodiment of the present invention;
Fig. 3 is a schematic structural diagram of a server performance testing apparatus according to an embodiment of the present invention;
Fig. 4 is a schematic structural diagram of a server screening device according to an embodiment of the present invention.
Detailed Description
In order to make the technical solution of the present invention better understood by those skilled in the art, the technical solution of the present invention will be clearly and completely described in the following description of the embodiments of the present invention with reference to the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
Those skilled in the art will appreciate that embodiments of the application may be implemented as a method, apparatus, or computer program product. Accordingly, the present disclosure may take the form of: complete hardware, complete software (including firmware, mainstream software, microcode, etc.), or a combination of hardware and software.
In the application, the control platform performs related operations by a tester to control the server to be tested to execute the test task. Schematically, a tester sets test cases required for testing and corresponding task templates on a control platform, binds related test cases with case sets, and sets a case set for each server to be tested, so as to achieve the purpose of setting test tasks. The server to be tested may be a single server, a server cluster, or a cloud server, and the specific composition form is not limited by the present application. The control platform may be a computer or a system or software installed in the computer, and the present application is not limited thereto.
Example 1
As shown in fig. 1, the present embodiment provides a method for obtaining a hardware performance test score of a server, which is applied to a control platform, and includes the following steps: step S101: responding to a test request of at least one server to be tested, wherein each server to be tested comprises at least one piece of hardware; step S102: acquiring a case set of a server to be tested according to the test request, wherein the case set comprises at least one hardware test case; step S103: invoking a task template corresponding to the test case, wherein the task template comprises a test tool and an execution script; step S104: and executing the execution script by using a test tool in the task template to test the test cases corresponding to the task template until all the test cases in the case set are tested, and obtaining the performance test scores of all the test cases in the case set.
Before the step S101, the performance testing method further includes establishing a connection relationship with at least one server to be tested. Specifically, the control platform acquires the identification information of the server to be tested, and is connected with the server according to the identification information. Before connecting the servers, the tester inputs the identification information of each server into the computer where the control platform is located. The identification information may be a serial number, a host name, an IP address, etc. of the server. It can be understood that only if the connection relationship between the control platform and the server to be tested is successfully established, the control platform can respond to the test request of the server to be tested.
In the step S101, the control platform responds to the test request for at least one server to be tested. The test request carries identification information of the server to be tested. The method comprises the steps that a tester performs relevant test operation on a control platform to generate test requests, each test request corresponds to a test task, the control platform responds to the test requests, and a server to be tested selected by the tester is determined according to identification information carried by the test requests so as to test at least one piece of hardware in the server. Wherein, at least one hardware in the server can be a CPU, a magnetic disk, etc.
In an exemplary embodiment, after receiving the test request, the server to be tested may also return a success message to the control platform, for indicating that the test task is issued successfully. Specifically, the server under test may return a message "success" to the control platform for indicating successful delivery of the test task.
In the step S102, a case set of the server to be tested is obtained according to the test request, and a test task is set for the server. One test request corresponds to one case set and one test task belongs to only one case set. The test data carried by the test request comprises identification information of the server to be tested and a case set name corresponding to the server. The control platform acquires the identification information of the server carried in the test request and the test cases included in the corresponding case set, and sets a test task for each server to be tested according to the identification information and the test cases. In an exemplary embodiment, the data carried in the test request is: { "id": "123", "testName": "test-cpu" }, where the hostname of the server is "123", and the corresponding case set name is "test-cpu", indicating that the test task that the server with hostname 123 needs to perform is: test cases in a set of cases named test-cpu are tested. Of course, it should be noted that the above test request is merely an exemplary embodiment, and the test request may be any other message body in specific implementation, which is not limited in the embodiment of the present application.
Specifically, the case set includes at least one test case of hardware, and a tester is required to perform an operation of adding related test cases when the case set is set on the control platform according to test requirements.
In step S103, the method for calling the task template corresponding to the test case is as follows: the control platform acquires the name of a preset task template, extracts the name as a keyword, and calls an interface to inquire the task template corresponding to the test case.
In an exemplary embodiment, when designing the call interface, interaction of the task template and the test case may be implemented through the restful API type interface using python as the development language, flask as the main framework. Of course, it should be noted that the above method is merely an exemplary embodiment, and any other method may be used to complete the design of the query interface in the specific implementation, which is not limited in the embodiments of the present application.
Specifically, the task template comprises a test tool and an execution script, and the file distribution node and the script execution node are respectively stored in the task template. The file distribution node and the script execution node are connected with the server to be tested through a network. The test tool can be UnixBench test tools or fio test tools.
After the task template corresponding to the test case is successfully called, the control platform automatically generates an application installation instruction. And responding to the installation instruction, and acquiring an installation package of the test tool in the task template and a node corresponding to the execution script by the control platform. Wherein the test tool is used for executing the execution script. The control platform uploads the content of the node to the server to be tested and instructs the server to install the installation package in the file distribution node so that the server has a test tool for executing the execution script. By the mode, the control platform uniformly uploads the test tool and the execution script, and a tester is not required to manually configure the test environment on the server, so that the efficiency of testing the server can be effectively improved.
In the above embodiment, the control platform may send the installation package of the test tool to all servers. In other embodiments, after determining the server under test, the control platform may receive a portion of the servers selected by the tester to perform the test and send the installation package to that portion of the servers.
In step S104, the execution script is executed by using the test tool in the task template to test the test cases corresponding to the task template until all the test cases in the case set are tested, so as to obtain the performance test scores of all the test cases in the case set.
Further, each time the server to be tested finishes testing one test case, outputting the performance test scores of the corresponding test cases on the control platform until the performance test scores of all the test cases are obtained.
In the embodiment, the script is not required to be manually configured on the server to be tested in the whole testing process, and the control platform spontaneously performs the testing step after responding to the testing request, namely, the automatic testing of the performance of the server is realized, the labor cost is greatly saved, and the efficiency of the performance testing of the server is effectively improved. The performance test scores of all the test cases in the case set are obtained, the test results corresponding to different hardware are unified and standardized into specific scores, so that the results are more direct, the performance test and comparison of the server are simplified, and the difficulty of evaluating the performance of the server of different manufacturers by a client is reduced.
Based on the same inventive concept, this embodiment also provides a server screening method, as shown in fig. 2, including the following steps: step S201: responding to the screening requirement of a target server which meets the target requirement, and issuing performance test scores of all test cases of a plurality of servers to be tested; step S202: obtaining the performance test scores of all test cases of the server to be tested based on the hardware performance test score obtaining method of the server; step 203: and comparing the performance test scores of the test cases of all the servers to be tested with the target test scores contained in the target demands, thereby determining the server closest to the target demands, and taking the server as the target server.
In the specific implementation, the control platform issues test tasks to a plurality of servers to be tested, and in the process of obtaining the test scores of all test cases in the servers to be tested, hardware performance tests are performed on all the servers to be tested in a parallel mode. Based on the visual score representation, whether the server to be tested meets the preset performance requirement is determined, and the client can efficiently complete the model selection test of the target server.
Example two
The method for obtaining the hardware performance test score of the server based on the first embodiment obtains the performance test scores of all the test cases in the case set, and comprises the following steps: step S201: obtaining a reference score J in a case set corresponding to a test case, a weight proportion Q occupied by the test case in the case set and a test result of testing by using a test tool corresponding to the test case; step S202: carrying out standardization treatment on the test result to obtain a standard test result C; step S203: the corresponding performance test score S for the test case is calculated using the following formula:
before the step S201, the tester sets the reference score and the weight ratio corresponding to each test case when creating the case set on the control platform according to the performance requirement preset by the customer, and the sum of the weight ratios of all the test cases in the control case set is 1. The test cases comprise a CPU performance test case and a disk read-write performance test case. The reference score is set according to actual needs, and the reference score is determined after the actual needs are determined.
The method for acquiring the weight proportion comprises the following steps:
Step one: obtaining a performance test result Y i and a corresponding feature X i of the test case, wherein i represents an ith test case;
step two: the weight ratio W i of the test case was calculated using the following formula:
Where α is a coefficient.
In the first step, the performance test result Y i and the corresponding feature X i are both test data of the test case. The feature X i represents the features of the ith test case. The features may be, for example, load or usage of the CPU, disk read/write speed, etc.
In the second step, the value of α is extremely important, and the proper value of α enables W i to reflect the influence of Y i relative to X i. By determining the optimal α, it is ensured that the resulting weights accurately reflect the importance relationships between the test cases.
The coefficient α may be determined as follows, and the method for obtaining the coefficient α includes:
Step three: the following loss function L (α) is constructed:
L (α) =f (α) +λ·α 2, where f (α) is a data fitting loss term function, λ·α 2 is a regularization term, and λ is a regularization parameter;
Step four: the loss function is minimized to update the value of α according to the following gradient descent formula:
η is the learning rate;
the alpha is updated iteratively a number of times until the loss function converges or reaches a specified number of iterations.
In the above step three, f (α) may be, for example, a mean square error, a cross entropy loss, or the like. The lambda value may be tried between 0.001 and 10 and the best lambda value is selected by cross-validation or the like. In the fourth step, the first step is performed,Is a falling gradient.
By acquiring the coefficient alpha according to the third step and the fourth step, the optimal alpha can be determined, and the obtained weight can be ensured to accurately reflect the importance relation between the test cases. In the third step, the regularization term λ·α 2 helps to control the complexity of the model, and by penalizing the parameters, the model is prevented from over-fitting due to excessive dependence on training data. Adjusting the magnitude of λ can balance data fitting and model complexity, contributing to a model with more generalization capability. Moreover, the regularization term penalizes the size of the parameters, so that the model is simpler and has generalization capability, and the model can better perform on unseen data. The gradient descent algorithm is helpful to find the minimum value of the loss function, helps to find the optimal balance point between the performance and the complexity of the model, and enables the model to fit data well without being excessively complex.
Further, when the test case is a CPU performance test case, the standard test result C is a score obtained by UnixBench test tools.
When the test case is a disk read-write performance test case, the calculation of the standard test result C comprises the following steps:
Obtaining a disk read-write standard B preset by a user;
the standard test result C is obtained by calculation according to the following calculation formula:
Wherein T is a disk read-write performance test result obtained by the fio test tool.
In an exemplary embodiment, the reference score is set to 100, the weight corresponding to the CPU performance test case is set to 60%, the weight corresponding to the disk read-write performance test case is set to 40%, and the result obtained by UnixBench test tool is 80 score, then the CPU performance test score of the server to be tested is 48 score; the disk read-write performance standard preset by the user is 400mb/s, the result obtained by the fio testing tool is 300mb/s, and the disk read-write performance test score of the server to be tested is 30 minutes. Of course, it should be noted that the above test request is merely an exemplary embodiment, and the specific reference score and the weight ratio may be set to other values in implementation, which is not limited in the embodiment of the present application.
According to the scheme, the performance test score corresponding to the test case is calculated based on the reference score and the weight proportion occupied by the test case, a certain test case is tested according to the preset performance requirement of the client, the test score is obtained, and the model selection requirement of the client in the purchase of the high-performance server is greatly met.
In this embodiment, after the server to be tested completes the performance test task, the test result is returned to the control platform, and the tester can check the start time, end time, time consumption and completion status of the whole task on the control platform, and export the result to the table for the tester to visually check and compare.
The embodiment of the present application further provides a server performance testing apparatus, which may specifically refer to a schematic structural diagram of a server performance testing apparatus provided according to an embodiment of the present application shown in fig. 3, where the apparatus may specifically include a first memory 51 and a first processor 52. Wherein the first memory 51 is used for storing a first computer program. The first processor 52, when executing the first computer program, implements the steps of the method for obtaining a hardware performance test score for a server described in any of the embodiments above.
The embodiment of the present application further provides a server screening device, which may specifically refer to a schematic structural diagram of a server screening device provided according to an embodiment of the present application shown in fig. 4, where the device may specifically include a second memory 61 and a second processor 62. Wherein the second memory 61 is for storing a second computer program. The second processor 62, when executing the second computer program, implements the steps of the server screening method described in any of the embodiments above.
In a specific implementation, the first memory and the second memory may be specifically memory devices for storing information in modern information technology. The memory may comprise a plurality of levels, and in a digital system, may be memory as long as binary data can be stored; in an integrated circuit, a circuit with a memory function without a physical form is also called a memory, such as a RAM, a FIFO, etc.; in the system, the storage device in the form of a memory, such as a memory bank, a TF card, etc. The first and second processors described above may be implemented in any suitable manner. For example, the first and second processors may take the form of, for example, a microprocessor or processor, and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application SPECIFIC INTEGRATED circuits, ASICs, programmable logic controllers, and embedded microcontrollers, etc.
In the foregoing embodiments, the functions and effects of the specific implementation of the server performance testing apparatus and the server screening apparatus may be explained in comparison with other embodiments, which are not described herein in detail.
The embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores the first computer program and the second computer program, the first computer program realizes the steps of the method for obtaining the hardware performance test score of the server in any embodiment when being executed by the first processor, and the second computer program realizes the steps of the method for screening the server in any embodiment when being executed by the second processor.
In the present embodiment, the storage medium includes, but is not limited to, a random access Memory (Random Access Memory, RAM), a Read-Only Memory (ROM), a Cache (Cache), a hard disk (HARD DISK DRIVE, HDD), or a Memory Card (Memory Card). The memory may be used to store computer program instructions. The network communication unit may be an interface for performing network connection communication, which is set in accordance with a standard prescribed by a communication protocol.
In this embodiment, the functions and effects of the program instructions stored in the computer storage medium may be explained in comparison with other embodiments, and are not described herein.
It will be apparent to those skilled in the art that the modules or steps of the embodiments of the application described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, they may alternatively be implemented in program code executable by computing devices, so that they may be stored in a storage device for execution by computing devices, and in some cases, the steps shown or described may be performed in a different order than what is shown or described, or they may be separately fabricated into individual integrated circuit modules, or a plurality of modules or steps in them may be fabricated into a single integrated circuit module. Thus, embodiments of the application are not limited to any specific combination of hardware and software.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the invention, which are described in detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.

Claims (7)

1. A method for obtaining a hardware performance test score of a server is applied to a control platform and is characterized by comprising the following steps:
responding to a test request of at least one server to be tested, wherein each server to be tested comprises at least one piece of hardware;
Acquiring a case set of the server to be tested according to the test request, wherein the case set comprises test cases of the at least one hardware;
invoking a task template corresponding to the test case, wherein the task template comprises a test tool and an execution script;
Executing the execution script by using a test tool in the task template to test the test cases corresponding to the task template until all the test cases in the case set are tested, and obtaining the performance test scores of all the test cases in the case set;
the case set is provided with a benchmark score;
each test case in the case set has a weight proportion of the test case in the case set, and the sum of the weight proportions of all the test cases in the case set is 1;
The method for acquiring the weight proportion of the test case comprises the following steps:
Acquiring a performance test result Y i and a corresponding feature X i of the test case;
The weight ratio of the test cases is calculated by the following formula
Wherein/>Is a coefficient;
The coefficient is The acquisition method of (1) comprises the following steps:
constructing a loss function L% ):
Wherein f (/ >)) Fitting a loss term function to the data,/>As regularization term, λ is regularization parameter;
minimizing the loss function to update according to the gradient descent formula Is the value of (1):
, /> Is the learning rate;
Multiple iteration update Until the loss function converges or reaches a specified number of iterations.
2. The method for obtaining the hardware performance test score of the server according to claim 1, wherein the execution script is executed by the test tool in the task template to test the test cases corresponding to the task template until all the test cases in the case set are tested, and the performance test score of the corresponding test case is output every time one test case is tested in the step of obtaining the performance test score of all the test cases in the case set, so that the performance test score of all the test cases in the case set is obtained when all the test cases in the case set are tested.
3. The method for obtaining the hardware performance test score of the server according to claim 2, wherein the method for outputting the performance test score of the corresponding test case after each test case is tested, comprises the steps of:
obtaining a reference score J in a case set corresponding to the test case, a weight proportion Q occupied by the test case in the case set and a test result of testing by using a test tool corresponding to the test case;
carrying out standardization treatment on the test result to obtain a standard test result C;
the corresponding performance test score S for the test case is calculated using the following formula:
4. The method for obtaining a hardware performance test score of a server according to claim 3, wherein the test cases include disk read-write performance test cases;
When the test case is a disk read-write performance test case, the standardized processing is performed on the test result to obtain a standard test result C, which comprises the following steps:
obtaining a disk read-write standard B;
the standard test result C is calculated according to the following formula:
and T is a test result of testing by using a test tool corresponding to the disk read-write performance test case.
5. The server screening method is characterized by comprising the following steps of:
responding to a screening request of screening out a target server meeting target requirements, and transmitting a test request for testing a plurality of servers to be tested;
Obtaining performance test scores of all test cases of the server to be tested based on the hardware performance test method of any one of claims 1-4;
And comparing the performance test scores of all the test cases of all the servers to be tested with the target test scores contained in the target demands, so as to determine the server closest to the target demands, and taking the server as the target server.
6. A server performance testing apparatus, comprising:
A memory for storing a computer program;
a processor for implementing the steps of the method for obtaining a hardware performance test score of a server according to any one of claims 1-4 or the steps of the method for screening a server according to claim 5 when executing the computer program.
7. A computer-readable storage medium, on which a computer program according to claim 6 is stored, which when being executed by a processor, implements the steps of the hardware performance test score acquisition method of the server according to any one of claims 1 to 4, or the steps of the server screening method according to claim 5.
CN202311811485.8A 2023-12-27 Method, device and medium for obtaining hardware performance test score of server Active CN117762760B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311811485.8A CN117762760B (en) 2023-12-27 Method, device and medium for obtaining hardware performance test score of server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311811485.8A CN117762760B (en) 2023-12-27 Method, device and medium for obtaining hardware performance test score of server

Publications (2)

Publication Number Publication Date
CN117762760A CN117762760A (en) 2024-03-26
CN117762760B true CN117762760B (en) 2024-06-11

Family

ID=

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108255656A (en) * 2018-02-28 2018-07-06 湖州师范学院 A kind of fault detection method applied to batch process
CN115563004A (en) * 2022-10-26 2023-01-03 中国农业银行股份有限公司 Regression test case screening method and device, electronic equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108255656A (en) * 2018-02-28 2018-07-06 湖州师范学院 A kind of fault detection method applied to batch process
CN115563004A (en) * 2022-10-26 2023-01-03 中国农业银行股份有限公司 Regression test case screening method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109582433B (en) Resource scheduling method and device, cloud computing system and storage medium
CN108304201B (en) Object updating method, device and equipment
US10496532B1 (en) Automatically reconfiguring a performance test environment
US9317330B2 (en) System and method facilitating performance prediction of multi-threaded application in presence of resource bottlenecks
US11429434B2 (en) Elastic execution of machine learning workloads using application based profiling
US10452522B1 (en) Synthetic data generation from a service description language model
CN113312072A (en) Configuration file updating method and device, electronic equipment and medium
CN112052082B (en) Task attribute optimization method, device, server and storage medium
CN114580344A (en) Test excitation generation method, verification system and related equipment
CN113157411B (en) Celery-based reliable configurable task system and device
CN110830234A (en) User traffic distribution method and device
US9501591B2 (en) Dynamically modifiable component model
US20200034209A1 (en) Information processing system and resource allocation method
JP2023086678A (en) Method and apparatus for generating and applying deep learning model based on deep learning framework
CN108833592A (en) Cloud host schedules device optimization method, device, equipment and storage medium
CN112559525B (en) Data checking system, method, device and server
CN113127357B (en) Unit test method, apparatus, device, storage medium, and program product
CN113191114A (en) Method and apparatus for authenticating a system
CN117762760B (en) Method, device and medium for obtaining hardware performance test score of server
CN116702668A (en) Regression testing method and device, electronic equipment and storage medium
CN112052152A (en) Simulation test method and device
CN117762760A (en) Method, device and medium for obtaining hardware performance test score of server
CN110399296B (en) Method, system and medium for testing interactive interface between client and server
CN109617954B (en) Method and device for creating cloud host
JP2023541510A (en) Systems, methods, and servers for optimizing containerized application deployment

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant