CN116893885A - Java virtual machine adjusting method and device, electronic equipment and readable storage medium - Google Patents

Java virtual machine adjusting method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN116893885A
CN116893885A CN202311162089.7A CN202311162089A CN116893885A CN 116893885 A CN116893885 A CN 116893885A CN 202311162089 A CN202311162089 A CN 202311162089A CN 116893885 A CN116893885 A CN 116893885A
Authority
CN
China
Prior art keywords
training
configuration parameters
groups
parameters
dimension
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311162089.7A
Other languages
Chinese (zh)
Other versions
CN116893885B (en
Inventor
王永建
严蔚岚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Suzhou Software Technology Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Suzhou Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Suzhou Software Technology Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202311162089.7A priority Critical patent/CN116893885B/en
Publication of CN116893885A publication Critical patent/CN116893885A/en
Application granted granted Critical
Publication of CN116893885B publication Critical patent/CN116893885B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45562Creating, deleting, cloning virtual machine instances

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a Java virtual machine adjusting method, a Java virtual machine adjusting device, electronic equipment and a readable storage medium, which belong to the technical field of data processing, wherein the method is executed by a side car container and comprises the following steps: collecting multiple groups of configuration parameters of the Java virtual machine according to a preset unit time in a preset time period; d groups of configuration parameters are selected from the multiple groups of configuration parameters; training to obtain an influence coefficient according to the d groups of configuration parameters; updating a preset K adjacent algorithm according to the influence coefficient to obtain an updated K adjacent algorithm; and calculating the target configuration parameters according to the plurality of groups of configuration parameters and the updated K-neighbor algorithm. The side car container performs configuration parameter calculation, is decoupled with Java application service, reduces the technical investment of research and development on tuning, can be more focused on product research and development, and can dynamically adjust algorithms for different service services by updating K-neighbor algorithms for calculating configuration parameters through influence coefficients, so that more accurate parameters can be configured for various services, the utilization rate of resources is improved, and the performance of products is improved.

Description

Java virtual machine adjusting method and device, electronic equipment and readable storage medium
Technical Field
The application belongs to the technical field of data processing, and particularly relates to a Java virtual machine adjusting method, a Java virtual machine adjusting device, electronic equipment and a readable storage medium.
Background
Java is the first large programming language and platform that runs on a Java virtual machine (Java Virtual Machine, JVM), and many software services in enterprises are developed based on the Java language, and with the popularity of the containerization technology, these software services also gradually develop toward containerization, and are typically deployed in the containerization engine Kubernetes (k 8 s). When the product is faced with a large access amount and the service performance needs to be optimized, one important link is the optimization of the JVM.
The prior technical proposal is mainly divided into two categories:
1. manually specifying parameters by virtue of developer experience
The optimization of the container service for the JVM mainly comprises the steps of manually designating JVM parameters, manually designating configuration parameters of the JVM through Command Prompt (CMD) or Endpoint (Endpoint) commands in a container engine file (DockerFile) when manufacturing a service container mirror image, monitoring the state of JVM garbage collection by matching with a tool, analyzing Dump (Dump) files, analyzing memory leakage reasons, and manually adjusting the configuration parameters of the JVM according to experience of a developer.
2. Calculating Java performance score for multiple times through monitoring to obtain optimal parameters, and manually designating parameters
And acquiring monitoring data of at least one Java service in a monitoring range, calculating the performance score of the Java service to obtain the performance state of the Java service, and generating tuning parameters for the Java service by utilizing a preset algorithm to realize automatic JVM tuning of the Java service and improve the JVM tuning efficiency.
The disadvantages of scheme 1 above are as follows:
1. the accuracy of the parameters cannot be ensured. Many times, a number is estimated approximately according to experience of a developer, and corresponding parameters are different under different hardware conditions, so that waste of hardware resources is caused.
2. The parameter configuration cannot be dynamically adjusted. With the running of the program and the consumption of machine resources, the configured parameters need to be continuously adjusted, and the JVM parameter value which is initially set is a fixed value, so that the effect of dynamic adjustment cannot be realized.
The disadvantages of scheme 2 above are as follows:
by detecting Java performance data for a period of time, a tuning parameter is obtained, but is not dynamically effective.
Disclosure of Invention
The embodiment of the application provides a Java virtual machine adjusting method, a Java virtual machine adjusting device, electronic equipment and a readable storage medium, which can solve the problems that the existing Java virtual machine adjusting method is poor in accuracy and cannot realize dynamic adjustment.
In a first aspect, there is provided a Java virtual machine adjustment method, the method being performed by a sidecar container, the method comprising:
collecting multiple groups of configuration parameters of the Java virtual machine according to a preset unit time in a preset time period;
d groups of configuration parameters are selected from the multiple groups of configuration parameters;
training to obtain an influence coefficient according to the d groups of configuration parameters;
updating a preset K adjacent algorithm according to the influence coefficient to obtain an updated K adjacent algorithm;
calculating target configuration parameters according to the multiple groups of configuration parameters and the updated K-neighbor algorithm;
the Java virtual machine and the sidecar container are deployed in the same POD, d is a preset positive integer, and the configuration parameters comprise a plurality of configuration parameter dimensions.
Optionally, training to obtain an influence coefficient according to the d groups of configuration parameters includes:
normalizing the d groups of configuration parameters to obtain d groups of normalized training tuples;
calculating a configuration parameter mean value corresponding to each configuration parameter dimension in the d groups of normalized training tuples;
and training according to the d groups of normalized training tuples and the configuration parameter mean value to obtain the influence coefficient corresponding to each configuration parameter dimension.
Optionally, the updating the preset K-neighbor algorithm according to the influence coefficient to obtain an updated K-neighbor algorithm includes:
and determining the updated K adjacent algorithm according to the influence coefficient as follows:
wherein ,for a first set of configuration parameters of the plurality of sets of configuration parameters,for a second set of configuration parameters of the plurality of sets of configuration parameters,in order to configure the number of parameter dimensions,is thatThe first dimension of configuration parametersThe dimensions of the individual configuration parameters are such that,for the first group of configuration parameters belonging to the first groupConfiguration parameters of the individual configuration parameter dimensions,for the second group of configuration parameters belonging to the first groupConfiguration parameters of the individual configuration parameter dimensions,is the firstAnd the corresponding influence coefficients of the configuration parameter dimensions.
Optionally, the calculating the target configuration parameter according to the multiple sets of configuration parameters and the updated K-nearest neighbor algorithm includes:
obtaining a plurality of alternative configuration parameters according to the plurality of groups of configuration parameters and the updated K adjacent algorithm;
calculating the average value of the plurality of alternative configuration parameters;
and determining the average value of the plurality of alternative configuration parameters as the target configuration parameter.
Optionally, the configuration parameter dimension includes:
GC number dimension;
GC time dimension;
complete GC cycle time dimension.
Optionally, the normalizing the d groups of configuration parameters to obtain d groups of normalized training tuples includes:
normalizing each group of parameters in the d groups of configuration parameters according to the following formula to obtain a normalized training tuple:
wherein ,for the normalized training tuples, theFor the training parameters corresponding to GC times dimension in the normalized training tuple, the followingFor the training parameters corresponding to GC time dimension in the normalized training tuples, theTraining parameters corresponding to the full GC cycle time dimension in the normalized training tuple;
for the configuration parameters corresponding to the GC times dimension among the configuration parameters currently undergoing normalization processing,for the smallest configuration parameter corresponding to the GC-number dimension in the d sets of configuration parameters,the maximum configuration parameter corresponding to the GC times dimension in the d groups of configuration parameters is set;
for the configuration parameters corresponding to the GC time dimension among the configuration parameters currently subjected to the normalization process,for the smallest configuration parameter corresponding to the GC-time dimension of the d sets of configuration parameters,the maximum configuration parameter corresponding to the GC time dimension in the d groups of configuration parameters is set;
time dimension pair of complete GC cycle in configuration parameters for current normalization processing The configuration parameters to be used are,for the smallest configuration parameter of the d sets of configuration parameters corresponding to the full GC period time dimension,and the maximum configuration parameter corresponding to the complete GC period time dimension in the d groups of configuration parameters is obtained.
Optionally, training to obtain an influence coefficient corresponding to each configuration parameter dimension according to the d groups of normalized training tuples and the configuration parameter mean value includes:
training according to the following formula to obtain the influence coefficient corresponding to each configuration parameter dimension:
wherein ,for the corresponding influence coefficient of the GC-number dimension,training parameters corresponding to the GC times dimension in the 1 st normalized training tuple in the d normalized training tuples,training parameters corresponding to the GC times dimension in the 2 nd normalized training tuple in the d normalized training tuples,normalized training tuples for the d-th normalized training tuple and GC times in the d-th normalized training tupleThe training parameters corresponding to the dimensions are used for the training,a configuration parameter mean value corresponding to the GC times dimension in the d groups of normalized training tuples;
for the corresponding influence coefficients of the GC-time dimension,training parameters corresponding to the GC time dimension in the 1 st normalized training tuple in the d groups of normalized training tuples, Training parameters corresponding to the GC time dimension in the 2 nd normalized training tuple in the d groups of normalized training tuples,training parameters corresponding to the GC time dimension in the d th normalized training tuple in the d normalized training tuples,a configuration parameter mean value corresponding to the GC time dimension in the d groups of normalized training tuples;
for the impact coefficient corresponding to the full GC cycle time dimension,training parameters corresponding to the full GC cycle time dimension in the 1 st normalized training tuple in the d sets of normalized training tuples,training parameters corresponding to the full GC cycle time dimension in the 2 nd normalized training tuple in the d sets of normalized training tuples,training parameters corresponding to the full GC cycle time dimension in the d-th normalized training tuple,and normalizing the configuration parameter mean value corresponding to the complete GC cycle time dimension in the training tuples for the d groups.
Optionally, the method further comprises one or more of the following:
modifying the configuration parameters of the Java virtual machine through a first command of the Java virtual machine according to the target configuration parameters;
and generating a Dump file through a second command of the Java virtual machine.
In a second aspect, there is provided a Java virtual machine adjustment apparatus, the apparatus being applied to a sidecar container, the apparatus comprising:
the collection module is used for collecting multiple groups of configuration parameters of the Java virtual machine according to a preset unit time in a preset time period;
the selecting module is used for selecting d groups of configuration parameters from the plurality of groups of configuration parameters;
the training module is used for training to obtain an influence coefficient according to the d groups of configuration parameters;
the updating module is used for updating a preset K adjacent algorithm according to the influence coefficient to obtain an updated K adjacent algorithm;
the calculating module is used for calculating target configuration parameters according to the plurality of groups of configuration parameters and the updated K-neighbor algorithm;
the Java virtual machine and the sidecar container are deployed in the same POD, d is a preset positive integer, and the configuration parameters comprise a plurality of configuration parameter dimensions.
Optionally, the training module is specifically configured to:
normalizing the d groups of configuration parameters to obtain d groups of normalized training tuples;
calculating a configuration parameter mean value corresponding to each configuration parameter dimension in the d groups of normalized training tuples;
and training according to the d groups of normalized training tuples and the configuration parameter mean value to obtain the influence coefficient corresponding to each configuration parameter dimension.
Optionally, the updating module is specifically configured to:
and determining the updated K adjacent algorithm according to the influence coefficient as follows:
wherein ,for a first set of configuration parameters of the plurality of sets of configuration parameters,for a second set of configuration parameters of the plurality of sets of configuration parameters,in order to configure the number of parameter dimensions,is thatThe first dimension of configuration parametersThe dimensions of the individual configuration parameters are such that,for the first group of configuration parameters belonging to the first groupConfiguration parameters of the individual configuration parameter dimensions,for the second group of configuration parameters belonging to the first groupConfiguration parameters of the individual configuration parameter dimensions,is the firstAnd the corresponding influence coefficients of the configuration parameter dimensions.
Optionally, the computing module is configured to:
obtaining a plurality of alternative configuration parameters according to the plurality of groups of configuration parameters and the updated K adjacent algorithm;
calculating the average value of the plurality of alternative configuration parameters;
and determining the average value of the plurality of alternative configuration parameters as the target configuration parameter.
Optionally, the configuration parameter dimension includes:
GC number dimension;
GC time dimension;
complete GC cycle time dimension.
Optionally, the training module is specifically configured to:
normalizing each group of parameters in the d groups of configuration parameters according to the following formula to obtain a normalized training tuple:
wherein ,for the normalized training tuples, theFor the training parameters corresponding to GC times dimension in the normalized training tuple, the followingFor the training parameters corresponding to GC time dimension in the normalized training tuples, theTraining parameters corresponding to the full GC cycle time dimension in the normalized training tuple;
for the configuration parameters corresponding to the GC times dimension among the configuration parameters currently undergoing normalization processing,for the smallest configuration parameter corresponding to the GC-number dimension in the d sets of configuration parameters,the maximum configuration parameter corresponding to the GC times dimension in the d groups of configuration parameters is set;
for the configuration parameters corresponding to the GC time dimension among the configuration parameters currently subjected to the normalization process,for the smallest configuration parameter corresponding to the GC-time dimension of the d sets of configuration parameters,the maximum configuration parameter corresponding to the GC time dimension in the d groups of configuration parameters is set;
for the configuration parameters corresponding to the full GC period time dimension among the configuration parameters currently undergoing normalization processing,for the smallest configuration parameter of the d sets of configuration parameters corresponding to the full GC period time dimension,and the maximum configuration parameter corresponding to the complete GC period time dimension in the d groups of configuration parameters is obtained.
Optionally, the training module is specifically configured to:
training according to the following formula to obtain the influence coefficient corresponding to each configuration parameter dimension:
wherein ,for the corresponding influence coefficient of the GC-number dimension,training parameters corresponding to the GC times dimension in the 1 st normalized training tuple in the d normalized training tuples,training parameters corresponding to the GC times dimension in the 2 nd normalized training tuple in the d normalized training tuples,training parameters corresponding to the GC times dimension in the d-th normalized training tuple,normalizing the training tuples for the d groups with GCA configuration parameter mean value corresponding to the frequency dimension;
for the corresponding influence coefficients of the GC-time dimension,training parameters corresponding to the GC time dimension in the 1 st normalized training tuple in the d groups of normalized training tuples,training parameters corresponding to the GC time dimension in the 2 nd normalized training tuple in the d groups of normalized training tuples,training parameters corresponding to the GC time dimension in the d th normalized training tuple in the d normalized training tuples,a configuration parameter mean value corresponding to the GC time dimension in the d groups of normalized training tuples;
For the impact coefficient corresponding to the full GC cycle time dimension,training parameters corresponding to the full GC cycle time dimension in the 1 st normalized training tuple in the d sets of normalized training tuples,training parameters corresponding to the full GC cycle time dimension in the 2 nd normalized training tuple in the d sets of normalized training tuples,training parameters corresponding to the full GC cycle time dimension in the d-th normalized training tuple,and normalizing the configuration parameter mean value corresponding to the complete GC cycle time dimension in the training tuples for the d groups.
Optionally, the apparatus further comprises:
the modification module is used for modifying the configuration parameters of the Java virtual machine through a first command of the Java virtual machine according to the target configuration parameters;
and the generation module is used for generating the Dump file through a second command of the Java virtual machine.
In a third aspect, there is provided an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, there is provided a readable storage medium having stored thereon a program or instructions which when executed by a processor perform the steps of the method according to the first aspect.
In a fifth aspect, a chip is provided, the chip comprising a processor and a communication interface, the communication interface being coupled to the processor, the processor being configured to execute programs or instructions for implementing the method according to the first aspect.
In a sixth aspect, there is provided a computer program/program product stored in a storage medium, the program/program product being executed by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, a plurality of groups of configuration parameters of the Java virtual machine are collected according to a preset unit time in a preset time period through a side car container which is deployed in the same POD as the Java virtual machine, an influence coefficient is obtained through training according to the plurality of groups of configuration parameters, a K adjacent algorithm is updated through the influence coefficient, and a target configuration parameter is calculated through the updated K adjacent algorithm. On one hand, the configuration parameter calculation is executed by the side car container, the side car container is decoupled with Java application service, the technical investment of development and optimization is reduced, the development and optimization are more focused on product development, on the other hand, the influence coefficient is trained by utilizing the collected configuration parameters of the Java virtual machine, the K adjacent algorithm for calculating the configuration parameters is updated through the influence coefficient, dynamic adjustment is realized, the algorithm can be dynamically adjusted according to different service services, more accurate parameters can be configured for various services, the utilization rate of resources is improved, and the performance of products is improved.
Drawings
FIG. 1 is a schematic flow chart of a Java virtual machine adjustment method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a POD structure according to an embodiment of the present application;
FIG. 3 is a second flowchart of a Java virtual machine adjustment method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a Java virtual machine adjusting device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are derived by a person skilled in the art based on the embodiments of the application, fall within the scope of protection of the application.
The terms "first," "second," and the like, herein, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the "first" and "second" distinguishing between objects generally are not limited in number to the extent that the first object may, for example, be one or more. Furthermore, "and/or" in the present application means at least one of the connected objects. For example, "a or B" encompasses three schemes, scheme one: including a and excluding B; scheme II: including B and excluding a; scheme III: both a and B. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The following describes in detail the Java virtual machine adjustment method provided by the embodiment of the present application through some embodiments and application scenarios thereof with reference to the accompanying drawings.
Referring to fig. 1, an embodiment of the present application provides a Java virtual machine adjustment method, which is executed by a sidecar container, and the sidecar container is explained as follows:
the SideCar Container is a Container set based on a SideCar mode (SideCar) in Kubernetes, and the Java virtual machine is deployed in an application Container (App Container) in PODs in Kubernetes, accordingly, the SideCar Container (SideCar Container) and the application Container are in the same POD, and the device and the application Container share resources such as calculation, storage, network and the like in the PODs. Compared with the existing scheme for monitoring Java performance, the existing scheme needs to collect data of a Java virtual machine through a POD external device, the collection difficulty is high, and a sidecar container and the Java virtual machine are deployed in the same POD and share resources such as calculation, storage and network in the POD, so that related data of the Java virtual machine can be collected more easily.
When the POD is initialized, the side car container is started before the application container is started, so that default configuration is conveniently injected into the application container. When the POD is in death, the side car container is in death behind the application container, so that data when the application container is abnormal can be collected, and backup can be stored in time.
The method of the embodiment of the application comprises the following steps:
step 101: collecting multiple groups of configuration parameters of the Java virtual machine according to a preset unit time in a preset time period;
step 102: d groups of configuration parameters are selected from the multiple groups of configuration parameters;
step 103: training to obtain an influence coefficient according to the d groups of configuration parameters;
step 104: updating a preset K adjacent algorithm according to the influence coefficient to obtain an updated K adjacent algorithm;
step 105: calculating target configuration parameters according to a plurality of groups of configuration parameters and an updated K-neighbor algorithm;
the Java virtual machine and the sidecar container are deployed in the same POD, d is a preset positive integer, the d can be flexibly set by a developer according to actual service requirements, the configuration parameters comprise a plurality of configuration parameter dimensions, and the accuracy of calculating the configuration parameters can be improved by the aid of the multidimensional parameters.
It should be noted that, in the foregoing preset period of time, according to a preset unit time, the developer may flexibly set the preset period of time according to an actual service requirement, for example: the preset time period may be 24 hours before the method is performed, and the preset unit time may be 10 minutes;
in the process of selecting the d-group configuration parameters from the multiple groups of configuration parameters, the d-group configuration parameters may be selected based on whether the configuration parameters are similar (specific similar judgment basis, for example, difference value may be flexibly set by a developer according to actual service requirements), and the training of the influence coefficient by selecting the multiple groups of parameters with similar configuration parameters may improve accuracy, where it is understood that the value of d mainly depends on the requirements of data processing, for example: setting 10 groups of configuration parameters to be selected, wherein only 8 groups of parameters meeting the conditions similar to the configuration parameters are set, and preferentially meeting the number of the selected parameter groups at the moment, namely selecting 2 groups of parameters with non-similar configuration parameters to obtain 10 groups of configuration parameters;
The K-nearest neighbor algorithm may also be referred to as a K-nearest neighbor algorithm, and the like, which is not particularly limited in the embodiment of the present application.
The K-nearest neighbor algorithm used in the embodiment of the application is different from the conventional K-nearest neighbor algorithm, the conventional K-nearest neighbor algorithm ignores the influence of the configuration parameters on the optimal value, and the influence degree of the configuration parameters with different dimensions on the optimal value is different in practice, so that in order to calculate more accurately how much the respective influence of the configuration parameters with different dimensions is, the embodiment of the application proposes that the influence coefficient is obtained based on the training of the selected d groups of configuration parameters, and the calculation is performed by using the K-nearest neighbor algorithm updated by the influence coefficient, so that the calculation of the configuration parameters is more accurate.
The target configuration parameter may be considered as a configuration parameter related to the Java virtual machine memory, i.e. may be simply referred to as the optimal value M.
In the embodiment of the application, a plurality of groups of configuration parameters of the Java virtual machine are collected according to a preset unit time in a preset time period through a side car container which is deployed in the same POD as the Java virtual machine, an influence coefficient is obtained through training according to the plurality of groups of configuration parameters, a K adjacent algorithm is updated through the influence coefficient, and a target configuration parameter is calculated through the updated K adjacent algorithm. On one hand, the configuration parameter calculation is executed by the side car container, the side car container is decoupled with Java application service, the technical investment of development and optimization is reduced, the development and optimization are more focused on product development, on the other hand, the influence coefficient is trained by utilizing the collected configuration parameters of the Java virtual machine, the K adjacent algorithm for calculating the configuration parameters is updated through the influence coefficient, dynamic adjustment is realized, the algorithm can be dynamically adjusted according to different service services, more accurate parameters can be configured for various services, the utilization rate of resources is improved, and the performance of products is improved.
Optionally, training to obtain the influence coefficient according to the d groups of configuration parameters includes:
(1) Normalizing the d groups of configuration parameters to obtain d groups of normalized training tuples;
considering that the values of the configuration parameters of different dimensions are in different ranges, the configuration parameters are difficult to process, the normalization processing is firstly performed on the d groups of configuration parameters, so that the parameters have the same measurement scale, for example, the values of the parameters can be mapped to [0,1], and the parameters with the same specific scale are convenient for subsequent training processing.
(2) Calculating a configuration parameter mean value corresponding to each configuration parameter dimension in the d groups of normalized training tuples;
(3) And training to obtain the influence coefficient corresponding to each configuration parameter dimension according to the d groups of normalized training tuples and the configuration parameter mean value.
Optionally, updating the preset K-nearest neighbor algorithm according to the influence coefficient to obtain an updated K-nearest neighbor algorithm, including:
the K-neighbor updating algorithm is determined according to the influence coefficient as follows:
wherein ,for a first set of configuration parameters of the plurality of sets of configuration parameters,for a second set of configuration parameters of the plurality of sets of configuration parameters,in order to configure the number of parameter dimensions,is thatThe first dimension of configuration parametersThe dimensions of the individual configuration parameters are such that,for the first group of configuration parameters belonging to the first group Configuration parameters of the individual configuration parameter dimensions,for the second group of configuration parameters belonging to the first groupConfiguration parameters of the individual configuration parameter dimensions,is the firstAnd the corresponding influence coefficients of the configuration parameter dimensions.
In the embodiment of the application, different influence coefficients are trained according to configuration parameters of different dimensions, and when a plurality of groups of collected configuration parameters of the Java virtual machine are calculated, the different influence coefficients are correspondingly used according to the dimensions of the different configuration parameters, so that the final calculated result is more accurate.
Optionally, calculating the target configuration parameters according to the multiple sets of configuration parameters and the updated K-nearest neighbor algorithm includes:
(1) Obtaining a plurality of alternative configuration parameters according to a plurality of groups of configuration parameters and an updated K adjacent algorithm;
(2) Calculating the average value of a plurality of alternative configuration parameters;
(3) And determining the average value of the plurality of alternative configuration parameters as a target configuration parameter.
In the embodiment of the application, K optimal value results can be obtained based on the updated K adjacent algorithm, and the average value of the K optimal value results is used as a final target configuration parameter.
Optionally, the configuration parameter dimension includes:
GC number dimension;
GC time dimension;
complete GC cycle time dimension.
In the embodiment of the application, configuration parameters of three dimensions including GC times, GC time and complete GC cycle time are specifically collected; the most basic condition for JVM performance tuning is to reduce the number of garbage collection (Garbage Collection, GC) as much as possible, mainly to achieve the following 3 goals:
(1) The GC time is short enough;
(2) the number of GCs is sufficiently small;
(3) the period over which Full GC occurs is long enough;
it is known that as the value of the optimal value M increases, the number of times C of CG becomes smaller, the time T of GC becomes longer, the cycle time FT of FCLL GC becomes longer a little, and as the value of M decreases, the number of times C of GC becomes larger, the time T of GC becomes shorter, the cycle time FT of FCLL GC becomes shorter a little, and as a result, the value of t×c+ft becomes larger relatively, which represents the better stability of the program, such as: when the value of M increases by a factor of M2 from M1, the value of t×c+ft in the case of M2 increases relative to the value of t×c+ft in the case of M1, it is reasonable to increase the value of M, whereas the value of t×c+ft in the case of M2 decreases relative to the value of t×c+ft in the case of M1, and it is reasonable to decrease the value of M. The GC times C are known according to the characteristics of the JVM, the influence of the GC time T on M is larger, and the influence of the Full GC cycle time FT on the optimal M is smaller.
In order to calculate more precisely how much the respective effects C, T, FT are, the present invention proposes the concept of configuring the parameter influence coefficient λ, the magnitude of the λ value being determined by the standard euclidean distance from the target (the optimal M of the JVM).
Optionally, normalizing the d groups of configuration parameters to obtain d groups of normalized training tuples, including:
Normalizing each group of parameters in the d groups of configuration parameters according to the following formula to obtain a normalized training tuple:
wherein ,in order to normalize the training tuples,to normalize the training parameters in the training tuple corresponding to the GC-number dimension,to normalize the training parameters in the training tuple corresponding to the GC-time dimension,to normalize the correspondence in training tuples to the full GC period time dimensionTraining parameters of (a);
for the configuration parameters corresponding to the GC times dimension among the configuration parameters currently undergoing normalization processing,for the smallest configuration parameter corresponding to the GC-number dimension among the d sets of configuration parameters,the maximum configuration parameter corresponding to the GC times dimension in the d groups of configuration parameters is set;
for the configuration parameters corresponding to the GC time dimension among the configuration parameters currently subjected to the normalization process,for the smallest configuration parameter of the d sets of configuration parameters corresponding to the GC-time dimension,the maximum configuration parameter corresponding to the GC time dimension in the d groups of configuration parameters is set;
for the configuration parameters corresponding to the full GC period time dimension among the configuration parameters currently undergoing normalization processing,for the smallest configuration parameter of the d sets of configuration parameters corresponding to the full GC cycle time dimension,and the maximum configuration parameter corresponding to the full GC cycle time dimension in the d groups of configuration parameters is set.
Optionally, training to obtain an influence coefficient corresponding to each configuration parameter dimension according to the d groups of normalized training tuples and the configuration parameter mean value, including:
training according to the following formula to obtain the influence coefficient corresponding to each configuration parameter dimension:
wherein ,for the corresponding influence coefficient of the GC-number dimension,training parameters corresponding to the GC times dimension in the 1 st normalized training tuple in the d normalized training tuples,for training parameters corresponding to the GC-number dimension in the 2 nd normalized training tuple in the d sets of normalized training tuples,for the training parameters corresponding to the GC-number dimension in the d-th normalized training tuple in the d-group normalized training tuples,the average value of configuration parameters corresponding to the GC times dimension in the d groups of normalized training tuples is set;
for the corresponding influence coefficients of the GC-time dimension,for the 1 st normalized training tuple in the d groups of normalized training tuplesTraining parameters corresponding to the GC time dimension,for training parameters corresponding to the GC-time dimension in the 2 nd normalized training tuple in the d sets of normalized training tuples,for training parameters corresponding to the GC-time dimension in the d-th normalized training tuple,the average value of the configuration parameters corresponding to the GC time dimension in the d groups of normalized training tuples is set;
For the impact coefficient corresponding to the full GC cycle time dimension,for training parameters corresponding to the full GC period time dimension in the 1 st normalized training tuple in the d sets of normalized training tuples,for training parameters corresponding to the full GC period time dimension in the 2 nd normalized training tuple in the d sets of normalized training tuples,for training parameters corresponding to the full GC period time dimension in the d-th normalized training tuple,and normalizing the configuration parameter mean value corresponding to the time dimension of the complete GC cycle in the training tuple of the d groups.
Optionally, the method further comprises one or more of the following:
(1) Modifying the configuration parameters of the Java virtual machine through a first command of the Java virtual machine according to the target configuration parameters;
in the embodiment of the application, after the target parameter is calculated, the configuration parameter of the Java virtual machine can be dynamically modified through the corresponding command of the Java virtual machine, and the first command can be particularly a Jinfo command;
(2) The Dump file is generated by a second command of the Java virtual machine.
Considering that when a program is destroyed due to Memory overflow (OOM) and other anomalies, the Dump file is often not stored in time, and the problem Of incapability Of accurately positioning the Dump file after the event occurs; in the embodiment of the application, due to the characteristic that the sidecar container is in death behind the application container, the Dump file can be generated before death, so that a developer can analyze the reasons of the abnormality based on the Dump file.
The technical scheme of the application is described below with reference to fig. 2 and 3:
referring to fig. 2, the relationship between the performance tuning apparatus (deployed in SideCar Container) and the application container in the POD is fully illustrated. The device shares resources such as computation, storage, network, etc. within the POD with the application container, as being in the same POD.
When the POD is initialized, the side car container is started before the application container is started, so that default configuration is conveniently injected into the application container. When the POD is in death, the side car container is in death behind the application container, so that data when the application container is abnormal can be collected, and backup can be stored in time.
The device mainly comprises four modules, namely an acquisition module, a calculation module, an implementation module and a data result module.
Referring to fig. 3, a flow of the technical solution of the present application is shown in the figure, and in combination with the modules in fig. 2, the specific flow is as follows:
(1) And the acquisition module is used for:
and collecting the JVM memory data, the configuration information of the POD and the resource utilization rate at fixed time, and transmitting the data to a computing module.
(2) The calculation module:
firstly, collecting historical data, taking 10 minutes as unit time, and acquiring JVM historical configuration parameters in the unit time, wherein the JVM historical configuration parameters comprise: GC times C, GC times T, full GC cycle times FT. It is well known that the most basic condition for JVM performance tuning is to reduce the number of Garbage Collection (GC) as much as possible, mainly to achieve the following 3 goals:
(1) The GC time is short enough;
(2) the number of GCs is sufficiently small;
(3) the period over which Full GC occurs is long enough;
and calculating the optimal memory M in unit time according to the configuration parameters. The application calculates the optimal M in the next unit time based on the improved K-adjacent algorithm by taking the JVM configuration data and the optimal M in 24 hours as a test set, thereby realizing dynamic tuning.
The improved K-nearest neighbor algorithm is used in the application to classify the JVM optimal value M.
In order to improve the accuracy of division, the K-nearest neighbor algorithm is optimized. In the present application, it is known that as the value of M increases, the number of times C of CG becomes smaller, the time T of GC becomes longer, the period time FT of FCLL GC becomes longer slightly, as the value of M decreases, the number of times C of GC becomes larger, the time T of GC becomes shorter, the period time FT of FCLL GC becomes shorter slightly, and the value of t+c+ft is larger, which represents that the stability of the program is better, for example: when the value of M increases by a factor of M2 from M1, the value of t×c+ft in the case of M2 increases relative to the value of t×c+ft in the case of M1, it is reasonable to increase the value of M, whereas the value of t×c+ft in the case of M2 decreases relative to the value of t×c+ft in the case of M1, and it is reasonable to decrease the value of M. The GC times C are known according to the characteristics of the JVM, the influence of the GC time T on M is larger, and the influence of the Full GC cycle time FT on the optimal M is smaller. In order to calculate more precisely how much the respective effects C, T, FT are, the present application proposes the concept of configuring the parameter influence coefficient λ, λ (0 < λ < 1) being a fuzzy concept, the magnitude of the λ value being determined by the standard euclidean distance of the attribute from the target (optimal M of JVM).
The specific implementation is as follows:
(1) The GC times C, GC time T and Full G of the cycle time FT in the unit time are obtained, and the dimensions of parameters are inconsistent, so that the calculation of the optimal M value can be inaccurate.
(2) Since the values of the C, T, GC parameters are in different ranges, the process data is first normalized, mapping the values of the attributes to [0,1 ]]The parameters are given the same metric. D groups of history data with similar M values are selected as training tuples,obtaining normalized training tuples
(3) Calculating an influence coefficient array of each parameterTraining tuples for normalizationMean value of parameters:
;
updating the K-nearest distance formula by using the influence coefficient array:
;
the improved algorithm is beneficial to improving the classification accuracy of the K-nearest neighbor algorithm.
For the optimal M value of the current unit time, based on the data in the first 24 hours, K optimal M values are obtained by utilizing an improved K-nearest distance formula, and the average value M is used as the optimal value of the current stage, so that JVM dynamic tuning is realized.
(3) The implementation module is as follows:
firstly, a running JVM process number pid (e.g. jsp-q) is found through a JPS command, and then parameters of the JVM (e.g. Jinfo-flag name=value pid) are dynamically modified through a Jinfo command of the JVM. Or by Jmap commands of JVM (e.g., jmap-Dump: format=b, file=/tmp/map.
(4) And a data result module:
on the one hand, the values of the memory changes of the JVM heap are collected and recorded in a log file or other persistent medium. On the other hand, the JVM Dump file generated when the abnormality occurs is saved, and a developer analyzes the Dump file through a Dump analysis tool, so that the root cause of the abnormality such as OOM and the like of the application can be conveniently searched.
According to the Java virtual machine adjusting method provided by the embodiment of the application, the execution main body can be a Java virtual machine adjusting device. In the embodiment of the application, a Java virtual machine adjusting device executes a Java virtual machine adjusting method as an example, and the Java virtual machine adjusting device provided by the embodiment of the application is described.
Referring to fig. 4, an embodiment of the present application provides a Java virtual machine adjustment apparatus, where the apparatus is applied to a sidecar container, and the apparatus includes:
the collection module 401 is configured to collect multiple groups of configuration parameters of the Java virtual machine according to a preset unit time within a preset time period;
a selecting module 402, configured to select d groups of configuration parameters from the multiple groups of configuration parameters;
the training module 403 is configured to train to obtain an influence coefficient according to the d groups of configuration parameters;
the updating module 404 is configured to update a preset K-nearest neighbor algorithm according to the influence coefficient, so as to obtain an updated K-nearest neighbor algorithm;
A calculating module 405, configured to calculate a target configuration parameter according to the plurality of groups of configuration parameters and the updated K-nearest neighbor algorithm;
the Java virtual machine and the sidecar container are deployed in the same POD, d is a preset positive integer, and the configuration parameters comprise a plurality of configuration parameter dimensions.
Optionally, the training module is specifically configured to:
normalizing the d groups of configuration parameters to obtain d groups of normalized training tuples;
calculating a configuration parameter mean value corresponding to each configuration parameter dimension in the d groups of normalized training tuples;
and training to obtain the influence coefficient corresponding to each configuration parameter dimension according to the d groups of normalized training tuples and the configuration parameter mean value.
Optionally, the updating module is specifically configured to:
the K-neighbor updating algorithm is determined according to the influence coefficient as follows:
wherein ,for a first set of configuration parameters of the plurality of sets of configuration parameters,for a second set of configuration parameters of the plurality of sets of configuration parameters,in order to configure the number of parameter dimensions,is thatThe first dimension of configuration parametersThe dimensions of the individual configuration parameters are such that,for the first group of configuration parameters belonging to the first groupConfiguration parameters of the individual configuration parameter dimensions,for the second group of configuration parameters belonging to the first groupConfiguration parameters of the individual configuration parameter dimensions,is the firstAnd the corresponding influence coefficients of the configuration parameter dimensions.
Optionally, the computing module is configured to:
obtaining a plurality of alternative configuration parameters according to a plurality of groups of configuration parameters and an updated K adjacent algorithm;
calculating the average value of a plurality of alternative configuration parameters;
and determining the average value of the plurality of alternative configuration parameters as a target configuration parameter.
Optionally, the configuration parameter dimension includes:
GC number dimension;
GC time dimension;
complete GC cycle time dimension.
Optionally, the training module is specifically configured to:
normalizing each group of parameters in the d groups of configuration parameters according to the following formula to obtain a normalized training tuple:
wherein ,in order to normalize the training tuples,to normalize the training parameters in the training tuple corresponding to the GC-number dimension,to normalize the training parameters in the training tuple corresponding to the GC-time dimension,training parameters corresponding to the full GC cycle time dimension in the normalized training tuple;
for the configuration parameters corresponding to the GC times dimension among the configuration parameters currently undergoing normalization processing,for the smallest configuration parameter corresponding to the GC-number dimension among the d sets of configuration parameters,the maximum configuration parameter corresponding to the GC times dimension in the d groups of configuration parameters is set;
for the configuration parameters corresponding to the GC time dimension among the configuration parameters currently subjected to the normalization process, For the smallest configuration parameter of the d sets of configuration parameters corresponding to the GC-time dimension,the maximum configuration parameter corresponding to the GC time dimension in the d groups of configuration parameters is set;
for the configuration parameters corresponding to the full GC period time dimension among the configuration parameters currently undergoing normalization processing,for the smallest configuration parameter of the d sets of configuration parameters corresponding to the full GC cycle time dimension,and the maximum configuration parameter corresponding to the full GC cycle time dimension in the d groups of configuration parameters is set.
Optionally, the training module is specifically configured to:
training according to the following formula to obtain the influence coefficient corresponding to each configuration parameter dimension:
wherein ,for the corresponding influence coefficient of the GC-number dimension,training parameters corresponding to the GC times dimension in the 1 st normalized training tuple in the d normalized training tuples,normalizing the training tuple for group dTraining parameters corresponding to the GC-number dimension in the 2 normalized training tuples,for the training parameters corresponding to the GC-number dimension in the d-th normalized training tuple in the d-group normalized training tuples,the average value of configuration parameters corresponding to the GC times dimension in the d groups of normalized training tuples is set;
for the corresponding influence coefficients of the GC-time dimension,for training parameters corresponding to the GC-time dimension in the 1 st normalized training tuple in the d sets of normalized training tuples, For training parameters corresponding to the GC-time dimension in the 2 nd normalized training tuple in the d sets of normalized training tuples,for training parameters corresponding to the GC-time dimension in the d-th normalized training tuple,the average value of the configuration parameters corresponding to the GC time dimension in the d groups of normalized training tuples is set;
for the impact coefficient corresponding to the full GC cycle time dimension,for training parameters corresponding to the full GC period time dimension in the 1 st normalized training tuple in the d sets of normalized training tuples,for training parameters corresponding to the full GC period time dimension in the 2 nd normalized training tuple in the d sets of normalized training tuples,for training parameters corresponding to the full GC period time dimension in the d-th normalized training tuple,and normalizing the configuration parameter mean value corresponding to the time dimension of the complete GC cycle in the training tuple of the d groups.
Optionally, the apparatus further comprises:
the modification module is used for modifying the configuration parameters of the Java virtual machine through a first command of the Java virtual machine according to the target configuration parameters;
and the generation module is used for generating the Dump file through a second command of the Java virtual machine.
The Java virtual machine adjusting device in the embodiment of the present application may be an electronic device, for example, an electronic device with an operating system, or may be a component in an electronic device, for example, an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the other device may be a server, network attached storage (Network Attached Storage, NAS), etc., and embodiments of the present application are not limited in detail.
The Java virtual machine adjusting device provided by the embodiment of the present application can realize each process realized by the above method embodiment, and achieve the same technical effects, and for avoiding repetition, the description is omitted here.
Referring to fig. 5, an embodiment of the present application provides an electronic device 500, including: at least one processor 501, memory 502, a user interface 503, and at least one network interface 504. The various components in the electronic device 500 are coupled together by a bus system 505.
It is understood that bus system 505 is used to enable connected communications between these components. The bus system 505 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 505 in fig. 5.
The user interface 503 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, a trackball, a touch pad, or a touch screen, etc.).
It is to be appreciated that memory 502 in embodiments of the application may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (Double Data Rate SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct memory bus RAM (DRRAM). The memory 902 described by embodiments of the present application is intended to comprise, without being limited to, these and any other suitable types of memory.
In some implementations, the memory 502 stores the following elements, executable modules or data structures, or a subset thereof, or an extended set thereof: an operating system 5021 and application programs 5022.
The operating system 5021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application 5022 includes various application programs, such as a media player, a browser, and the like, for implementing various application services. A program for implementing the method according to the embodiment of the present invention may be included in the application 5022.
In an embodiment of the present invention, the electronic device 500 may further include: a program stored on the memory 502 and executable on the processor 501, which when executed by the processor 901, implements the steps of the method provided by embodiments of the present invention.
The method disclosed in the above embodiment of the present invention may be applied to the processor 501 or implemented by the processor 501. The processor 501 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware or instructions in software in the processor 501. The processor 501 may be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a computer readable storage medium well known in the art such as random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, and the like. The computer readable storage medium is located in a memory 502, and the processor 501 reads information in the memory 502 and, in combination with its hardware, performs the steps of the method described above. In particular, the computer readable storage medium has a computer program stored thereon.
It is to be understood that the embodiments of the application described herein may be implemented in hardware, software, firmware, middleware, microcode, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more ASICs, DSPs, digital Signal Processing Devices (DSPDs), programmable logic devices (Programmable Logic Device, PLDs), FPGAs, general purpose processors, controllers, microcontrollers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
The embodiment of the application also provides a readable storage medium, wherein the readable storage medium stores a program or an instruction, and the program or the instruction realizes each process of the Java virtual machine adjusting method embodiment when being executed by a processor, and can achieve the same technical effect, so that repetition is avoided and redundant description is omitted.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc. In some examples, the readable storage medium may be a non-transitory readable storage medium.
The embodiment of the application further provides a chip, which comprises a processor and a communication interface, wherein the communication interface is coupled with the processor, and the processor is used for running programs or instructions to realize the processes of the Java virtual machine adjusting method embodiment, and the same technical effects can be achieved, so that repetition is avoided, and the description is omitted here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, or the like.
The embodiment of the present application further provides a computer program/program product, where the computer program/program product is stored in a storage medium, and the computer program/program product is executed by at least one processor to implement each process of the above-mentioned Java virtual machine adjustment method embodiment, and the same technical effects can be achieved, so that repetition is avoided, and details are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the description of the embodiments above, it will be apparent to those skilled in the art that the above-described example methods may be implemented by means of a computer software product plus a necessary general purpose hardware platform, but may also be implemented by hardware. The computer software product is stored on a storage medium (such as ROM, RAM, magnetic disk, optical disk, etc.) and includes instructions for causing a terminal or network side device to perform the methods according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms of embodiments may be made by those of ordinary skill in the art without departing from the spirit of the application and the scope of the claims, which fall within the protection of the present application.

Claims (18)

1. A Java virtual machine adjustment method, the method performed by a sidecar container, the method comprising:
collecting multiple groups of configuration parameters of the Java virtual machine according to a preset unit time in a preset time period;
D groups of configuration parameters are selected from the multiple groups of configuration parameters;
training to obtain an influence coefficient according to the d groups of configuration parameters;
updating a preset K adjacent algorithm according to the influence coefficient to obtain an updated K adjacent algorithm;
calculating target configuration parameters according to the multiple groups of configuration parameters and the updated K-neighbor algorithm;
the Java virtual machine and the sidecar container are deployed in the same POD, d is a preset positive integer, and the configuration parameters comprise a plurality of configuration parameter dimensions.
2. The method of claim 1, wherein training the resulting influence coefficients based on the d-set of configuration parameters comprises:
normalizing the d groups of configuration parameters to obtain d groups of normalized training tuples;
calculating a configuration parameter mean value corresponding to each configuration parameter dimension in the d groups of normalized training tuples;
and training according to the d groups of normalized training tuples and the configuration parameter mean value to obtain the influence coefficient corresponding to each configuration parameter dimension.
3. The method according to claim 2, wherein updating the preset K-nearest neighbor algorithm according to the influence coefficient, to obtain an updated K-nearest neighbor algorithm, comprises:
And determining the updated K adjacent algorithm according to the influence coefficient as follows:
wherein ,for a first set of configuration parameters of said plurality of sets of configuration parameters +.>For the second set of configuration parameters of the plurality of sets of configuration parameters, is>For configuring the number of parameter dimensions +.>Is->The first part of the dimension of the configuration parameters>A configuration parameter dimension->For the first group of configuration parameters belonging to +.>Configuration parameters of the individual configuration parameter dimensions +.>For the second group of configuration parameters belonging to +.>Configuration parameters of the individual configuration parameter dimensions +.>Is->And the corresponding influence coefficients of the configuration parameter dimensions.
4. A method according to claim 3, wherein said calculating target configuration parameters from said plurality of sets of configuration parameters and said updated K-nearest neighbor algorithm comprises:
obtaining a plurality of alternative configuration parameters according to the plurality of groups of configuration parameters and the updated K adjacent algorithm;
calculating the average value of the plurality of alternative configuration parameters;
and determining the average value of the plurality of alternative configuration parameters as the target configuration parameter.
5. The method of claim 2, wherein the configuration parameter dimension comprises:
the GC number dimension of garbage recovery;
GC time dimension;
complete GC cycle time dimension.
6. The method of claim 5, wherein normalizing the d sets of configuration parameters to obtain d sets of normalized training tuples comprises:
normalizing each group of parameters in the d groups of configuration parameters according to the following formula to obtain a normalized training tuple:
wherein ,for said normalized training tuple, said ++>For training parameters corresponding to GC number dimension in the normalized training tuple, the +.>For training parameters corresponding to GC time dimension in said normalized training tuple, said +.>Training parameters corresponding to the full GC cycle time dimension in the normalized training tuple;
for the configuration parameters corresponding to the GC times dimension in the configuration parameters of the current normalization processing, the ++>For the minimum configuration parameter corresponding to the GC number dimension in the d groups of configuration parameters, the +.>The maximum configuration parameter corresponding to the GC times dimension in the d groups of configuration parameters is set;
for the configuration parameters corresponding to GC time dimension in the configuration parameters currently undergoing normalization processing, the +.>For the minimum configuration parameter corresponding to GC time dimension in the d groups of configuration parameters, the +.>The maximum configuration parameter corresponding to the GC time dimension in the d groups of configuration parameters is set;
For the configuration parameters corresponding to the full GC period time dimension among the configuration parameters currently undergoing normalization processing,for the minimum configuration parameter corresponding to the full GC period time dimension in the d groups of configuration parameters,/I>And the maximum configuration parameter corresponding to the complete GC period time dimension in the d groups of configuration parameters is obtained.
7. The method of claim 5, wherein training to obtain the influence coefficient corresponding to each configuration parameter dimension according to the d groups of normalized training tuples and the configuration parameter mean comprises:
training according to the following formula to obtain the influence coefficient corresponding to each configuration parameter dimension:
wherein ,for the influence coefficient corresponding to the GC number dimension, +.>For training parameters corresponding to GC times dimension in the 1 st normalized training tuple in the d groups of normalized training tuples, the +_>For training parameters corresponding to GC times dimension in the 2 nd normalized training tuple in the d groups of normalized training tuples, the +_>For training parameters corresponding to GC times dimension in the d th normalized training tuple of the d groups of normalized training tuples, the weight of the training parameters is +.>A configuration parameter mean value corresponding to the GC times dimension in the d groups of normalized training tuples;
For the influence coefficient corresponding to the GC time dimension, +.>For training parameters corresponding to GC time dimension in the 1 st normalized training tuple in the d groups of normalized training tuples,/for>For training parameters corresponding to GC time dimension in the 2 nd normalized training tuple in the d groups of normalized training tuples,/for>For training parameters corresponding to GC time dimension in the d th normalized training tuple of the d groups of normalized training tuples>A configuration parameter mean value corresponding to the GC time dimension in the d groups of normalized training tuples;
influence coefficient corresponding to the full GC period time dimension, < ->For training parameters corresponding to the full GC cycle time dimension in the 1 st normalized training tuple of the d groups of normalized training tuples,/a>For training parameters corresponding to the full GC cycle time dimension in the 2 nd normalized training tuple of the d groups of normalized training tuples>For training parameters corresponding to the full GC cycle time dimension in the d th normalized training tuple of the d groups of normalized training tuples,/a>And normalizing the configuration parameter mean value corresponding to the complete GC cycle time dimension in the training tuples for the d groups.
8. The method of claim 1, further comprising one or more of the following:
Modifying the configuration parameters of the Java virtual machine through a first command of the Java virtual machine according to the target configuration parameters;
and generating a Dump file through a second command of the Java virtual machine.
9. A Java virtual machine adjustment apparatus, the apparatus being applied to a sidecar container, the apparatus comprising:
the collection module is used for collecting multiple groups of configuration parameters of the Java virtual machine according to a preset unit time in a preset time period;
the selecting module is used for selecting d groups of configuration parameters from the plurality of groups of configuration parameters;
the training module is used for training to obtain an influence coefficient according to the d groups of configuration parameters;
the updating module is used for updating a preset K adjacent algorithm according to the influence coefficient to obtain an updated K adjacent algorithm;
the calculating module is used for calculating target configuration parameters according to the plurality of groups of configuration parameters and the updated K-neighbor algorithm;
the Java virtual machine and the sidecar container are deployed in the same POD, d is a preset positive integer, and the configuration parameters comprise a plurality of configuration parameter dimensions.
10. The device according to claim 9, wherein the training module is specifically configured to:
Normalizing the d groups of configuration parameters to obtain d groups of normalized training tuples;
calculating a configuration parameter mean value corresponding to each configuration parameter dimension in the d groups of normalized training tuples;
and training according to the d groups of normalized training tuples and the configuration parameter mean value to obtain the influence coefficient corresponding to each configuration parameter dimension.
11. The apparatus of claim 10, wherein the updating module is specifically configured to:
and determining the updated K adjacent algorithm according to the influence coefficient as follows:
wherein ,for a first set of configuration parameters of said plurality of sets of configuration parameters +.>For the second set of configuration parameters of the plurality of sets of configuration parameters, is>For configuring the number of parameter dimensions +.>Is->The first part of the dimension of the configuration parameters>A configuration parameter dimension->For the first group of configuration parameters belonging to +.>Configuration parameters of the individual configuration parameter dimensions +.>For the second group of configuration parameters belonging to +.>Configuration parameters of the individual configuration parameter dimensions +.>Is->And the corresponding influence coefficients of the configuration parameter dimensions.
12. The apparatus of claim 11, wherein the computing module is configured to:
obtaining a plurality of alternative configuration parameters according to the plurality of groups of configuration parameters and the updated K adjacent algorithm;
Calculating the average value of the plurality of alternative configuration parameters;
and determining the average value of the plurality of alternative configuration parameters as the target configuration parameter.
13. The apparatus of claim 10, wherein the configuration parameter dimension comprises:
GC number dimension;
GC time dimension;
complete GC cycle time dimension.
14. The apparatus according to claim 13, wherein the training module is specifically configured to:
normalizing each group of parameters in the d groups of configuration parameters according to the following formula to obtain a normalized training tuple:
wherein ,for said normalized training tuple, said ++>For training parameters corresponding to GC number dimension in the normalized training tuple, the +.>For training parameters corresponding to GC time dimension in said normalized training tuple, said +.>Is saidNormalizing training parameters corresponding to the full GC cycle time dimension in the training tuple;
for the configuration parameters corresponding to the GC times dimension in the configuration parameters of the current normalization processing, the ++>For the minimum configuration parameter corresponding to the GC number dimension in the d groups of configuration parameters, the +.>The maximum configuration parameter corresponding to the GC times dimension in the d groups of configuration parameters is set;
For the configuration parameters corresponding to GC time dimension in the configuration parameters currently undergoing normalization processing, the +.>For the minimum configuration parameter corresponding to GC time dimension in the d groups of configuration parameters, the +.>The maximum configuration parameter corresponding to the GC time dimension in the d groups of configuration parameters is set;
for the configuration parameters corresponding to the full GC period time dimension among the configuration parameters currently undergoing normalization processing,for the minimum configuration parameter corresponding to the full GC period time dimension in the d groups of configuration parameters,/I>And the maximum configuration parameter corresponding to the complete GC period time dimension in the d groups of configuration parameters is obtained.
15. The apparatus according to claim 13, wherein the training module is specifically configured to:
training according to the following formula to obtain the influence coefficient corresponding to each configuration parameter dimension:
wherein ,for the influence coefficient corresponding to the GC number dimension, +.>For training parameters corresponding to GC times dimension in the 1 st normalized training tuple in the d groups of normalized training tuples, the +_>For training parameters corresponding to GC times dimension in the 2 nd normalized training tuple in the d groups of normalized training tuples, the +_>For training parameters corresponding to GC times dimension in the d th normalized training tuple of the d groups of normalized training tuples, the weight of the training parameters is +. >For d isThe average value of configuration parameters corresponding to the GC times dimension in the group normalization training tuple;
for the influence coefficient corresponding to the GC time dimension, +.>For training parameters corresponding to GC time dimension in the 1 st normalized training tuple in the d groups of normalized training tuples,/for>For training parameters corresponding to GC time dimension in the 2 nd normalized training tuple in the d groups of normalized training tuples,/for>For training parameters corresponding to GC time dimension in the d th normalized training tuple of the d groups of normalized training tuples>A configuration parameter mean value corresponding to the GC time dimension in the d groups of normalized training tuples;
influence coefficient corresponding to the full GC period time dimension, < ->For training parameters corresponding to the full GC cycle time dimension in the 1 st normalized training tuple of the d groups of normalized training tuples,/a>For training parameters corresponding to the full GC cycle time dimension in the 2 nd normalized training tuple of the d groups of normalized training tuples>For training parameters corresponding to the full GC cycle time dimension in the d th normalized training tuple of the d groups of normalized training tuples,/a>And normalizing the configuration parameter mean value corresponding to the complete GC cycle time dimension in the training tuples for the d groups.
16. The apparatus of claim 9, wherein the apparatus further comprises:
the modification module is used for modifying the configuration parameters of the Java virtual machine through a first command of the Java virtual machine according to the target configuration parameters;
and the generation module is used for generating the Dump file through a second command of the Java virtual machine.
17. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the Java virtual machine tuning method of any one of claims 1 to 8.
18. A readable storage medium, wherein a program or instructions is stored on the readable storage medium, which when executed by a processor, implements the steps of the Java virtual machine tuning method according to any one of claims 1 to 8.
CN202311162089.7A 2023-09-11 2023-09-11 Java virtual machine adjusting method and device, electronic equipment and readable storage medium Active CN116893885B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311162089.7A CN116893885B (en) 2023-09-11 2023-09-11 Java virtual machine adjusting method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311162089.7A CN116893885B (en) 2023-09-11 2023-09-11 Java virtual machine adjusting method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN116893885A true CN116893885A (en) 2023-10-17
CN116893885B CN116893885B (en) 2024-01-26

Family

ID=88315300

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311162089.7A Active CN116893885B (en) 2023-09-11 2023-09-11 Java virtual machine adjusting method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN116893885B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117369954A (en) * 2023-12-08 2024-01-09 成都乐超人科技有限公司 JVM optimization method and device of risk processing framework for big data construction

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050262394A1 (en) * 2004-04-21 2005-11-24 Fuji Xerox Co., Ltd. Failure diagnosis method, failure diagnosis apparatus, conveyance device, image forming apparatus, program, and storage medium
CN102662836A (en) * 2012-03-28 2012-09-12 易云捷讯科技(北京)有限公司 Evaluation system and method for virtual machine
US20210266227A1 (en) * 2018-07-31 2021-08-26 Nippon Telegraph And Telephone Corporation Service chain accomodation apparatus and service chain accommodation method
CN116431498A (en) * 2023-04-13 2023-07-14 中国工商银行股份有限公司 Performance test method and device, electronic equipment and computer readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050262394A1 (en) * 2004-04-21 2005-11-24 Fuji Xerox Co., Ltd. Failure diagnosis method, failure diagnosis apparatus, conveyance device, image forming apparatus, program, and storage medium
CN102662836A (en) * 2012-03-28 2012-09-12 易云捷讯科技(北京)有限公司 Evaluation system and method for virtual machine
US20210266227A1 (en) * 2018-07-31 2021-08-26 Nippon Telegraph And Telephone Corporation Service chain accomodation apparatus and service chain accommodation method
CN116431498A (en) * 2023-04-13 2023-07-14 中国工商银行股份有限公司 Performance test method and device, electronic equipment and computer readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117369954A (en) * 2023-12-08 2024-01-09 成都乐超人科技有限公司 JVM optimization method and device of risk processing framework for big data construction
CN117369954B (en) * 2023-12-08 2024-03-05 成都乐超人科技有限公司 JVM optimization method and device of risk processing framework for big data construction

Also Published As

Publication number Publication date
CN116893885B (en) 2024-01-26

Similar Documents

Publication Publication Date Title
CN116893885B (en) Java virtual machine adjusting method and device, electronic equipment and readable storage medium
WO2021169473A1 (en) Model performance optimization method, apparatus and device, and storage medium
US8370359B2 (en) Method to perform mappings across multiple models or ontologies
US8832143B2 (en) Client-side statement cache
US20220253856A1 (en) System and method for machine learning based detection of fraud
WO2015058578A1 (en) Method, apparatus and system for optimizing distributed computation framework parameters
WO2021004324A1 (en) Resource data processing method and apparatus, and computer device and storage medium
CN106293891B (en) Multidimensional investment index monitoring method
CN110399268B (en) Abnormal data detection method, device and equipment
WO2020233360A1 (en) Method and device for generating product evaluation model
CN116561542B (en) Model optimization training system, method and related device
CN113434482A (en) Data migration method and device, computer equipment and storage medium
US10782942B1 (en) Rapid onboarding of data from diverse data sources into standardized objects with parser and unit test generation
CN106919501A (en) Static Analysis Method and instrument based on defect mode
CN110992198A (en) Crop disease control scheme recommendation method, device, system, equipment and medium
CN110928941B (en) Data fragment extraction method and device
CN117172093A (en) Method and device for optimizing strategy of Linux system kernel configuration based on machine learning
Shan et al. Software defect prediction model based on improved LLE-SVM
CN109344050B (en) Interface parameter analysis method and device based on structure tree
CN115187191A (en) Scientific research project progress monitoring method and system based on teaching centralized control management
CN113986860A (en) Log classification method, system, device and medium based on convolutional neural network
US20210286809A1 (en) System for generating predicate-weighted histograms
CN115203500A (en) Method and device for enriching user tags, computer equipment and storage medium
TWI764474B (en) Data compression system and method thereof
CN116739545B (en) Method and device for improving intelligent message touch rate

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant