CN112861964B - Network function extension detection method and device, storage medium and electronic equipment - Google Patents

Network function extension detection method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112861964B
CN112861964B CN202110163214.0A CN202110163214A CN112861964B CN 112861964 B CN112861964 B CN 112861964B CN 202110163214 A CN202110163214 A CN 202110163214A CN 112861964 B CN112861964 B CN 112861964B
Authority
CN
China
Prior art keywords
network function
chain
function
network
representation vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110163214.0A
Other languages
Chinese (zh)
Other versions
CN112861964A (en
Inventor
刘莹
何林
李丽姗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202110163214.0A priority Critical patent/CN112861964B/en
Publication of CN112861964A publication Critical patent/CN112861964A/en
Application granted granted Critical
Publication of CN112861964B publication Critical patent/CN112861964B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention provides a network function extension detection method, a device, a storage medium and an electronic device, wherein the network function extension detection method comprises the following steps: collecting state characteristic information of network functions and service function chain information; converting the state characteristic information of the network function and the service function chain information into a chain perception network function representation vector with a preset length; and inputting the chain perception network function expression vector in the continuous time interval into a preset decision model to carry out an expansion decision, and deciding an expansion action. The invention improves the accuracy and timeliness of extension detection, and is beneficial to improving the system performance and reducing the system operation and maintenance expenditure.

Description

Network function extension detection method and device, storage medium and electronic equipment
Technical Field
The present invention relates to the field of internet technologies, and in particular, to a method and an apparatus for detecting network function expansion, a storage medium, and an electronic device.
Background
In a real Network, a large number of Network functions (also called middleware) are deployed to perform more complex data processing tasks besides routing forwarding, wherein typical applications include DHCP (Dynamic Host Configuration Protocol), NAT (Network Address Translation), IDS (Intrusion detection System), IPS (Intrusion Prevention System), and the like. Conventional network functions are implemented by proprietary hardware devices. In recent years, network function virtualization technologies have been proposed and extensively studied with the aim of replacing traditional proprietary hardware middleware devices with software-implemented network functions. Most of NFV (Network Functions Virtualization) systems adopt a design concept of "SDN (Software Defined Network, SDN) + NF (Network function)", decouple a control layer from a data layer, and simplify and standardize operation and maintenance management by using a centralized control method, where the data layer is responsible for the bottom layer physical implementation of a virtual Network function in a general-purpose server, and the control layer is responsible for all Network function arrangement and centralized operation and maintenance management.
One of the core advantages of the NFV system is online resource scheduling flexibility. In the face of dynamically changing traffic load, the NFV system elastically expands in units of examples. Each instance (e.g., NAT, firewall, intrusion detection system, etc.) corresponds to a virtual machine, which can schedule a fixed amount of underlying resources. NFV systems utilize virtualization technology to enable mirroring, copying and starting a new instance based on network functions. The unified controller controls the distribution of the inlet flow to a plurality of instances and performs cooperative processing. The ultimate goal of elastic expansion includes two aspects: on the basis of meeting the system performance requirement, the number of the examples is reduced as much as possible, namely the system performance is guaranteed with the least resources.
In the NFV system, an integrated controller monitors multidimensional state information of each network function, including an ingress traffic rate, an egress traffic rate, a CPU utilization rate, a memory utilization rate, a packet processing delay, and the like. On the basis, in the face of the dynamically changing flow demand, the centralized controller is responsible for dynamically starting or destroying the instance so as to realize the elastic expansion of the network function and the dynamic scheduling of the bottom layer resource and ensure the online stable operation of the system. The purpose of the expansion detection is to accurately and timely capture the bottleneck network function, and allocate more resources by taking an example as a unit, and the accuracy and timeliness of the expansion detection directly relate to the system performance and the system operation and maintenance overhead.
Disclosure of Invention
In order to improve the accuracy and timeliness of extension detection, the invention provides a network function extension detection method, a network function extension detection device, a storage medium and electronic equipment, which are beneficial to improving the system performance and reducing the system operation and maintenance overhead.
In a first aspect, an embodiment of the present invention provides a method for detecting network function extension, including:
collecting state characteristic information of network functions and service function chain information;
converting the state characteristic information of the network function and the service function chain information into a chain perception network function expression vector with a preset length;
and inputting the chain perception network function expression vector in the continuous time interval into a preset decision model for expansion decision, and deciding an expansion action.
In some embodiments, the converting the state feature information and the service function chain information of the network function into a chain-aware network function representation vector of a preset length includes:
generating a network function level expression vector for describing state characteristic information of the network function;
generating a global service function chain level representation vector for describing service function chain information, wherein the service function chain information comprises a connection relation between network functions;
and generating a chain-aware network function representation vector for describing a network function, wherein the network function comprises a network function level representation vector of the network function and a global service function chain level representation vector.
In some embodiments, the act of expanding comprises one of:
adding a new instance;
no extension is made; and
one instance is removed.
In some embodiments, the status characteristic information of the network function includes at least one of:
an inlet flow rate;
an outlet flow rate;
packet processing latency;
CPU utilization; and
memory utilization.
In some embodiments, the generating the network function level representation vector comprises:
generating a network function level representation vector at time t using the following equation
Figure BDA0002936364770000031
Figure BDA0002936364770000032
Figure BDA0002936364770000033
Wherein network function j is a downstream node of network function i;
Figure BDA0002936364770000034
the method comprises the steps of mapping state characteristic information of a network function i at the time t to an implicit state space;
Figure BDA0002936364770000035
the state characteristic information of the network function j at the moment t is mapped to an implicit state space;
Figure BDA0002936364770000036
and
Figure BDA0002936364770000037
respectively representing a weight matrix and a deviation of the network function i; d represents a hyper-parameter.
In some embodiments, the generating a global service function chain level representation vector comprises:
generating a global service function chain level representation vector using the following equation
Figure BDA0002936364770000038
Figure BDA0002936364770000039
Wherein C is a service function chain virtual node, and all network functions are downstream nodes of the virtual node C,
Figure BDA00029363647700000310
the state characteristic information of the network function j at the moment t is mapped to an implicit state space; d represents a hyper-parameter.
In some embodiments, the inputting the chain-aware network function representation vector in the continuous time interval into a preset decision model for performing an extended decision, and deciding an extended action includes:
will continue for a time interval [1, M]Inner chain-aware network function representation vector as input sequence
Figure BDA00029363647700000311
Inputting a preset decision model; the preset decision model is as follows:
Figure BDA00029363647700000312
Figure BDA00029363647700000313
Figure BDA00029363647700000314
Figure BDA00029363647700000315
wherein, the current time k belongs to [1, M ]]The chain-aware network function representation vector at the current time k is
Figure BDA00029363647700000316
σ is a sigmoid function;
Figure BDA00029363647700000317
an update gate representing a network function i;
Figure BDA00029363647700000318
a reset gate representing a network function i;
Figure BDA00029363647700000319
representing the memory content of the network function i at the current time k;
Figure BDA00029363647700000320
the output of the memory content of the network function i at the current time k is shown; as indicates Hadamard product operation;
Figure BDA00029363647700000321
Figure BDA00029363647700000322
respectively representing the weight parameter of the network function i, and d represents the hyper-parameter;
output of memory content at time M for network function i
Figure BDA00029363647700000323
Calculating probability p of expansion action of network function i by using softmax function i
Figure BDA0002936364770000041
Wherein the content of the first and second substances,
Figure BDA0002936364770000042
and
Figure BDA0002936364770000043
respectively representing a weight matrix and a deviation of the network function i;
probability p i The largest extended action is taken as the extended action of the decided network function i.
In a second aspect, an embodiment of the present invention provides a network function expansion detection apparatus, including:
the collection module is used for collecting the state characteristic information of the network function and the service function chain information;
the conversion module is used for converting the state characteristic information of the network function and the service function chain information into a chain perception network function representation vector with a preset length;
and the decision module is used for inputting the chain perception network function representation vector in the continuous time interval into a preset decision model for expansion decision, and deciding an expansion action.
In a third aspect, an embodiment of the present invention provides a storage medium, on which a computer program is stored, and when the computer program is executed by one or more processors, the method according to the first aspect is implemented.
In a fourth aspect, an embodiment of the present invention provides an electronic device, including a memory and a processor, where the memory stores a computer program, and the computer program implements the method according to the first aspect when executed by the processor.
One or more embodiments of the invention have at least the following beneficial effects:
the invention provides a network function extension detection method, a device, a storage medium and electronic equipment, which are used for collecting state characteristic information and service function chain information of a network function and converting the state characteristic information and the service function chain information of the network function into a chain sensing network function expression vector with a preset length; and the chain sensing network function expression vector in the continuous time interval is input into a preset decision model to carry out an extension decision, and an extension action is decided, so that the accuracy and timeliness of extension detection are improved, and the improvement of the system performance and the reduction of the system operation and maintenance overhead are facilitated.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 shows the effect of the network function connection order on the timeliness of the extension detection, where (1) NFV overload, (2) network monitoring becomes NFV system bottleneck, (3) virtual private network becomes NFV system bottleneck, and (4) NFV system completes resource scheduling;
fig. 2 is a flowchart of a network function expansion detection method according to an embodiment of the present invention;
fig. 3 is a flowchart of a network function expansion detection method according to an embodiment of the present invention;
FIG. 4 is an example of a chain-aware network function representation vector of a preset length provided by an embodiment of the present invention;
fig. 5 is a block diagram of a network function expansion detection apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
The applicant has analyzed the impact of the connection order on the timeliness of the extended detection as follows:
in the NFV system, the resource scheduling policy of each network function is affected by the neighboring network function nodes. As shown in fig. 1(1), the upstream low-throughput network monitoring first detects an expansion requirement, and performs related elastic expansion operations including instance start-up and instance migration due to detecting NFV system overload; then, as shown in fig. 1(2), the network monitoring becomes the bottleneck of the NFV system, and as shown in fig. 1(3), the network monitoring is extended, and the virtual private network becomes the bottleneck; as further shown in fig. 1(4), the virtual private network is extended.
It can be seen that, in the process of starting the network function instance, the downstream high-throughput virtual private network cannot sense the increased traffic, and the extended demand can be sniffed only after the network monitoring completely completes the elastic extension so that the performance of the overall NFV system is improved. Typically, the total time for elastic expansion of a single network function is on the order of minutes, including instance start-up time and instance migration time. The above problems will cause the overall resource scheduling delay of the system to be delayed for several minutes, and the system performance cannot be effectively guaranteed. Therefore, how to avoid the influence of the connection sequence of the network functions on the timeliness of the resource scheduling of the network functions becomes an important research problem.
The embodiment of the invention provides a network function extension detection method, a network function extension detection device, a storage medium and electronic equipment, so that extension detection can be accurately carried out in time, and the influence of a network function connection sequence on the timeliness of resource scheduling of a network function is avoided.
Example one
Fig. 2 shows a flowchart of a network function extension detection method, and as shown in fig. 2, the present embodiment provides a network function extension detection method, which includes steps S110 to S130:
step S210, collecting status feature information of the network function and service function chain information.
In some embodiments, the status characteristic information of the network function includes at least one of:
an inlet flow rate;
an outlet flow rate;
packet processing latency;
CPU utilization; and
and (4) utilization rate of the memory.
In practical application, for any network function i, the initial state characteristic information of the network function i at the time t
Figure BDA0002936364770000061
Wherein:
Figure BDA0002936364770000062
representing the ingress traffic rate of the network function i at time t,
Figure BDA0002936364770000063
representing the egress traffic rate of the network function i at time t,
Figure BDA0002936364770000064
indicating the packet processing delay of the network function i at time t,
Figure BDA0002936364770000065
indicating the CPU utilization of the network function i at time t,
Figure BDA0002936364770000066
and (4) representing the memory utilization rate of the network function i at the time t.
A service function chain includes any number of network functions connected in any manner between them. Regarding the service function chain as a graph, representing service function chain information by using a graph C ═ (V, E); wherein V represents a set of all network functions in the service function chain; e represents the set of directed edge connections between network functions in the service function chain.
Step S220, converting the state feature information of the network function and the service function chain information into a chain-aware network function representation vector of a preset length.
In some embodiments, as shown in fig. 3, converting the state feature information of the network function and the service function chain information into a chain-aware network function representation vector of a preset length, further includes steps S310 to S330:
step S310, generating a network function level expression vector for describing the state characteristic information of the network function.
In some embodiments, assuming that the network function j is a downstream node of the network function i, i → j exists, then generating a network function level representation vector comprising:
generating a network function level representation vector at time t using the following equation
Figure BDA0002936364770000067
Figure BDA0002936364770000068
Figure BDA0002936364770000069
Wherein network function j is a downstream node of network function i;
Figure BDA0002936364770000071
the method comprises the steps of mapping state characteristic information of a network function i at the time t to an implicit state space;
Figure BDA0002936364770000072
the state characteristic information of the network function j at the moment t is mapped to an implicit state space;
Figure BDA0002936364770000073
and
Figure BDA0002936364770000074
respectively representing a weight matrix and a deviation of the network function i, which can be randomly initialized and updated in the training process of the decision model; d represents a hyperparameter representing the dimension of the vector.
The generated network function level representation vector can effectively characterize each network function and the connection relation between the network function node and the downstream network function node.
Step S320, generating a global service function chain level representation vector for describing service function chain information, where the service function chain information includes a connection relationship between network functions.
In some embodiments, assuming that there is a global service function chain virtual node C and all network functions are downstream nodes of the virtual node C, generating a global service function chain level representation vector includes:
generating a global service function chain level representation vector using the following equation
Figure BDA0002936364770000075
Figure BDA0002936364770000076
Wherein C is a service function chain virtual node, and all network functions are downstream nodes of the virtual node C,
Figure BDA0002936364770000077
indicating that the state characteristic information of the network function j at time t is mapped to the implicit state space.
Step S330, generating a chain-aware network function representation vector for describing a network function, including a network function level representation vector and a global service function chain level representation vector of the network function.
For the network function i, the above network function level is represented as a vector
Figure BDA0002936364770000078
Global service function chain level representation vector
Figure BDA0002936364770000079
Aggregation is carried out, and then a chain perception network function representation vector of the network function i at the time t can be obtained
Figure BDA00029363647700000710
Chain-aware network function representation vector at time t through network function i
Figure BDA00029363647700000711
The network function i can be accurately characterized.
An example of converting the status feature information and the service function chain information of the network function into a chain-aware network function representation vector with a preset length is shown in fig. 4, where the network function VNF 1 ~VNF 7 Forming a service function chain, generating for each network function its network function level representation vector v 1 ~v 7 Generating a global service function chain level representation vector v c Then the network function level is represented as a vector v 1 ~v 7 Service function chain level representation vector v with global c And aggregating to obtain a chain perception network function representation vector of each network function.
Considering that the chain-aware network function representation vector of a single time node cannot predict an extension requirement in advance and is easily interfered by noise information, the embodiment uses a feature sequence in a continuous time interval as a model input for constructing a complex corresponding relationship between the chain-aware network function representation vector sequence and an extension action.
Step S230, inputting the chain perception network function representation vector in the continuous time interval into a preset decision model for expansion decision, and deciding an expansion action.
The method not only takes the state feature information of the network function as input, but also takes the global service function chain information as the input information of the neural network model, adds the service function chain information into the extension detection strategy of the NFV system, takes the complexity and the variability of the input information into consideration, adopts the neural network model to carry out extension decision, and establishes the relationship between the high-dimensional feature sequence and the extension action. Because the input information of the neural network is a vector with a specific length, in order to add the service function chain information into the extension detection strategy, the initially collected network function state characteristic information and the service function chain information are converted into a chain perception network function representation vector with a fixed length, the state and the connection relation of the network function are described, and the chain perception network function representation vector is used as the input information of a GRU (Gate Recurrent Unit) neural network model, so that the extension detection of the network function is realized, and the extension operation which should be executed by the NFV system is output.
In some embodiments, the act of expanding comprises one of:
adding a new instance;
no extension is made; and
one instance is removed.
In some embodiments, the decision model may employ a Gated Recurrent Unit (GRU), and in practical applications, the GRU may be trained under various neural network frameworks (e.g., a deep learning framework such as TensorFlow, pytorreh, etc.).
Inputting the chain perception network function representation vector in the continuous time interval into a preset decision model for carrying out an extension decision, and deciding an extension action, and further comprising the following steps:
will continue for a time interval [1, M]Inner chain-aware network function representation vector as input sequence
Figure BDA0002936364770000081
Inputting a preset decision model; the preset decision model is as follows:
Figure BDA0002936364770000082
Figure BDA0002936364770000083
Figure BDA0002936364770000084
Figure BDA0002936364770000085
wherein, the current time k belongs to [1, M ]]The chain-aware network function representation vector at the current time k is
Figure BDA0002936364770000086
σ is a sigmoid function;
Figure BDA0002936364770000087
an update gate (one of the two gates of the GRU) representing a network function i for determining the amount of information that needs to be passed to future steps in past steps;
Figure BDA0002936364770000091
a reset gate (one of two gates of GRU) representing a network function i for determining the amount of information that needs to be forgotten in the past step;
Figure BDA0002936364770000092
representing the memory content of the network function i at the current time k;
Figure BDA0002936364770000093
the output of the memory content of the network function i at the current moment k is shown; an indication of Hadamard multiplication operation;
Figure BDA0002936364770000094
Figure BDA0002936364770000095
respectively, a weight parameter of the network function i and a hyperparameter.
Output of memory content of network function i at time M
Figure BDA0002936364770000096
As the current continuous time interval [1, M]Carrying information of the chain-aware network function representation vector determined by the update gate.
Output of memory content at time M for network function i
Figure BDA0002936364770000097
Calculating the probability p of different extended actions of the network function i by using the softmax function as an extended action decision function i
Figure BDA0002936364770000098
Wherein the content of the first and second substances,
Figure BDA0002936364770000099
and
Figure BDA00029363647700000910
respectively representing a weight matrix and a deviation of the network function i, it is understood that the dimensions of the weight matrix and the deviation are 3 dimensions, respectively corresponding to threeDifferent expansion actions, i.e. adding a new instance, doing no expansion and removing an instance.
Probability p i The largest extended action is taken as the extended action of the decided network function i.
By taking the chain-aware network function representation vector for each network function as input, the probability p of each expansion action for each network function can be calculated i The expansion action corresponding to the maximum value is the expansion action of the decided network function i, and then the expansion action is executed, the expansion actions of a plurality of network functions can be synchronously carried out, the expansion requirement is sniffed without waiting for the network monitoring process to be finished, and the delay of system resource scheduling and the influence of the network function connection sequence on the resource scheduling timeliness of the network functions are avoided.
In this embodiment, in the network monitoring process, when a dynamically changing traffic load is faced, the network function extension detection method can perform real-time extension detection on each network function in the service function chain, and calculate an extension action corresponding to each network function, so as to perform elastic extension in time, avoid the influence of the network function connection sequence on the resource scheduling timeliness of the network function, and ensure the system performance.
It should be understood that, although the steps in the flowchart of fig. 2 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least some of the sub-steps or stages of other steps.
Example two
Fig. 5 is a block diagram of a network function expansion detection apparatus, and as shown in fig. 5, the embodiment provides a network function expansion detection apparatus, including:
a collecting module 510, configured to collect status feature information of the network function and service function chain information;
a conversion module 520, configured to convert the state feature information of the network function and the service function chain information into a chain-aware network function representation vector with a preset length;
the decision module 530 is configured to input the chain-aware network function representation vector in the continuous time interval into a preset decision model to perform an extended decision, and decide an extended action.
It is understood that the collecting module 510 may be configured to execute the step S210 in the first embodiment, the transforming module 520 may be configured to execute the step S220 in the first embodiment, and the decision module 530 may be configured to execute the step S230 in the first embodiment.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, or they may be separately fabricated into various integrated circuit modules, or multiple modules or steps thereof may be fabricated into a single integrated circuit module. The present invention is not limited to any specific combination of hardware and software.
EXAMPLE III
The present embodiment provides a storage medium having a computer program stored thereon, which when executed by one or more processors implements the method in the above embodiments.
The storage medium may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk, or optical disk.
Example four
The present embodiment provides an electronic device, which includes a memory and a processor, wherein the memory stores a computer program, and the computer program implements the method in the foregoing embodiments when executed by the processor.
The Processor may be an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and for details of the method, reference is made to the foregoing embodiments, which are not repeated herein.
In practical applications, the processor may be a controller of the NFV system, and calculates a corresponding extended action according to the network function state feature information and the service function chain information acquired from the data layer.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method can be implemented in other ways. The above-described apparatus and method embodiments are merely illustrative.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Although the embodiments of the present invention have been described above, the above descriptions are only for the convenience of understanding the present invention, and are not intended to limit the present invention. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (9)

1. A network function expansion detection method is characterized by comprising the following steps:
collecting state characteristic information of network functions and service function chain information;
converting the state characteristic information of the network function and the service function chain information into a chain perception network function representation vector with a preset length;
inputting the chain perception network function expression vector in the continuous time interval into a preset decision model for carrying out an expansion decision, and deciding an expansion action;
the method for inputting the chain perception network function representation vector in the continuous time interval into a preset decision model to carry out an extension decision to decide out an extension action comprises the following steps:
will continue for a time interval [1, M]Inner chain-aware network function representation vector as input sequence
Figure FDA0003754000430000011
Inputting a preset decision model; the preset decision model is as follows:
Figure FDA0003754000430000012
Figure FDA0003754000430000013
Figure FDA0003754000430000014
Figure FDA0003754000430000015
wherein, the current time k belongs to [1, M ]]The chain-aware network function representation vector at the current time k is
Figure FDA0003754000430000016
σ is a sigmoid function;
Figure FDA0003754000430000017
an update gate representing a network function i;
Figure FDA0003754000430000018
a reset gate representing a network function i;
Figure FDA0003754000430000019
representing the memory content of the network function i at the current time k;
Figure FDA00037540004300000110
the output of the memory content of the network function i at the current time k is shown; an indication of Hadamard multiplication operation;
Figure FDA00037540004300000111
Figure FDA00037540004300000112
respectively representing the weight parameter of the network function i, and d represents the hyper-parameter;
is in time for network function iOutputting of memory content of engraving M
Figure FDA00037540004300000113
Calculating probability p of expansion action of network function i by using softmax function i
Figure FDA00037540004300000114
Wherein the content of the first and second substances,
Figure FDA00037540004300000115
and
Figure FDA00037540004300000116
respectively representing a weight matrix and a deviation of the network function i;
probability p i The largest extended action is taken as the extended action of the decided network function i.
2. The method according to claim 1, wherein the converting the status feature information and the service function chain information of the network function into a chain-aware network function representation vector of a preset length includes:
generating a network function level expression vector for describing state characteristic information of the network function;
generating a global service function chain level representation vector for characterizing service function chain information, the service function chain information including a connection relationship between network functions;
a chain-aware network function representation vector is generated for characterizing a network function, including a network function level representation vector and a global service function chain level representation vector for the network function.
3. The network function extension detection method of claim 1, wherein the extension action comprises one of:
adding a new instance;
no expansion is made; and
one instance is removed.
4. The method according to claim 1, wherein the status feature information of the network function includes at least one of:
an inlet flow rate;
an outlet flow rate;
packet processing latency;
CPU utilization; and
memory utilization.
5. The method according to claim 2, wherein the generating a network function level representation vector comprises:
generating a network function level representation vector at time t using the following equation
Figure FDA0003754000430000021
Figure FDA0003754000430000022
Figure FDA0003754000430000023
Wherein network function j is a downstream node of network function i;
Figure FDA0003754000430000024
the method comprises the steps of mapping state characteristic information of a network function i at the time t to an implicit state space;
Figure FDA0003754000430000025
the state characteristic information of the network function j at the moment t is mapped to an implicit state space;
Figure FDA0003754000430000026
and
Figure FDA0003754000430000027
respectively representing a weight matrix and a deviation of the network function i; d represents a hyper-parameter.
6. The method according to claim 2, wherein the generating a global service function chain level representation vector comprises:
generating a global service function chain level representation vector using the following equation
Figure FDA0003754000430000031
Figure FDA0003754000430000032
Wherein C is a service function chain virtual node, and all network functions are downstream nodes of the virtual node C,
Figure FDA0003754000430000033
the state characteristic information of the network function j at the moment t is mapped to an implicit state space; d represents a hyper-parameter.
7. A network function extension detection apparatus, comprising:
the collection module is used for collecting the state characteristic information of the network function and the service function chain information;
the conversion module is used for converting the state characteristic information of the network function and the service function chain information into a chain perception network function representation vector with a preset length;
the decision module is used for inputting the chain perception network function representation vector in the continuous time interval into a preset decision model for expansion decision, and deciding an expansion action; the method for inputting the chain perception network function representation vector in the continuous time interval into a preset decision model to carry out an extension decision to decide out an extension action comprises the following steps:
will continue for a time interval [1, M]Inner chain-aware network function representation vector as input sequence
Figure FDA0003754000430000034
Inputting a preset decision model; the preset decision model is as follows:
Figure FDA0003754000430000035
Figure FDA0003754000430000036
Figure FDA0003754000430000037
Figure FDA0003754000430000038
wherein, the current time k belongs to [1, M ]]The chain-aware network function at the current time k is represented as a vector
Figure FDA0003754000430000039
σ is a sigmoid function;
Figure FDA00037540004300000310
an update gate representing a network function i;
Figure FDA00037540004300000311
a reset gate representing a network function i;
Figure FDA00037540004300000312
representing the memory content of the network function i at the current time k;
Figure FDA00037540004300000313
the output of the memory content of the network function i at the current time k is shown; an indication of Hadamard multiplication operation;
Figure FDA00037540004300000314
Figure FDA00037540004300000315
respectively representing the weight parameters of the network function i, and d representing a hyper-parameter;
output of memory content at time M for network function i
Figure FDA00037540004300000316
Calculating probability p of expansion action of network function i by using softmax function i
Figure FDA0003754000430000041
Wherein the content of the first and second substances,
Figure FDA0003754000430000042
and
Figure FDA0003754000430000043
respectively representing a weight matrix and a deviation of the network function i;
probability p i The largest extended action is taken as the extended action of the decided network function i.
8. A storage medium having stored thereon a computer program which, when executed by one or more processors, implements the method of any one of claims 1 to 6.
9. An electronic device, comprising a memory and a processor, the memory having stored thereon a computer program that, when executed by the processor, implements the method of any of claims 1-6.
CN202110163214.0A 2021-02-05 2021-02-05 Network function extension detection method and device, storage medium and electronic equipment Active CN112861964B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110163214.0A CN112861964B (en) 2021-02-05 2021-02-05 Network function extension detection method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110163214.0A CN112861964B (en) 2021-02-05 2021-02-05 Network function extension detection method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112861964A CN112861964A (en) 2021-05-28
CN112861964B true CN112861964B (en) 2022-09-16

Family

ID=75988680

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110163214.0A Active CN112861964B (en) 2021-02-05 2021-02-05 Network function extension detection method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112861964B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9628380B2 (en) * 2015-03-06 2017-04-18 Telefonaktiebolaget L M Ericsson (Publ) Method and system for routing a network function chain
CN109450667B (en) * 2018-10-12 2020-10-13 北京邮电大学 Mobility management method and device based on network function virtualization
CN109412900B (en) * 2018-12-04 2020-08-21 腾讯科技(深圳)有限公司 Network state recognition method, model training method and model training device
US11159408B2 (en) * 2019-06-25 2021-10-26 Intel Corporation Link performance prediction technologies

Also Published As

Publication number Publication date
CN112861964A (en) 2021-05-28

Similar Documents

Publication Publication Date Title
WO2021197364A1 (en) Scaling method for a service, and related device
Linder et al. Big building data-a big data platform for smart buildings
Bogdan Mathematical modeling and control of multifractal workloads for data-center-on-a-chip optimization
Khaleq et al. Intelligent autoscaling of microservices in the cloud for real-time applications
Zhang et al. Blockchain-based collaborative edge intelligence for trustworthy and real-time video surveillance
Mahajan et al. Prediction of network traffic in wireless mesh networks using hybrid deep learning model
Papageorgiou et al. Edge-computing-aware deployment of stream processing tasks based on topology-external information: Model, algorithms, and a storm-based prototype
CN110460662A (en) The processing method and system of internet of things data
Maurya et al. Time-series clustering for data analysis in smart grid
Khelghatdoust et al. GLAP: Distributed dynamic workload consolidation through gossip-based learning
Lujic et al. Resilient edge data management framework
Oliveira et al. Latency and energy-awareness in data stream processing for edge based iot systems
Tuli et al. DRAGON: Decentralized fault tolerance in edge federations
CN112861964B (en) Network function extension detection method and device, storage medium and electronic equipment
Sedlak et al. Controlling data gravity and data friction: from metrics to multidimensional elasticity strategies
da Silva et al. Online machine learning for auto-scaling in the edge computing
Song et al. Adaptive watermark generation mechanism based on time series prediction for stream processing
CN113986704A (en) TS-Decomposition-based data center high-frequency fault time domain early warning method and system
de Oliveira et al. An energy-aware data cleaning workflow for real-time stream processing in the internet of things
CN108471362B (en) Resource allocation prediction technique and device
Du et al. OctopusKing: A TCT-aware task scheduling on spark platform
de Farias et al. Density based multisensor data fusion for multiapplication wireless sensor networks
Li et al. An automated VNF manager based on parameterized action MDP and reinforcement learning
Yoshihisa et al. A low-load stream processing scheme for IoT environments
Cao et al. Virtual Cluster Deployment Model for Large-Scale Data Processing Jobs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant