CN116170384A - Edge computing service perception method and device and edge computing equipment - Google Patents

Edge computing service perception method and device and edge computing equipment Download PDF

Info

Publication number
CN116170384A
CN116170384A CN202310445152.1A CN202310445152A CN116170384A CN 116170384 A CN116170384 A CN 116170384A CN 202310445152 A CN202310445152 A CN 202310445152A CN 116170384 A CN116170384 A CN 116170384A
Authority
CN
China
Prior art keywords
service
services
long
term memory
memory network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310445152.1A
Other languages
Chinese (zh)
Inventor
白晖峰
甄岩
陈文彬
霍超
闫波
顾仁涛
陈晨
耿俊成
陈克伟
张颉
郑利斌
张港红
尹志斌
高健
苑佳楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Smartchip Microelectronics Technology Co Ltd
Original Assignee
Beijing Smartchip Microelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Smartchip Microelectronics Technology Co Ltd filed Critical Beijing Smartchip Microelectronics Technology Co Ltd
Priority to CN202310445152.1A priority Critical patent/CN116170384A/en
Publication of CN116170384A publication Critical patent/CN116170384A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/10Flow control; Congestion control
    • H04L47/24Traffic characterised by specific attributes, e.g. priority or QoS
    • H04L47/2441Traffic characterised by specific attributes, e.g. priority or QoS relying on flow classification, e.g. using integrated services [IntServ]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/10Flow control; Congestion control
    • H04L47/24Traffic characterised by specific attributes, e.g. priority or QoS
    • H04L47/2483Traffic characterised by specific attributes, e.g. priority or QoS involving identification of individual flows
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/50Queue scheduling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/70Admission control; Resource allocation
    • H04L47/80Actions related to the user profile or the type of traffic
    • H04L47/808User-type aware
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The invention relates to the field of electric power Internet of things, and provides an edge computing service sensing method and device and edge computing equipment. The method comprises the following steps: acquiring local access data of various types of services at the edge side, extracting service flow characteristics of the local access data, and discretizing the extracted service flow characteristics of the various types of services; and inputting the service flow characteristics of the discretized multiple types of services into the service perception model in parallel to obtain service perception results. The service perception model comprises a plurality of long-short-term memory network element groups, each long-short-term memory network element group corresponds to one type of service, and a plurality of long-short-term memory network element groups respectively correspond to a plurality of types of services. According to the invention, the discretized service flow characteristics are calculated in parallel through the service perception model based on the long-short-term memory network, and meanwhile, various types of services are identified, so that the service identification operation speed is improved, and the processing efficiency of edge computing multi-service access is improved.

Description

Edge computing service perception method and device and edge computing equipment
Technical Field
The invention relates to the field of electric power Internet of things, in particular to an edge computing service perception method, an edge computing device for multi-service perception, edge computing equipment and a computer readable storage medium.
Background
Along with the development of novel power system construction, the integration of novel power distribution business and edge computing technology is increasingly tight, and a novel challenge is provided for the edge computing processing capacity of the power distribution Internet of things. The edge calculation of the distribution internet of things must meet the requirements of diversified and differentiated electric power business mass access on high real-time performance, high efficiency and high accuracy. Services faced by the power distribution internet of things are increasingly complex and diversified, and in order to obtain better quality of service (QoS), the supporting capability of identifying and optimizing service distinguishing processing for accessed services is a prerequisite for improving the QoS of a plurality of services of the power distribution internet of things.
The service awareness identification technology detects a machine learning-based method from an early port identification method and a deep packet flow. Port identification methods tend to have poor identification accuracy due to changes in port numbers. In order to solve the limitation of the port method, a new method is provided for identifying the service flow through deep packet flow detection of an application layer, a load and the like. The deep packet flow detection method carries out classification and identification by detecting the content of the data packet, and carries out classification processing according to the characteristic mode of the known service flow. The deep packet flow detection method has the advantages of high identification precision, small service granularity and the like, but the service protocols of the electric power internet of things are diversified gradually, so that the real-time classification calculation complexity is high, even some encrypted service data packets are adopted, and the identification processing time delay is longer. This results in deep packet flow detection being a significant limitation in heterogeneous multi-source service access and handling at edge computing devices. The traditional neural network algorithm has the problem of large operation amount to a certain extent, and is difficult to realize rapid classification and identification operation under the edge calculation condition with limited calculation power.
Disclosure of Invention
In order to solve the technical problems, the invention provides an edge computing service sensing method and an edge computing service sensing device so as to improve the processing efficiency of edge computing multi-service access.
In a first aspect of the embodiment of the present invention, there is provided an edge computing service awareness method, including:
acquiring local access data of multiple types of services at the edge side, extracting service flow characteristics of the local access data, and discretizing the extracted service flow characteristics of the multiple types of services;
the service flow characteristics of the discretized multiple types of services are input into a service perception model in parallel to obtain a service perception result, wherein the service perception model comprises multiple groups of long-short-term memory network element groups, each group of long-short-term memory network element groups corresponds to one type of service, and multiple groups of long-short-term memory network element groups respectively correspond to multiple types of services.
In the embodiment of the invention, each group of long-short-term memory network units in the service perception model comprises a plurality of long-short-term memory network units based on time discretization improvement, and the long-short-term memory network units are connected in series.
In the embodiment of the invention, the long-period memory network unit based on time discretization improvement comprises an input gate, an output gate and a state unit, wherein the input gate is used for summing a first input gate signal activated by a sigmoid function and a second input gate signal activated by a tanh function and carrying forgetting degree information to obtain a state update signal of the state unit.
In the embodiment of the invention, the structural formula of the long-period and short-period memory network unit based on time discretization improvement is as follows:
Figure SMS_1
where k is the number of times, x k For the kth input signal, y k For the kth output signal, y k-1 I is the output signal of the k-1 th time k For the first input gate signal, z k C is the second input gate signal k For the kth state update signal, c k-1 Status update signal for the k-1 th time, o k Representing the output gate signal, W z Is z k Input weight matrix of U z Is z k Is a recursive weight matrix of U i Is i k Is a recursive weight matrix of U o Is o is k B z Is z k Sigma () is a sigmoid function, g () is a tanh function, and c represents a point multiplication operation.
In the embodiment of the present invention, the above-mentioned edge computing service awareness method further includes: and scheduling the service flow queues of the services of the multiple types according to the service sensing result, and distributing computing resources and communication resources for the services of the different types.
In the embodiment of the present invention, the above-mentioned edge computing service awareness method further includes: acquiring the service flow characteristics and the service perception results, and updating sample data of the service perception model in real time; and training the business perception model by using the updated sample data.
In the embodiment of the present invention, the training the service awareness model by using the updated sample data includes: and counting the service perception results output by the service perception model in real time, and training the service perception model by using updated sample data when the identification accuracy of the service perception results is lower than a preset threshold value.
In a second aspect of the embodiment of the present invention, there is provided a multi-service aware edge computing device, including:
the service characteristic extraction module is deployed at the edge side of the power distribution network and is used for extracting service flow characteristics from local access data of various types of services and performing discretization processing on the extracted service flow characteristics of the various types of services;
the service identification module is provided with a service perception model and is used for inputting the service flow characteristics of the discretized multiple types of services into the service perception model in parallel to obtain a service perception result;
the service perception model comprises a plurality of long-short-term memory network element groups, each long-short-term memory network element group corresponds to one type of service, and a plurality of long-short-term memory network element groups respectively correspond to a plurality of types of services.
In the embodiment of the invention, each group of long-short-term memory network units in the service perception model comprises a plurality of long-short-term memory network units based on time discretization improvement, and the long-short-term memory network units are connected in series.
In the embodiment of the invention, the long-period memory network unit based on time discretization improvement comprises an input gate, an output gate and a state unit, wherein the input gate is used for summing a first input gate signal activated by a sigmoid function and a second input gate signal activated by a tanh function and carrying forgetting degree information to obtain a state update signal of the state unit.
In the embodiment of the invention, the structural formula of the long-period and short-period memory network unit based on time discretization improvement is as follows:
Figure SMS_2
where k is the number of times, x k For the kth input signal, y k For the kth output signal, y k-1 I is the output signal of the k-1 th time k For the first input gate signal, z k C is the second input gate signal k For the kth state update signal, c k-1 Status update signal for the k-1 th time, o k Representing the output gate signal, W z Is z k Input weight matrix of U z Is z k Is a recursive weight matrix of U i Is i k Is a recursive weight matrix of U o Is o is k B z Is z k Sigma () is a sigmoid function, g () is a tanh function, and c represents a point multiplication operation.
In an embodiment of the present invention, the edge computing device for multi-service awareness further includes:
the sample database module is used for acquiring the service flow characteristics output by the service characteristic extraction module and the service perception results output by the service identification module and updating the sample data of the service perception model in real time;
the business perception training module is provided with a business perception model for training, and is used for acquiring sample data in the sample database module, training the business perception model to obtain model parameters, and transmitting the model parameters to the business identification module.
In the embodiment of the invention, the service perception training module counts the service perception result output by the service perception model in real time, and when the identification accuracy of the service perception result is determined to be lower than a preset threshold value, sample data in the sample database module is acquired to train the service perception model.
In a third aspect of an embodiment of the present invention, there is provided an edge computing device including:
the communication interface module is used for acquiring local access data of various types of services;
the service characteristic extraction module is used for extracting service flow characteristics from the local access data of the multiple types of services and performing discretization processing on the extracted service flow characteristics of the multiple types of services;
the service identification module is provided with a service perception model and is used for inputting the service flow characteristics of the discretized multiple types of services into the service perception model in parallel to obtain a service perception result;
the service flow scheduling module is used for scheduling the service flow queues of the various types of services according to the service sensing result obtained by the service identification module, and distributing computing resources and communication resources for the different types of services;
the service perception model comprises a plurality of long-short-term memory network element groups, each long-short-term memory network element group corresponds to one type of service, and a plurality of long-short-term memory network element groups respectively correspond to a plurality of types of services.
In the embodiment of the invention, each group of long-short-term memory network units in the service perception model comprises a plurality of long-short-term memory network units based on time discretization improvement, and the long-short-term memory network units are connected in series.
In an embodiment of the present invention, the edge computing device further includes:
the sample database module is used for acquiring the service flow characteristics output by the service characteristic extraction module and the service perception results output by the service identification module and updating the sample data of the service perception model in real time;
the business perception training module is provided with a business perception model for training, and is used for acquiring sample data in the sample database module, training the business perception model to obtain model parameters, and transmitting the model parameters to the business identification module.
The embodiment of the present invention also provides a computer readable storage medium having stored thereon a computer program that is executed by a processor to implement the edge computing service awareness method provided in the first aspect.
According to the invention, the discretized service flow characteristics are calculated in parallel based on the service perception model of the long-short-term memory network, and meanwhile, various types of services are identified, so that the service identification operation speed is improved, and the processing speed, the resource utilization rate and the intelligent level of edge computing multi-service access are improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flowchart of an edge computing service awareness method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a service awareness model according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a long-short-term memory network unit according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a long-short-period memory network unit based on time discretization improvement according to an embodiment of the present invention;
FIG. 5 is a block diagram of a multi-service aware edge computing device according to an embodiment of the present invention;
fig. 6 is a block diagram of an edge computing device according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions and advantages of the embodiments of the present invention more apparent, the following detailed description of exemplary embodiments of the present invention is provided in conjunction with the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present invention and not exhaustive of all embodiments. It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
Term interpretation:
a Long Short-Term Memory (LSTM) is a time-loop neural network, and is designed to solve the Long-Term dependency problem of a general RNN (loop neural network). The core of the LSTM neural network is a cell state and a gate structure, the structure of the LSTM neural network comprises an input gate, a forgetting gate and an output gate, and meanwhile, compared with RNNs, the LSTM is added with cell state information, and whether the cell state information can be forgotten or not can be determined through the forgetting gate. The cell state corresponds to the path of information transmission, allowing information to be transferred in serial links, which can be regarded as "memory" of the network. In theory, the cell state can always convey relevant information during sequence processing. Thus, even the information of the earlier time step can be carried into the cells of the later time step, which overcomes the influence of the short-term memory. The addition and removal of information in LSTM neural networks may be accomplished through a "gate" structure that learns which information should be saved or forgotten during the training process.
As described in the background art, the conventional neural network algorithm has a problem of large operation amount to a certain extent, and is difficult to realize rapid classification and identification operation under the condition of limited calculation force. The embodiment of the invention provides an edge computing service perception method, which comprises the following steps: acquiring local access data of various types of services at the edge side, extracting service flow characteristics of the local access data, and discretizing the extracted service flow characteristics of the various types of services; and inputting the service flow characteristics of the discretized multiple types of services into the service perception model in parallel to obtain service perception results. The service perception model comprises a plurality of long-short-term memory network element groups, each long-short-term memory network element group corresponds to one type of service, and a plurality of long-short-term memory network element groups respectively correspond to a plurality of types of services. According to the invention, the discretized service flow characteristics are calculated in parallel based on the service perception model of the long-short-term memory network, and meanwhile, various types of services are identified, so that the service identification operation speed is improved, and the processing speed, the resource utilization rate and the intelligent level of edge computing multi-service access are improved. The above scheme is explained in detail below.
Fig. 1 is a flowchart of an edge computing service awareness method according to an embodiment of the present invention. As shown in fig. 1, the edge computing service awareness method provided in this embodiment includes the following steps:
s101, acquiring local access data of various types of services at the edge side, extracting service flow characteristics of the local access data, and performing discretization processing on the extracted service flow characteristics of the various types of services;
s102, the service flow characteristics of the discretized multiple types of services are input into the service perception model in parallel to obtain a service perception result.
In the step S101, the edge side device receives the local access data of the multiple types of services, and extracts the characteristic parameters of the multiple types of service flows, for example: maximum packet size, minimum packet size, average arrival time, inter-time average, traffic flow size, traffic flow duration, flag bit, etc. Discretizing the characteristic parameters of the service flow based on time, for example, taking the parameters at a certain moment in a certain period of time as one input of a service perception model.
The structure of the service awareness model in the above step S102 is shown in fig. 2, and includes: an input layer, an LSTM layer, and an output layer. The input layer is used for inputting multiple service flow characteristic parameters (i.e. the service flow characteristics of multiple types of services) in parallel, and is provided with n types of service flow characteristic parameters, i.e. X 1 ,X 2 ,…,X n . The LSTM layer is composed of n LSTM unit groups, each LSTM unit group corresponds to one type of service flow characteristic parameter, and the n LSTM unit groups respectively correspond to n types of service flow characteristic parameters. Each group of LSTM cells includes a plurality of LSTM cells connected in series (only 5 LSTM cells are shown in fig. 2). The output layer is used for obtaining the service sensing result S through a classification function (such as a SoftMax classification function) of the data output by the n LSTM unit groups.
In one embodiment, as shown in FIG. 3, an LSTM cell comprises: an input gate, a forget gate, an output gate, and a state unit for storing cell state information. Whether the cell state information can be forgotten or not can be determined through a forgetting door, and the specific formula of the forgetting door is as follows:
Figure SMS_3
;(1)
wherein, the liquid crystal display device comprises a liquid crystal display device,f 1 indicating the degree of forgetfulness of the information,f 1 is within the range of [0,1 ]]Indicating whether or not to forget the previous information,f 1 mainly according to the output of the previous moment and the weight coefficient; w (W) hf Is h t-1 Weight, W of (2) xh Is thatx t Is used for the weight of the (c),b f a bias vector representing a forgetting gate, σ () represents an activation function.
The specific formula for activating function σ () is:
Figure SMS_4
;(2)
from the current inputx t And hidden state h t-1 An update vector of the cell state is calculated. Wherein, cell state C t The specific formula of (2) is:
Figure SMS_5
;(3)
the specific formula of tan h is:
Figure SMS_6
;(4)
wherein vector C t The range of (1, 1).
The input gate of the LSTM defines which information is used in the current time step to update the cell state. The specific formula of the input door is as follows:
Figure SMS_7
;(5)/>
wherein the vector isi t Is within the range of (0, 1).
Cell state C t The updated formula of (2) is:
Figure SMS_8
;(6)
where e represents the multiplication between the matrices, the vectorf t Andi t the ranges of (1) are within (0), so that equation (6) can explain C t-1 Which information is forgotten. Also, the process of the present invention is,i t it can be ensured which information is added to the cell state.
The output gate of the LSTM is controlled based on the current input, cell status, and previous outputs. The specific formula of the output gate is:
Figure SMS_9
;(7)
(Vector)O t within (0, 1), by vectorO t Calculate new hidden layer state h t
Figure SMS_10
;(8)
Cell state C t The long-term dependency can be effectively learned, so that the LSTM can store information of a longer time step.
In another embodiment, the LSTM unit structure provided in the foregoing embodiment is improved and simplified based on time discretization, and a Naive-LSTM (Naive LSTM) structure is provided to reduce the complexity of the operation. As shown in fig. 4, the native-LSTM cell includes an input gate, an output gate, and a state cell. Unlike the LSTM cells provided by the above embodiments, the native-LSTM cells are formed by combining a forget gate with an input gate. The combined input gate sums the first input gate signal activated by the sigmoid function and the second input gate signal activated by the tanh function and carrying forgetting degree information to obtain a state update signal of the state unit, and the state update signal is input into the state unit, so that the number of times of activating operation of the LSTM is reduced, and the structure of the LSTM is simplified.
The structural formula of the Naive-LSTM unit is as follows:
Figure SMS_11
where k is the number of times, x k For the kth input signal, y k For the kth output signal, y k-1 I is the output signal of the k-1 th time k For the first input gate signal, z k C is the second input gate signal k For the kth state update signal, c k-1 Status update signal for the k-1 th time, o k Representing the output gate signal, W z Is z k Input weight matrix of U z Is z k Is a recursive weight matrix of U i Is i k Is a recursive weight matrix of U o Is o is k B z Is z k Sigma () is a sigmoid function, g () is a tanh function, and c represents a point multiplication operation.
According to the embodiment, through a business perception model based on the Naive-LSTM, the time-discretized business flow characteristics are calculated in parallel, meanwhile, various types of businesses are identified, the operation complexity is reduced, and the business perception model is supported to be applied to edge computing equipment; the service identification operation speed is improved, and the processing efficiency of edge computing multi-service access is improved.
In an embodiment, the edge computing service awareness method further includes: and scheduling the service flow queues of the multiple types of services according to the service sensing result obtained in the step S102, and distributing computing resources and communication resources for the different types of services.
In the edge computing service perception method, sample data of the service perception model can be updated in real time according to the acquired service flow characteristics and the service perception result, and the service perception model is trained by using the updated sample data. Specifically, the service perception results output by the service perception model are counted in real time, and when the recognition accuracy of the service perception results is lower than a preset threshold value, the updated sample data is utilized to train the service perception model. In a specific embodiment, the training of the service awareness model has two working modes, namely a periodic training mode and a trigger training mode. In the periodic training mode, model training is carried out once every time period T; in the trigger training mode, the output result of the service perception model is counted in real time, whether the identification accuracy is lower than a threshold value or not is counted, and if the identification accuracy is lower than the threshold value, the training is triggered. In the training process, all sample data in a sample database are input into a service perception model for training until the recognition accuracy is more than 98%.
Fig. 5 is a block diagram of a multi-service aware edge computing device according to an embodiment of the present invention. As shown in fig. 5, the multi-service aware edge computing device provided in this embodiment includes: the system comprises a service feature extraction module, a service identification module, a sample database module and a service perception training module. The service characteristic extraction module is deployed at the edge side of the power distribution network and is used for extracting service flow characteristics from local access data of multiple types of services and performing discretization processing on the extracted service flow characteristics of the multiple types of services. The service recognition module is internally provided with a service perception model which is used for inputting the service flow characteristics of the discretized multiple types of services into the service perception model in parallel to obtain a service perception result. The sample database module is used for acquiring the service flow characteristics output by the service characteristic extraction module and the service perception results output by the service identification module and updating the sample data of the service perception model in real time. The service perception training module is internally provided with a service perception model for training, the service perception training module acquires sample data in the sample database module to train the service perception model to obtain model parameters, the trained model parameters are sent to the service identification module, and the service perception model in the service identification module carries out service identification operation according to the parameters.
Referring to fig. 2, the traffic awareness model includes: an input layer, an LSTM layer, and an output layer. The input layer is used for inputting multiple service flow characteristic parameters (i.e. the service flow characteristics of multiple types of services) in parallel, and is provided with n types of service flow characteristic parameters, i.e. X 1 ,X 2 ,…,X n . The LSTM layer is composed of n LSTM unit groups, each LSTM unit group corresponds to one type of service flow characteristic parameter, and the n LSTM unit groups respectively correspond to n types of service flow characteristic parameters. Each group of LSTM cells includes a plurality of LSTM cells connected in series (only 5 LSTM cells are shown in fig. 2). The output layer is used for obtaining the service sensing result S through a classification function (such as a SoftMax classification function) of the data output by the n LSTM unit groups.
In an embodiment, the conventional LSTM cell structure is improved and simplified based on time discretization, and the forgetting gate and the input gate are combined together to provide a Naive-LSTM (Naive LSTM) structure, so as to reduce the complexity of the operation. Referring to fig. 4, a native-LSTM unit includes an input gate, an output gate, and a state unit, where the input gate sums a first input gate signal activated by a sigmoid function and a second input gate signal activated by a tanh function and carrying forgetting degree information, so as to obtain a state update signal of the state unit, and input the state update signal to the state unit, thereby reducing the number of times of LSTM activation operation, and simplifying the LSTM unit structure.
The structural formula of the Naive-LSTM unit is as follows:
Figure SMS_12
where k is the number of times, x k For the kth input signal, y k For the kth output signal, y k-1 I is the output signal of the k-1 th time k For the first input gate signal, z k C is the second input gate signal k For the kth state update signal, c k-1 Status update signal for the k-1 th time, o k Representing the output gate signal, W z Is z k Input weight matrix of U z Is z k Is a recursive weight matrix of U i Is i k Is a recursive weight matrix of U o Is o is k B z Is z k Sigma () is a sigmoid function, g () is a tanh function, and c represents a point multiplication operation.
In an embodiment, the service perception training module counts service perception results output by the service perception model in real time, and when it is determined that the recognition accuracy of the service perception results is lower than a preset threshold, sample data in the sample database module is obtained to train the service perception model.
According to the multi-service-aware edge computing device provided by the embodiment of the invention, through the service awareness model based on the Naive-LSTM, the time-discrete service flow characteristics are computed in parallel, and meanwhile, various types of services are identified, so that the operation complexity is reduced, the service identification operation speed is improved, the device can be applied to edge computing equipment, and the processing efficiency of multi-service access of edge computing is improved.
Fig. 6 is a block diagram of an edge computing device according to an embodiment of the present invention. As shown in fig. 6, the edge computing device provided in this embodiment includes: the system comprises a communication interface module, a service feature extraction module, a service identification module, a service flow scheduling module, a sample database module and a service perception training module. The communication interface module is used for acquiring the local access data of various types of services. The service feature extraction module is used for extracting service flow features from the local access data of the multiple types of services and performing discretization processing on the extracted service flow features of the multiple types of services. The service recognition module is internally provided with a service perception model which is used for inputting the service flow characteristics of the discretized multiple types of services into the service perception model in parallel to obtain a service perception result. The service flow scheduling module is used for scheduling the service flow queues of the various types of services according to the service sensing result obtained by the service identification module, and distributing computing resources and communication resources for the different types of services. The sample database module is used for acquiring the service flow characteristics output by the service characteristic extraction module and the service perception results output by the service identification module and updating the sample data of the service perception model in real time. The service perception training module is internally provided with a service perception model for training, the service perception training module acquires sample data in the sample database module to train the service perception model to obtain model parameters, the trained model parameters are sent to the service identification module, and the service perception model in the service identification module carries out service identification operation according to the parameters.
In one embodiment, referring to fig. 2, the traffic awareness model includes: an input layer, an LSTM layer, and an output layer. The input layer is used for inputting a plurality of service flow characteristic parameters in parallel, and n types of service flow characteristic parameters are set. The LSTM layer is composed of n LSTM unit groups, each LSTM unit group corresponds to one type of service flow characteristic parameter, and the n LSTM unit groups respectively correspond to n types of service flow characteristic parameters. Each LSTM cell group comprises a plurality of LSTM cells, and the plurality of LSTM cells are connected in series. The output layer is used for obtaining service perception results through classification functions according to data output by the n LSTM unit groups.
In an embodiment, the conventional LSTM cell structure is improved and simplified based on time discretization, and the forgetting gate and the input gate are combined together to provide a Naive-LSTM (Naive LSTM) structure, so as to reduce the complexity of the operation. Referring to fig. 4, a native-LSTM unit includes an input gate, an output gate, and a state unit, where the input gate sums a first input gate signal activated by a sigmoid function and a second input gate signal activated by a tanh function and carrying forgetting degree information, so as to obtain a state update signal of the state unit, and input the state update signal to the state unit, thereby reducing the number of times of LSTM activation operation, and simplifying the LSTM unit structure.
The structural formula of the Naive-LSTM unit is as follows:
Figure SMS_13
;/>
where k is the number of times, x k For the kth input signal, y k For the kth output signal, y k-1 I is the output signal of the k-1 th time k For the first input gate signal, z k C is the second input gate signal k For the kth state update signal, c k-1 Status update signal for the k-1 th time, o k Representing the output gate signal, W z Is z k Input weight matrix of U z Is z k Is a recursive weight matrix of U i Is i k Is a recursive weight matrix of U o Is o is k Is used to determine the weight matrix of the (c),b z is z k Sigma () is a sigmoid function, g () is a tanh function, and c represents a point multiplication operation.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, the computer program being executed by a processor to implement the edge computing service awareness method.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein. The scheme in the embodiment of the invention can be realized by adopting various computer languages, such as object-oriented programming language Java, an transliteration script language JavaScript and the like.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (17)

1. An edge computing service awareness method, comprising:
acquiring local access data of multiple types of services at the edge side, extracting service flow characteristics of the local access data, and discretizing the extracted service flow characteristics of the multiple types of services;
the service flow characteristics of the discretized multiple types of services are input into a service perception model in parallel to obtain a service perception result, wherein the service perception model comprises multiple groups of long-short-term memory network element groups, each group of long-short-term memory network element groups corresponds to one type of service, and multiple groups of long-short-term memory network element groups respectively correspond to multiple types of services.
2. The edge computing service awareness method of claim 1 wherein each of the long-short term memory network element groups in the service awareness model comprises a plurality of long-short term memory network elements based on time discretization improvement, the plurality of long-short term memory network elements being connected in series.
3. The edge computing service awareness method according to claim 2, wherein the long-term memory network unit based on time discretization improvement comprises an input gate, an output gate and a state unit, wherein the input gate is used for summing a first input gate signal activated by a sigmoid function and a second input gate signal activated by a tanh function and carrying forgetting degree information to obtain a state update signal of the state unit.
4. The edge computing service awareness method according to claim 3, wherein the structural formula of the long-term and short-term memory network unit based on time discretization improvement is as follows:
Figure QLYQS_1
where k is the number of times, x k For the kth input signal, y k For the kth output signal, y k-1 I is the output signal of the k-1 th time k For the first input gate signal, z k C is the second input gate signal k For the kth state update signal, c k-1 Status update signal for the k-1 th time, o k Representing the output gate signal, W z Is z k Input weight matrix of U z Is z k Is a recursive weight matrix of U i Is i k Is a recursive weight matrix of U o Is o is k B z Is z k Sigma () is a sigmoid function, g () is a tanh function, and c represents a point multiplication operation.
5. The edge computing traffic awareness method of claim 1, further comprising:
and scheduling the service flow queues of the services of the multiple types according to the service sensing result, and distributing computing resources and communication resources for the services of the different types.
6. The edge computing traffic awareness method of claim 1, further comprising:
acquiring the service flow characteristics and the service perception results, and updating sample data of the service perception model in real time;
and training the business perception model by using the updated sample data.
7. The edge computing business awareness method of claim 6 wherein the training the business awareness model with updated sample data comprises:
and counting the service perception results output by the service perception model in real time, and training the service perception model by using updated sample data when the identification accuracy of the service perception results is lower than a preset threshold value.
8. A multi-service aware edge computing device, comprising:
the service characteristic extraction module is deployed at the edge side of the power distribution network and is used for extracting service flow characteristics from local access data of various types of services and performing discretization processing on the extracted service flow characteristics of the various types of services;
the service identification module is provided with a service perception model and is used for inputting the service flow characteristics of the discretized multiple types of services into the service perception model in parallel to obtain a service perception result;
the service perception model comprises a plurality of long-short-term memory network element groups, each long-short-term memory network element group corresponds to one type of service, and a plurality of long-short-term memory network element groups respectively correspond to a plurality of types of services.
9. The multi-service aware edge computing device of claim 8, wherein each set of long-short term memory network elements in the service aware model comprises a plurality of long-short term memory network elements based on time discretization improvement, the plurality of long-short term memory network elements connected in series.
10. The multi-service aware edge computing device of claim 9, wherein the time discretization improvement based long and short term memory network element comprises an input gate, an output gate, and a state element, wherein the input gate is configured to sum a first input gate signal activated by a sigmoid function with a second input gate signal activated by a tanh function and carrying forgetting degree information, so as to obtain a state update signal of the state element.
11. The multi-service aware edge computing device of claim 10, wherein the time discretization improvement based long-short term memory network element has a structural formula as follows:
Figure QLYQS_2
where k is the number of times, x k For the kth input signal, y k For the kth output signal, y k-1 I is the output signal of the k-1 th time k For the first input gate signal, z k C is the second input gate signal k For the kth state update signal, c k-1 Status update signal for the k-1 th time, o k Representing the output gate signal, W z Is z k Input weight matrix of U z Is z k Is a recursive weight matrix of U i Is i k Is a recursive weight matrix of U o Is o is k B z Is z k Sigma () is a sigmoid function, g () is a tanh function, and c represents a point multiplication operation.
12. The multi-service aware edge computing device of claim 8, further comprising:
the sample database module is used for acquiring the service flow characteristics output by the service characteristic extraction module and the service perception results output by the service identification module and updating the sample data of the service perception model in real time;
the business perception training module is provided with a business perception model for training, and is used for acquiring sample data in the sample database module, training the business perception model to obtain model parameters, and transmitting the model parameters to the business identification module.
13. The edge computing device of multi-service awareness according to claim 12, wherein the service awareness training module counts service awareness results output by the service awareness model in real time, and when determining that an identification accuracy of the service awareness results is lower than a preset threshold, obtains sample data in the sample database module to train the service awareness model.
14. An edge computing device, comprising:
the communication interface module is used for acquiring local access data of various types of services;
the service characteristic extraction module is used for extracting service flow characteristics from the local access data of the multiple types of services and performing discretization processing on the extracted service flow characteristics of the multiple types of services;
the service identification module is provided with a service perception model and is used for inputting the service flow characteristics of the discretized multiple types of services into the service perception model in parallel to obtain a service perception result;
the service flow scheduling module is used for scheduling the service flow queues of the various types of services according to the service sensing result obtained by the service identification module, and distributing computing resources and communication resources for the different types of services;
the service perception model comprises a plurality of long-short-term memory network element groups, each long-short-term memory network element group corresponds to one type of service, and a plurality of long-short-term memory network element groups respectively correspond to a plurality of types of services.
15. The edge computing device of claim 14, wherein each set of long-short term memory network elements in the traffic awareness model comprises a plurality of long-short term memory network elements modified based on time discretization, the plurality of long-short term memory network elements connected in series.
16. The edge computing device of claim 14, further comprising:
the sample database module is used for acquiring the service flow characteristics output by the service characteristic extraction module and the service perception results output by the service identification module and updating the sample data of the service perception model in real time;
the business perception training module is provided with a business perception model for training, and is used for acquiring sample data in the sample database module, training the business perception model to obtain model parameters, and transmitting the model parameters to the business identification module.
17. A computer readable storage medium having stored thereon a computer program, wherein the computer program is executed by a processor to implement the edge computing traffic awareness method of any of claims 1 to 7.
CN202310445152.1A 2023-04-24 2023-04-24 Edge computing service perception method and device and edge computing equipment Pending CN116170384A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310445152.1A CN116170384A (en) 2023-04-24 2023-04-24 Edge computing service perception method and device and edge computing equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310445152.1A CN116170384A (en) 2023-04-24 2023-04-24 Edge computing service perception method and device and edge computing equipment

Publications (1)

Publication Number Publication Date
CN116170384A true CN116170384A (en) 2023-05-26

Family

ID=86418500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310445152.1A Pending CN116170384A (en) 2023-04-24 2023-04-24 Edge computing service perception method and device and edge computing equipment

Country Status (1)

Country Link
CN (1) CN116170384A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117725489A (en) * 2024-02-07 2024-03-19 北京智芯微电子科技有限公司 Edge computing service flow sensing method and device and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110896381A (en) * 2019-11-25 2020-03-20 中国科学院深圳先进技术研究院 Deep neural network-based traffic classification method and system and electronic equipment
CN111324990A (en) * 2020-03-19 2020-06-23 长江大学 Porosity prediction method based on multilayer long-short term memory neural network model
CN113783716A (en) * 2021-07-27 2021-12-10 国网冀北电力有限公司信息通信分公司 Flow prediction method and device based on cloud edge collaborative framework
CN114445143A (en) * 2022-01-29 2022-05-06 中国农业银行股份有限公司 Service data prediction method, device, equipment and medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110896381A (en) * 2019-11-25 2020-03-20 中国科学院深圳先进技术研究院 Deep neural network-based traffic classification method and system and electronic equipment
CN111324990A (en) * 2020-03-19 2020-06-23 长江大学 Porosity prediction method based on multilayer long-short term memory neural network model
CN113783716A (en) * 2021-07-27 2021-12-10 国网冀北电力有限公司信息通信分公司 Flow prediction method and device based on cloud edge collaborative framework
CN114445143A (en) * 2022-01-29 2022-05-06 中国农业银行股份有限公司 Service data prediction method, device, equipment and medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李文静等: "基于简化型LSTM神经网络的时间序列预测方法", 《北京工业大学学报》, vol. 47, no. 5, pages 483 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117725489A (en) * 2024-02-07 2024-03-19 北京智芯微电子科技有限公司 Edge computing service flow sensing method and device and electronic equipment

Similar Documents

Publication Publication Date Title
US20220255817A1 (en) Machine learning-based vnf anomaly detection system and method for virtual network management
CN112118143B (en) Traffic prediction model training method, traffic prediction method, device, equipment and medium
CN107786388B (en) Anomaly detection system based on large-scale network flow data
CN110263824B (en) Model training method, device, computing equipment and computer readable storage medium
CN114219097B (en) Federal learning training and predicting method and system based on heterogeneous resources
CN112435469B (en) Vehicle early warning control method and device, computer readable medium and electronic equipment
CN109039727B (en) Deep learning-based message queue monitoring method and device
CN111460294A (en) Message pushing method and device, computer equipment and storage medium
CN116170384A (en) Edge computing service perception method and device and edge computing equipment
CN113852432A (en) RCS-GRU model-based spectrum prediction sensing method
CN103617146B (en) A kind of machine learning method and device based on hardware resource consumption
CN114036051A (en) Test method, device, equipment and storage medium
CN114861875A (en) Internet of things intrusion detection method based on self-supervision learning and self-knowledge distillation
CN111901134B (en) Method and device for predicting network quality based on recurrent neural network model (RNN)
CN115034596A (en) Risk conduction prediction method, device, equipment and medium
CN113298121B (en) Message sending method and device based on multi-data source modeling and electronic equipment
CN113887748A (en) Online federal learning task allocation method and device, and federal learning method and system
Nakıp et al. Dynamic automatic forecaster selection via artificial neural network based emulation to enable massive access for the Internet of Things
WO2019062404A1 (en) Application program processing method and apparatus, storage medium, and electronic device
CN117580046A (en) Deep learning-based 5G network dynamic security capability scheduling method
CN116125279A (en) Method, device, equipment and storage medium for determining battery health state
CN115348190A (en) Internet of things equipment detection method, system and equipment
CN111814051B (en) Resource type determining method and device
CN115757002A (en) Energy consumption determination method, device and equipment and computer readable storage medium
CN111709786B (en) Method, apparatus, device and medium for generating user retention time

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20230526

RJ01 Rejection of invention patent application after publication