CN116886725A - Intelligent scene execution processing method, triggering device and server - Google Patents

Intelligent scene execution processing method, triggering device and server Download PDF

Info

Publication number
CN116886725A
CN116886725A CN202310909388.6A CN202310909388A CN116886725A CN 116886725 A CN116886725 A CN 116886725A CN 202310909388 A CN202310909388 A CN 202310909388A CN 116886725 A CN116886725 A CN 116886725A
Authority
CN
China
Prior art keywords
compensation
trigger
execution
triggering
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310909388.6A
Other languages
Chinese (zh)
Inventor
章俊华
方堃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Huacheng Software Technology Co Ltd
Original Assignee
Hangzhou Huacheng Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Huacheng Software Technology Co Ltd filed Critical Hangzhou Huacheng Software Technology Co Ltd
Priority to CN202310909388.6A priority Critical patent/CN116886725A/en
Publication of CN116886725A publication Critical patent/CN116886725A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • H04L67/143Termination or inactivation of sessions, e.g. event-controlled end of session
    • H04L67/145Termination or inactivation of sessions, e.g. event-controlled end of session avoiding end of session, e.g. keep-alive, heartbeats, resumption message or wake-up for inactive or interrupted session

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Cardiology (AREA)
  • Automation & Control Theory (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The application discloses a processing method, trigger equipment and a server for intelligent scene execution, which are used for realizing that trigger events failing in offline reporting of the trigger equipment are reported to the server again according to a trigger condition compensation strategy, so that guarantee is provided for effective execution of the intelligent scene. The application provides a processing method for intelligent scene execution, which comprises the following steps: when the triggering equipment serving as the triggering condition of the intelligent scene is recovered to be online, acquiring a triggering event compensation message which is locally stored and generated by utilizing a triggering event which fails to report a server when the triggering equipment is offline last time; according to the triggering event compensation message, matching a locally stored triggering condition compensation strategy, and when the triggering condition compensation is determined to be needed, reporting the triggering event identification recorded in the message to the server again for the server, matching stored intelligent scene information according to the triggering event identification, and determining the execution action and the execution object configured for the triggering event according to the matched intelligent scene information.

Description

Intelligent scene execution processing method, triggering device and server
Technical Field
The present application relates to the field of internet of things, and in particular, to a processing method, a triggering device, and a server for executing an intelligent scene.
Background
With the development of intelligent home, a user can actively control intelligent equipment of the Internet of things through a mobile client, and can also create an intelligent scene by setting triggering conditions such as a timing task, equipment alarm information and the like, so that the purpose of passive control of the intelligent equipment is achieved. The intelligent device can trigger and execute the intelligent scene under the condition of stable network, and when the intelligent device is not provided with a network, the intelligent scene can be blocked from running.
At present, only the offline of the intelligent terminal device finally executed by the intelligent scene is processed, and the offline of the intelligent terminal device serving as the triggering condition of the intelligent scene in the actual application scene is not considered, so that the situation that the intelligent scene is not effectively executed may occur.
Disclosure of Invention
The embodiment of the application provides a processing method, trigger equipment and a server for intelligent scene execution, which are used for realizing that a trigger event failing in offline reporting of the trigger equipment is reported to the server again according to a trigger condition compensation strategy, so that guarantee is provided for effective execution of the intelligent scene.
The embodiment of the application provides a processing method for intelligent scene execution, which comprises the following steps:
when the triggering equipment serving as the triggering condition of the intelligent scene is recovered to be online, acquiring a locally stored triggering event compensation message generated by utilizing a triggering event of failure of the reporting server when the triggering equipment is offline last time;
according to the trigger event compensation message, matching a locally stored trigger condition compensation strategy, when the trigger condition compensation is determined to be needed, reporting the trigger event identification recorded in the trigger event compensation message to the server again for the server, matching stored intelligent scene information according to the received trigger event identification, and determining an execution action configured for the trigger event and an execution object for executing the action according to the matched intelligent scene information.
According to the method, when the triggering equipment serving as the triggering condition of the intelligent scene is recovered to be online, a triggering event compensation message which is locally stored and generated by utilizing a triggering event of failure of a reporting server when the triggering equipment is offline last time is acquired; according to the trigger event compensation message, matching a locally stored trigger condition compensation strategy, when the trigger condition compensation is determined to be needed, reporting the trigger event identification recorded in the trigger event compensation message to the server again for the server, matching stored intelligent scene information according to the received trigger event identification, and determining an execution action configured for the trigger event and an execution object for executing the action according to the matched intelligent scene information, thereby providing a guarantee for effective execution of the intelligent scene.
In some embodiments, determining whether trigger condition compensation is needed includes:
matching compensation rules in a locally stored trigger condition compensation strategy according to the event level corresponding to the trigger event recorded in the trigger event compensation message;
and comparing the self offline recovery processing time period with the compensation rule to judge whether the triggering condition compensation is needed.
By the method, whether trigger condition compensation is needed or not is determined according to the trigger event.
In some embodiments, determining whether the generated trigger event compensation message needs to be stored specifically includes:
when the trigger event contained in the trigger event type of the trigger condition compensation strategy is triggered and the reporting of the trigger event type fails, generating a trigger event compensation message recorded with the trigger event identifier;
judging whether intelligent scene information matched with the triggering event exists or not according to the triggering event identification matched with the intelligent scene information sent by the server;
if yes, judging whether the compensation rule in the trigger condition compensation strategy is met or not according to the event level corresponding to the trigger event, and if yes, storing the trigger event compensation message.
By the method, the triggering event compensation information to be stored is stored, and the storage space of the triggering equipment is ensured to be reasonably applied.
In some embodiments, the generated trigger event compensation message is cleared when one of the following conditions is met:
no intelligent scene information matched with the triggering event exists;
and determining that the triggering event does not meet the compensation rule in the triggering condition compensation strategy according to the event level corresponding to the triggering event.
By the method, the triggering event compensation message which is not required to be stored is cleared, and the storage space of the triggering equipment is released in time.
The embodiment of the application provides a processing method for intelligent scene execution, which comprises the following steps:
receiving a triggering event identifier reported by triggering equipment;
matching the locally stored intelligent scene information acquired from the user terminal according to the triggering event identification, and determining an execution action configured for the triggering event and an execution object for executing the action according to the matched intelligent scene information;
when the execution object is an execution device, an execution command is generated and issued to the execution device, and if the execution device is offline, an execution command compensation message is generated and stored according to the execution command.
By the method, the execution command is generated according to the triggering event reported by the triggering device and sent to the execution device, and when the execution device cannot execute the command offline, the execution command compensation message is generated by using the execution command, so that the guarantee is provided for effective execution of the intelligent scene.
In some embodiments, after acquiring the intelligent scene information from the user terminal, before receiving the trigger event identifier reported by the trigger device, the method includes:
according to the acquired intelligent scene information created by the user terminal, determining triggering equipment serving as triggering conditions of the intelligent scene information;
according to event levels corresponding to all trigger events preset for the trigger equipment, matching a preset trigger condition compensation strategy to generate a trigger condition compensation strategy corresponding to the trigger equipment;
and sending the intelligent scene information and the triggering condition compensation strategy corresponding to the triggering equipment.
By the method, the triggering condition compensation strategy corresponding to the triggering device is generated.
In some embodiments, after the execution device comes back online, the method includes:
acquiring a locally stored execution command compensation message, matching a compensation rule in the execution command compensation strategy according to a command type recorded in the execution command compensation message, and comparing the execution device offline recovery processing time period with the compensation rule to judge whether an execution command needs to be issued to the execution device again according to the execution command compensation message;
If yes, the execution command is issued to the execution device again, and the execution device is used for completing the execution action of the corresponding intelligent scene according to the execution command;
and if not, clearing the execution command compensation message.
By the method, whether the execution command needs to be issued to the execution device again after the execution device is recovered online or not is determined, useless execution command compensation information is cleared, and the storage space of the server is timely released.
In some embodiments, the action is performed directly when the execution object is a home device.
Another embodiment of the present application provides an intelligent scene triggering device and a server, which includes a memory and a processor, where the memory is configured to store program instructions, and the processor is configured to call the program instructions stored in the memory, and execute any one of the methods according to the obtained program.
Furthermore, according to an embodiment, for example, a computer program product for a computer is provided, comprising software code portions for performing the steps of the method defined above, when said product is run on a computer. The computer program product may include a computer-readable medium having software code portions stored thereon. Furthermore, the computer program product may be directly loaded into the internal memory of the computer and/or transmitted via the network by at least one of an upload procedure, a download procedure and a push procedure.
Another embodiment of the present application provides a computer-readable storage medium storing computer-executable instructions for causing the computer to perform any of the methods described above.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is an interaction schematic diagram of an intelligent scene execution system according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a server according to an embodiment of the present application;
FIG. 3 is a schematic distribution diagram of different execution objects according to an embodiment of the present application;
fig. 4 is an overall flow schematic diagram of a processing method (trigger device side) executed by an intelligent scene according to an embodiment of the present application;
fig. 5 is an overall flow diagram of a processing method (server side) for intelligent scene execution according to an embodiment of the present application;
Fig. 6 is a specific flow diagram of a processing method (triggering device offline, executing device online) for executing an intelligent scene according to an embodiment of the present application;
fig. 7 is a specific flow diagram of a processing method (trigger device is on-line and execution device is off-line) for executing an intelligent scene according to an embodiment of the present application;
fig. 8 is a specific flow diagram of a processing method (trigger device and execution device are both offline) for executing an intelligent scene according to an embodiment of the present application;
FIG. 9 is a flowchart illustrating a specific method for storing and clearing a trigger event compensation message according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an intelligent scene triggering device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The embodiment of the application provides a processing method, trigger equipment and a server for intelligent scene execution, which are used for realizing that a trigger event failing in offline reporting of the trigger equipment is reported to the server again according to a trigger condition compensation strategy, so that guarantee is provided for effective execution of the intelligent scene.
The method and the device are based on the same application, and because the principles of solving the problems by the method and the device are similar, the implementation of the device and the method can be referred to each other, and the repetition is not repeated.
The terms first, second and the like in the description and in the claims of embodiments of the application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The following examples and embodiments are to be construed as illustrative only. Although the specification may refer to "an", "one", or "some" example or embodiment(s) at several points, this does not mean that each such reference is related to the same example or embodiment, nor that the feature is applicable to only a single example or embodiment. Individual features of different embodiments may also be combined to provide further embodiments. Furthermore, terms such as "comprising" and "including" should be understood not to limit the described embodiments to consist of only those features already mentioned; such examples and embodiments may also include features, structures, units, modules, etc. that are not specifically mentioned.
Various embodiments of the application are described in detail below with reference to the drawings attached to the specification. It should be noted that, the display sequence of the embodiments of the present application only represents the sequence of the embodiments, and does not represent the advantages or disadvantages of the technical solutions provided by the embodiments.
It should be noted that, in the technical solution provided in the embodiment of the present application, intelligent scene information created by a user terminal is obtained through a server, a trigger device is determined according to the intelligent scene information, and a trigger condition compensation policy corresponding to the trigger device is generated and sent to the trigger device; the trigger equipment can judge whether to generate trigger event compensation information by the trigger event which fails to report when offline according to the trigger condition compensation strategy, so that when the subsequent online recovery is convenient, the trigger event compensation information can be utilized to judge whether to report the trigger event to the server again; the server determines a corresponding execution object according to the received trigger event reported by the trigger device, when the execution object is the execution device, generates an execution command and issues the execution command to the execution device, if the execution device cannot execute the execution command offline, generates an execution command compensation message, and is convenient for the subsequent execution device to restore online.
Referring to fig. 1, an intelligent scene execution system provided by an embodiment of the present application includes, for example: the system comprises a user terminal, a server, a triggering device and an executing device; specifically:
the user terminal is used for submitting the intelligent scene created by the user on the user interface to the server;
servers (which may be a general server or a cloud server, for example, an internet of things platform) include, for example: an intelligent scene rule issuing module, an intelligent scene execution judging module, an intelligent scene command execution issuing module, such as shown in fig. 2, wherein,
the intelligent scene rule issuing module is used for storing the intelligent scene information, determining the trigger equipment according to the trigger conditions configured in the intelligent scene information, then matching a locally stored preset trigger condition compensation strategy according to the event level of a trigger event preset for the trigger equipment, generating a trigger condition compensation strategy corresponding to the trigger equipment, and finally issuing the intelligent scene information and the trigger condition compensation strategy corresponding to the trigger equipment;
the intelligent scene execution judging module is used for matching the intelligent scene information according to the triggering event reported by the triggering equipment, determining an execution object, generating an execution command when the execution object is the execution equipment, and sending the execution command to the intelligent scene command execution issuing module; when the execution device is offline and cannot execute the execution command, generating an execution command compensation message (the command type of the execution command is recorded in the message) according to the execution command, and after the execution device is recovered to be online, judging whether the execution command needs to be issued again to the execution device according to the matching of the execution command compensation message with a preset execution command compensation strategy;
The intelligent scene command execution issuing module is shown in fig. 3, and is configured to receive an intelligent scene execution command issued by the intelligent scene execution judging module, judge whether an execution object of the received execution command is a server or an execution device, and if the execution object is the server, directly execute the execution command by the server; if the execution object is the execution device, issuing an execution command to the execution device.
The triggering device is used for storing the triggering condition compensation strategy and the intelligent scene information issued by the received server to the local; when the intelligent scene triggering condition is triggered, reporting the triggering event identification to a server, generating triggering event compensation information by the triggering event which fails to be reported to the server when the intelligent scene triggering condition is offline, and then after the intelligent scene triggering condition is recovered, judging whether the triggering event needs to be reported to the server again according to the triggering event compensation information matched with a locally stored triggering condition compensation strategy;
and the execution device is used for receiving the execution command issued by the server and completing the execution action on the intelligent scene according to the execution command.
Referring to fig. 4, a processing method for executing an intelligent scene provided by an embodiment of the present application, for example, the method is executed by the triggering device, and specific steps include:
Step S101, when the triggering equipment serving as the triggering condition of the intelligent scene is recovered to be online, acquiring a triggering event compensation message which is locally stored and generated by utilizing a triggering event of failure of a reporting server when the triggering equipment is offline last time;
the smart scenario, for example: in the night mode of the park, when monitoring equipment in the park monitors people, lamps around the monitoring equipment are turned on; the monitoring equipment monitors the triggering condition of the artificial intelligent scene, and starts lamps around the monitoring equipment to perform actions for the intelligent scene; wherein the monitoring device is the triggering device and the controller controlling the state of the lamp is the executing device.
The method comprises the steps that a heartbeat keep-alive is established between a server and a trigger device, namely the trigger device sends a heartbeat packet to the server once every preset time interval, and when the trigger device cannot receive a heartbeat packet reply sent by the server to exceed the preset time (for example, 3 minutes), the trigger device cannot communicate with the server and is in an offline state; when the triggering device receives the heartbeat packet reply sent by the server again, the triggering device resumes the online state again.
When the trigger equipment is in an offline state, reporting a trigger event to a server can fail, generating a trigger event compensation message when the trigger event which fails to report is the trigger event contained in the trigger event type of the locally stored trigger condition compensation strategy, and then acquiring the trigger event compensation message when the trigger event is recovered to be online.
Step S102, matching a locally stored trigger condition compensation strategy according to the trigger event compensation message, and when the trigger condition compensation is determined to be needed, reporting the trigger event identification recorded in the trigger event compensation message to the server again for the server, matching stored intelligent scene information according to the received trigger event identification, and determining an execution action configured for the trigger event and an execution object for executing the action according to the matched intelligent scene information.
The trigger condition compensation policy, as shown in table 1, is set in advance in the server for the trigger event of the trigger device, where L1 represents the lowest level of the trigger event, and L4 represents the highest level of the trigger event (or L1 represents the highest level of the trigger event, and L4 represents the lowest level of the trigger event), which is not limited in any way). The type of trigger event includes, for example, a device event, an alarm, and a device attribute change; a device event represents the occurrence of some action; the alarm indicates some types of equipment events, such as an intelligent door lock, wherein a door is opened as an equipment event, and if someone tries to unlock or smash the lock, the equipment is triggered to report an anti-dismantling alarm event; the device attribute change indicates some change in state of the device itself, e.g. a change in the lamp from on to off state. Storing the number of the triggering events generated when the triggering equipment is offline; assuming that the monitoring equipment of the trigger event park monitors that the event level of the person is L2, generating trigger event C of the same level when offline for a certain time, if C is larger than C1, only storing the front C1 trigger events, and discarding the trigger events regenerated later. Assuming that the level of a certain trigger event is L3, the offline recovery time of the trigger device (i.e. the time from offline to online recovery) is T, if T is smaller than T3, the server dials a user call to remind the user and needs to execute trigger condition compensation after the trigger device is recovered online, and if T is larger than T3, the server only needs to dial the user call to remind the user and does not need to execute trigger condition compensation after the trigger device is recovered online. The message reminding means that a notification message is sent to the user terminal through the server to remind. A1-A4 represent the type of compensation action, for example A1 represents that the trigger event level is low, and the trigger condition compensation is not required to be executed after the on-line state is restored; a2 represents: if the off-line time length of the trigger equipment is less than T2, a message is sent by the server to remind a user, and after the on-line recovery is performed, trigger condition compensation is executed; if the offline time length of the trigger equipment is longer than T2, the trigger equipment only needs to send a message to remind a user through a server, and trigger condition compensation is not required to be executed; a3 represents: if the offline time length of the trigger equipment is less than T3, a server dials a call to remind a user, and after the online state is recovered, trigger condition compensation is executed; if the offline time length of the trigger equipment is longer than T3, the user is reminded only by dialing a call through the server, and trigger condition compensation is not required to be executed; a4 represents: no matter how long the trigger device is offline, trigger condition compensation needs to be performed after the trigger device is restored to online.
Different levels of trigger events correspond to different compensation rules. When the level of the trigger event is the highest level, the trigger event generated during the offline period needs to be stored, and no matter how long the trigger device is offline, the trigger condition compensation needs to be performed after the trigger device is online. For example, the triggering device is door magnetic equipment, and has three events, namely a door opening event, a door closing event, an anti-dismantling event and the like; the intelligent scene taking the door magnet type equipment as a triggering condition comprises three types: the door opening event is used as a trigger condition, the door closing event is used as a trigger condition, and the anti-dismantling event is used as a trigger condition; when the trigger condition is triggered, the three types of events define different compensation strategies, such as door opening and door closing events are not compensated, and the disassembly prevention event needs to be compensated.
Table 1 trigger condition compensation strategy for trigger devices
It should be noted that, the rule included in the trigger condition compensation policy is not limited to the rule in table 1, but may be other rules, and the embodiment of the present application does not limit this according to actual needs.
The trigger event identifier, such as trigger event ID, trigger event id+trigger device ID, matches intelligent scene information associated with the trigger event according to the trigger event identifier, so as to determine associated execution devices and execution actions configured for the trigger condition according to the intelligent scene information.
Through step S102, the trigger event of failure reporting during the offline period is realized and is reported to the server again according to the requirement, so that the guarantee is provided for the effective execution of the intelligent scene.
In some embodiments, determining whether trigger condition compensation is needed includes:
matching compensation rules (e.g., compensation rules corresponding to each level shown in table 1) in a locally stored trigger condition compensation policy according to event levels (e.g., L1-L4 shown in table 1) corresponding to trigger events described in the trigger event compensation message;
and comparing the self offline recovery processing time period (namely the time period from offline start to online recovery of the trigger equipment) with the compensation rule, and judging whether the trigger condition compensation is needed.
In order to ensure that the storage space of the trigger device is reasonably used, in some embodiments, determining whether the generated trigger event compensation message needs to be stored specifically includes:
when the trigger event contained in the trigger event type (such as equipment event, alarm, equipment attribute change) triggering the trigger condition compensation strategy fails to report the server, generating a trigger event compensation message recorded with the trigger event identifier;
Judging whether intelligent scene information matched with the triggering event exists or not according to the triggering event identification matched with the intelligent scene information sent by the server;
if yes, judging whether the compensation rule in the trigger condition compensation strategy is met or not according to the event level corresponding to the trigger event, and if yes, storing the trigger event compensation message.
To enable timely release of the memory space of the trigger device, in some embodiments, the generated trigger event compensation message is cleared when one of the following conditions is met:
no intelligent scene information matched with the triggering event exists;
and determining that the triggering event does not meet the compensation rule in the triggering condition compensation strategy according to the event level corresponding to the triggering event.
Referring to fig. 5, a processing method for executing an intelligent scene provided by an embodiment of the present application, for example, the method is executed by the above-mentioned server, and specific steps include:
step S201, receiving a trigger event identifier reported by trigger equipment;
step S202, matching the intelligent scene information acquired from the user terminal and stored locally according to the trigger event identification, and determining an execution action configured for the trigger event and an execution object for executing the action according to the matched intelligent scene information;
When the execution object configured for the triggering event in the intelligent scene information is the server itself, the server directly executes corresponding actions, such as sending a short message, sending a mail, calling a phone to a user, and the like.
Step S203, when the execution object is an execution device (device that completes an execution action in an intelligent scenario), generating an execution command and issuing the execution command to the execution device, and if the execution device is offline, generating an execution command compensation message according to the execution command and storing the execution command compensation message.
The method comprises the steps that a heartbeat keep-alive is established between a server and an execution device, namely the execution device sends a heartbeat packet to the server every preset time interval, and when the server cannot receive the heartbeat packet sent by the execution device and exceeds the preset time (for example, 3 minutes), the execution device is considered to be unable to communicate with the server and is in an offline state; when the server receives the heartbeat packet sent by the execution device again, the execution device resumes the online state again.
Through step S203, when the execution device cannot execute the command offline, the execution command compensation message is generated by using the execution command, so as to ensure that the intelligent scene is effectively executed.
To implement the triggering condition compensation policy for generating the corresponding triggering device, in some embodiments, after acquiring the intelligent scene information from the user terminal, before receiving the triggering event identifier reported by the triggering device, the method includes:
According to the acquired intelligent scene information created by the user terminal, determining triggering equipment serving as triggering conditions of the intelligent scene information;
according to event levels corresponding to trigger events preset for the trigger equipment (corresponding event levels are preset for the trigger events of each type of trigger equipment by a server side), a preset trigger condition compensation strategy (a general trigger condition compensation strategy preset by the server side) is matched, and a trigger condition compensation strategy corresponding to the trigger equipment is generated;
and sending the intelligent scene information and the triggering condition compensation strategy corresponding to the triggering equipment.
To ensure that the intelligent scenario is executed efficiently and that the storage space is used efficiently, in some embodiments, the execution device is brought back online, including:
acquiring a locally stored execution command compensation message, matching a compensation rule in the execution command compensation strategy according to a command type recorded in the execution command compensation message, and comparing the execution device offline recovery processing time period with the compensation rule to judge whether an execution command needs to be issued to the execution device again according to the execution command compensation message;
If yes, the execution command is issued to the execution device again, and the execution device is used for completing the execution action of the corresponding intelligent scene according to the execution command;
and if not, clearing the execution command compensation message.
The execution command compensation policy is preset by the server for each type of execution device (or each execution device may be a different series of the same execution device, and no limitation is made to the execution command compensation policy), for example, as shown in table 2, for example, the command type of the execution command recorded in the acquired execution command compensation message is B1, the offline recovery processing time period of the execution device is T, and if T is greater than T1, the server does not need to resend the execution command to the execution device after the execution device is recovered online; if T is less than or equal to T1, the server needs to re-issue the execution command to the execution device after the execution device comes back online.
Command type Compensating for effective duration
B1 T1
B2 T2
B3 T3
Table 2 execution command compensation strategy for execution device
In some embodiments, when the execution object is a local device, the action is directly executed (for example, a switch controlled by a server is turned on or off, and the server sends a short message, sends a mail, and dials a phone call to a user).
Examples of several specific process flows are given below.
The user creates the following intelligent scene on the mobile phone APP: in night mode, when the monitoring device in the living room monitors a person, the light in the living room is turned on. The triggering conditions of the intelligent scene are as follows: the monitoring device monitors the person (assuming that the corresponding event level is L2), the triggering device is the monitoring device, and performs the action: and starting the lamp, wherein an execution object is a controller, and the monitoring equipment and the controller are respectively communicated with the server. The processing method executed in the above-described smart scenario will be described below by taking this smart scenario as an example.
Embodiment one:
referring to fig. 6, in the processing method for executing an intelligent scene provided by the embodiment of the present application, it is assumed that a monitoring device is in an offline state, and a controller is in an online state, and specific steps include:
step S301, a user creates an intelligent scene on a mobile phone APP, the mobile phone APP submits intelligent scene data to a server, and the server stores the received intelligent scene data;
step S302, a server determines monitoring equipment serving as a trigger condition according to relevant information of intelligent scene configuration;
step S303, the server matches preset trigger condition compensation strategies according to the levels of the trigger events preset for the monitoring equipment, generates the trigger condition compensation strategies corresponding to the monitoring equipment, and sends the trigger condition compensation strategies corresponding to the monitoring equipment and intelligent scene data to the monitoring equipment;
Step S304, the monitoring equipment stores the received triggering condition compensation strategy and intelligent scene data;
step S305, at a certain moment at night, the monitoring equipment is offline, and during the offline period, the monitoring equipment shoots a person and generates a triggering event compensation message;
step S306, the monitoring equipment determines that intelligent scene data associated with a trigger event exists, matches the level of the trigger event with a trigger condition compensation strategy, determines that a compensation rule is met, and stores a generated trigger event compensation message;
step S307, after the time length of T (assuming T < T2), the monitoring equipment restores the on-line state, acquires the stored trigger event compensation information, matches the stored trigger condition compensation strategy, determines that the trigger event needs to be reported again to the server, and reports the trigger event to the server;
step S308, the server determines that intelligent scene data associated with a trigger event reported by the received monitoring equipment exists, determines an execution action as turning on a lamp according to the intelligent scene data, and an execution object is a controller;
step S309, the server generates an execution command and issues the execution command to the controller;
step S310, the controller turns on the lamp of the living room according to the received execution command.
Embodiment two:
referring to fig. 7, in the processing method for executing an intelligent scene provided by the embodiment of the present application, it is assumed that a monitoring device is in an on-line state, and a controller is in an off-line state, and specific steps include:
step S401, a user creates an intelligent scene on a mobile phone APP, the mobile phone APP submits intelligent scene data to a server, and the server stores the received intelligent scene data;
step S402, the server determines monitoring equipment serving as a trigger condition according to the related information of intelligent scene configuration;
step S403, the server matches preset trigger condition compensation strategies according to the levels of the trigger events preset for the monitoring equipment, generates the trigger condition compensation strategies corresponding to the monitoring equipment, and sends the trigger condition compensation strategies corresponding to the monitoring equipment and intelligent scene data to the monitoring equipment;
step S404, the monitoring equipment stores the received trigger condition compensation strategy and intelligent scene data;
step S405, at a certain moment in the night, the monitoring equipment shoots a person and reports the triggering event to the server;
step S406, the server determines that intelligent scene data associated with a trigger event reported by the received monitoring equipment exists, determines an execution action as turning on a lamp according to the intelligent scene data, and an execution object is a controller;
Step S407, the server generates an execution command and issues the execution command to the controller, and the server generates an execution command compensation message (assuming that the command type is B1) because the controller is offline;
in step S408, after the duration of T (assuming T > T1) passes, the controller resumes the online state, and the server obtains the execution command compensation message, matches the execution command compensation policy preset for the controller, determines that the execution command does not need to be issued again to the controller, and clears the execution command compensation message.
Embodiment III:
referring to fig. 8, in the processing method for executing an intelligent scene provided by the embodiment of the present application, it is assumed that a monitoring device and a controller are both in an offline state, and specific steps include:
step S501, a user creates an intelligent scene on a mobile phone APP, the mobile phone APP submits intelligent scene data to a server, and the server stores the received intelligent scene data;
step S502, a server determines monitoring equipment serving as a trigger condition according to relevant information of intelligent scene configuration;
step S503, the server matches preset trigger condition compensation strategies according to the level of each trigger event preset for the monitoring equipment, generates the trigger condition compensation strategy corresponding to the monitoring equipment, and sends the trigger condition compensation strategy corresponding to the monitoring equipment and intelligent scene data to the monitoring equipment;
Step S504, the monitoring equipment stores the received triggering condition compensation strategy and intelligent scene data;
step S505, at a certain moment at night, the monitoring equipment is offline, and during the offline period, the monitoring equipment shoots a person and generates a triggering event compensation message;
step S506, the monitoring equipment determines that intelligent scene data associated with a trigger event exists, matches the level of the trigger event with a trigger condition compensation strategy, determines that a compensation rule is met, and stores the generated trigger event compensation message;
step S507, after the time length of T (assuming T < T3), the monitoring equipment restores the on-line state, acquires the trigger event compensation message, matches the stored trigger condition compensation strategy, determines that the trigger event needs to be reported to the server again, and reports the trigger event to the server;
step S508, the server determines that intelligent scene data associated with a trigger event reported by the received monitoring equipment exists, determines that an execution action is to turn on a lamp according to the intelligent scene data, and an execution object is to be a controller;
step S509, the server generates an execution command and issues the execution command to the controller, and the server generates an execution command compensation message (assuming that the command type is B1) because the controller is offline;
Step S510, after T (supposing T < T1) time length is passed, the controller is recovered to be in an on-line state, the server acquires an execution command compensation message, matches with a preset execution command compensation strategy, determines that the execution command needs to be issued again to the controller, and issues the execution command to the controller;
in step S511, the controller turns on the lamp of the living room according to the received execution command.
Embodiment four:
referring to fig. 9, a specific method for storing and clearing a trigger event compensation message according to an embodiment of the present application includes:
step S601, triggering a triggering event in a locally stored triggering condition compensation strategy by triggering equipment in an offline period;
step S602, the triggering device reports a triggering event to a server;
step 603, judging whether reporting failure, if yes, proceeding to step 604; if not, ending the flow;
step S604, generating a trigger event compensation message, judging whether intelligent scene data associated with the trigger event exists or not, and if so, performing step S605; if not, ending the flow;
step S605, judging whether the triggering event meets the compensation rule in the triggering condition compensation strategy, if yes, proceeding to step S606; if not, go to step S608;
Step S606, when the trigger equipment is recovered on line, determining that a trigger event compensation message exists, judging whether the trigger condition compensation is needed, if yes, executing a step S607; if not, go to step S608;
step S607, reporting the trigger event to the server again;
step S608, the trigger event compensation message is cleared.
The following describes a device or apparatus provided by an embodiment of the present application, where explanation or illustration of the same or corresponding technical features as those described in the above method is omitted.
Referring to fig. 10, on the side of an intelligent terminal device, an intelligent scene triggering device provided by an embodiment of the present application includes:
the processor 700 is configured to read the program in the memory 720, and execute the following procedures:
when the triggering equipment serving as the triggering condition of the intelligent scene is recovered to be online, acquiring a locally stored triggering event compensation message generated by utilizing a triggering event of failure of the reporting server when the triggering equipment is offline last time;
according to the trigger event compensation message, matching a locally stored trigger condition compensation strategy, when the trigger condition compensation is determined to be needed, reporting the trigger event identification recorded in the trigger event compensation message to the server again for the server, matching stored intelligent scene information according to the received trigger event identification, and determining an execution action configured for the trigger event and an execution object for executing the action according to the matched intelligent scene information.
In some embodiments, determining whether trigger condition compensation is needed includes:
matching compensation rules in a locally stored trigger condition compensation strategy according to the event level corresponding to the trigger event recorded in the trigger event compensation message;
and comparing the self offline recovery processing time period with the compensation rule to judge whether the triggering condition compensation is needed.
In some embodiments, determining whether the generated trigger event compensation message needs to be stored specifically includes:
when the trigger event contained in the trigger event type of the trigger condition compensation strategy is triggered and the reporting of the trigger event type fails, generating a trigger event compensation message recorded with the trigger event identifier;
judging whether intelligent scene information matched with the triggering event exists or not according to the triggering event identification matched with the intelligent scene information sent by the server;
if yes, judging whether the compensation rule in the trigger condition compensation strategy is met or not according to the event level corresponding to the trigger event, and if yes, storing the trigger event compensation message.
In some embodiments, the generated trigger event compensation message is cleared when one of the following conditions is met:
No intelligent scene information matched with the triggering event exists;
and determining that the triggering event does not meet the compensation rule in the triggering condition compensation strategy according to the event level corresponding to the triggering event.
In some embodiments, the intelligent scene triggering apparatus provided in the embodiments of the present application further includes a transceiver 710, configured to receive data sent by the server under the control of the processor 700, and send the data to the server.
Wherein in fig. 10, a bus architecture may comprise any number of interconnected buses and bridges, and in particular, one or more processors represented by processor 700 and various circuits of memory represented by memory 720, linked together. The bus architecture may also link together various other circuits such as peripheral devices, voltage regulators, power management circuits, etc., which are well known in the art and, therefore, will not be described further herein. The bus interface provides an interface. The transceiver 710 may be a number of elements, i.e. comprising a transmitter and a receiver, providing a means for communicating with various other apparatus over a transmission medium.
The processor 700 is responsible for managing the bus architecture and general processing, and the memory 720 may store data used by the processor 700 in performing operations.
In some embodiments, the processor 700 may be a CPU (Central processing Unit), ASIC (Application Specific Integrated Circuit ), FPGA (Field-Programmable Gate Array, field programmable Gate array), or CPLD (Complex Programmable Logic Device ).
Accordingly, referring to fig. 11, at a network side, a server provided in an embodiment of the present application includes:
processor 800, for reading the program in memory 820, performs the following processes:
receiving a triggering event identifier reported by triggering equipment;
matching the locally stored intelligent scene information acquired from the user terminal according to the triggering event identification, and determining an execution action configured for the triggering event and an execution object for executing the action according to the matched intelligent scene information;
when the execution object is an execution device, an execution command is generated and issued to the execution device, and if the execution device is offline, an execution command compensation message is generated and stored according to the execution command.
In some embodiments, after acquiring the intelligent scene information from the user terminal, before receiving the trigger event identifier reported by the trigger device, the method includes:
According to the acquired intelligent scene information created by the user terminal, determining triggering equipment serving as triggering conditions of the intelligent scene information;
according to event levels corresponding to all trigger events preset for the trigger equipment, matching a preset trigger condition compensation strategy to generate a trigger condition compensation strategy corresponding to the trigger equipment;
and sending the intelligent scene information and the triggering condition compensation strategy corresponding to the triggering equipment.
In some embodiments, after the execution device comes back online, the method includes:
acquiring a locally stored execution command compensation message, matching a compensation rule in the execution command compensation strategy according to a command type recorded in the execution command compensation message, and comparing the execution device offline recovery processing time period with the compensation rule to judge whether an execution command needs to be issued to the execution device again according to the execution command compensation message;
if yes, the execution command is issued to the execution device again, and the execution device is used for completing the execution action of the corresponding intelligent scene according to the execution command;
and if not, clearing the execution command compensation message.
In some embodiments, the action is performed directly when the execution object is a home device.
In some embodiments, the server provided by the embodiments of the present application further includes a transceiver 810, configured to receive data sent by the user terminal and the intelligent scene trigger device, and send the data to the user terminal and the intelligent scene trigger device under the control of the processor 800.
Wherein in fig. 11, a bus architecture may comprise any number of interconnected buses and bridges, and in particular, one or more processors represented by processor 800 and various circuits of memory represented by memory 820, linked together. The bus architecture may also link together various other circuits such as peripheral devices, voltage regulators, power management circuits, etc., which are well known in the art and, therefore, will not be described further herein. The bus interface provides an interface. Transceiver 810 may be a plurality of elements, i.e., including a transmitter and a receiver, providing a means for communicating with various other apparatus over a transmission medium.
In some embodiments, the server provided by embodiments of the present application further includes a user interface 830, where the user interface 830 may be an interface capable of interfacing with an internal connection requiring device, including but not limited to a keypad, display, speaker, microphone, joystick, etc.
The processor 800 is responsible for managing the bus architecture and general processing, and the memory 820 may store data used by the processor 800 in performing operations.
In some embodiments, the processor 800 may be a CPU (Central processing Unit), ASIC (Application Specific Integrated Circuit ), FPGA (Field-Programmable Gate Array, field programmable Gate array), or CPLD (Complex Programmable Logic Device ).
Embodiments of the present application provide a computing device, which may be specifically a desktop computer, a portable computer, a smart phone, a tablet computer, a personal digital assistant (Personal Digital Assistant, PDA), and the like. The computing device may include a central processing unit (Center Processing Unit, CPU), memory, input/output devices, etc., the input devices may include a keyboard, mouse, touch screen, etc., and the output devices may include a display device, such as a liquid crystal display (Liquid Crystal Display, LCD), cathode Ray Tube (CRT), etc.
The memory may include Read Only Memory (ROM) and Random Access Memory (RAM) and provides the processor with program instructions and data stored in the memory. In the embodiment of the present application, the memory may be used to store a program of any of the methods provided in the embodiment of the present application.
The processor is configured to execute any of the methods provided by the embodiments of the present application according to the obtained program instructions by calling the program instructions stored in the memory.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method of any of the above embodiments. The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
An embodiment of the present application provides a computer readable storage medium storing computer program instructions for use in an apparatus provided in the embodiment of the present application, where the computer program instructions include a program for executing any one of the methods provided in the embodiment of the present application. The computer readable storage medium may be a non-transitory computer readable medium.
The computer-readable storage medium can be any available medium or data storage device that can be accessed by a computer, including, but not limited to, magnetic storage (e.g., floppy disks, hard disks, magnetic tape, magneto-optical disks (MOs), etc.), optical storage (e.g., CD, DVD, BD, HVD, etc.), and semiconductor storage (e.g., ROM, EPROM, EEPROM, nonvolatile storage (NAND FLASH), solid State Disk (SSD)), etc.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (10)

1. A method of processing intelligent scene execution, the method comprising:
when the triggering equipment serving as the triggering condition of the intelligent scene is recovered to be online, acquiring a locally stored triggering event compensation message generated by utilizing a triggering event of failure of the reporting server when the triggering equipment is offline last time;
according to the trigger event compensation message, matching a locally stored trigger condition compensation strategy, when the trigger condition compensation is determined to be needed, reporting the trigger event identification recorded in the trigger event compensation message to the server again for the server, matching stored intelligent scene information according to the received trigger event identification, and determining an execution action configured for the trigger event and an execution object for executing the action according to the matched intelligent scene information.
2. The method of claim 1, wherein determining whether trigger condition compensation is required comprises:
matching compensation rules in a locally stored trigger condition compensation strategy according to the event level corresponding to the trigger event recorded in the trigger event compensation message;
and comparing the self offline recovery processing time period with the compensation rule to judge whether the triggering condition compensation is needed.
3. The method according to claim 1, wherein determining whether the generated trigger event compensation message needs to be stored comprises:
when the trigger event contained in the trigger event type of the trigger condition compensation strategy is triggered and the reporting of the trigger event type fails, generating a trigger event compensation message recorded with the trigger event identifier;
judging whether intelligent scene information matched with the triggering event exists or not according to the triggering event identification matched with the intelligent scene information sent by the server;
if yes, judging whether the compensation rule in the trigger condition compensation strategy is met or not according to the event level corresponding to the trigger event, and if yes, storing the trigger event compensation message.
4. A method according to claim 3, wherein the generated trigger event compensation message is cleared when one of the following conditions is met:
no intelligent scene information matched with the triggering event exists;
and determining that the triggering event does not meet the compensation rule in the triggering condition compensation strategy according to the event level corresponding to the triggering event.
5. A method of processing intelligent scene execution, the method comprising:
receiving a triggering event identifier reported by triggering equipment;
matching the locally stored intelligent scene information acquired from the user terminal according to the triggering event identification, and determining an execution action configured for the triggering event and an execution object for executing the action according to the matched intelligent scene information;
when the execution object is an execution device, an execution command is generated and issued to the execution device, and if the execution device is offline, an execution command compensation message is generated and stored according to the execution command.
6. The method of claim 5, wherein after the intelligent context information is obtained from the user terminal, before receiving the trigger event identifier reported by the trigger device, the method comprises:
According to the acquired intelligent scene information created by the user terminal, determining triggering equipment serving as triggering conditions of the intelligent scene information;
according to event levels corresponding to all trigger events preset for the trigger equipment, matching a preset trigger condition compensation strategy to generate a trigger condition compensation strategy corresponding to the trigger equipment;
and sending the intelligent scene information and the triggering condition compensation strategy corresponding to the triggering equipment.
7. The method of claim 5, wherein after the executing device comes back online, comprising:
acquiring a locally stored execution command compensation message, matching a compensation rule in the execution command compensation strategy according to a command type recorded in the execution command compensation message, and comparing the execution device offline recovery processing time period with the compensation rule to judge whether an execution command needs to be issued to the execution device again according to the execution command compensation message;
if yes, the execution command is issued to the execution device again, and the execution device is used for completing the execution action of the corresponding intelligent scene according to the execution command;
And if not, clearing the execution command compensation message.
8. The method of claim 5, wherein the action is performed directly when the execution object is a home device.
9. An intelligent scene triggering device, comprising:
a memory for storing program instructions;
a processor for invoking program instructions stored in said memory to perform the method of any of claims 1 to 4 in accordance with the obtained program.
10. A server, comprising:
a memory for storing program instructions;
a processor for invoking program instructions stored in said memory to perform the method of any of claims 5 to 8 in accordance with the obtained program.
CN202310909388.6A 2023-07-21 2023-07-21 Intelligent scene execution processing method, triggering device and server Pending CN116886725A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310909388.6A CN116886725A (en) 2023-07-21 2023-07-21 Intelligent scene execution processing method, triggering device and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310909388.6A CN116886725A (en) 2023-07-21 2023-07-21 Intelligent scene execution processing method, triggering device and server

Publications (1)

Publication Number Publication Date
CN116886725A true CN116886725A (en) 2023-10-13

Family

ID=88266038

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310909388.6A Pending CN116886725A (en) 2023-07-21 2023-07-21 Intelligent scene execution processing method, triggering device and server

Country Status (1)

Country Link
CN (1) CN116886725A (en)

Similar Documents

Publication Publication Date Title
AU2019248524C1 (en) Cross-blockchain interaction method, apparatus, system, and electronic device
CN107295080B (en) Data storage method applied to distributed server cluster and server
CN109213792A (en) Method, server-side, client, device and the readable storage medium storing program for executing of data processing
US9836331B2 (en) Shared resource updating
CN111461885B (en) Consensus network management method, device, computer and readable storage medium
CN112671613B (en) Federal learning cluster monitoring method, device, equipment and medium
CN113051110A (en) Cluster switching method, device and equipment
AU2015213307A1 (en) Method for setting heartbeat timer, terminal and server
CN114328132A (en) Method, device, equipment and medium for monitoring state of external data source
US20120246311A1 (en) Session management system, session management device, session management method and session management program
CN111262893B (en) Method for event notification, server device, event notification apparatus, and medium
CN111427689B (en) Cluster keep-alive method and device and storage medium
CN116886725A (en) Intelligent scene execution processing method, triggering device and server
CN108763017A (en) Application of software data processing method, server-side and the storage medium of financial business
CN111209333A (en) Data updating method, device, terminal and storage medium
CN108401018B (en) Method and device for supervising student mobile terminal by multi-mode strategy
CN112835748A (en) Multi-center redundancy arbitration method and system based on scada system
CN112349003B (en) Door lock password transmission method, lock body, server and readable storage medium
CN115168139A (en) Early warning method and system based on monitoring prediction
CN109857720B (en) Database table monitoring method, device, computer device and readable storage medium
EP3526976B1 (en) Managing a telephone communication system to dynamically throttle call attempts and reduce congestions during mass call events
CN111506328A (en) Service upgrading method and device, electronic equipment and storage medium
CN112463514A (en) Monitoring method and device for distributed cache cluster
CN109753529A (en) Management method, system and the smart television of application program setting menu item
CN114866567B (en) Disaster-tolerant multi-level blockchain network block synchronization method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination