CN115982160A - Data processing method, server, electronic device, and computer storage medium - Google Patents

Data processing method, server, electronic device, and computer storage medium Download PDF

Info

Publication number
CN115982160A
CN115982160A CN202211676300.2A CN202211676300A CN115982160A CN 115982160 A CN115982160 A CN 115982160A CN 202211676300 A CN202211676300 A CN 202211676300A CN 115982160 A CN115982160 A CN 115982160A
Authority
CN
China
Prior art keywords
data
container
target
processed
database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211676300.2A
Other languages
Chinese (zh)
Inventor
刘一星
喻波
王志海
韩振国
安鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wondersoft Technology Co Ltd
Original Assignee
Beijing Wondersoft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wondersoft Technology Co Ltd filed Critical Beijing Wondersoft Technology Co Ltd
Priority to CN202211676300.2A priority Critical patent/CN115982160A/en
Publication of CN115982160A publication Critical patent/CN115982160A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the invention provides a data processing method, a server, electronic equipment and a computer readable storage medium, and relates to the technical field of data processing. The server is provided with a business service container, a data cache container, a data collection engine container and a database container. After the server acquires a data processing instruction, the service container encapsulates original data to be processed to obtain target data, and the target data is sent to the data cache container; the data collection engine container acquires the target data from the data cache container, processes the target data to obtain processed data, and sends the processed data to the database container; and the database container stores the processed data. The embodiment of the invention ensures that the data can not be lost even when the server reaches the performance bottleneck, and improves the stability of data processing.

Description

Data processing method, server, electronic device, and computer storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a data processing method, a data processing server, an electronic device, and a computer-readable storage medium.
Background
In daily business scenarios, it is often necessary to transmit or synchronize business data to some non-relational databases in real time. However, in the related art, when the non-relational database reaches the performance bottleneck, the data is queued for processing, so that data loss occurs during the process of waiting for processing, and the stability is poor.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are proposed to provide a data processing method, a data processing server, an electronic device, and a computer-readable storage medium that overcome or at least partially solve the above problems.
In order to solve the above problem, an embodiment of the present invention discloses a data processing method, which is characterized in that the method is applied to a server, and the server includes a business service container, a data cache container, a data collection engine container and a database container; the method comprises the following steps:
in response to a data processing instruction, the service container encapsulates original data to be processed to obtain target data, and sends the target data to the data cache container;
the data collection engine container acquires the target data from the data cache container, processes the target data to obtain processed data, and sends the processed data to the database container;
and the database container stores the processed data.
In one or more embodiments, the encapsulating, by the service container, original data to be transmitted to obtain target data, and sending the target data to the data cache container includes:
packaging the original data by adopting a json format to obtain the target data;
inserting the target data into the created local data queue;
and sending the target data in the local data queue to the data cache container by adopting a preset asynchronous thread.
In one or more embodiments, the sending the target data to the data cache container includes:
the business service container determines a target address port of the data cache container and a target topic in the target address port based on the business attribute of the target data; the data cache container comprises at least one address port, and any address port comprises at least one preset original topic;
the business service container sends the target data to the data caching container based on the target address port and the target topic.
In one or more embodiments, the acquiring, by the data collection engine container, the target data from the data cache container, processing the target data to obtain processed data, and sending the processed data to the database container includes:
the data collection engine container acquires the target data from the data cache container and filters the target data to obtain filtered data; the filtering comprises keyword filtering;
and sending the filtered data to the database container.
In one or more embodiments, the database container stores the processed data, including:
the database container generates an index database based on the current time node and a preset index template;
and storing the processed data to the index library.
In one or more embodiments, the data buffer container includes at least one data buffer queue, and the at least one data buffer queue has a corresponding address port respectively; the database container includes a plurality of databases, each of which is of a non-relational type.
Correspondingly, the embodiment of the invention discloses a data processing server, which is characterized by comprising a business service container, a data cache container, a data collection engine container and a database container; the server includes:
the business service container is used for responding to a data processing instruction, packaging original data to be processed to obtain target data, and sending the target data to the data cache container;
the data collection engine container is used for acquiring the target data from the data cache container, processing the target data to obtain processed data and sending the processed data to the database container;
and the database container is used for storing the processed data.
In one or more embodiments, the business service container is specifically configured to:
packaging the original data by adopting a json format to obtain the target data;
inserting the target data into the created local data queue;
and sending the target data in the local data queue to the data cache container by adopting a preset asynchronous thread.
In one or more embodiments, the service container is further specifically configured to:
determining a target address port of the data cache container and a target topic in the target address port based on the business attribute of the target data; the data cache container comprises at least one address port, and any address port comprises at least one preset original topic;
transmitting the target data to the data caching container based on the destination address port and the target topic.
In one or more embodiments, the data collection engine container is specifically configured to:
acquiring the target data from the data cache container, and filtering the target data to obtain filtered data; the filtering comprises keyword filtering;
and sending the filtered data to the database container.
In one or more embodiments, the database container is specifically configured to:
generating an index base based on the current time node and a preset index template;
and storing the processed data to the index library.
In one or more embodiments, the data cache container includes at least one data cache queue, each of the at least one data cache queue having a corresponding address port; the database container includes a plurality of databases, each of which is of a type that is not relational.
Correspondingly, the embodiment of the invention discloses an electronic device, which comprises: a processor, a memory and a computer program stored on the memory and capable of running on the processor, which computer program, when executed by the processor, performs the steps of the above-described data processing method embodiments.
Accordingly, the embodiment of the present invention discloses a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps of the data processing method embodiment.
The embodiment of the invention has the following advantages:
the server is provided with a business service container, a data cache container, a data collection engine container and a database container. After the server acquires a data processing instruction, the service container encapsulates original data to be processed to obtain target data, and the target data is sent to the data cache container; the data collection engine container acquires the target data from the data cache container, processes the target data to obtain processed data, and sends the processed data to the database container; and the database container stores the processed data. Therefore, when the data volume is large and the server reaches the performance bottleneck, the data cache container can cache the current target data which is not processed in time, after the current data processing is finished, the target data in the cache can be processed to obtain the processed data, and the processed data is stored in the non-relational database container, so that the data cannot be lost even if the server reaches the performance bottleneck, and the stability of data processing is improved.
Drawings
FIG. 1 is a flow chart of the steps of one data processing method embodiment of the present invention;
FIG. 2 is a logical framework diagram of one data processing method embodiment of the present invention;
fig. 3 is a block diagram of a data processing server according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention more comprehensible, the present invention is described in detail with reference to the accompanying drawings and the detailed description thereof.
One of the core concepts of the embodiments of the present invention is that a service container, a data cache container, a data collection engine container, and a database container are deployed in a server. After the server acquires a data processing instruction, the service container encapsulates original data to be processed to obtain target data, and the target data is sent to the data cache container; the data collection engine container acquires the target data from the data cache container, processes the target data to obtain processed data, and sends the processed data to the database container; and the database container stores the processed data. Therefore, when the data volume is large and the server reaches the performance bottleneck, the data cache container can cache the current target data which cannot be processed in time, after the current data processing is finished, the target data in the cache can be processed to obtain the processed data, and the processed data is stored in the non-relational database container, so that the data cannot be lost even when the server reaches the performance bottleneck, and the stability of data processing is improved.
Referring to fig. 1, a flow chart of steps of an embodiment of a data processing method of the present invention is shown, the method is applied to a server, and the server comprises a business service container, a data cache container, a data collection engine container and a database container. The data buffer container comprises at least one data buffer queue, and the at least one data buffer queue is respectively provided with a corresponding address port; the database container includes a plurality of databases, each of which is of a non-relational type.
In particular, containerization is a virtualization technique, also known as Operating system level virtualization (Operating system level virtualization), that virtualizes the Operating system kernel, allowing user space software instances (instances) to be partitioned into several independent units, running in the kernel, rather than running only a single instance. This instance of software, also known as a container (containers). The server programs that they use appear to each instance's owner and user to be proprietary to themselves. Compared with the traditional virtualization technology based on the virtual machine, the containerization has the advantages of less occupied server resource space, very quick start and capability of booting within seconds generally.
The data processing method and the data processing system are applied to the embodiment of the invention, the business service container in the server is used for conducting business service processing on data, the data cache container is used for conducting cache processing on the data, the data collection engine container is used for processing the data, including but not limited to input, output and filtering, and the database container is used for storing the processed data.
The type of the data cache container may be kafka. kafka is a distributed Message Queue (Message Queue) based on a publish/subscribe mode, and is mainly applied to the field of big data real-time processing. In this embodiment of the present invention, the data buffer container may include at least one data buffer queue, each data buffer queue may be of a type of kafka, and each kafka has a corresponding address port, for example, if the data buffer container includes 3 kafka, the address ports of the 3 kafka may be: 192.168.0.1, 192.168.0.2 and 192.168.0.3.
The type of data collection engine container may be Logstash. Logstash is an open-source data collection engine with real-time pipeline 0 capability that can dynamically unify data from different sources and normalize the data to your selected destination output. It provides a number of plug-ins that help users parse, enrich, convert, and buffer any type of data.
The database container can comprise a plurality of databases, and each database is of a non-relational type, such as an elastic search database, so that the distributed computing advantage of the database cluster can be fully used, and the storage efficiency is improved 5.
It should be noted that, specific types of the data cache container, the data collection engine container and the database container may be other types having the same function besides the types described above, and may be adjusted according to actual needs in actual applications, which is not limited in this embodiment of the present invention.
Further, the business service container, the data cache container, the data collection engine container and the database container 0 can be deployed in one server, or in two or more servers,
for example, 4 containers are respectively deployed in 4 servers, so that hardware resources required by each container are independent from each other, and performance can be improved. Therefore, in practical applications, the specific deployment mode of the container may be adjusted according to actual requirements, which is not limited in the embodiment of the present invention.
The method specifically comprises the following steps: and step 5, responding to a data processing instruction, packaging original data to be processed by the service container to obtain target data, and sending the target data to the data cache container by the service container.
After the server acquires a processing instruction of original data to be processed, a service container in the server can encapsulate the original data to obtain encapsulated target data, and then the service server container sends the target data to a data cache container so that the data cache container caches the target data.
The processing instruction may be initiated by a user actively, may be initiated automatically at a certain time, or may be initiated in other manners, and in practical application, the initiating manner of the processing instruction may be set according to practical requirements, which is not limited in this embodiment of the present invention.
In the embodiment of the present invention, the encapsulating, by the service container, original data to be transmitted to obtain target data, and sending the target data to the data cache container includes:
packaging the original data by adopting a json format to obtain the target data;
inserting the target data into the created local data queue;
and sending the target data in the local data queue to the data cache container by adopting a preset asynchronous thread.
Specifically, the service container may encapsulate the original data in a json data format to obtain encapsulated target data, insert the target data into a created local data queue, such as an ArrayBlockingQueue, and send the target data in the ArrayBlockingQueue to the data cache container by using a preset asynchronous thread, such as dataasynces thread.
It should be noted that, in practical applications, the specific types of the local data queue and the asynchronous thread may be adjusted according to the above types, and the embodiment of the present invention is not limited to this.
In this embodiment of the present invention, the sending the target data to the data cache container includes:
the business service container determines a target address port of the data cache container and a target topic in the target address port based on the business attribute of the target data; the data cache container comprises at least one address port, and any address port comprises at least one preset original topic;
the business service container sends the target data to the data caching container based on the target address port and the target topic.
Specifically, as described above, since the data buffer container may include more than one data buffer queue, each data buffer queue corresponds to one address port, each data buffer queue includes multiple topics (topic), and the service attributes of the data stored in each topic are not necessarily the same, for example, the data buffer container includes 3 data buffer queues a, B, and C, where a includes topics 1, 2, and 3, B includes topics 4, and C includes topics 5 and 6, and after the target data is obtained, the service attribute of the target data is required to be the same as the service attribute of topic 4, and then the target data may be sent to topic 4 of data queue B in the data buffer container for storage. For another example, if the service attribute of the target data is the same as the service attribute of topics 1 and 3, the target data may be sent to topics 1 and 3 of the data queue a in the data cache container for storage, and so on.
Therefore, when the data volume is large and the server reaches the performance bottleneck, the data cache container can cache the current target data which cannot be processed in time, and the target data in the cache can be processed after the current data is processed, so that the data cannot be lost even if the server reaches the performance bottleneck, and the stability in the data processing process is improved.
And 102, the data collection engine container acquires the target data from the data cache container, processes the target data to obtain processed data, and sends the processed data to the database container.
The data collection engine container can acquire target data from the data cache container, then perform data processing on the target data to obtain processed data, and then send the processed data to the database container for storage.
In this embodiment of the present invention, the acquiring, by the data collection engine container, the target data from the data cache container, processing the target data to obtain processed data, and sending the processed data to the database container includes:
the data collection engine container acquires the target data from the data cache container and filters the target data to obtain filtered data; the filtering comprises keyword filtering;
and sending the filtered data to the database container.
Specifically, a data processing program, such as a filter, may be set in the data collection engine container, and after the data collection engine container acquires the target data, the data collection engine container may perform rough processing and filtering on the target data to obtain filtered data, and then send the filtered data to the database container for storage. Wherein filtering includes, but is not limited to, keyword filtering.
It should be noted that the data processing program may be a filter or other program for processing data, and in addition, the filtering may be filtering in other manners besides the keyword filtering, and in practical application, the data processing program and the filtering manner may be set according to actual requirements, which is not limited in this embodiment of the present invention.
Further, in order to ensure data security, an account password may be set for the database container, and the data collection engine container may log in the database container by using the account password first, and send the filtered data to the database container after logging in successfully. The account password may be pre-stored in the data collection engine container or the database container, the data collection engine container obtains the account password in a request manner, and the account password may be set in other manners.
And 103, storing the processed data by the database container.
After the database container obtains the filtered data, the database container may store the processed data, for example, store the filtered data.
In an embodiment of the present invention, the storing the processed data by the database container includes:
the database container generates an index database based on the current time node and a preset index template;
and storing the processed data to the index library.
Specifically, a plurality of index templates may be set in advance, each having a one-to-one correspondence period and attribute information of the index library. When the database container stores data, the time period to which the current time node belongs can be determined, so that the index template of the time period is determined, the index database is created according to the attribute information of the index database in the index template, and the processed data can be stored in the index database after the creation is completed.
It should be noted that, because the database container may include a plurality of databases, when creating the index library, the index library may be created according to a preset rule, for example, according to a numbering sequence of the databases, or according to a load balancing manner, or according to another manner, and a specific rule may be set according to an actual requirement, which is not limited in this embodiment of the present invention.
For ease of understanding, fig. 2 shows a logical framework diagram of an embodiment of the present invention, which is illustrated with each container deployed in one server. Specifically, the server comprises a service container, a data cache container, a data collection engine container and a database container, after the service container obtains a data processing instruction, original data are packaged into a json format to obtain target data, the target data are inserted into a local data queue ArrayBlockingQueue, then an asynchronous thread DataAsyncEsThread is started to send the target data in the ArrayBlockingQueue to the data cache container, the data collection engine container obtains the target data from the data cache container, then the target data are filtered through a filter to obtain processed data, and the processed data are sent to the database container.
In the embodiment of the invention, a business service container, a data cache container, a data collection engine container and a database container are deployed in the server. After the server acquires a data processing instruction, the service container encapsulates original data to be processed to obtain target data, and the target data is sent to the data cache container; the data collection engine container acquires the target data from the data cache container, processes the target data to obtain processed data, and sends the processed data to the database container; and the database container stores the processed data. Therefore, when the data volume is large and the server reaches the performance bottleneck, the data cache container can cache the current target data which is not processed in time, after the current data processing is finished, the target data in the cache can be processed to obtain the processed data, and the processed data is stored in the non-relational database container, so that the data cannot be lost even if the server reaches the performance bottleneck, and the stability of data processing is improved.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 3, there is shown a block diagram of a data processing server according to an embodiment of the present invention, where the server includes a business service container 301, a data cache container 302, a data collection engine container 303 and a database container 304; the server includes:
the service container 301 is configured to respond to a data processing instruction, encapsulate original data to be processed to obtain target data, and send the target data to the data cache container 302;
the data collection engine container 303 is configured to obtain the target data from the data cache container, process the target data to obtain processed data, and send the processed data to the database container;
the database container 304 is configured to store the processed data.
In this embodiment of the present invention, the service container is specifically configured to:
packaging the original data by adopting a json format to obtain the target data;
inserting the target data into the created local data queue;
and sending the target data in the local data queue to the data cache container by adopting a preset asynchronous thread.
In this embodiment of the present invention, the service container is further specifically configured to:
determining a target address port of the data cache container and a target topic in the target address port based on the business attribute of the target data; the data cache container comprises at least one address port, and any address port comprises at least one preset original topic;
transmitting the target data to the data caching container based on the destination address port and the target topic.
In an embodiment of the present invention, the data collection engine container is specifically configured to:
acquiring the target data from the data cache container, and filtering the target data to obtain filtered data; the filtering comprises keyword filtering;
and sending the filtered data to the database container.
In an embodiment of the present invention, the database container is specifically configured to:
generating an index base based on the current time node and a preset index template;
and storing the processed data to the index library.
In the embodiment of the present invention, the data buffer container includes at least one data buffer queue, and the at least one data buffer queue has a corresponding address port; the database container includes a plurality of databases, each of which is of a non-relational type.
For the server embodiment, since it is basically similar to the method embodiment, the description is simple, and for relevant points, refer to the partial description of the method embodiment.
An embodiment of the present invention further provides an electronic device, including:
the data processing method comprises a processor, a memory and a computer program which is stored on the memory and can run on the processor, wherein when the computer program is executed by the processor, each process of the data processing method embodiment is realized, the same technical effect can be achieved, and the details are not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements each process of the data processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrases "comprising one of \ ...does not exclude the presence of additional like elements in a process, method, article, or terminal device that comprises the element.
The data processing method and the data processing apparatus provided by the present invention are described in detail above, and the principle and the implementation of the present invention are explained in the present document by applying specific examples, and the description of the above examples is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (14)

1. The data processing method is applied to a server, wherein the server comprises a business service container, a data cache container, a data collection engine container and a database container; the method comprises the following steps:
in response to a data processing instruction, the service container encapsulates original data to be processed to obtain target data, and sends the target data to the data cache container;
the data collection engine container acquires the target data from the data cache container, processes the target data to obtain processed data, and sends the processed data to the database container;
and the database container stores the processed data.
2. The data processing method of claim 1, wherein the service container encapsulates original data to be transmitted to obtain target data, and sends the target data to the data cache container, and the method comprises:
packaging the original data by adopting a json format to obtain the target data;
inserting the target data into the created local data queue;
and sending the target data in the local data queue to the data cache container by adopting a preset asynchronous thread.
3. The data processing method according to claim 1 or 2, wherein the sending the target data to the data cache container comprises:
the business service container determines a target address port of the data cache container and a target topic in the target address port based on the business attribute of the target data; the data cache container comprises at least one address port, and any address port comprises at least one preset original topic;
the business service container sends the target data to the data caching container based on the target address port and the target topic.
4. The data processing method of claim 1, wherein the data collection engine container obtains the target data from the data cache container, processes the target data to obtain processed data, and sends the processed data to the database container, and the data processing method comprises:
the data collection engine container acquires the target data from the data cache container and filters the target data to obtain filtered data; the filtering comprises keyword filtering;
and sending the filtered data to the database container.
5. The data processing method of claim 1, wherein the storing of the processed data by the database container comprises:
the database container generates an index database based on the current time node and a preset index template;
and storing the processed data to the index library.
6. The data processing method according to claim 1, wherein the data buffer container comprises at least one data buffer queue, and the at least one data buffer queue has a corresponding address port; the database container includes a plurality of databases, each of which is of a non-relational type.
7. A data processing server is characterized in that the server comprises a business service container, a data cache container, a data collection engine container and a database container; the server includes:
the business service container is used for responding to a data processing instruction, packaging original data to be processed to obtain target data, and sending the target data to the data cache container;
the data collection engine container is used for acquiring the target data from the data cache container, processing the target data to obtain processed data and sending the processed data to the database container;
and the database container is used for storing the processed data.
8. The data processing server of claim 7, wherein the business service container is specifically configured to:
packaging the original data by adopting a json format to obtain the target data;
inserting the target data into the created local data queue;
and sending the target data in the local data queue to the data cache container by adopting a preset asynchronous thread.
9. The data processing server according to claim 7 or 8, wherein the traffic service container is further configured to:
determining a target address port of the data cache container and a target topic in the target address port based on the business attribute of the target data; the data cache container comprises at least one address port, and any address port comprises at least one preset original topic;
transmitting the target data to the data caching container based on the destination address port and the target topic.
10. The data processing server of claim 7, wherein the data collection engine container is specifically configured to:
acquiring the target data from the data cache container, and filtering the target data to obtain filtered data; the filtering comprises keyword filtering;
and sending the filtered data to the database container.
11. The data processing server of claim 7, wherein the database container is specifically configured to:
generating an index base based on the current time node and a preset index template;
and storing the processed data to the index library.
12. The data processing server of claim 7, wherein the data buffer container comprises at least one data buffer queue, each of the at least one data buffer queue having a corresponding address port; the database container includes a plurality of databases, each of which is of a non-relational type.
13. An electronic device, comprising: processor, memory and computer program stored on said memory and capable of running on said processor, said computer program when executed by said processor implementing the steps of the data processing method according to any one of claims 1 to 6.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the data processing method according to any one of claims 1 to 6.
CN202211676300.2A 2022-12-26 2022-12-26 Data processing method, server, electronic device, and computer storage medium Pending CN115982160A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211676300.2A CN115982160A (en) 2022-12-26 2022-12-26 Data processing method, server, electronic device, and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211676300.2A CN115982160A (en) 2022-12-26 2022-12-26 Data processing method, server, electronic device, and computer storage medium

Publications (1)

Publication Number Publication Date
CN115982160A true CN115982160A (en) 2023-04-18

Family

ID=85957529

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211676300.2A Pending CN115982160A (en) 2022-12-26 2022-12-26 Data processing method, server, electronic device, and computer storage medium

Country Status (1)

Country Link
CN (1) CN115982160A (en)

Similar Documents

Publication Publication Date Title
CN110943961B (en) Data processing method, device and storage medium
CN107920094B (en) Data acquisition method and device, server and network equipment
CN112449750A (en) Log data collection method, log data collection device, storage medium, and log data collection system
CN109726004B (en) Data processing method and device
CN110661829B (en) File downloading method and device, client and computer readable storage medium
CN110297944B (en) Distributed XML data processing method and system
CN111105006A (en) Deep learning network training system and method
CN109254854A (en) Asynchronous invoking method, computer installation and storage medium
CN107315972A (en) A kind of dynamic desensitization method of big data unstructured document and system
US20210109932A1 (en) Selecting an optimal combination of systems for query processing
US10606655B2 (en) Non-directional transmissible task
US11960578B2 (en) Correspondence of external operations to containers and mutation events
US10671686B2 (en) Processing webpage data
CN106168963A (en) Real-time streaming data processing method and device and server
CN110798490A (en) Method and device for accessing third-party system based on data center and data center
CN111008254A (en) Object creating method and device, computer equipment and storage medium
CN107395663B (en) Data acquisition method and device
US11093477B1 (en) Multiple source database system consolidation
US11861386B1 (en) Application gateways in an on-demand network code execution system
US10887267B2 (en) Intelligent notification routing and delivery
CN110019497B (en) Data reading method and device
US11388210B1 (en) Streaming analytics using a serverless compute system
CN115982160A (en) Data processing method, server, electronic device, and computer storage medium
CN109302446B (en) Cross-platform access method and device, electronic equipment and storage medium
CN109600403B (en) Method and device for sending information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination