CN116226251A - Data export method and device, electronic equipment and storage medium - Google Patents

Data export method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116226251A
CN116226251A CN202310191570.2A CN202310191570A CN116226251A CN 116226251 A CN116226251 A CN 116226251A CN 202310191570 A CN202310191570 A CN 202310191570A CN 116226251 A CN116226251 A CN 116226251A
Authority
CN
China
Prior art keywords
data
paging
memory
thread
exported
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310191570.2A
Other languages
Chinese (zh)
Inventor
王红阳
张晋锋
吕灼恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongke Shuguang International Information Industry Co ltd
Shuguang Zhisuan Information Technology Co ltd
Original Assignee
Zhongke Shuguang International Information Industry Co ltd
Shuguang Zhisuan Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongke Shuguang International Information Industry Co ltd, Shuguang Zhisuan Information Technology Co ltd filed Critical Zhongke Shuguang International Information Industry Co ltd
Priority to CN202310191570.2A priority Critical patent/CN116226251A/en
Publication of CN116226251A publication Critical patent/CN116226251A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • G06F16/258Data format conversion from or to a database
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • G06F16/2228Indexing structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a data export method, a device, electronic equipment and a storage medium, and relates to the field of data transmission, wherein the method comprises the following steps: storing the preposed data obtained from the database into a memory; setting the paging number according to the total number of the data to be exported, and sequentially acquiring the paging data from the database through the main thread according to the retrieval associated data so as to store the paging data into the memory; and acquiring the paging data in the memory through at least one slave thread and performing data processing to acquire the paging data file with the data processing completed. According to the technical scheme, the data to be exported can be exported completely by occupying less memory resources required by paging data, so that the occupation of the memory resources is reduced, the occurrence of memory overflow phenomenon is avoided, meanwhile, the paging data can be directly acquired by a main thread every time the main thread accesses a database, the read-write operation times of a disk interface of the database are reduced, and the data export efficiency is improved.

Description

Data export method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of data transmission, and in particular, to a data export method, apparatus, electronic device, and storage medium.
Background
Along with the continuous expansion of service demands, a large amount of service data is often stored in a database of a service platform, and how to quickly derive the service data from the service platform also becomes an important condition for the development of the service platform.
The existing online real-time data export technology mainly carries out data export through concurrent threads so as to store the data to be exported into a memory completely; however, in such a data export manner, more memory resources are occupied, and there is a serious risk of memory overflow, so that the business processing flow becomes more complicated.
Meanwhile, when data is exported from the service platform according to the retrieval conditions, the disk I/O (input/output) interface of the database needs to be read and written for a plurality of times every time of access to the database, so that the data export efficiency is greatly reduced.
Disclosure of Invention
The invention provides a data export method, a data export device, electronic equipment and a storage medium, which are used for solving the problem that memory overflow risks exist when business data are exported from a database.
According to an aspect of the present invention, there is provided a data export method comprising:
according to the acquired data, a task is exported, matched prepositive data is acquired from a database, and the prepositive data is stored into a memory; wherein the preamble data comprises the total number of data to be exported and retrieval associated data;
Setting the number of pages according to the total number of the data to be exported, and sequentially acquiring the paging data from the database through a main thread according to the retrieval related data so as to store the paging data into a memory;
and acquiring the paging data in the memory through at least one slave thread, performing data processing to acquire a paging data file subjected to data processing, and storing the paging data file into a shared disk.
The setting of the number of pages according to the total number of the data to be exported specifically includes: and determining a standard single page threshold according to the service type and the data type of the data export task, and setting the paging number according to the total number of the data to be exported and the standard single page threshold. The standard single page threshold value not only can ensure that each piece of paging data occupies the same memory resource and realizes fixed call of the memory resource, but also can prevent the user from waiting time from being overlong due to excessive data loading quantity and prevent the data files from being excessively large due to insufficient data loading quantity when each piece of paging data generates corresponding data files and displays the corresponding data files to the user, thereby improving the convenience of data file storage and user browsing.
The determining a standard single page threshold according to the service type and the data type of the data export task specifically comprises the following steps: determining a standard single page threshold value and a maximum single page threshold value according to the service type and the data type of the data export task; wherein the maximum single page threshold is greater than the standard single page threshold; if the total number of the data to be exported is smaller than or equal to the maximum single page threshold value, setting the paging number as one; setting the paging number according to the total number of the data to be exported and the standard single page threshold, wherein the method specifically comprises the following steps: and if the total number of the data to be exported is larger than the maximum single page threshold value, setting the paging number according to the total number of the data to be exported and the standard single page threshold value. Compared with the standard single page threshold, the maximum single page threshold only increases the memory resource occupation amount of the paging data and the loading waiting time of the data files to a limited extent, but can avoid data paging, especially if the first paging data is generated according to the standard single page threshold, the second paging data consisting of the remaining export data is possibly mistaken as an empty file or an invalid file because the data amount is less, so that if the total number of the data to be exported is less than or equal to the maximum single page threshold, the export data can be summarized into a complete data file, and the phenomenon is avoided.
After setting the number of pages according to the total number of data to be exported and the standard single page threshold value, the method further comprises: and acquiring an average single page threshold according to the total data to be exported and the paging number, and taking the average single page threshold as a data number threshold of each paging. The average single page threshold value is used as the data number threshold value of each page, so that the processing efficiency of a data export task is improved, meanwhile, the same number of data in each data file is ensured, and when the page data file is displayed, each page data can provide a sufficient number of front and rear associated data for a user.
After storing the preamble data to the memory, the method further comprises: acquiring a thread response coefficient according to the total number of the data to be exported and the service type of the data export task; the thread response coefficient is the ratio of the thread waiting time to the thread calculating time; and setting the number of threads in a thread pool according to the number of cores of the central processing unit, the utilization rate of the central processing unit and the thread response coefficient. Therefore, on the premise that a new functional component is not required to be added to monitor the running state of the thread or the memory occupancy rate, the dynamic adjustment of the number of threads can be realized, the number of threads in a thread pool is ensured to meet the data processing requirement of a data export task, and the occurrence of redundant threads is avoided.
The step of sequentially obtaining the paging data from the database through the main thread according to the retrieval related data so as to store the paging data into the memory, specifically includes: acquiring a first quantity threshold according to the quantity of idle slave threads and/or the time consumption of slave thread data processing; and acquiring the first quantity of threshold paging data from the database through the main thread according to the retrieval related data so as to store the first quantity of threshold paging data into a memory. The number of the idle slave threads and/or the time consumption of the data processing of the slave threads are/is reduced by the first number threshold value, so that the number of the paging data acquired by the master thread each time can meet the data processing requirement of each slave thread, the existence of the idle threads is reduced, and the transition occupation of memory resources is avoided.
After sequentially acquiring the paging data from the database through the main thread according to the retrieval related data so as to store the paging data into the memory, the method further comprises the following steps: setting a first expiration time for the preamble data and starting timing; if the first expiration time is timed out, judging whether the data pass through the main thread, and storing all paging data of the data export task into a memory; if all the paging data of the data export task are not stored into the memory through the main thread, resetting the first expiration time to restart the timing operation of the first expiration time; and if all the paging data of the data export task are stored into the memory through the main thread, deleting the preposed data. The setting and resetting of the first expiration time ensures that the cache data in the memory is cleaned in time, avoids memory overflow caused by long-time memory occupation, ensures complete export of the data to be exported, and avoids occurrence of data missing phenomenon.
According to another aspect of the present invention, there is provided a data deriving apparatus comprising:
the preposed data acquisition module is used for deriving a task according to the acquired data, acquiring matched preposed data from a database and storing the preposed data into a memory; wherein the preamble data comprises the total number of data to be exported and retrieval associated data;
the paging data storage module is used for setting the number of paging according to the total number of the data to be exported, and sequentially acquiring the paging data from the database through a main thread according to the retrieval related data so as to store the paging data into a memory;
and the paging data processing module is used for acquiring the paging data in the memory through at least one slave thread and performing data processing to acquire a paging data file subjected to data processing, and storing the paging data file into the shared disk.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the data deriving method according to any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute a data deriving method according to any embodiment of the present invention.
According to the technical scheme, the main thread acquires the matched front-end data from the database, sets the paging quantity, sequentially acquires the paging data from the database according to the retrieval related data and stores the paging data into the memory, and then the secondary thread extracts the paging data in the memory and processes the data, so that the paging data file is stored into the shared disk, the data to be exported can be completely exported only by occupying less memory resources required by the paging data, the occupation of the memory resources is reduced, the memory overflow phenomenon is avoided, meanwhile, the main thread can directly acquire the paging data when accessing the database each time based on the retrieval related data stored in the memory, the read-write operation times of a disk interface of the database are reduced, and the data export efficiency is improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a data export method according to a first embodiment of the present invention;
FIG. 2 is a flow chart of a data export method according to a second embodiment of the present invention;
FIG. 3A is a flow chart of a data export method according to a third embodiment of the present invention;
FIG. 3B is a flowchart of a data export method provided in accordance with a specific application scenario of the present invention;
fig. 4 is a schematic structural diagram of a data deriving device according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device implementing a data deriving method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a data export method according to a first embodiment of the present invention, where the method may be applied to a case of exporting service data from a service database of a service platform, and the method may be performed by a data exporting device, where the data exporting device may be implemented in a form of hardware and/or software and configured in an electronic device, such as a terminal device, and typically, the electronic device may be configured to export service data from a service database configured in a server cluster. As shown in fig. 1, the method includes:
S101, deriving a task according to the acquired data, acquiring matched preposed data from a database, and storing the preposed data into a memory; wherein the preamble data comprises the total number of data to be exported and retrieval associated data.
The data export task is a process of acquiring required data from a database through the search condition completed by user configuration, for example, the data export task comprises acquiring transaction flow of the business platform of the month of January, namely exporting transaction flow information of the month of January of the business platform from the database of the business server; the total number of the data to be exported is the number of the data to be exported, and reflects the number of the data rows to be exported in the data table; for example, 200 ten thousand pieces of data are obtained as the total data to be exported; the related data is retrieved according to the intermediate data obtained by the retrieval logic of the database when the data is retrieved according to the retrieval conditions in the data deriving task, and the database obtains the final retrieval result according to the intermediate data; obviously, different data export tasks correspond to different retrieval-related data.
Taking the data export task as an example, when searching is performed in the database according to the search condition, the search logic may first obtain account information of transaction flowing water in the month of January, for example, obtain 500 account identifiers; then, specific 200 ten thousand transaction flow information is acquired based on the 500 accounts; the searching associated data is the 500 account identifications, and the middle data in the process of acquiring the 200 ten thousand transaction data according to the searching condition. After the matched preposed data is obtained from the database through a data export task, the total number of the data to be exported and the retrieval associated data are stored in a memory; the Redis database may be used as a storage device for pre-data and paging data in the memory.
In particular, if the search condition is complex, a plurality of intermediate data (i.e. search related data) may be acquired based on the search logic in the database, so that the plurality of search related data also need to be recorded in the memory at the same time; in addition, in the embodiment of the invention, statistics and acquisition of the transaction flow information of the user are performed, and authorization of each user is obtained in advance; in addition, in the embodiment of the present invention, the type of the database in the service platform is not particularly limited.
S102, setting the paging number according to the total number of the data to be exported, and sequentially acquiring the paging data from the database through a main thread according to the retrieval related data so as to store the paging data into a memory.
Because the data volume managed in the service platform is larger, each data export task may involve a larger number of data exports, and if all data is exported to the memory at the same time, excessive memory resources need to be occupied, so that paging processing is often required to be performed on the data to be exported, namely complete acquisition of the data to be exported is realized through multiple export operations; wherein, a fixed data number threshold value can be set for each page, namely, the maximum number of data bars each page can include; and obtaining the paging number according to the ratio of the total number of the data to be exported to the single page number threshold value.
Simultaneously, after the main thread in the thread pool stores the front data acquired in the database into the memory, the retrieval associated data in the memory is synchronously transmitted to the service database as the known information in the paging data acquisition request when the paging data is acquired each time; in the conventional technical scheme, when the service database is accessed to acquire the paging data each time, the retrieval associated data is required to be acquired repeatedly through the disk I/O interface of the service database, and then the acquisition of the paging data is completed according to the retrieval associated data.
Taking the above technical scheme as an example, each time of accessing the disk I/O interface of the service database, the 500 account identifiers are required to be obtained first, then the paging data to be obtained at this time is obtained according to the 500 account identifiers, and one access is substantially equivalent to two reading operations performed on the disk I/0 interface of the database; for the data export task with complex search conditions, if N (N is more than 2) search related data are acquired, one access is equivalent to N+1 read operations on the disk I/0 interface of the database; in the embodiment of the invention, obviously, only one interface reading operation is needed for each access, and the current paging data can be directly obtained.
Optionally, in an embodiment of the present invention, the sequentially obtaining, by the main thread, the paging data from the database according to the retrieval association data, so as to store the paging data in the memory, specifically includes: acquiring a first quantity threshold according to the quantity of idle slave threads and/or the time consumption of slave thread data processing; and acquiring the first quantity of threshold paging data from the database through the main thread according to the retrieval related data so as to store the first quantity of threshold paging data into a memory.
Specifically, when the main thread acquires the paging data each time, only one paging data can be acquired each time, and a plurality of paging data can be acquired, so long as the memory resource can meet the storage requirement of the paging data with the quantity; however, if the number of the paging data put in one time by the main thread is too large, no idle slave thread takes the paging data out of the memory in time, and the waste of memory resources is caused; therefore, the number of the acquired paging data (i.e., the first number threshold) can be estimated according to the number of the idle slave threads; obviously, the larger the number of idle slave threads, the larger the value of the first number threshold; the smaller the number of idle slave threads, the smaller the value of the first number threshold.
Meanwhile, the time consumption of the data processing of the slave thread reflects the time consumption of processing one piece of paging data of each slave thread aiming at the current data processing task, and the average time consumption of the slave thread which has completed the data processing of the paging data can be used as the time consumption of the data processing of the slave thread; thus, the shorter the slave thread data processing time is, the faster the processing speed of the slave thread for the paging data is, and the larger the first quantity threshold value is; the longer the processing time of the data of the slave thread is, the slower the processing speed of the slave thread for the paging data is, the smaller the first quantity threshold value is; the number of the idle slave threads and/or the time consumption of the data processing of the slave threads are/is reduced by the first number threshold value, so that the number of the paging data acquired by the master thread each time can meet the data processing requirement of each slave thread, the existence of the idle threads is reduced, and the transition occupation of memory resources is avoided.
S103, obtaining the paging data in the memory through at least one slave thread, performing data processing to obtain a paging data file with the data processing completed, and storing the paging data file into a shared disk.
The main thread is used for storing the prepositive data and the paging data in the business database into the memory; the slave thread is used for extracting paging data stored in the memory, performing data processing such as data screening, data format conversion, data calculation and the like on the paging data, further generating data files with specified formats, for example, an xlsx format file, a text format file and the like, and finally storing the generated data files in the shared disk. Obviously, the processing time of the slave thread for the paging data is generally longer than that of the master thread, so that the operation of writing the paging data into the memory by one master thread can meet the data processing requirements of a plurality of slave threads; in addition, a plurality of main threads can be arranged, so that the efficiency of exporting the paging data from the database is further improved.
Optionally, in an embodiment of the present invention, after sequentially acquiring, by the main thread, the paging data from the database according to the retrieval association data, so as to store the paging data in the memory, the method further includes: setting a first expiration time for the preamble data and starting timing; if the first expiration time is timed out, judging whether the data pass through the main thread, and storing all paging data of the data export task into a memory; if all the paging data of the data export task are not stored into the memory through the main thread, resetting the first expiration time to restart the timing operation of the first expiration time; and if all the paging data of the data export task are stored into the memory through the main thread, deleting the preposed data.
Specifically, after the main thread writes the last paging data of the data export task into the memory, namely, all data of the data export task is exported, at this time, after the first expiration time is timed out, the front data can be deleted directly, and the data processing of all the slave threads is not required to be completed; if the first expiration time is timed out, but the main thread does not write all the paging data into the memory, resetting the first expiration time at the moment to restart the timing of the first expiration time, and ensuring that the complete data of the data export task is written into the memory; the setting and resetting of the first expiration time ensures that the cache data in the memory is cleaned in time, avoids memory overflow caused by long-time memory occupation, ensures complete export of the data to be exported, and avoids occurrence of data missing phenomenon.
According to the technical scheme, the main thread acquires the matched front-end data from the database, sets the paging quantity, sequentially acquires the paging data from the database according to the retrieval related data and stores the paging data into the memory, and then the secondary thread extracts the paging data in the memory and processes the data, so that the paging data file is stored into the shared disk, the data to be exported can be completely exported only by occupying less memory resources required by the paging data, the occupation of the memory resources is reduced, the memory overflow phenomenon is avoided, meanwhile, the main thread can directly acquire the paging data when accessing the database each time based on the retrieval related data stored in the memory, the read-write operation times of a disk interface of the database are reduced, and the data export efficiency is improved.
Example two
Fig. 2 is a flowchart of a data export method according to a second embodiment of the present invention, where the relationship between the present embodiment and the above embodiment is that different standard single page thresholds are set for different data export tasks according to different service types and data types. As shown in fig. 2, the method includes:
s201, a task is exported according to the acquired data, matched preposed data is acquired from a database, and the preposed data is stored into a memory; wherein the preamble data comprises the total number of data to be exported and retrieval associated data.
S202, determining a standard single page threshold according to the service type and the data type of the data export task, and setting the paging number according to the total number of the data to be exported and the standard single page threshold.
The number of the parameter items to be exported is different for different service types, so that the service types determine the parameter item composition of the exported data; the data type determines the number of bytes occupied by the parameter information in each parameter item; the data types can comprise picture data, text data, list data, hyperlink data and the like, and can also comprise a plurality of data types such as integer data, floating point data and the like; based on the traffic type and data type of the current data export task, a standard single page threshold, i.e., the number of data pieces that can be carried in each page, may be determined.
Therefore, after the paging quantity is set, the standard single page threshold value can ensure that each piece of paging data occupies the same memory resource, so that the fixed call of the memory resource is realized, and when each piece of paging data generates a corresponding data file and is displayed to a user, the situation that the waiting time of the user is too long due to too much data loading quantity can be avoided, the situation that the quantity of the data files is too much due to too little data loading quantity can be avoided, and the convenience of data file storage and user browsing is improved.
Optionally, in an embodiment of the present invention, determining a standard single page threshold according to a service type and a data type of the data export task specifically includes: determining a standard single page threshold value and a maximum single page threshold value according to the service type and the data type of the data export task; wherein the maximum single page threshold is greater than the standard single page threshold; if the total number of the data to be exported is smaller than or equal to the maximum single page threshold value, setting the paging number as one; setting the paging number according to the total number of the data to be exported and the standard single page threshold, wherein the method specifically comprises the following steps: and if the total number of the data to be exported is larger than the maximum single page threshold value, setting the paging number according to the total number of the data to be exported and the standard single page threshold value.
Specifically, the maximum single page threshold is slightly greater in value than the standard single page threshold, but much less than twice the standard single page threshold; for example, the maximum single page threshold may be 1.1 times the standard single page threshold; taking the technical scheme as an example, setting a standard single page threshold value as 20 ten thousand pieces of data, and setting a maximum single page threshold value as 22 ten thousand pieces of data; compared with the standard single page threshold, the maximum single page threshold only increases the memory resource occupation amount of the paging data and the loading waiting time of the data files to a limited extent, but can avoid data paging, especially if the first paging data is generated according to the standard single page threshold, the second paging data consisting of the remaining export data is possibly mistaken as an empty file or an invalid file because the data amount is less, so that if the total number of the data to be exported is less than or equal to the maximum single page threshold, the export data can be summarized into a complete data file, and the phenomenon is avoided.
S203, acquiring an average single page threshold according to the total data to be exported and the number of pages, and taking the average single page threshold as a data number threshold of each page.
The number of data in each piece of paging data is related to the data processing time of the slave thread; taking the above technical solution as an example, in the conventional technical solution, if the total number of data to be exported is 45 ten thousand pieces of data; the obtained standard single page threshold value is 20 ten thousand pieces of data; 20 ten thousand data are required to be processed through the slave thread A, 20 ten thousand data are required to be processed through the slave thread B, and 3 ten thousand data are required to be processed through the thread C; the main thread does not need to perform any data processing, and the speed of importing the paging data into the memory is high, so that the time for acquiring the paging data from the thread A, the thread B and the thread C is close, and the completion time of the data export task is mainly determined by the processing completion time of the slave thread with a large number of data; that is, after 20 ten thousand pieces of data are processed from the thread a or the thread B, the data export task processing is regarded as being completed.
In the embodiment of the invention, since the average single page threshold value is obtained, namely, the slave thread a, the slave thread B and the slave thread C, the corresponding paging data are all 15 ten thousand pieces of data, and the time for obtaining the paging data from the slave thread a, the slave thread B and the slave thread C is relatively close as described above, at this time, the completion time of the data deriving task is mainly determined by the processing completion time of the slave thread with a relatively large number of data pieces; namely, after processing 15 ten thousand data by the slave thread A, the slave thread B or the slave thread C, the processing is regarded as the completion of data export task processing; obviously, the average single page threshold value is used as the data number threshold value of each page, so that the processing efficiency of the data export task is improved, meanwhile, the same number of data in each data file is ensured, and when the page data file is displayed, each page data can provide a sufficient number of front and rear associated data for a user.
S204, according to the retrieval related data, sequentially acquiring the paging data from the database through the main thread so as to store the paging data into a memory.
S205, obtaining the paging data in the memory through at least one slave thread, performing data processing to obtain a paging data file with the data processing completed, and storing the paging data file into a shared disk.
According to the technical scheme of the embodiment of the invention, the standard single page threshold value is determined according to the service type and the data type of the data export task, and then the paging number is set according to the total number of the data to be exported and the standard single page threshold value, so that the fixed call to the memory resource is realized, and when each paging data generates a corresponding data file and is displayed to a user, the excessive data loading amount can be avoided to cause the overlong waiting time of the user, and the excessive data loading amount can be avoided to cause the excessive data file number; in addition, according to the total number of data to be exported and the number of pages, the obtained average single page threshold value is used as the data number threshold value of each page, so that the processing efficiency of a data export task is improved, meanwhile, the same number of data pieces in each data file is ensured, and when the page data files are conveniently displayed, each page data can provide a sufficient number of front and rear associated data for users.
Example III
Fig. 3A is a flowchart of a data export method according to a third embodiment of the present invention, where the number of threads in a thread pool is related to the total number of data to be exported and the service type of the data export task. As shown in fig. 3A, the method includes:
s301, deriving a task according to the acquired data, acquiring matched preposed data from a database, and storing the preposed data into a memory; wherein the preamble data comprises the total number of data to be exported and retrieval associated data.
S302, acquiring a thread response coefficient according to the total number of the data to be exported and the service type of the data export task; the thread response coefficient is the ratio of the thread waiting time to the thread calculating time.
If the task operation executed by the current thread needs to wait for the task processing results in other threads to continue executing the task, the current thread waits for the time when the other threads finish processing the task, namely the thread waiting time; the time required by the current thread to read the data to be processed, execute the data processing operation and execute other processing tasks is the thread calculation time.
For example, for the transaction sequence of the business platform of the acquired month, after the paging data is acquired, the slave thread is required to mark the operation of the transaction amount section of each transaction, the time section of the transaction time of each transaction and the like aiming at the current paging data, so as to provide a preprocessing operation for the subsequent data statistics; the slave thread can execute other processing operations or not execute operations on the current paging data according to other types of business data; therefore, different business types, the slave thread executes different data processing operations on the paging data, and thus, the data processing time is also different; therefore, if the data processing operation required to be executed by each slave thread under the service type is complicated, the calculation time of each thread is longer, namely, the thread response coefficient with smaller value is obtained; if the data processing operation required to be executed by each slave thread under the service type is simpler, the calculation time of each thread is smaller, namely, the thread response coefficient with larger value is obtained.
Meanwhile, if the total number of data to be exported is larger, the paging number is larger, and the slave thread executes data processing, the slave thread needs to rely on the main thread to complete the operation of writing the paging data into the memory, namely, the slave thread has longer waiting time, and waits for acquiring the paging data required by the slave thread through the memory; therefore, if the total number of the data to be exported is larger, the thread waiting time of each thread is longer, namely, the thread response coefficient with larger value is obtained; if the total number of data to be exported is smaller, the thread waiting time of each thread is shorter, namely, the thread response coefficient with smaller value is obtained.
S303, setting the number of threads in a thread pool according to the number of cores of the central processing unit, the utilization rate of the central processing unit and the thread response coefficient.
The number of threads in a thread pool can be obtained by the following formula:
N=N number of kernels +N Utilization rate +(1+W/C)
Wherein N is the number of threads in the thread pool; n (N) Number of kernels The number of cores for the central processor of the current electronic device; n (N) Utilization rate The expected utilization rate of the CPU is set in the interval (0, 1); W/C is the thread response coefficient; w is thread waiting time; c is the thread calculation time.
In the traditional technical scheme, the running state of the existing thread is monitored by installing a certain plug-in, so that the numerical value of the thread response coefficient is estimated; or setting a fixed W/C value based on the empirical value; in the embodiment of the invention, the matched thread waiting time is determined according to the total number of the data to be exported, and the matched thread calculating time is determined according to the service type of the data exporting task, so that the dynamic adjustment of the number of threads can be realized on the premise that the running state of the threads or the memory occupancy rate of the threads is monitored without adding new functional components, the number of threads in a thread pool is ensured to meet the data processing requirement of the data exporting task, and the occurrence of redundant threads is avoided.
S304, setting the number of pages according to the total number of the data to be exported, and sequentially acquiring the paging data from the database through a main thread according to the retrieval related data so as to store the paging data into a memory.
S305, obtaining the paging data in the memory through at least one slave thread, performing data processing to obtain a paging data file with the data processing completed, and storing the paging data file into a shared disk.
According to the technical scheme of the embodiment of the invention, after the thread response coefficient is obtained according to the total number of data to be exported and the service type of the data export task, the number of threads in the thread pool is set according to the number of cores of the central processing unit, the utilization rate of the central processing unit and the thread response coefficient, and on the premise that a new functional component is not required to be added to monitor the running state of the threads or the occupancy rate of the memory, the dynamic adjustment of the number of threads can be realized, so that the number of threads in the thread pool is ensured to meet the data processing requirement of the data export task, and the occurrence of redundant threads is avoided.
Specific application scenario 1
Fig. 3B is a flowchart of a data export method provided in a specific application scenario of the present invention, in which a user exports required data from a service database through a client matched with a service platform. As shown in fig. 3B, the method includes:
s401, in response to the acquired data export task, sending a search condition to the database so that the database executes a search operation based on the search condition.
S402, acquiring matched preamble data through a database, and storing the preamble data into a memory; wherein the preamble data comprises the total number of data to be exported and retrieval associated data.
S403, setting the paging number according to the total number of the data to be exported.
S404, initializing a door bolt.
A keeper is an indication of the completion of a data export task.
S405, according to the retrieval related data, sequentially acquiring the paging data from the database through the main thread so as to store the paging data into a memory, and setting a second expiration time for the paging data.
S406, obtaining the paging data in the memory through at least one slave thread, performing data processing to obtain a paging data file with the data processing completed, and storing the paging data file into a shared disk.
S407, releasing the door bolt.
S408, compressing the paging files and outputting the paging files in a binary stream mode.
S409, deleting the temporary file.
According to the technical scheme of the embodiment of the invention, the standard single page threshold value is determined according to the service type and the data type of the data export task, and then the paging number is set according to the total number of the data to be exported and the standard single page threshold value, so that the fixed call to the memory resource is realized, and when each paging data generates a corresponding data file and is displayed to a user, the excessive data loading amount can be avoided to cause the overlong waiting time of the user, and the excessive data loading amount can be avoided to cause the excessive data file number; in addition, according to the total number of data to be exported and the number of pages, the obtained average single page threshold value is used as the data number threshold value of each page, so that the processing efficiency of a data export task is improved, meanwhile, the same number of data pieces in each data file is ensured, and when the page data files are conveniently displayed, each page data can provide a sufficient number of front and rear associated data for users.
Example IV
Fig. 4 is a block diagram of a data deriving device according to a fourth embodiment of the present invention, where the device specifically includes:
the preamble data acquisition module 401 is configured to derive a task according to the acquired data, acquire matched preamble data from a database, and store the preamble data into a memory; wherein the preamble data comprises the total number of data to be exported and retrieval associated data;
the paging data storage module 402 is configured to set a paging number according to the total number of the data to be exported, and sequentially obtain paging data from the database through a main thread according to the retrieval related data, so as to store the paging data into a memory;
the paging data processing module 403 is configured to obtain, through at least one slave thread, paging data in the memory and perform data processing, so as to obtain a paging data file after the data processing is completed, and store the paging data file in the shared disk.
According to the technical scheme, the main thread acquires the matched front-end data from the database, sets the paging quantity, sequentially acquires the paging data from the database according to the retrieval related data and stores the paging data into the memory, and then the secondary thread extracts the paging data in the memory and processes the data, so that the paging data file is stored into the shared disk, the data to be exported can be completely exported only by occupying less memory resources required by the paging data, the occupation of the memory resources is reduced, the memory overflow phenomenon is avoided, meanwhile, the main thread can directly acquire the paging data when accessing the database each time based on the retrieval related data stored in the memory, the read-write operation times of a disk interface of the database are reduced, and the data export efficiency is improved.
Optionally, the paging data storage module 402 is specifically configured to determine a standard single page threshold according to the service type and the data type of the data export task, and set the number of pages according to the total number of data to be exported and the standard single page threshold.
Optionally, the paging data storage module 402 is specifically further configured to determine a standard single page threshold and a maximum single page threshold according to a service type and a data type of the data export task; wherein the maximum single page threshold is greater than the standard single page threshold; if the total number of the data to be exported is smaller than or equal to the maximum single page threshold value, setting the paging number as one; and if the total number of the data to be exported is larger than the maximum single page threshold value, setting the paging number according to the total number of the data to be exported and the standard single page threshold value.
Optionally, the paging data storage module 402 is further specifically configured to obtain an average single page threshold according to the total number of data to be exported and the number of pages, and use the average single page threshold as a data number threshold of each page.
Optionally, the data deriving device further includes:
the thread configuration module is used for acquiring thread response coefficients according to the total number of the data to be exported and the service type of the data export task; the thread response coefficient is the ratio of the thread waiting time to the thread calculating time; and setting the number of threads in a thread pool according to the number of cores of the central processing unit, the utilization rate of the central processing unit and the thread response coefficient.
Optionally, the paging data storage module 402 is further specifically configured to obtain a first number threshold according to the number of idle slave threads and/or the processing time of the slave thread data; and acquiring the first quantity of threshold paging data from the database through the main thread according to the retrieval related data so as to store the first quantity of threshold paging data into a memory.
Optionally, the paging data storage module 402 is further specifically configured to set a first expiration time for the preamble data and start timing; if the first expiration time is timed out, judging whether the data pass through the main thread, and storing all paging data of the data export task into a memory; if all the paging data of the data export task are not stored into the memory through the main thread, resetting the first expiration time to restart the timing operation of the first expiration time; and if all the paging data of the data export task are stored into the memory through the main thread, deleting the preposed data.
The device can execute the data export method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method. Technical details not described in detail in this embodiment may be referred to the data export method provided in any embodiment of the present invention.
Example five
Fig. 5 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the data derivation method.
In some embodiments, the data export method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as a storage unit. In some embodiments, part or all of the computer program may be loaded and/or installed onto the heterogeneous hardware accelerator via the ROM and/or the communication unit. One or more of the steps of the data export method described above may be performed when the computer program is loaded into RAM and executed by a processor. Alternatively, in other embodiments, the processor may be configured to perform the data export method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a heterogeneous hardware accelerator having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or a trackball) through which a user can provide input to the heterogeneous hardware accelerator. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A data export method, comprising:
according to the acquired data, a task is exported, matched prepositive data is acquired from a database, and the prepositive data is stored into a memory; wherein the preamble data comprises the total number of data to be exported and retrieval associated data;
setting the number of pages according to the total number of the data to be exported, and sequentially acquiring the paging data from the database through a main thread according to the retrieval related data so as to store the paging data into a memory;
And acquiring the paging data in the memory through at least one slave thread, performing data processing to acquire a paging data file subjected to data processing, and storing the paging data file into a shared disk.
2. The method of claim 1, wherein the setting the number of pages according to the total number of data to be exported specifically comprises:
and determining a standard single page threshold according to the service type and the data type of the data export task, and setting the paging number according to the total number of the data to be exported and the standard single page threshold.
3. Method according to claim 2, characterized in that said determining a standard single page threshold value from the traffic type and data type of said data-derived task comprises in particular:
determining a standard single page threshold value and a maximum single page threshold value according to the service type and the data type of the data export task; wherein the maximum single page threshold is greater than the standard single page threshold;
if the total number of the data to be exported is smaller than or equal to the maximum single page threshold value, setting the paging number as one;
setting the paging number according to the total number of the data to be exported and the standard single page threshold, wherein the method specifically comprises the following steps:
And if the total number of the data to be exported is larger than the maximum single page threshold value, setting the paging number according to the total number of the data to be exported and the standard single page threshold value.
4. A method according to claim 2 or 3, further comprising, after setting the number of pages according to the total number of data to be exported and the standard single page threshold value:
and acquiring an average single page threshold according to the total data to be exported and the paging number, and taking the average single page threshold as a data number threshold of each paging.
5. The method of claim 1, further comprising, after storing the preamble data to a memory:
acquiring a thread response coefficient according to the total number of the data to be exported and the service type of the data export task; the thread response coefficient is the ratio of the thread waiting time to the thread calculating time;
and setting the number of threads in a thread pool according to the number of cores of the central processing unit, the utilization rate of the central processing unit and the thread response coefficient.
6. The method according to claim 1, wherein the retrieving, according to the retrieval association data, paging data from the database sequentially via a main thread to store the paging data to a memory, specifically comprises:
Acquiring a first quantity threshold according to the quantity of idle slave threads and/or the time consumption of slave thread data processing;
and acquiring the first quantity of threshold paging data from the database through the main thread according to the retrieval related data so as to store the first quantity of threshold paging data into a memory.
7. The method of claim 1, further comprising, after sequentially retrieving the paged data from the database by the main thread according to the retrieval-related data to store the paged data to the memory:
setting a first expiration time for the preamble data and starting timing;
if the first expiration time is timed out, judging whether the data pass through the main thread, and storing all paging data of the data export task into a memory;
if all the paging data of the data export task are not stored into the memory through the main thread, resetting the first expiration time to restart the timing operation of the first expiration time;
and if all the paging data of the data export task are stored into the memory through the main thread, deleting the preposed data.
8. A data deriving apparatus, comprising:
The preposed data acquisition module is used for deriving a task according to the acquired data, acquiring matched preposed data from a database and storing the preposed data into a memory; wherein the preamble data comprises the total number of data to be exported and retrieval associated data;
the paging data storage module is used for setting the number of paging according to the total number of the data to be exported, and sequentially acquiring the paging data from the database through a main thread according to the retrieval related data so as to store the paging data into a memory;
and the paging data processing module is used for acquiring the paging data in the memory through at least one slave thread and performing data processing to acquire a paging data file subjected to data processing, and storing the paging data file into the shared disk.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the data export method of any of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to perform the data export method according to any one of claims 1-7.
CN202310191570.2A 2023-02-24 2023-02-24 Data export method and device, electronic equipment and storage medium Pending CN116226251A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310191570.2A CN116226251A (en) 2023-02-24 2023-02-24 Data export method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310191570.2A CN116226251A (en) 2023-02-24 2023-02-24 Data export method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116226251A true CN116226251A (en) 2023-06-06

Family

ID=86572791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310191570.2A Pending CN116226251A (en) 2023-02-24 2023-02-24 Data export method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116226251A (en)

Similar Documents

Publication Publication Date Title
CN114564149B (en) Data storage method, device, equipment and storage medium
CN116226251A (en) Data export method and device, electronic equipment and storage medium
CN115438007A (en) File merging method and device, electronic equipment and medium
CN115904240A (en) Data processing method and device, electronic equipment and storage medium
CN115718732A (en) Disk file management method, device, equipment and storage medium
CN114186123A (en) Processing method, device and equipment for hotspot event and storage medium
CN111090633A (en) Small file aggregation method, device and equipment of distributed file system
CN116431561B (en) Data synchronization method, device, equipment and medium based on heterogeneous many-core accelerator card
CN115599838B (en) Data processing method, device, equipment and storage medium based on artificial intelligence
CN116107763B (en) Data transmission method, device, equipment and storage medium
CN116991781B (en) Request processing device, method, chip, storage medium and electronic equipment
CN116579914B (en) Execution method and device of graphic processor engine, electronic equipment and storage medium
CN115442432B (en) Control method, device, equipment and storage medium
CN117271840B (en) Data query method and device of graph database and electronic equipment
CN117827949A (en) Method and device for batch writing of data into database based on memory queue
CN115408115A (en) Transaction starting time processing method, device, equipment and storage medium
CN117520403A (en) Data query method based on synchronous pipeline Gremlin language
CN117331944A (en) Database table generation method, device, equipment and storage medium
CN117609245A (en) Paging query method, device, equipment and medium
CN116126249A (en) Data reading method and device, electronic equipment and storage medium
CN117591249A (en) Transaction processing method, device, electronic equipment and storage medium
CN116860826A (en) Data processing method, device, equipment and medium based on time sequence database
CN116801001A (en) Video stream processing method and device, electronic equipment and storage medium
CN117193726A (en) Parallel design method and device of software, electronic equipment and medium
CN115454660A (en) Task management method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination