CN110825705A - Data set caching method and related device - Google Patents

Data set caching method and related device Download PDF

Info

Publication number
CN110825705A
CN110825705A CN201911157888.9A CN201911157888A CN110825705A CN 110825705 A CN110825705 A CN 110825705A CN 201911157888 A CN201911157888 A CN 201911157888A CN 110825705 A CN110825705 A CN 110825705A
Authority
CN
China
Prior art keywords
data set
caching
cache
container
script
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911157888.9A
Other languages
Chinese (zh)
Inventor
郑玉会
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Inspur Smart Computing Technology Co Ltd
Original Assignee
Guangdong Inspur Big Data Research Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Inspur Big Data Research Co Ltd filed Critical Guangdong Inspur Big Data Research Co Ltd
Priority to CN201911157888.9A priority Critical patent/CN110825705A/en
Publication of CN110825705A publication Critical patent/CN110825705A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/172Caching, prefetching or hoarding of files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/176Support for shared access to files; File sharing support
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45562Creating, deleting, cloning virtual machine instances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45583Memory management, e.g. access or allocation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application discloses a data set caching method, which comprises the steps of receiving configuration parameters transmitted by a user when a training task is created; generating a YAML configuration file based on the configuration parameters; creating a data set cache container according to the YAML configuration file by using a container arrangement system; and starting the data set cache container, calling and executing a preset data set cache script in the data set cache container to cache the data set. The data set caching method can simplify the business process of data set caching and improve the submission efficiency of the training task. The application also discloses a data set caching device, equipment and a computer readable storage medium, which have the technical effects.

Description

Data set caching method and related device
Technical Field
The application relates to the technical field of data storage, in particular to a data set caching method; it also relates to a data set caching apparatus, a device and a computer readable storage medium.
Background
Models such as machine learning and deep learning are widely used in different fields, and need to be trained before the models are used. The data set is the basis for training such as machine learning and deep learning, and the data set required by the training task needs to be cached before the training task is started, so that the training task is started after the caching of the data set is completed. The business process of data set caching concerns the efficiency of submitting training tasks. Therefore, how to simplify the business process of data set caching and improve the submission efficiency of training tasks becomes a technical problem to be solved urgently by those skilled in the art.
Disclosure of Invention
The method aims to provide a data set caching method, which can simplify the business process of data set caching and improve the submission efficiency of training tasks; another object of the present application is to provide a data set caching apparatus, a device and a computer readable storage medium, all of which have the above technical effects.
In order to solve the above technical problem, the present application provides a data set caching method, including:
receiving configuration parameters transmitted by a user when a training task is created;
generating a YAML configuration file based on the configuration parameters;
creating a data set cache container according to the YAML configuration file by using a container arrangement system;
and calling and executing a preset data set cache script in the data set cache container to cache the data set.
Optionally, the executing the preset data set caching script to cache the data set includes:
and executing the preset data set caching script to cache the target data set to a shared storage system or a local node.
Optionally, the method further includes:
and configuring a data set caching mode to cache the target data set to a storage position corresponding to the data set caching mode when the data set caching script is executed.
Optionally, the method further includes:
judging whether the data set caching is finished or not;
and if the data set caching is completed, starting the training task.
Optionally, before generating the YAML configuration file based on the configuration parameters, the method further includes:
and detecting whether the configuration parameters transmitted by the user when creating the training task are valid or not, and performing exception prompt when invalid configuration parameters exist.
In order to solve the above technical problem, the present application further provides a data set caching apparatus, including:
the configuration parameter receiving module is used for receiving configuration parameters transmitted by a user when a training task is created;
the configuration file generation module is used for generating a YAML configuration file based on the configuration parameters;
the container creating module is used for creating a data set cache container according to the YAML configuration file by using a container arranging system;
and the data set caching module is used for calling and executing a preset data set caching script in the data set caching container to cache the data set.
Optionally, the data set caching module includes:
the first data set caching unit is used for executing a preset data set caching script to share a target data set with the storage system;
and the second data set caching unit is used for executing the preset data set caching script to cache the target data set to the local node.
Optionally, the method further includes:
and the cache mode configuration module is used for configuring a data set cache mode so as to execute the data set cache script to cache the target data set to a storage position corresponding to the data set cache mode.
In order to solve the above technical problem, the present application further provides a data set caching device, including:
a memory for storing a computer program;
a processor for implementing the steps of the data set caching method as described above when executing the computer program.
To solve the above technical problem, the present application further provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the data set caching method as described above.
The data set caching method comprises the steps of receiving configuration parameters transmitted by a user when a training task is created; generating a YAML configuration file based on the configuration parameters; creating a data set cache container according to the YAML configuration file by using a container arrangement system; and calling and executing a preset data set cache script in the data set cache container to cache the data set.
Therefore, according to the data set caching method provided by the application, after the user creates the training task, only basic configuration parameters need to be transmitted, so that the system automatically generates the YAML configuration file after receiving the configuration parameters, and further, the container arrangement system is used for creating the data set caching container based on the YAML configuration file, and then the data set caching script is executed in the data set caching container for caching the data set. The data set caching method can effectively simplify the complex business process of data set caching and improve the submission efficiency of training tasks.
The data set caching device, the equipment and the computer readable storage medium provided by the application have the technical effects.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed in the prior art and the embodiments are briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flowchart of a data set caching method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a data set caching apparatus according to an embodiment of the present application;
fig. 3 is a schematic diagram of a data set caching apparatus according to an embodiment of the present application.
Detailed Description
The core of the application is to provide a data set caching method, which can simplify the business process of data set caching and improve the submission efficiency of training tasks; at the other core of the present application, a data set caching apparatus, a device and a computer readable storage medium are provided, which all have the above technical effects.
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a data set caching method according to an embodiment of the present disclosure; referring to fig. 1, the data set caching method includes:
s101: receiving configuration parameters transmitted by a user when a training task is created;
s102: generating a YAML configuration file based on the configuration parameters;
specifically, the data set caching method provided by the application is applied to a deep learning platform, and a user can create a training task on the deep learning platform. When the user creates the training, he/she will enter the relevant configuration parameters, and the system receives the configuration parameters and further generates YAML (a format for expressing data serialization) configuration files based on the configuration parameters, so as to make the container arrangement system call the YAML configuration files to create the data set cache container. The configuration parameters include parameters related to the training task, such as a mirror image, a data set required by the training task, and the like, and specific contents of the configuration parameters are not described herein again.
In addition, in order to guarantee the validity of the YMAL configuration file, in a specific embodiment, before generating the YAML configuration file based on the configuration parameters, whether the configuration parameters incoming when the user creates the training task are valid may be further detected, and an exception prompt may be performed when there are invalid configuration parameters.
Specifically, after receiving configuration parameters that are input by a user when creating a training task, the system first detects whether the received configuration parameters are valid, for example, whether format errors exist in the configuration parameters, and the like. Thus, when a configuration parameter is valid, a YAML configuration file is further generated based on the configuration parameter. On the contrary, if invalid configuration parameters exist in the received configuration parameters, exception prompting is carried out so that the user can re-transmit the valid configuration parameters. Of course, for a specific abnormal prompting mode, the method and the device are not limited uniquely, and can be set differently according to actual needs.
S103: creating a data set cache container according to the YAML configuration file by using a container arrangement system;
specifically, on the basis of generating the YAML configuration file, a data set cache container is further created according to the YAML configuration file by using kubernets, namely a container arrangement system. In particular, Kubernetes is an open source container orchestration engine that supports automated deployment, application containerization management. After the YAML configuration file is generated, the system can access an Application Programming Interface (API) of Kubernetes, and further create a data set cache container according to the YAML configuration file by using a function inside the Kubernetes. For a specific process of creating a data set cache container by a function inside Kubernetes, details are not repeated in the present application, and reference may be made to the prior art. In addition, when a user creates a training task, Kubernets can also randomly schedule the training task to a certain node, and the node is the local node of the training task.
S104: and starting the data set cache container, calling and executing a preset data set cache script in the data set cache container to cache the data set.
Specifically, the data set caching method provided by the application writes and inserts the data set caching script in advance, so that after the data set caching container is created, the data set caching container is started, and the preset data set caching script is called and executed in the data set caching container to cache the data set.
In a specific embodiment, the executing the preset data set caching script for data set caching may include executing the preset data set caching script to cache the target data set to the shared storage system or the local node.
Specifically, the present embodiment provides two data caching methods, including a shared storage system cache and a node cache. The shared storage system cache is to cache a data set to a BeeGFS (parallel file system) shared storage system, so that parallel tasks share the data set in the storage system, and the training efficiency is improved. The node cache is to cache a data set in an SSD (Solid State Disk) of a local node, and the data sets cached by each node are independent of each other.
Importantly, the two caching modes cannot coexist, namely if a user selects the shared storage system to cache, the data set caching script caches a target data set to the shared storage system when a preset data set caching script is executed; if the user selects the node cache, the data set cache script caches the target data set to the local node when the preset data set cache script is executed.
Further, in the above embodiment where the data set caching method includes a shared storage system cache and a node cache, the data set caching method further includes configuring the data set caching method to cache the target data set to a storage location corresponding to the data set caching method when the data set caching script is executed.
Specifically, the user may configure the data set caching method according to the application requirement, so as to select the data set caching method required by the user. Further, based on this configuration, the system caches the dataset to the corresponding storage location when the dataset caching script is executed within the dataset caching container. Specifically, if the data caching mode configured by the user is the shared storage system caching, the data set is cached in the shared storage system, and if the data caching mode configured by the user is the node caching, the data set is cached in the local node.
Further, on the basis of the above embodiment, the data set caching method further includes determining whether caching of the data set is completed; and if the data set caching is completed, starting a training task. On the contrary, if the data set cache is not completed, the training task is in a waiting state until the data set cache is completed to start the training task. For example, in the process of executing the dataset caching script to cache the dataset, if the caching of the dataset is completed, a corresponding identifier for completing the caching of the dataset may be generated, so that the completion of the caching of the dataset may be known by identifying the identifier, and if the identifier is not identified, the caching of the dataset is not completed.
In summary, the data set caching method provided by the application includes receiving configuration parameters transmitted by a user when a training task is created; generating a YAML configuration file based on the configuration parameters; creating a data set cache container according to the YAML configuration file by using a container arrangement system; and calling and executing a preset data set cache script in the data set cache container to cache the data set. According to the data set caching method, after a user creates a training task, only basic configuration parameters need to be input, so that a YAML configuration file is automatically generated after the system receives the configuration parameters, a data set caching container is further created by using a container arranging system based on the YAML configuration file, and then a data set caching script is executed in the data set caching container to cache the data set. The data set caching method can effectively simplify the complex business process of data set caching and improve the submission efficiency of training tasks.
The present application further provides a data set caching apparatus, which may be referred to in correspondence with the above-described method. Referring to fig. 2, fig. 2 is a schematic diagram of a data set caching apparatus according to an embodiment of the present disclosure; as can be seen in fig. 2, the apparatus includes:
a configuration parameter receiving module 10, configured to receive configuration parameters transmitted by a user when creating a training task;
a configuration file generation module 20, configured to generate a YAML configuration file based on the configuration parameters;
a container creation module 30, configured to create a data set cache container according to the YAML configuration file by using a container arrangement system;
and the data set caching module 40 is used for calling and executing a preset data set caching script in the data set caching container to cache the data set.
On the basis of the foregoing embodiment, as a specific implementation manner, the data set caching module 40 includes:
the first data set caching unit is used for executing a preset data set caching script to share the target data set with the storage system;
and the second data set caching unit is used for executing a preset data set caching script to cache the target data set to the local node.
On the basis of the above embodiment, as a specific implementation manner, the method further includes:
and the cache mode configuration module is used for configuring the data set cache mode so that the data set cache script caches the target data set to the storage position corresponding to the data set cache mode.
On the basis of the above embodiment, as a specific implementation manner, the method further includes:
the judging module is used for judging whether the data set caching is finished or not;
and the starting module is used for starting the training task if the data set is cached.
Please refer to fig. 3, where fig. 3 is a schematic diagram of a data set caching device according to an embodiment of the present application; referring to fig. 3, the apparatus comprises a memory 1 and a processor 2. Wherein, the memory 1 is used for storing computer programs; a processor 2, configured to implement the following steps when executing the computer program:
receiving configuration parameters transmitted by a user when a training task is created; generating a YAML configuration file based on the configuration parameters; creating a data set cache container according to the YAML configuration file by using a container arrangement system; and starting the data set cache container, calling and executing a preset data set cache script in the data set cache container to cache the data set.
For the introduction of the data set caching device provided in the present application, please refer to the above embodiment of the method for implementing the user customized function, which is not described herein again.
The present application further provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
receiving configuration parameters transmitted by a user when a training task is created; generating a YAML configuration file based on the configuration parameters; creating a data set cache container according to the YAML configuration file by using a container arrangement system; starting the data set cache container and calling and executing a preset data set cache script in the data set cache container to cache the data set
The computer-readable storage medium may include: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
For the introduction of the computer-readable storage medium provided in the present application, please refer to the above method embodiments, which are not described herein again.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device, the apparatus and the computer-readable storage medium disclosed by the embodiments correspond to the method disclosed by the embodiments, so that the description is simple, and the relevant points can be referred to the description of the method.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The data set caching method, apparatus, device and computer-readable storage medium provided by the present application are described in detail above. The principles and embodiments of the present application are explained herein using specific examples, which are provided only to help understand the method and the core idea of the present application. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.

Claims (10)

1. A method for caching a data set, comprising:
receiving configuration parameters transmitted by a user when a training task is created;
generating a YAML configuration file based on the configuration parameters;
creating a data set cache container according to the YAML configuration file by using a container arrangement system;
and starting the data set cache container, calling and executing a preset data set cache script in the data set cache container to cache the data set.
2. The data set caching method according to claim 1, wherein the executing the preset data set caching script for data set caching comprises:
and executing the preset data set caching script to cache the target data set to a shared storage system or a local node.
3. The data set caching method as recited in claim 1, further comprising:
and configuring a data set caching mode to cache the target data set to a storage position corresponding to the data set caching mode when the data set caching script is executed.
4. The data set caching method as recited in claim 1, further comprising:
judging whether the data set caching is finished or not;
and if the data set caching is completed, starting the training task.
5. The method for caching data sets according to claim 4, wherein before generating the YAML configuration file based on the configuration parameters, the method further comprises:
and detecting whether the configuration parameters transmitted by the user when creating the training task are valid or not, and performing exception prompt when invalid configuration parameters exist.
6. A data set caching apparatus, comprising:
the configuration parameter receiving module is used for receiving configuration parameters transmitted by a user when a training task is created;
the configuration file generation module is used for generating a YAML configuration file based on the configuration parameters;
the container creating module is used for creating a data set cache container according to the YAML configuration file by using a container arranging system;
and the data set caching module is used for calling and executing a preset data set caching script in the data set caching container to cache the data set.
7. The data set caching apparatus according to claim 6, wherein said data set caching module comprises:
the first data set caching unit is used for executing a preset data set caching script to share a target data set with the storage system;
and the second data set caching unit is used for executing the preset data set caching script to cache the target data set to the local node.
8. The apparatus for initiating a data set cache of claim 6 further comprising:
and the cache mode configuration module is used for configuring a data set cache mode so as to execute the data set cache script to cache the target data set to a storage position corresponding to the data set cache mode.
9. A data set caching apparatus, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the data set caching method as claimed in any one of claims 1 to 5 when executing said computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the data set caching method according to any one of claims 1 to 5.
CN201911157888.9A 2019-11-22 2019-11-22 Data set caching method and related device Pending CN110825705A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911157888.9A CN110825705A (en) 2019-11-22 2019-11-22 Data set caching method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911157888.9A CN110825705A (en) 2019-11-22 2019-11-22 Data set caching method and related device

Publications (1)

Publication Number Publication Date
CN110825705A true CN110825705A (en) 2020-02-21

Family

ID=69558328

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911157888.9A Pending CN110825705A (en) 2019-11-22 2019-11-22 Data set caching method and related device

Country Status (1)

Country Link
CN (1) CN110825705A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112241368A (en) * 2020-09-30 2021-01-19 北京影谱科技股份有限公司 Kubernetes-based automatic model training method and device
CN112905325A (en) * 2021-02-10 2021-06-04 山东英信计算机技术有限公司 Method, system and medium for distributed data cache accelerated training
CN115022405A (en) * 2022-08-10 2022-09-06 合肥中科类脑智能技术有限公司 Intelligent cache acceleration system and method of deep learning cloud platform
US11487555B2 (en) * 2020-06-09 2022-11-01 Tencent America LLC Running PBS jobs in kubernetes

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100153415A1 (en) * 2008-12-16 2010-06-17 Netapp, Inc. Method and Apparatus to Implement a Hierarchical Cache System with pNFS
CN106156255A (en) * 2015-04-28 2016-11-23 天脉聚源(北京)科技有限公司 A kind of data buffer storage layer realization method and system
CN107977165A (en) * 2017-11-22 2018-05-01 用友金融信息技术股份有限公司 Data buffer storage optimization method, device and computer equipment
CN108920136A (en) * 2018-06-29 2018-11-30 郑州云海信息技术有限公司 A kind of operating system creation method, system and relevant apparatus based on container
CN109508238A (en) * 2019-01-05 2019-03-22 咪付(广西)网络技术有限公司 A kind of resource management system and method for deep learning
CN109933312A (en) * 2019-03-25 2019-06-25 南京邮电大学 A method of containerization relevant database I/O consumption is effectively reduced

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100153415A1 (en) * 2008-12-16 2010-06-17 Netapp, Inc. Method and Apparatus to Implement a Hierarchical Cache System with pNFS
CN106156255A (en) * 2015-04-28 2016-11-23 天脉聚源(北京)科技有限公司 A kind of data buffer storage layer realization method and system
CN107977165A (en) * 2017-11-22 2018-05-01 用友金融信息技术股份有限公司 Data buffer storage optimization method, device and computer equipment
CN108920136A (en) * 2018-06-29 2018-11-30 郑州云海信息技术有限公司 A kind of operating system creation method, system and relevant apparatus based on container
CN109508238A (en) * 2019-01-05 2019-03-22 咪付(广西)网络技术有限公司 A kind of resource management system and method for deep learning
CN109933312A (en) * 2019-03-25 2019-06-25 南京邮电大学 A method of containerization relevant database I/O consumption is effectively reduced

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11487555B2 (en) * 2020-06-09 2022-11-01 Tencent America LLC Running PBS jobs in kubernetes
CN112241368A (en) * 2020-09-30 2021-01-19 北京影谱科技股份有限公司 Kubernetes-based automatic model training method and device
CN112905325A (en) * 2021-02-10 2021-06-04 山东英信计算机技术有限公司 Method, system and medium for distributed data cache accelerated training
CN115022405A (en) * 2022-08-10 2022-09-06 合肥中科类脑智能技术有限公司 Intelligent cache acceleration system and method of deep learning cloud platform
CN115022405B (en) * 2022-08-10 2022-10-25 合肥中科类脑智能技术有限公司 Intelligent cache acceleration system and method of deep learning cloud platform

Similar Documents

Publication Publication Date Title
CN110825705A (en) Data set caching method and related device
CN109376088B (en) Automatic test system and automatic test method
CN107104923B (en) Account binding and service processing method and device
CN107807841B (en) Server simulation method, device, equipment and readable storage medium
CN110599341A (en) Transaction calling method and system
CN109359045A (en) A kind of test method, device, equipment and storage medium
CN110609755A (en) Message processing method, device, equipment and medium for cross-block chain node
CN109033466A (en) Page sharing method calculates equipment and computer storage medium
CN110825428A (en) State machine configuration method, device, equipment and readable storage medium
CN112306471A (en) Task scheduling method and device
CN111625294A (en) Server project execution method, device and related equipment
CN111294377B (en) Dependency network request sending method, terminal device and storage medium
CN113377669A (en) Automatic testing method and device, computer equipment and storage medium
CN107220818B (en) Online payment method and device
CN109245941B (en) Service compensation method and device
CN111752601A (en) Data configuration method, device and system, electronic equipment and storage medium thereof
CN106951236B (en) Plug-in development method and device
CN112631931B (en) Version testing method and device, storage medium and electronic equipment
CN110727416B (en) Development framework generation method and related device
CN110874713A (en) Service state management method and device
CN114327673A (en) Task starting method and device, electronic equipment and storage medium
CN111324368B (en) Data sharing method and server
CN113377385A (en) Client automatic deployment method and device
CN112508524A (en) Electronic approval method, system, device and storage medium
CN112395194A (en) Method and device for accessing test platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200221