CN114970761A - Model training method, device and system - Google Patents

Model training method, device and system Download PDF

Info

Publication number
CN114970761A
CN114970761A CN202210711503.4A CN202210711503A CN114970761A CN 114970761 A CN114970761 A CN 114970761A CN 202210711503 A CN202210711503 A CN 202210711503A CN 114970761 A CN114970761 A CN 114970761A
Authority
CN
China
Prior art keywords
target
model
server
data
trained
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210711503.4A
Other languages
Chinese (zh)
Inventor
胡赟
严兴俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Telecom Corp Ltd
Original Assignee
China Telecom Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Telecom Corp Ltd filed Critical China Telecom Corp Ltd
Priority to CN202210711503.4A priority Critical patent/CN114970761A/en
Publication of CN114970761A publication Critical patent/CN114970761A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Bioethics (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a model training method, a device and a system. Wherein, the method comprises the following steps: the method comprises the steps that a first server carries out feature extraction on target data obtained by using a first preset model to obtain a plurality of feature data of the target data, and the feature data are stored in a target database; the method comprises the steps that a first server receives a model to be trained sent by a target terminal, determines target characteristic data matched with the model to be trained from a target database according to the model to be trained, and sends the target characteristic data to a second server; the first server receives a target model obtained by training of the second server and forwards the target model to the target terminal, wherein the target model is obtained by extracting target characteristic data from a target database by the second server and training a model to be trained by using the target characteristic data. The method and the device solve the technical problem that due to the lack of a data protection method, the safety of the training data is low.

Description

Model training method, device and system
Technical Field
The application relates to the field of artificial intelligence, in particular to a model training method, a device and a system.
Background
As situations of privacy data leakage are more common, requirements for protection and use of user privacy data are increasing. Data model demanders lack data, data owners need to protect data privacy of users, and contradictions are increasingly revealed.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the application provides a model training method, a device and a system, which are used for at least solving the technical problem of low safety of training data caused by lack of a data protection method.
According to an aspect of an embodiment of the present application, there is provided a model training method, including:
the method comprises the steps that a first server carries out feature extraction on target data obtained by a first preset model to obtain a plurality of feature data of the target data, and the plurality of feature data are stored in a target database; the method comprises the steps that a first server receives a model to be trained sent by a target terminal, determines target characteristic data matched with the model to be trained from a target database according to the model to be trained, and sends the target characteristic data to a second server; the first server receives a target model obtained by training of the second server and forwards the target model to the target terminal, wherein the target model is obtained by extracting target characteristic data from a target database by the second server and training a model to be trained by using the target characteristic data.
Optionally, storing a plurality of feature data into a target database, comprising: the first server deduces the plurality of characteristic data by using a second preset model to obtain characteristic data labels corresponding to the plurality of characteristic data; the first server stores the plurality of feature data and the feature data tags in a target database.
Optionally, the first server receives a model to be trained sent by the target terminal, and determines target feature data matched with the model to be trained from a target database according to the model to be trained, including: the method comprises the steps that a first server receives a first selection instruction sent by a target terminal and determines a first preset model; the first server receives a model to be trained and a required data label sent by a target terminal, and determines target characteristic data matched with the model to be trained and a target characteristic data label corresponding to the target characteristic data according to the model to be trained, a first preset model and the required data label.
Optionally, the receiving, by the first server, the target model trained by the second server includes: the method comprises the steps that a first server sends a model to be trained and a target data list to a second server, wherein the target data list is used for indicating target characteristic data and target characteristic data labels; the first server sends a first control instruction to the second server, and the second server is controlled to extract target characteristic data and a target characteristic data label from a target database according to the instruction of a target data list; and the first server sends a second control instruction to the second server, and controls the second server to train the model to be trained by using the target characteristic data and the target characteristic data label to obtain the target model.
Optionally, after the first server receives the model to be trained sent by the target terminal, the method further includes: the first server sends receipt information and first information to the target terminal, the receipt information is used for indicating that the first server receives the model to be trained, and the first information is used for displaying the training progress of the model to be trained on a display interface of the target terminal.
Optionally, after the first server receives the target model trained by the second server and forwards the target model to the target terminal, the method further includes: and the first server sends second information to the target terminal, wherein the second information is used for prompting that the target model is sent to the target terminal.
Optionally, the first server performs feature extraction on the obtained target data by using a first preset model, where the first preset model includes: a residual network model and a super-resolution test sequence model.
According to another aspect of the embodiments of the present application, there is also provided a model training system, including: the system comprises a data source server, a distributed database, an intermediary server, a training server and a data demand terminal; the intermediary server is used for acquiring a first preset model from the data source server and inputting target data into the first preset model to obtain a plurality of characteristic data; the distributed database is used for storing a plurality of characteristic data and characteristic data labels corresponding to the plurality of characteristic data; the data requirement terminal is used for sending the model to be trained; the training server is used for training by utilizing the model to be trained, the target characteristic data and the target characteristic data label to obtain a target model; the intermediate server is also used for forwarding the target model to the data requirement terminal.
According to still another aspect of the embodiments of the present application, there is provided a model training apparatus, including: the extraction module is used for performing feature extraction on the target data acquired by using the first preset model to obtain a plurality of feature data of the target data and storing the plurality of feature data into a target database; the determining module is used for receiving a model to be trained sent by a target terminal and determining target characteristic data matched with the model to be trained from a target database according to the model to be trained; and the sending module is used for receiving a target model obtained by training of the second server and forwarding the target model to the target terminal, wherein the target model is obtained by extracting the target characteristic data from the target database by the second server and training the model to be trained by using the target characteristic data.
According to another aspect of the embodiments of the present application, there is also provided a non-volatile storage medium, where the non-volatile storage medium includes a stored program, and when the program runs, a device in which the non-volatile storage medium is located is controlled to execute the model training method.
According to still another aspect of the embodiments of the present application, there is provided a processor, configured to run a program, where the program executes the above model training method.
In the embodiment of the application, a first server is adopted to perform feature extraction on the obtained target data to obtain a plurality of feature data of the target data, and the plurality of feature data are stored in a target database; the method comprises the steps that a first server receives a model to be trained sent by a target terminal, determines target characteristic data matched with the model to be trained from a target database according to the model to be trained, and sends the target characteristic data to a second server; the first server receives a target model obtained by training of the second server and forwards the target model to a target terminal, wherein the target model is obtained by extracting target characteristic data from a target database by the second server to train a model to be trained, the aim of encrypting the training data is achieved by extracting the characteristics of the obtained target data and performing model training by only using the characteristic data obtained after the characteristics are extracted, so that the technical effect of protecting the privacy of the training data is achieved, and the technical problem of low safety of the training data caused by the lack of a data protection method is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a block diagram of a hardware structure of a computer terminal (or a mobile device) for a model training method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a model training method according to the present application;
FIG. 3 is a schematic diagram of an alternative model training system according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an alternative model training apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
There is also provided, in accordance with an embodiment of the present application, an embodiment of a model training method, to note that the steps illustrated in the flowchart of the figure may be performed in a computer system, such as a set of computer-executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
The method provided by the embodiment of the application can be executed in a mobile terminal, a computer terminal, a cloud server or a similar operation device. Fig. 1 shows a hardware configuration block diagram of a computer terminal (or mobile device) for implementing the model training method. As shown in fig. 1, the computer terminal 10 (or mobile device 10) may include one or more (shown as 102a, 102b, … …, 102 n) processors 102 (the processors 102 may include, but are not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA, etc.), a memory 104 for storing data, and a transmission module 106 for communication functions. Besides, the method can also comprise the following steps: a display, an input/output interface (I/O interface), a Universal Serial Bus (USB) port (which may be included as one of the ports of the I/O interface), a network interface, a power source, and/or a camera. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration and is not intended to limit the structure of the electronic device. For example, the computer terminal 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
It should be noted that the one or more processors 102 and/or other data processing circuitry described above may be referred to generally herein as "data processing circuitry". The data processing circuitry may be embodied in whole or in part in software, hardware, firmware, or any combination thereof. Further, the data processing circuit may be a single stand-alone processing module, or incorporated in whole or in part into any of the other elements in the computer terminal 10 (or mobile device). As referred to in the embodiments of the application, the data processing circuit acts as a processor control (e.g. selection of a variable resistance termination path connected to the interface).
The memory 104 may be used to store software programs and modules of application software, such as program instructions/data storage devices corresponding to the model training method in the embodiment of the present application, and the processor 102 executes various functional applications and data processing by running the software programs and modules stored in the memory 104, so as to implement the model training method of the application program. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the computer terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission module 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the computer terminal 10. In one example, the transmission module 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission module 106 may be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
The display may be, for example, a touch screen type Liquid Crystal Display (LCD) that may enable a user to interact with a user interface of the computer terminal 10 (or mobile device).
In accordance with an embodiment of the present application, there is provided an embodiment of a model training method, it should be noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer-executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 2 is a flowchart of a model training method according to an embodiment of the present application, as shown in fig. 2, the method includes the following steps:
step S202, the first server extracts the characteristics of the acquired target data to obtain a plurality of characteristic data of the target data, and stores the plurality of characteristic data into a target database;
step S204, the first server receives a model to be trained sent by the target terminal, determines target characteristic data matched with the model to be trained from a target database according to the model to be trained, and sends the target characteristic data to the second server;
and step S206, the first server receives a target model obtained by training of the second server and forwards the target model to the target terminal, wherein the target model is obtained by extracting target characteristic data from a target database by the second server and training a model to be trained by using the target characteristic data.
Through the steps, the purpose of encrypting the training data is achieved by extracting the characteristics of the obtained target data and only using the characteristic data obtained after the characteristic extraction to perform model training, so that the technical effect of protecting the privacy of the training data is achieved, and the technical problem of low safety of the training data caused by the lack of a data protection method is solved.
In step S202, the target data may be subjected to feature extraction by using a preset model to obtain a plurality of feature data, and it should be noted that the original data cannot be restored according to the feature data, so that the effect of encrypting the target data is achieved.
In step S204, the first server receives the model to be trained sent by the target terminal, determines target feature data matched with the model to be trained from the target database according to the model to be trained, and sends the target feature data to the second server, where the first server further needs to send the model to be trained to the second server for the second server to train the model to be trained, where the model to be trained may be an initial model.
The above-mentioned steps S202 to S206 are explained in detail by specific examples.
In step S202, the plurality of feature data are stored in the target database, and before the plurality of feature data are stored in the target database, it is determined whether or not there is label information corresponding to the plurality of feature data, for example, classification information, a detection frame, or the like, in the plurality of feature data. Under the condition that the plurality of feature data are not labeled, the first server deduces the plurality of feature data by using a second preset model to obtain feature data labels corresponding to the plurality of feature data; it is understood that a plurality of matching relationships of the feature data and the feature data tags are obtained. The second preset model can be one or more trained neural network models for labeling the feature data labels, and then the first server stores the plurality of feature data and the feature data labels in the target database.
In step S204, the first server receives a first selection instruction sent by the target terminal, and determines a first preset model; the first server receives a model to be trained and a required data label sent by a target terminal, and determines target characteristic data matched with the model to be trained and a target characteristic data label corresponding to the target characteristic data according to the model to be trained, a first preset model and the required data label.
The first preset model may be a model having a feature extraction function, for example: a residual network model and a super-resolution test sequence model.
Specifically, the target terminal may select a first preset model corresponding to the first preset model by sending the selected first preset model, provide model parameters of the model to be trained and a required data tag, determine a target feature data tag in a target database through the required tag, determine target feature data in the plurality of feature data according to the matching relationship between the plurality of feature data obtained in step 202 and the feature data tag, and finally train the model to be trained by using the target feature data and the corresponding target feature data tag, it may be understood that the first preset model may be used as a front model of a training system to extract feature data, and the model to be trained may be used as a rear model of the system, and training by using the characteristic data obtained by training the first preset model.
In step S206, the first server receives the target model trained by the second server, where it should be noted that the target model is a trained model to be trained. The method comprises the steps that a first server sends a model to be trained and a target data list to a second server, wherein the target data list is used for indicating target characteristic data and a target characteristic data label; extracting target characteristic data and a target characteristic data label from a target database according to the indication of a target data list by a second server; and training the model to be trained by using the target characteristic data and the target characteristic data label to obtain the target model.
The confidentiality is further improved by the second server obtaining the training data, and the path of obtaining the data from the first server is isolated.
In an optional manner, after the first server receives the model to be trained sent by the target terminal, the first server sends receipt information and first information to the target terminal, where the receipt information is used to indicate that the first server has received the model to be trained, and the first information is used to show the training progress of the model to be trained on a display interface of the target terminal. The user experience is improved. The training progress can also be displayed from the moment when the target terminal sends the model training request, for example: the interface display flow comprises the steps of sending a request, receiving the request, acquiring data, training a model and finishing the training of the model.
In some optional embodiments, after the first server receives the target model trained by the second server and forwards the target model to the target terminal, the first server sends second information to the target terminal, where the second information is used to prompt that the target model has been sent to the target terminal.
An embodiment of the present application further provides a model training system, as shown in fig. 3, including: a data source server 30, a distributed database 32, a first server 34, a second server 36 and a target terminal 38; the first server 34 is configured to obtain a first preset model from the data source server 30, and input the target data into the first preset model to obtain a plurality of feature data; the target database 32 is used for storing a plurality of feature data and feature data labels corresponding to the plurality of feature data; the target terminal 38 is used for sending the model to be trained; the second server 36 is configured to train with the model to be trained, the target feature data, and the target feature data tag to obtain a target model; the first server 34 is also used to forward the target model to the target terminal 38.
The target database may be a database located in a server.
An embodiment of the present application further provides a model training apparatus, as shown in fig. 4, including: the extraction module 40 is configured to perform feature extraction on the acquired target data to obtain a plurality of feature data of the target data, and store the plurality of feature data in a target database; the determining module 42 is configured to receive a model to be trained sent by a target terminal, and determine target feature data matched with the model to be trained from a target database according to the model to be trained; and the sending module 44 is configured to receive the target model obtained through training by the second server, and forward the target model to the target terminal, where the target model is obtained through training by the second server by extracting target feature data from a target database.
The extraction module 40 comprises a storage submodule, and the storage submodule is used for deducing the plurality of feature data by using a second preset model to obtain feature data labels corresponding to the plurality of feature data; storing the plurality of feature data and the feature data tags in a target database.
The determination module 42 includes: the receiving submodule is used for receiving a first selection instruction sent by a target terminal and determining a first preset model; and receiving a model to be trained and a required data label sent by a target terminal, and determining target characteristic data matched with the model to be trained and a target characteristic data label corresponding to the target characteristic data according to the model to be trained, the first preset model and the required data label.
The transmission module 44 includes: the training submodule is used for sending a model to be trained and a target data list to a second server, wherein the target data list is used for indicating target characteristic data and a target characteristic data label; sending a first control instruction to a second server, and controlling the second server to extract target characteristic data and a target characteristic data label from a target database according to the instruction of a target data list; and sending a second control instruction to a second server, and controlling the second server to train the model to be trained by using the target characteristic data and the target characteristic data label to obtain the target model.
The sending submodule is used for sending receipt information and first information to the target terminal after the first server receives the model to be trained sent by the target terminal, wherein the receipt information is used for indicating that the first server receives the model to be trained, and the first information is used for displaying the training progress of the model to be trained on a display interface of the target terminal; and after the first server receives the target model obtained by training of the second server and forwards the target model to the target terminal, second information is sent to the target terminal, and the second information is used for prompting that the target model is sent to the target terminal.
According to another aspect of the embodiments of the present application, there is also provided a non-volatile storage medium, including a stored program, where the apparatus in which the non-volatile storage medium is located is controlled to execute the model training method when the program is running.
According to another aspect of the embodiments of the present application, there is also provided a processor, configured to run a program, where the program performs the above model training method when running.
The processor is used for running a program for executing the following functions: the first server performs feature extraction on the acquired target data to obtain a plurality of feature data of the target data, and stores the plurality of feature data into a target database; the method comprises the steps that a first server receives a model to be trained sent by a target terminal, determines target characteristic data matched with the model to be trained from a target database according to the model to be trained, and sends the target characteristic data to a second server; the first server receives a target model obtained by training of the second server and forwards the target model to the target terminal, wherein the target model is obtained by the second server extracting target characteristic data from a target database and training a model to be trained.
The processor executes the model training method, performs feature extraction on the obtained target data, and performs model training only by using the feature data obtained after feature extraction, so that the aim of encrypting the training data is fulfilled, the technical effect of protecting the privacy of the training data is realized, and the technical problem of low safety of the training data caused by lack of a data protection method is solved.
In the above embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of a unit may be a division of a logic function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (10)

1. A method of model training, comprising:
the method comprises the steps that a first server utilizes a first preset model to conduct feature extraction on obtained target data to obtain a plurality of feature data of the target data, and the feature data are stored in a target database;
the first server receives a model to be trained sent by a target terminal, determines target characteristic data matched with the model to be trained from the target database according to the model to be trained, and sends the target characteristic data to a second server;
and the first server receives a target model obtained by training of the second server and forwards the target model to the target terminal, wherein the target model is obtained by extracting the target characteristic data from the target database by the second server and training the model to be trained by using the target characteristic data.
2. The method of claim 1, wherein storing the plurality of feature data in a target database comprises:
the first server deduces the plurality of characteristic data by using a second preset model to obtain characteristic data labels corresponding to the plurality of characteristic data;
the first server stores the plurality of feature data and the feature data tag in the target database.
3. The method according to claim 2, wherein the first server receives a model to be trained sent by a target terminal, and determines the target feature data matched with the model to be trained from the target database according to the model to be trained, including:
the first server receives a first selection instruction sent by the target terminal and determines the first preset model;
the first server receives a model to be trained and a required data label sent by the target terminal, and determines the target characteristic data matched with the model to be trained and a target characteristic data label corresponding to the target characteristic data according to the model to be trained, the first preset model and the required data label.
4. The method of claim 1, wherein the first server receives the object model trained by the second server, comprising:
the first server sends the model to be trained and a target data list to the second server, wherein the target data list is used for indicating the target characteristic data and the target characteristic data label;
the first server sends a first control instruction to the second server, and the second server is controlled to extract the target characteristic data and the target characteristic data label from the target database according to the indication of the target data list;
and the first server sends a second control instruction to the second server, and the second server is controlled to train the model to be trained by using the target characteristic data and the target characteristic data label to obtain the target model.
5. The method of claim 1, wherein after the first server receives the model to be trained sent by the target terminal, the method further comprises:
the first server sends receipt information and first information to the target terminal, wherein the receipt information is used for indicating that the first server receives the model to be trained, and the first information is used for displaying the training progress of the model to be trained on a display interface of the target terminal.
6. The method of claim 5, wherein after the first server receives the target model trained by the second server and forwards the target model to the target terminal, the method further comprises:
and the first server sends second information to the target terminal, wherein the second information is used for prompting that the target model is sent to the target terminal.
7. The method of claim 1, wherein the feature extraction performed by the first server on the acquired target data comprises:
the first server performs feature extraction on the acquired target data by using a first preset model, wherein the first preset model comprises: a residual network model and a super-resolution test sequence model.
8. A model training system, comprising:
the system comprises a data source server, a distributed database, an intermediary server, a training server and a data demand terminal;
the intermediary server is used for acquiring a first preset model from the data source server and inputting target data into the first preset model to obtain a plurality of characteristic data;
the distributed database is used for storing a plurality of characteristic data and characteristic data labels corresponding to the plurality of characteristic data;
the data requirement terminal is used for sending a model to be trained;
the training server is used for training by utilizing the model to be trained, the target characteristic data and the target characteristic data label to obtain a target model;
the intermediary server is further used for forwarding the target model to the data requirement terminal.
9. A model training apparatus, comprising:
the extraction module is used for performing feature extraction on target data acquired by using a first preset model to obtain a plurality of feature data of the target data and storing the plurality of feature data into a target database;
the determining module is used for receiving a model to be trained sent by a target terminal, determining target characteristic data matched with the model to be trained from the target database according to the model to be trained, and sending the target characteristic data to a second server;
and the sending module is used for receiving a target model obtained by training of the second server and forwarding the target model to the target terminal, wherein the target model is obtained by extracting the target characteristic data from the target database by the second server and training the model to be trained by using the target characteristic data.
10. A non-volatile storage medium, comprising a stored program, wherein the program, when executed, controls a device in which the non-volatile storage medium is located to perform the model training method of any one of claims 1 to 7.
CN202210711503.4A 2022-06-22 2022-06-22 Model training method, device and system Pending CN114970761A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210711503.4A CN114970761A (en) 2022-06-22 2022-06-22 Model training method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210711503.4A CN114970761A (en) 2022-06-22 2022-06-22 Model training method, device and system

Publications (1)

Publication Number Publication Date
CN114970761A true CN114970761A (en) 2022-08-30

Family

ID=82965591

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210711503.4A Pending CN114970761A (en) 2022-06-22 2022-06-22 Model training method, device and system

Country Status (1)

Country Link
CN (1) CN114970761A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115766489A (en) * 2022-12-23 2023-03-07 中国联合网络通信集团有限公司 Data processing apparatus, method and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115766489A (en) * 2022-12-23 2023-03-07 中国联合网络通信集团有限公司 Data processing apparatus, method and storage medium

Similar Documents

Publication Publication Date Title
CN110798703A (en) Method and device for detecting illegal video content and storage medium
CN113395200B (en) Message pushing method and system, client, storage medium and processor
CN108319888B (en) Video type identification method and device and computer terminal
CN109067883B (en) Information pushing method and device
CN114970761A (en) Model training method, device and system
CN104636460A (en) Article information pushing method and device
CN113934299B (en) Equipment interaction method and device, intelligent household equipment and processor
CN110505260B (en) Processing method and system of push information, display device and mobile terminal
CN111476598A (en) Information prompting method and device, storage medium and electronic device
CN112306592A (en) Message processing method and device, storage medium and electronic device
CN107786528B (en) Application login method and device and communication system
CN111678519B (en) Intelligent navigation method, device and storage medium
CN113011182B (en) Method, device and storage medium for labeling target object
CN110929866B (en) Training method, device and system of neural network model
CN113194045B (en) Data traffic analysis method, device, storage medium and processor
CN110909755B (en) Object feature processing method and device
CN112417164A (en) Information recommendation method and device, storage medium and electronic device
CN112182394A (en) Product recommendation method and device, electronic equipment and storage medium
CN110826582A (en) Image feature training method, device and system
CN112560555A (en) Method, device and storage medium for expanding key points
CN112801296A (en) Data processing method, device and system
CN114943868B (en) Image processing method, device, storage medium and processor
CN106355426B (en) Display method and device for service platform in application
CN113449795A (en) Power utilization data processing method and device and electronic equipment
CN110675459A (en) Font generation method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination