CN117271338A - Scheme generation processing method, device, computer equipment and storage medium - Google Patents

Scheme generation processing method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN117271338A
CN117271338A CN202311235685.3A CN202311235685A CN117271338A CN 117271338 A CN117271338 A CN 117271338A CN 202311235685 A CN202311235685 A CN 202311235685A CN 117271338 A CN117271338 A CN 117271338A
Authority
CN
China
Prior art keywords
model application
test
application scheme
scheme
task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311235685.3A
Other languages
Chinese (zh)
Inventor
龚渝钧
曹晋其
周东滨
余劭嶔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Youzhuju Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youzhuju Network Technology Co Ltd filed Critical Beijing Youzhuju Network Technology Co Ltd
Priority to CN202311235685.3A priority Critical patent/CN117271338A/en
Publication of CN117271338A publication Critical patent/CN117271338A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9032Query formulation
    • G06F16/90332Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/10Requirements analysis; Specification techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Physics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Stored Programmes (AREA)

Abstract

The disclosure provides a scheme generation processing method, a scheme generation processing device, computer equipment and a storage medium, and relates to the technical field of computers. The method not only can complete the construction and generation of the model application scheme, but also can test the generated model application scheme, thereby realizing the integrated execution of construction, generation and test verification and greatly improving the overall development efficiency of the model application scheme. In addition, in the test link, a test data set and test indexes of a model application scheme can be determined according to specific model application requirements, and a test flow can be executed by one key based on the test data set, so that the efficiency of the test link is greatly improved; aiming at the output result obtained by the test, the test evaluation of the model application scheme under the test index meeting the requirement can be completed, so that reference information is provided for a user to judge the usability of the model application scheme, and the model application scheme meets the requirement of the user more formally.

Description

Scheme generation processing method, device, computer equipment and storage medium
Technical Field
The disclosure relates to the field of computer technology, and in particular, to a scheme generation processing method, a scheme generation processing device, computer equipment and a storage medium.
Background
With the improvement of the artificial intelligence technology level, the application of the neural network model is more and more extensive, and the neural network model can automatically complete tasks of saving labor cost, such as intelligent creation, intelligent question-answering and the like.
For some models, such as natural language processing models, it is often necessary to build model application schemes to assist the model in better yielding satisfactory results due to limitations in the model's own capabilities and uncertainty in the output. How to make the formally put-into-use model application scheme better conform to the user's needs is a worth solving problem.
Disclosure of Invention
The embodiment of the disclosure at least provides a scheme generation processing method, a scheme generation processing device, computer equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a solution generating and processing method, including:
generating a model application scheme matched with the model application requirement aiming at the acquired model application requirement; the model application scheme is provided with a task link comprising a plurality of task nodes; the plurality of task nodes comprise at least one model interaction node;
in response to receiving a test requirement for the model application scheme, determining at least one test dataset corresponding to the model application scheme and at least one test indicator of the model application scheme; the test data set and the test index are associated with the model application requirements;
Performing model application test on the model application scheme by using the at least one test data set to obtain an output result of each test data set under the model application scheme;
based on the output results corresponding to the test data sets, obtaining test evaluation results of the model application scheme under the at least one test index; the test evaluation result is used as reference information for judging whether the model application scheme can be used.
In an optional implementation manner, the performing, by using the at least one test dataset, a model application test on the model application scheme to obtain an output result of each test dataset under the model application scheme includes:
creating at least one round of test tasks for the test data set and the test index; wherein the number of rounds of the test task is related to the model application requirements and/or the test dataset;
and executing the at least one round of test tasks on the model application scheme by using the test data set to obtain an output result of the test data set under the model application scheme.
In an optional implementation manner, after the output result of the test dataset under the model application scheme is obtained, the method further includes:
After the at least one round of test tasks are executed, acquiring user inquiry information;
executing the model application scheme based on the user inquiry information to obtain an output result of the user inquiry information under the model application scheme;
the obtaining, based on the output results corresponding to the test data sets, a test evaluation result of the model application scheme under the at least one test index includes:
and acquiring a test evaluation result of the model application scheme under the at least one test index based on the test data set and an output result corresponding to the user inquiry information.
In an optional implementation manner, the generating, for the obtained model application requirement, a model application scheme matched with the model application requirement includes:
aiming at the acquired model application requirements, generating an initial model application scheme matched with the model application requirements;
responding to the debugging requirement, determining inquiry information aiming at a user, and utilizing an answer result provided by the initial model application scheme;
and responding to the received feedback information provided by the user for the answer result, and adjusting the initial model application scheme based on the feedback information to obtain an adjusted model application scheme.
In an alternative embodiment, the responding to receiving the test requirement for the model application scheme includes:
responding to a scheme test triggering operation aiming at the model application requirements, and if a plurality of model application schemes matched with the model application requirements are determined to be generated and stored, displaying version information of the plurality of model application schemes; different model application schemes have different version information;
and responding to the selection operation aiming at any version information, and taking the selected model application scheme corresponding to the version information as the model application scheme corresponding to the test requirement.
In an optional implementation manner, the generating, for the obtained model application requirement, a model application scheme matched with the model application requirement includes:
a page is constructed by a display model application scheme;
and responding to the requirement information input operation in the model application scheme construction page, and generating and displaying a model application scheme matched with the model application requirement in the model application scheme construction page.
In an alternative embodiment, the method further comprises:
and responding to the release operation of the model application scheme aiming at the debugging completion, recording the version information of the model application scheme displayed in the model application scheme construction page, and storing the model application scheme and the corresponding version information.
In an optional implementation manner, the model application scheme construction page comprises a node selection area, a node configuration area, a node output area and a task link display area;
the node selection area is provided with a plurality of task node identifiers; the task node identifier is used for adding a task node corresponding to the task node identifier in the task link display area after being triggered; different task node identifiers correspond to different types of task nodes;
the node configuration area is used for adding configuration information for task nodes in the task link display area;
the node output area is used for displaying the output information of the task nodes in the task link display area.
In a second aspect, an embodiment of the present disclosure further provides a solution generating and processing apparatus, including:
the generation module is used for generating a model application scheme matched with the model application requirement aiming at the acquired model application requirement; the model application scheme is provided with a task link comprising a plurality of task nodes; the plurality of task nodes comprise at least one model interaction node;
a determining module, configured to determine at least one test data set corresponding to the model application scheme and at least one test index of the model application scheme in response to receiving a test requirement for the model application scheme; the test data set and the test index are associated with the model application requirements;
The test module is used for carrying out model application test on the model application scheme by utilizing the at least one test data set to obtain an output result of each test data set under the model application scheme;
the evaluation module is used for acquiring a test evaluation result of the model application scheme under the at least one test index based on the output result corresponding to each test data set; the test evaluation result is used as reference information for judging whether the model application scheme can be used.
In an alternative embodiment, the test module is specifically configured to:
creating at least one round of test tasks for the test data set and the test index; wherein the number of rounds of the test task is related to the model application requirements and/or the test dataset;
and executing the at least one round of test tasks on the model application scheme by using the test data set to obtain an output result of the test data set under the model application scheme.
In an alternative embodiment, the test module is further configured to, after obtaining an output result of the test dataset under the model application scheme:
after the at least one round of test tasks are executed, acquiring user inquiry information;
Executing the model application scheme based on the user inquiry information to obtain an output result of the user inquiry information under the model application scheme;
the evaluation module is specifically used for:
and acquiring a test evaluation result of the model application scheme under the at least one test index based on the test data set and an output result corresponding to the user inquiry information.
In an alternative embodiment, the generating module is specifically configured to:
aiming at the acquired model application requirements, generating an initial model application scheme matched with the model application requirements;
responding to the debugging requirement, determining inquiry information aiming at a user, and utilizing an answer result provided by the initial model application scheme;
and responding to the received feedback information provided by the user for the answer result, and adjusting the initial model application scheme based on the feedback information to obtain an adjusted model application scheme.
In an alternative embodiment, the determining module, in response to receiving a test requirement for the model application scenario, is configured to:
responding to a scheme test triggering operation aiming at the model application requirements, and if a plurality of model application schemes matched with the model application requirements are determined to be generated and stored, displaying version information of the plurality of model application schemes; different model application schemes have different version information;
And responding to the selection operation aiming at any version information, and taking the selected model application scheme corresponding to the version information as the model application scheme corresponding to the test requirement.
In an optional implementation manner, the generating module is specifically configured to generate, for the acquired model application requirement, a model application scheme matched with the model application requirement, where the generating module includes:
a page is constructed by a display model application scheme;
and responding to the requirement information input operation in the model application scheme construction page, and generating and displaying a model application scheme matched with the model application requirement in the model application scheme construction page.
In an alternative embodiment, the apparatus further comprises a storage module for:
and responding to the release operation of the model application scheme aiming at the debugging completion, recording the version information of the model application scheme displayed in the model application scheme construction page, and storing the model application scheme and the corresponding version information.
In an optional implementation manner, the model application scheme construction page comprises a node selection area, a node configuration area, a node output area and a task link display area;
The node selection area is provided with a plurality of task node identifiers; the task node identifier is used for adding a task node corresponding to the task node identifier in the task link display area after being triggered; different task node identifiers correspond to different types of task nodes;
the node configuration area is used for adding configuration information for task nodes in the task link display area;
the node output area is used for displaying the output information of the task nodes in the task link display area.
In a third aspect, an optional implementation manner of the disclosure further provides a computer device, a processor, and a memory, where the memory stores machine-readable instructions executable by the processor, and the processor is configured to execute the machine-readable instructions stored in the memory, where the machine-readable instructions, when executed by the processor, perform the steps in the first aspect, or any possible implementation manner of the first aspect, when executed by the processor.
In a fourth aspect, an alternative implementation of the present disclosure further provides a computer readable storage medium having stored thereon a computer program which when executed performs the steps of the first aspect, or any of the possible implementation manners of the first aspect.
The description of the effects of the above-described scheme generation processing apparatus, computer device, and computer-readable storage medium is referred to the description of the above-described scheme generation processing method, and is not repeated here.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the aspects of the disclosure.
The scheme generation processing method, the device, the computer equipment and the storage medium provided by the embodiment of the disclosure can complete the construction and generation of the model application scheme, and can test the generated model application scheme, so that the integrated execution of construction, generation and test verification is realized, and the overall development efficiency of the model application scheme is greatly improved. In addition, in the test link, a test data set and test indexes of a model application scheme can be determined according to specific model application requirements, and a test flow can be executed by one key based on the test data set, so that the efficiency of the test link is greatly improved; aiming at the output result obtained by the test, the test evaluation of the model application scheme under the test index meeting the requirement can be completed, so that reference information is provided for a user to judge the usability of the model application scheme, and the model application scheme meets the requirement of the user more formally.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the embodiments are briefly described below, which are incorporated in and constitute a part of the specification, these drawings showing embodiments consistent with the present disclosure and together with the description serve to illustrate the technical solutions of the present disclosure. It is to be understood that the following drawings illustrate only certain embodiments of the present disclosure and are therefore not to be considered limiting of its scope, for the person of ordinary skill in the art may admit to other equally relevant drawings without inventive effort.
FIG. 1 illustrates a flow chart of a solution generation processing method provided by some embodiments of the present disclosure;
FIG. 2 illustrates one of the schematics of a model application schema build page provided by some embodiments of the present disclosure;
FIG. 3 illustrates a second schematic diagram of a model application schema build page provided by some embodiments of the present disclosure;
FIG. 4 illustrates a flow chart of another scenario generation processing method provided by some embodiments of the present disclosure;
FIG. 5 illustrates a schematic diagram of a scenario generation processing apparatus provided by some embodiments of the present disclosure;
fig. 6 illustrates a schematic diagram of a computer device provided by some embodiments of the present disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, but not all embodiments. The components of the disclosed embodiments generally described and illustrated herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
The model application scheme refers to a problem solution built on the basis of a neural network model. When a user builds a model application scheme, the quality of the designed model application scheme is difficult to control due to uncertainty of using the model and fuzzy capability boundary of the model, and the model application scheme is generally difficult to directly put into use, so that development of the model application scheme is not facilitated.
Based on the above research, the present disclosure provides a solution generating and processing method, device, computer equipment and storage medium, which not only can complete the construction and generation of a model application solution, but also can test the generated model application solution, thereby realizing the integrated execution of construction, generation and test verification, and greatly improving the overall development efficiency of the model application solution. In addition, a test data set and test indexes of a model application scheme can be determined according to specific model application requirements, and a test flow can be executed by one key based on the test data set, so that the efficiency of a test link is greatly improved; aiming at the output result obtained by the test, the test evaluation of the model application scheme under the test index meeting the requirement can be completed, so that reference information is provided for a user to judge the usability of the model application scheme, and the model application scheme meets the requirement of the user more formally.
The present invention is directed to a method for manufacturing a semiconductor device, and a semiconductor device manufactured by the method.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
For the sake of understanding the present embodiment, first, a solution generating and processing method disclosed in the embodiments of the present disclosure will be described in detail, where an execution body of the solution generating and processing method provided in the embodiments of the present disclosure is generally a computer device with a certain computing capability. In some possible implementations, the solution generating processing method may be implemented by a manner in which a processor calls computer readable instructions stored in a memory.
The scheme generating processing method provided by the embodiment of the present disclosure is described below by taking an execution body as a terminal device as an example.
Referring to fig. 1, a flowchart of a solution generating processing method according to an embodiment of the present disclosure is shown, where the method includes steps S101 to S104, where:
s101, generating a model application scheme matched with the model application requirement aiming at the acquired model application requirement; the model application scheme is provided with a task link comprising a plurality of task nodes; the plurality of task nodes comprise at least one model interaction node.
The model application requirements may indicate that there are issues that need to be addressed by the model application solution, such as introducing a particular object, solving a mathematical problem, scheduling a plan, generating a video encyclopedia based on image information, and so forth.
The model application scheme can be provided with a task link, one or more model interaction nodes can be arranged in the task link, the model interaction nodes can use a deployed model (such as a natural language processing model) to process, and an answer result of a to-be-solved problem is output. The model interaction node is used for taking the output information of the previous task node in the task link as model input information, and outputting a matching result through model processing.
In the practical application process, the model needs enough context information in order to output a relatively perfect answer result, so the model interaction node can be deployed with preset context information, can combine the context information (such as a content knowledge base) and input information (related problems), and output a specific solution.
In order to overcome the limitation of the model (such as processing text content), other task nodes can be arranged in the task link, and the task nodes can realize functions which are not possessed by the model to make up for the defects of the model. In the task link, other task nodes can be arranged before or after the model interaction node, namely, other task nodes can provide input information for the model interaction node and can also further process the output result of the model interaction node.
In short, the task node in the task link can use the output of the previous task node to perform corresponding processing and output new information for the next task node to use. The first task node in the task link may take as input information entered by the user and the last task node in the task link may take as output of the model application scheme.
In the process of generating the model application scheme, the terminal equipment can select a plurality of task nodes based on the model application requirement, splice the task nodes and pre-configure parameters of the task nodes.
For example, a requirement label may be associated with a task node, and model application requirements may be represented using the requirement label. And determining the task node corresponding to the model application demand through the association relation between the demand label and the task node. Or, the semantic recognition can be directly carried out on the model application requirements, and the requirement label matched with the model application requirements is determined.
In one possible implementation manner, a plurality of preset model application scheme templates may be provided, the model application scheme templates may correspond to preset model application requirements, the corresponding preset model application requirements may be determined according to the obtained model application requirements of the user, and a model application scheme required by the user may be generated according to the model application scheme templates corresponding to the preset model application requirements.
In a specific implementation process, the model application scheme generated by the terminal device may not meet the requirement of the user, and therefore, the terminal device may modify the model application scheme.
For example, an initial model application scheme (such as the model application scheme generated based on the template) matched with the model application requirement can be generated according to the acquired model application requirement, then, the initial model application scheme can be displayed through a model application scheme construction page, and a user can debug the initial model application scheme.
When the model is a natural language processing model, the user can input questioning information, the initial model application scheme can provide answer results for the user questioning information, the user can provide feedback information according to the answer results, and the initial model application scheme can be adjusted based on the feedback information to obtain an adjusted model application scheme.
In a specific implementation, the user may trigger a debug button in the model application project build page, thereby triggering a debug requirement.
The answer result can be information output by the model application scheme aiming at the user question information; the feedback information may include user supplemental conditions to the model, task node adjustment information, and the like.
In one possible implementation manner, the terminal device may directly display a model application scheme construction page, the user may perform a requirement information input operation, and the terminal device may generate and display a model application scheme matched with the requirement of the model application in the model application scheme construction page according to the requirement information output operation.
The above-described demand information input operation may be used to input a model application demand, which may indicate information such as the type, number, configuration information of task nodes, configuration of task links, and the like of task nodes that are required.
For example, the model application scheme building page may include a plurality of regions, such as a node selection region, a node configuration region, a node output region, a task link presentation region, and the like. The user can operate on these areas to input model application requirements.
The node selection area can display a plurality of task node identifiers; the task node identifier may be triggered, and after triggered, the terminal device may add a task node corresponding to the task node identifier in the task link display area. Different task node identities correspond to different types of task nodes.
The node configuration area may be used to add configuration information, such as pre-input information for a model, for the task nodes in the task link presentation area.
The node output area may be used to expose output information of the task nodes in the task link exposure area.
Referring to fig. 2 and fig. 3, there are respectively a first schematic diagram of a model application scheme building page and a second schematic diagram of a model application scheme building page provided in an embodiment of the present disclosure. As shown in FIG. 2, the model application project build page includes a node selection area 210, a node configuration area 220, a node output area 230, and a task link presentation area 240.
In the node selection area 210, task node identifiers 211 corresponding to various task nodes may be displayed, and a user may trigger the task node identifiers 211, so as to add corresponding task nodes in the task link display area 240.
In the task link display area 240, a task link that is currently configured may be displayed, and a user may configure task nodes that are displayed in the task link display area 240, such as moving positions of task nodes, changing connection relationships between task nodes, deleting task nodes, and so on.
In the node configuration area 220, a user may input configuration information of a task node, for example, the user may first select a target task node in the task link display area, in the node configuration area, the selected target task node may be configured, for example, a task node corresponding to a natural language processing model may be configured, and pre-input information may be configured, for example, when a model application requirement is a solution, the pre-input information may be "you are a professional high-school teacher please answer the following student questions", and the pre-input information may be context information of a model application scheme, and the model may answer questions according to the pre-input information and an output of a previous task node.
The node output area 230 may show the output of the task node, for example, when the user may input the question information during debugging, the information output by the model interaction node may be the answer result for the question information.
In fig. 3, the configuration information in the node configuration area 220 is updated by debugging to "you are a professional high-school teacher, please answer the following student questions; in the answer process, please refer to the following requirements: 1. answer no more than 100 words; 2. content that is not relevant to learning is not replied to.
After generating the model application scheme, the user may issue the generated model application scheme, and the user may perform an issue operation for the debugged model application scheme, such as triggering an issue button in the model application scheme building page. When the terminal equipment detects the release operation, the version information of the model application scheme displayed in the model application scheme construction page can be recorded, and the model application scheme and the corresponding version information are stored, so that the model application scheme is released and can be in a state capable of being tested.
S102, determining at least one test data set corresponding to the model application scheme and at least one test index of the model application scheme in response to receiving test requirements for the model application scheme; the test data set and the test index are associated with the model application requirements.
After the model application scenario is generated, the model application scenario may be tested. When a user needs to test the model application scheme, a test requirement for the model application scheme can be received, and at the moment, at least one test data set corresponding to the model application scheme and at least one test index of the model application scheme can be determined.
The test data set and the test index can be determined according to the model application requirements, the test data can be used as the input of the model application scheme, and the test index can be used for evaluating the output of the model application scheme. For example, for a model application requirement of solving a problem, the test data set may include a problem to be solved, and the test index may include whether the problem solving process is correct, whether the problem solving result is correct, whether the problem solving explanation is correct, and the like.
After the test data set and the test indicators are determined, the model application scheme may be tested.
In addition to testing newly generated model application schemes, embodiments of the present disclosure may also test pre-built and stored model application schemes.
For example, in response to receiving a test requirement for a model application scenario, it may refer to detecting a scenario test trigger operation for the model application requirement. When the scheme test triggering operation is detected, if a plurality of model application schemes with matched model application requirements are determined to be generated and stored, version information of the plurality of model application schemes (different model application schemes have different version information) can be displayed, a user can select the version information, and when the selection operation aiming at any version information is detected, the model application scheme corresponding to the selected version information can be used as the model application scheme corresponding to the test requirements.
The test data sets may be multiple, for example, under the application requirement of the model for solving the problem, the test data sets may be set according to disciplines or disciplines, and different disciplines or disciplines may be set by different test data sets. In a test dataset, multiple test items may be included, one test item may correspond to each test task. For example, under the model application requirement of solving the problem, one test item can be a problem; under the model application requirement of writing, one test item can be a corresponding requirement of writing once (such as theme, word number requirement, article structure requirement and the like).
S103, performing model application test on the model application scheme by using the at least one test data set to obtain an output result of each test data set under the model application scheme.
In the process of performing the model application test, the test data set can be input into the model application scheme to obtain an output result of the test data set under the tested model application scheme. The output result may include an output of the model application scheme, and the output may be an output of the model application scheme as a whole, or an output of a specific or all task nodes in the model application scheme.
Since model application schemes can typically perform multiple rounds of input and output, one or more rounds of test tasks can be generated during the test. The number of rounds of test tasks may be related to the model application requirements and/or the test dataset.
For example, for model application requirements of creating articles, the creation requirements can be directly input, so that corresponding output articles can be obtained, for model application requirements of solving problems and the like, multiple rounds of tasks can be set to progressively meet user requirements, for example, after a model application scheme outputs answers corresponding to the problems, the questions can be continuously searched, for example, the method is "I are not very clear," please explain the lower principle "and the like. Under the scene that multiple rounds of questions and answers can be performed, multiple rounds of test tasks can be generated.
After the test task is executed, the user can select to inquire, the terminal equipment can acquire the user inquiry information, and execute the model application scheme based on the user inquiry information to acquire an output result of the user inquiry information under the model application scheme.
The user inquiry information can be used in a model application scheme after the test task is completed, so that the model application scheme can take the input and output in the test task as the context of the user inquiry information, and output a corresponding output result to realize further test.
S104, based on the output results corresponding to the test data sets, obtaining test evaluation results of the model application scheme under the at least one test index; the test evaluation result is used as reference information for judging whether the model application scheme can be used.
In this step, whether the model application scheme meets the test index can be determined according to the output result. The terminal equipment can identify the result information corresponding to different test indexes from the output result, and then compare the result information with the standard value corresponding to the test index, so as to determine the test evaluation result of the model application scheme under at least one test index.
For example, in model application requirements for solving problems, the test index may include whether the result of solving the problem is correct, and whether the process of solving the problem is complete.
For multiple rounds of test tasks, the result information corresponding to the test indexes can be determined from the test evaluation results of each round, and the test evaluation results of each round are determined.
Under the condition that the user inquiry information is obtained, a test evaluation result of the model application scheme under the test index can be obtained based on the test data set and an output result corresponding to the user inquiry information.
The test evaluation result can reflect whether the model application scheme meets the test index or not, or the completion degree of the test index can be used as reference information for a user to judge whether the model application scheme can be used or not.
The embodiment can be applied to a development scene of a model application scheme. The development scene of the model application scheme can comprise the stages of demand construction, scheme design, scheme evaluation, scheme release and the like. In the demand build phase, topic ideas and demand analysis can be performed, and then the solution design phase is entered. In the scheme design stage, a prototype of the model application scheme can be designed first, in the design process, some task nodes (privately-owned tools such as image recognition tools) can be introduced to package the model, the task links are subjected to flow arrangement, debugging and parameter adjustment are performed, and the prototype of the model application scheme is completed and delivered. And then, the model application scheme can be evaluated, the evaluation can be performed for effect evaluation and/or quality evaluation, and if the evaluated model application scheme does not accord with the expectation, the model application scheme can be debugged or parameterized again. And after the model application scheme accords with the expectation, delivering the model application scheme passing evaluation to a release link for distribution.
When the test is performed, the test data set, the test indexes and the related information of the test task can be displayed, and after the test is completed, the test evaluation result obtained by the test can be displayed.
The embodiment of the disclosure can provide a visual management page of model application requirements and corresponding model application schemes of various versions thereof, drag arrangement of a visual graphical interface can be provided in a model application scheme construction page, interaction is carried out between configured task nodes and a model, required information is obtained by utilizing the task nodes, relevant templates and optimization of the task nodes are provided, the task nodes can be arranged through bonding in a script mode, and debugging and running of the model application scheme are supported. In evaluating the model application scheme, versions of the model application scheme may be distinguished.
In a specific implementation process, a plurality of entity identifiers may be set for the present processing method. For example, the identifier of the model application requirement may be a motif, and the entity may be provided with a motif_id and a title type field, where the motif_id may be used to mark a unique requirement, and the title may be a requirement name.
The graph can be a model application scheme, the wip_graph can be an edited model application scheme, and the release_graph can represent a published model application scheme which can be tested. The graph may be provided with fields of a type of activity_id, version/update_version, and graph_structure, where activity_id is used to identify a model application requirement corresponding to the model application scheme, version/update_version is used to identify a version or an update version corresponding to the model application scheme, and graph_structure is used to record a data structure corresponding to the model application scheme, such as task nodes, overall execution logic, and the like used in the model application scheme.
The entity identifier corresponding to the test data set may be a dataset, and a field of the motif_ id, name, schemas type may be set, where motif_id may be used to correlate model application requirements, name is used to represent the name of the test data set, and schema may record the fields contained in the test data set.
The test index may be represented as a metric, and fields of motif_id, name and type are set, where motif_id may be used to represent the associated model application requirement, name is the name of the test index, type is used to represent the type of the test index, and mode is used to represent the evaluation mode (such as scoring, correctness evaluation) of the test index.
The test task may be represented as a task, and a motif_ id, version, dataset _id and a task_name field may be set under the task, where the motif_id may be used to represent an associated model application requirement, version is used to represent a version of a model application scheme, dataset_id is used to represent an associated test dataset, and name is the name of the test task.
The model application scheme being edited can be distinguished from the published model application scheme by the wip_graph entity, the model application scheme being edited (draft version) can be stored in the mia_graph_structure field, and the latest draft version can be maintained by update_version.
After the construction and editing work of the model application scheme is completed, the user can issue the corresponding content into a formal version, the formally issued scheme is stored in a mia_release_graph field, the specific content of the model application scheme is recorded through the graph_structure field, and the version number is maintained through a version section. In the subsequent evaluation link, complete evaluation can be performed on the released version.
The scheme generation processing method provided by the embodiment of the disclosure can realize the integrated execution of the construction generation and test evaluation of the model application scheme; and the test data set and the test index of the model application scheme can be determined according to the specific model application requirements of the model application scheme, and the test evaluation result of the model application scheme under the test index meeting the requirements can be obtained by testing the output result based on the test data set, so that the evaluation of the model application scheme is realized, the reference information is provided for users to judge the availability of the model application scheme, and the development of the model application scheme is facilitated.
Referring to fig. 4, a schematic diagram of another solution generating processing method according to an embodiment of the disclosure is shown. After the method obtains the model application scheme to be tested, the matched test indexes and data sets can be selected, then a data execution workflow of single-round or multi-round test tasks is created, and each test task is processed asynchronously. In the execution of the test task, if one round of task fails to be executed or the user needs to inquire, the test task can be retried or inquired, and after the task is executed successfully, the next round of task can be executed. After all the test tasks are executed, the output result of the model application scheme can be utilized to determine the test result.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
Based on the same inventive concept, the embodiment of the disclosure further provides a solution generating and processing device corresponding to the solution generating and processing method, and since the principle of solving the problem by the device in the embodiment of the disclosure is similar to that of the solution generating and processing method in the embodiment of the disclosure, the implementation of the device can refer to the implementation of the method, and the repetition is omitted.
Referring to fig. 5, a schematic diagram of a solution generating and processing apparatus according to an embodiment of the present disclosure is shown, where the apparatus includes:
a generating module 510, configured to generate, for an obtained model application requirement, a model application scheme that matches the model application requirement; the model application scheme is provided with a task link comprising a plurality of task nodes; the task nodes comprise at least one model interaction node, and the model interaction node is used for taking first output information of a previous task node in the task link as model input information and outputting second output information matched with the first output information through model processing;
A determining module 520, configured to determine at least one test data set corresponding to the model application scenario and at least one test index of the model application scenario in response to receiving a test requirement for the model application scenario; the test data set and the test index are associated with the model application requirements;
the test module 530 is configured to perform a model application test on the model application solution by using the at least one test data set, so as to obtain an output result of each test data set under the model application solution;
the evaluation module is used for acquiring a test evaluation result of the model application scheme under the at least one test index based on the output result corresponding to each test data set; the test evaluation result is used as reference information for judging whether the model application scheme can be used.
In an alternative embodiment, the test module 530 is specifically configured to:
creating at least one round of test tasks for the test data set and the test index; wherein the number of rounds of the test task is related to the model application requirements and/or the test dataset;
and executing the at least one round of test tasks on the model application scheme by using the test data set to obtain an output result of the test data set under the model application scheme.
In an alternative embodiment, the test module 530 is further configured to, after obtaining the output result of the test dataset under the model application scheme:
after the at least one round of test tasks are executed, acquiring user inquiry information;
executing the model application scheme based on the user inquiry information to obtain an output result of the user inquiry information under the model application scheme;
the evaluation module 540 is specifically configured to:
and acquiring a test evaluation result of the model application scheme under the at least one test index based on the test data set and an output result corresponding to the user inquiry information.
In an alternative embodiment, the generating module 510 is specifically configured to:
aiming at the acquired model application requirements, generating an initial model application scheme matched with the model application requirements;
responding to the debugging requirement, determining inquiry information aiming at a user, and utilizing an answer result provided by the initial model application scheme;
and responding to the received feedback information provided by the user for the answer result, and adjusting the initial model application scheme based on the feedback information to obtain an adjusted model application scheme.
In an alternative embodiment, the determining module 520, in response to receiving a test requirement for the model application scenario, is configured to:
responding to a scheme test triggering operation aiming at the model application requirements, and if a plurality of model application schemes matched with the model application requirements are determined to be generated and stored, displaying version information of the plurality of model application schemes; different model application schemes have different version information;
and responding to the selection operation aiming at any version information, and taking the selected model application scheme corresponding to the version information as the model application scheme corresponding to the test requirement.
In an optional implementation manner, the generating module 510 is specifically configured to generate, for the acquired model application requirement, a model application scheme that matches the model application requirement, where the generating module includes:
a page is constructed by a display model application scheme;
and responding to the requirement information input operation in the model application scheme construction page, and generating and displaying a model application scheme matched with the model application requirement in the model application scheme construction page.
In an alternative embodiment, the apparatus further comprises a storage module for:
And responding to the release operation of the model application scheme aiming at the debugging completion, recording the version information of the model application scheme displayed in the model application scheme construction page, and storing the model application scheme and the corresponding version information.
In an optional implementation manner, the model application scheme construction page comprises a node selection area, a node configuration area, a node output area and a task link display area;
the node selection area is provided with a plurality of task node identifiers; the task node identifier is used for adding a task node corresponding to the task node identifier in the task link display area after being triggered; different task node identifiers correspond to different types of task nodes;
the node configuration area is used for adding configuration information for task nodes in the task link display area;
the node output area is used for displaying the output information of the task nodes in the task link display area.
The process flow of each module in the apparatus and the interaction flow between the modules may be described with reference to the related descriptions in the above method embodiments, which are not described in detail herein.
The embodiment of the disclosure further provides a computer device, as shown in fig. 6, which is a schematic structural diagram of the computer device provided by the embodiment of the disclosure, including:
a processor 61 and a memory 62; the memory 62 stores machine readable instructions executable by the processor 61, the processor 61 being configured to execute the machine readable instructions stored in the memory 62, the machine readable instructions when executed by the processor 61, the processor 61 performing the steps of:
generating a model application scheme matched with the model application requirement aiming at the acquired model application requirement; the model application scheme is provided with a task link comprising a plurality of task nodes; the plurality of task nodes comprise at least one model interaction node;
in response to receiving a test requirement for the model application scheme, determining at least one test dataset corresponding to the model application scheme and at least one test indicator of the model application scheme; the test data set and the test index are associated with the model application requirements;
performing model application test on the model application scheme by using the at least one test data set to obtain an output result of each test data set under the model application scheme;
Based on the output results corresponding to the test data sets, obtaining test evaluation results of the model application scheme under the at least one test index; the test evaluation result is used as reference information for judging whether the model application scheme can be used.
In an alternative embodiment, in the instructions executed by the processor 61, the performing, with the at least one test dataset, a model application test on the model application solution to obtain an output result of each test dataset under the model application solution, includes:
creating at least one round of test tasks for the test data set and the test index; wherein the number of rounds of the test task is related to the model application requirements and/or the test dataset;
and executing the at least one round of test tasks on the model application scheme by using the test data set to obtain an output result of the test data set under the model application scheme.
In an alternative embodiment, after the obtaining the output result of the test dataset under the model application scheme in the instructions executed by the processor 61, the method further includes:
after the at least one round of test tasks are executed, acquiring user inquiry information;
Executing the model application scheme based on the user inquiry information to obtain an output result of the user inquiry information under the model application scheme;
the obtaining, based on the output results corresponding to the test data sets, a test evaluation result of the model application scheme under the at least one test index includes:
and acquiring a test evaluation result of the model application scheme under the at least one test index based on the test data set and an output result corresponding to the user inquiry information.
In an alternative embodiment, in the instructions executed by the processor 61, the generating, for the obtained model application requirement, a model application scheme matched with the model application requirement includes:
aiming at the acquired model application requirements, generating an initial model application scheme matched with the model application requirements;
responding to the debugging requirement, determining inquiry information aiming at a user, and utilizing an answer result provided by the initial model application scheme;
and responding to the received feedback information provided by the user for the answer result, and adjusting the initial model application scheme based on the feedback information to obtain an adjusted model application scheme.
In an alternative embodiment, the instructions executed by the processor 61 in response to receiving the test requirement for the model application scenario include:
responding to a scheme test triggering operation aiming at the model application requirements, and if a plurality of model application schemes matched with the model application requirements are determined to be generated and stored, displaying version information of the plurality of model application schemes; different model application schemes have different version information;
and responding to the selection operation aiming at any version information, and taking the selected model application scheme corresponding to the version information as the model application scheme corresponding to the test requirement.
In an alternative embodiment, in the instructions executed by the processor 61, the generating, for the obtained model application requirement, a model application scheme matched with the model application requirement includes:
a page is constructed by a display model application scheme;
and responding to the requirement information input operation in the model application scheme construction page, and generating and displaying a model application scheme matched with the model application requirement in the model application scheme construction page.
In an alternative embodiment, the instructions executed by the processor 61 further include:
And responding to the release operation of the model application scheme aiming at the debugging completion, recording the version information of the model application scheme displayed in the model application scheme construction page, and storing the model application scheme and the corresponding version information.
In an alternative embodiment, in the instructions executed by the processor 61, the model application scheme construction page includes a node selection area, a node configuration area, a node output area, and a task link display area;
the node selection area is provided with a plurality of task node identifiers; the task node identifier is used for adding a task node corresponding to the task node identifier in the task link display area after being triggered; different task node identifiers correspond to different types of task nodes;
the node configuration area is used for adding configuration information for task nodes in the task link display area;
the node output area is used for displaying the output information of the task nodes in the task link display area.
The memory 62 includes a memory 621 and an external memory 622; the memory 621 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 61 and data exchanged with the external memory 622 such as a hard disk, and the processor 61 exchanges data with the external memory 622 via the memory 621.
The specific execution process of the above instruction may refer to the steps of the scheme generation processing method described in the embodiments of the present disclosure, which is not described herein.
The present disclosure also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the scenario generation processing method described in the method embodiment described above. Wherein the storage medium may be a volatile or nonvolatile computer readable storage medium.
The embodiments of the present disclosure further provide a computer program product, where the computer program product carries a program code, where instructions included in the program code may be used to perform steps of a solution generating processing method described in the foregoing method embodiments, and specifically reference may be made to the foregoing method embodiments, which are not described herein in detail.
Wherein the above-mentioned computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present disclosure, and are not intended to limit the scope of the disclosure, but the present disclosure is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, it is not limited to the disclosure: any person skilled in the art, within the technical scope of the disclosure of the present disclosure, may modify or easily conceive changes to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some of the technical features thereof; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the disclosure, and are intended to be included within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (11)

1. A scenario generation processing method, comprising:
generating a model application scheme matched with the model application requirement aiming at the acquired model application requirement; the model application scheme is provided with a task link comprising a plurality of task nodes; the plurality of task nodes comprise at least one model interaction node;
in response to receiving a test requirement for the model application scheme, determining at least one test dataset corresponding to the model application scheme and at least one test indicator of the model application scheme; the test data set and the test index are associated with the model application requirements;
performing model application test on the model application scheme by using the at least one test data set to obtain an output result of each test data set under the model application scheme;
based on the output results corresponding to the test data sets, obtaining test evaluation results of the model application scheme under the at least one test index; the test evaluation result is used as reference information for judging whether the model application scheme can be used.
2. The method according to claim 1, wherein the performing the model application test on the model application solution by using the at least one test dataset to obtain an output result of each test dataset under the model application solution includes:
Creating at least one round of test tasks for the test data set and the test index; wherein the number of rounds of the test task is related to the model application requirements and/or the test dataset;
and executing the at least one round of test tasks on the model application scheme by using the test data set to obtain an output result of the test data set under the model application scheme.
3. The method according to claim 2, wherein after obtaining the output result of the test dataset under the model application scheme, further comprising:
after the at least one round of test tasks are executed, acquiring user inquiry information;
executing the model application scheme based on the user inquiry information to obtain an output result of the user inquiry information under the model application scheme;
the obtaining, based on the output results corresponding to the test data sets, a test evaluation result of the model application scheme under the at least one test index includes:
and acquiring a test evaluation result of the model application scheme under the at least one test index based on the test data set and an output result corresponding to the user inquiry information.
4. The method of claim 1, wherein the generating a model application solution matching the model application requirement for the obtained model application requirement comprises:
aiming at the acquired model application requirements, generating an initial model application scheme matched with the model application requirements;
responding to the debugging requirement, determining inquiry information aiming at a user, and utilizing an answer result provided by the initial model application scheme;
and responding to the received feedback information provided by the user for the answer result, and adjusting the initial model application scheme based on the feedback information to obtain an adjusted model application scheme.
5. The method of claim 1, wherein the responding to receiving test requirements for the model application scenario comprises:
responding to a scheme test triggering operation aiming at the model application requirements, and if a plurality of model application schemes matched with the model application requirements are determined to be generated and stored, displaying version information of the plurality of model application schemes; different model application schemes have different version information;
and responding to the selection operation aiming at any version information, and taking the selected model application scheme corresponding to the version information as the model application scheme corresponding to the test requirement.
6. The method of claim 1, wherein the generating a model application solution matching the model application requirement for the obtained model application requirement comprises:
a page is constructed by a display model application scheme;
and responding to the requirement information input operation in the model application scheme construction page, and generating and displaying a model application scheme matched with the model application requirement in the model application scheme construction page.
7. The method of claim 6, wherein the method further comprises:
and responding to the release operation of the model application scheme aiming at the debugging completion, recording the version information of the model application scheme displayed in the model application scheme construction page, and storing the model application scheme and the corresponding version information.
8. The method of claim 6, wherein the model application scheme build page comprises a node selection area, a node configuration area, a node output area, and a task link presentation area;
the node selection area is provided with a plurality of task node identifiers; the task node identifier is used for adding a task node corresponding to the task node identifier in the task link display area after being triggered; different task node identifiers correspond to different types of task nodes;
The node configuration area is used for adding configuration information for task nodes in the task link display area;
the node output area is used for displaying the output information of the task nodes in the task link display area.
9. A scenario generation processing apparatus, comprising:
the generation module is used for generating a model application scheme matched with the model application requirement aiming at the acquired model application requirement; the model application scheme is provided with a task link comprising a plurality of task nodes; the plurality of task nodes comprise at least one model interaction node;
a determining module, configured to determine at least one test data set corresponding to the model application scheme and at least one test index of the model application scheme in response to receiving a test requirement for the model application scheme; the test data set and the test index are associated with the model application requirements;
the test module is used for carrying out model application test on the model application scheme by utilizing the at least one test data set to obtain an output result of each test data set under the model application scheme;
the evaluation module is used for acquiring a test evaluation result of the model application scheme under the at least one test index based on the output result corresponding to each test data set; the test evaluation result is used as reference information for judging whether the model application scheme can be used.
10. A computer device, comprising: a processor, a memory storing machine-readable instructions executable by the processor for executing the machine-readable instructions stored in the memory, which when executed by the processor, perform the steps of the solution generation processing method according to any one of claims 1 to 8.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a computer device, performs the steps of the scenario generation processing method according to any one of claims 1 to 8.
CN202311235685.3A 2023-09-22 2023-09-22 Scheme generation processing method, device, computer equipment and storage medium Pending CN117271338A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311235685.3A CN117271338A (en) 2023-09-22 2023-09-22 Scheme generation processing method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311235685.3A CN117271338A (en) 2023-09-22 2023-09-22 Scheme generation processing method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117271338A true CN117271338A (en) 2023-12-22

Family

ID=89210033

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311235685.3A Pending CN117271338A (en) 2023-09-22 2023-09-22 Scheme generation processing method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117271338A (en)

Similar Documents

Publication Publication Date Title
CN106446412B (en) Model-based test method for avionics system
CN109522228B (en) Interface automation test data construction method, device, platform and storage medium
Agarwal et al. Expert system and it's requirement engineering process
CN109542765A (en) Database script verification method, device, computer equipment and storage medium
CN111753075A (en) Method and device for creating question and answer data of customer service robot and computer equipment
US10380011B2 (en) Method, apparatus, and computer-readable medium for performing functional testing of software
CN111158656B (en) Test code generation method and device based on fruit tree method
CN108984208A (en) A kind of function document generating method, apparatus and system
CN113886606B (en) Data annotation method, device, medium and equipment based on knowledge graph
CN114201616A (en) Knowledge graph construction method and system based on multi-source database
CN110865806B (en) Code processing method, device, server and storage medium
CN114003451B (en) Interface testing method, device, system and medium
CN111814443A (en) Table generation method and device combining RPA and AI, computing equipment and storage medium
CN113705816B (en) Flow chart generation method, electronic device, device and readable storage medium
CN109800147B (en) Test case generation method and terminal equipment
WO2022134001A1 (en) Machine learning model framework development method and system based on containerization technology
CN113672674A (en) Method, electronic device and storage medium for automatically arranging service flow
CN110059967B (en) Data processing method and device applied to city aid decision analysis
CN112052157A (en) Test message construction method, device and system
CN117271338A (en) Scheme generation processing method, device, computer equipment and storage medium
Fatwanto Translating software requirements from natural language to formal specification
Rahiman et al. CopyPoppy–A Source Code Plagiarism Detector
CN110597874B (en) Data analysis model creation method and device, computer equipment and storage medium
CN113157551A (en) ROS-oriented differential fuzzy test method
CN110019146A (en) A kind of implementation method and device of list table maintenance function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination