CN117313736A - Implicit demand recognition method and device for comment text - Google Patents

Implicit demand recognition method and device for comment text Download PDF

Info

Publication number
CN117313736A
CN117313736A CN202310974595.XA CN202310974595A CN117313736A CN 117313736 A CN117313736 A CN 117313736A CN 202310974595 A CN202310974595 A CN 202310974595A CN 117313736 A CN117313736 A CN 117313736A
Authority
CN
China
Prior art keywords
implicit
requirement
demand
comment
level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310974595.XA
Other languages
Chinese (zh)
Inventor
李秋丹
毛雪
苑敏洁
吴奕霖
曾大军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Original Assignee
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN202310974595.XA priority Critical patent/CN117313736A/en
Publication of CN117313736A publication Critical patent/CN117313736A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • G06F16/355Class or cluster creation or modification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • G06N3/0455Auto-encoder networks; Encoder-decoder networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention provides an implicit requirement identification method and device for comment texts, wherein the method comprises the following steps: obtaining comment text; mining a first implicit requirement of the comment text by adopting a requirement prediction model; reconstructing the first implicit requirement to obtain a second implicit requirement of the concrete expression; judging and obtaining a specific demand level and demand intensity corresponding to the first implicit demand according to the second implicit demand and the comment text by adopting a demand prediction model, wherein the demand level is used for representing the demand type of the first implicit demand, and the demand intensity is used for representing the intensity degree of the first implicit demand; outputting an implicit requirement recognition result of the comment text, wherein the implicit requirement recognition result comprises: first implicit demand, demand level, demand strength. According to the embodiment of the invention, the technical problems that the implicit demands of the users are difficult to deeply model and the demands level and the strength cannot be jointly analyzed in the existing method are solved, and the automatic deep mining of the implicit demands of the users from massive comments is realized.

Description

Implicit demand recognition method and device for comment text
Technical Field
The invention relates to the field of computers, in particular to an implicit requirement identification method and device for comment texts.
Background
In the related art, implicit requirements contained in social media user comments generally comprise deep semantics, and obvious hierarchy and different language strengths often exist in the requirements, so that complex interactions and correlations between requirements put higher requirements on mining and analysis of the implicit requirements of users. The existing text analysis method aiming at the user comments usually only starts from a single view, omits deep association between the user surface layer semantics and the actual requirements which are more helpful to merchants, and cannot simultaneously identify multi-level implicit requirements and requirement strength.
In the related art, there is a method for judging the emotion polarity of comments by taking sentences as granularity, and emotion label categories are generally defined as positive/neutral/negative, and deep semantic mining of the comments is lacked; there also exist structured analysis of user emotion from the perspective of aspect words by extracting "aspect word-opinion word-emotion tendencies" triples from a given sentence in a two-stage approach. However, the requirement expression of the user is usually unstructured, and there is usually no direct correspondence between the aspect words directly expressed in the comment text and the real requirement, so the actual requirement of the user cannot be rapidly and directly mastered.
In view of the above problems in the related art, an efficient and accurate solution has not been found.
Disclosure of Invention
The invention provides an implicit requirement identification method and device for comment texts, which solve the technical problems in the related art.
According to one embodiment of the present invention, there is provided an implicit requirement recognition method for comment text, including: obtaining comment text; mining a first implicit requirement of the comment text by adopting a requirement prediction model; reconstructing the first implicit requirement to obtain a second implicit requirement of a specific expression; judging and obtaining a specific demand level and demand intensity corresponding to the first implicit demand according to the second implicit demand and the evaluation paper by adopting the demand prediction model, wherein the demand level is used for representing the demand type of the first implicit demand, and the demand intensity is used for representing the intensity degree of the first implicit demand; outputting an implicit requirement recognition result of the comment text, wherein the implicit requirement recognition result comprises: the first implicit requirement, the requirement level, the requirement strength.
Optionally, before mining the first implicit requirement of the comment text using the multi-level requirement prediction module, the method further includes: setting a first class label set of implicit requirements, a second class label set of a requirement layer, and a third class label set of requirement strength; and constructing an implicit demand classification subtask based on the first category label set, constructing a demand level classification subtask based on the second category label set, and constructing a demand intensity classification subtask based on the third category label set.
Optionally, before mining the first implicit requirement of the comment text using the requirement prediction model, the method further includes: acquiring a generated large language model; configuring trainable prompt parameters in front of an input sequence in each layer of the conversion structure of the generated large language model to obtain an initial demand prediction model; sample data are acquired, and the following steps are iteratively executed until the difference value between the predicted value and the true value is smaller than a preset threshold value: inputting the sample data into the initial demand prediction model to obtain a predicted value, and calculating a difference value between the predicted value and a true value by using a cross entropy loss function; after training is finished, the demand prediction model is output.
Optionally, inputting the sample data into the initial demand prediction model to obtain a predicted value, and calculating a difference value between the predicted value and a true value using a cross entropy loss function, including: preprocessing a sample data item, and constructing model input data X: x= [ Soft sample, task sample, input text ]]Wherein soft prompt is a trainable prompt parameter; task prompt is different prompt templates constructed for different stage tasks; input text is input comment text data; inputting the model input data X into the initial demand prediction model, and outputting a predicted value The difference between the predicted value and the true value is calculated using the following loss function formula: />Wherein Y is i For the true value built up by the true tag of data item i +.>And (3) outputting a predicted value for the data item i by the generative large language model, wherein t is the data quantity for calculating Loss, CELoss is cross entropy Loss, and Loss is the difference value. Optionally, reconstructing the first implicit requirement into a second implicit requirement of the concrete expression includes: determining reconstruction requirement information, wherein the reconstruction requirement information is used for representing constraint conditions of a second implicit requirement to be reconstructed; fusing the reconstruction requirement information and the first implicit requirement to construct a question-answer type guiding prompt; and inputting the guiding prompt into a language model and outputting a second implicit requirement.
Optionally, before mining the first implicit requirement of the comment text using the multi-level requirement prediction module, the method further includes: acquiring an initial classification label of the first implicit requirement, wherein the initial classification label comprises a plurality of label categories, and each label category corresponds to one option label; judging whether the number of the initial classification labels is larger than 1; if the number of the initial classification labels is greater than 1, mapping relations between the initial classification labels and all the option labels are configured for each label class in the plurality of label classes, so that the enhanced classification labels are obtained.
Optionally, mining the first implicit requirement of the comment text using a requirement prediction model includes: dividing the comment text by taking sentences as granularity to obtain a plurality of comment sentences; and for each comment, performing multi-label classification on the comment based on a demand prediction model to obtain a first implicit demand of the comment.
According to another embodiment of the present invention, there is provided an implicit requirement recognition apparatus for comment text, including: the first acquisition module is used for acquiring comment texts; the mining module is used for mining the first implicit requirement of the comment text by adopting a requirement prediction model; a reconstruction module, configured to reconstruct the first implicit requirement to obtain a second implicit requirement that is specifically expressed; the judging module is used for judging and obtaining a specific demand level and demand intensity corresponding to the first implicit demand according to the second implicit demand and the evaluation paper by adopting the demand prediction model, wherein the demand level is used for representing the demand type of the first implicit demand, and the demand intensity is used for representing the intensity degree of the first implicit demand; the output module is used for outputting an implicit requirement identification result of the comment text, wherein the implicit requirement identification result comprises: the first implicit requirement, the requirement level, the requirement strength.
Optionally, the apparatus further comprises: the setting module is used for setting a first class label set of the implicit requirement, a second class label set of a requirement layer and a third class label set of the requirement strength before the mining module adopts the multi-level requirement prediction module to mine the first implicit requirement of the comment text; the building module is used for building an implicit requirement classification subtask based on the first class label set, building a requirement level classification subtask based on the second class label set and building a requirement strength classification subtask based on the third class label set.
Optionally, the apparatus further comprises: the second acquisition module is used for acquiring a generated large language model before the mining module adopts a requirement prediction model to mine the first implicit requirement of the comment text; the input module is used for configuring trainable prompt parameters before an input sequence in each layer of the transducer structure of the generated large language model to obtain an initial demand prediction model; the iteration module is used for acquiring sample data, and iteratively executing the following steps until the difference value between the predicted value and the true value is smaller than a preset threshold value: inputting the sample data into the initial demand prediction model to obtain a predicted value, and calculating a difference value between the predicted value and a true value by using a cross entropy loss function; and the output module is used for outputting the demand prediction model after training is finished.
Optionally, the iteration module includes: the construction unit is used for preprocessing the sample data items and constructing model input data X: x= [ Soft sample, task sample, input text ]]Wherein soft prompt is a trainable prompt parameter; task prompt is different prompt templates constructed for different stage tasks; input text is input comment text data; a processing unit for inputting the model input data X into the initial demand prediction model and outputting a predicted valueA calculating unit for calculating a difference value between the predicted value and the true value using the following loss function formula: wherein Y is i For the true value built up by the true tag of data item i +.>For data item i, the predicted value output by the generative large language model, t is the data amount used to calculate Loss, CELoss is the cross entropy Loss, loss is the differenceDifferent values.
Optionally, the reconstruction module includes: the determining unit is used for determining reconstruction requirement information, wherein the reconstruction requirement information is used for representing constraint conditions of a second implicit requirement to be reconstructed; the fusion unit is used for fusing the reconstruction requirement information and the first implicit requirement and constructing a question-answering type guide prompt; and the processing unit is used for inputting the guide prompt into the language model and outputting a second implicit requirement.
Optionally, the apparatus further comprises: the third obtaining module is used for obtaining an initial classification label of the first implicit requirement before the mining module adopts the multi-level requirement prediction module to mine the first implicit requirement of the comment text, wherein the initial classification label comprises a plurality of label categories, and each label category corresponds to one option label; the judging module is used for judging whether the number of the initial classification labels is larger than 1; and the configuration module is used for respectively configuring mapping relations with all option labels aiming at each label class in the plurality of label classes if the number of the initial classification labels is larger than 1, so as to obtain the enhanced classification labels.
Optionally, the mining module includes: the segmentation unit is used for segmenting the comment text by taking sentences as granularity to obtain a plurality of comment sentences; and the classifying unit is used for classifying the comment sentences in a multi-label mode based on the demand prediction model aiming at each comment sentence to obtain the first implicit demand of the comment sentence.
According to a further embodiment of the invention, there is also provided a storage medium having stored therein a computer program, wherein the computer program is arranged to perform the steps of any of the apparatus embodiments described above when run.
According to a further embodiment of the invention there is also provided an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the device embodiments described above.
According to the embodiment of the invention, comment text is obtained; mining a first implicit requirement of the comment text by adopting a requirement prediction model; reconstructing the first implicit requirement to obtain a second implicit requirement of a specific expression; judging and obtaining a specific demand level and demand intensity corresponding to the first implicit demand according to the second implicit demand and the evaluation paper by adopting the demand prediction model, wherein the demand level is used for representing the demand type of the first implicit demand, and the demand intensity is used for representing the intensity degree of the first implicit demand; outputting an implicit requirement recognition result of the comment text, wherein the implicit requirement recognition result comprises: the first implicit requirement, the requirement level, the requirement strength. The implicit requirements of comment texts are mined by adopting a requirement prediction model, the implicit requirements are reconstructed to obtain the implicit requirements of specific expressions, the requirement level and the requirement strength of the implicit requirements are obtained by adopting the requirement prediction model according to the implicit requirements of the specific expressions and the judgment of the comment texts, the technical problems that the implicit requirements of users are difficult to deeply model and the requirements level and the strength cannot be jointly analyzed in the existing method are solved, and the automatic deep mining of the implicit requirements of the users from massive comments is realized.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention. In the drawings:
FIG. 1 is a block diagram of the hardware architecture of a server according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method of implicit need identification of comment text in accordance with an embodiment of the invention;
FIG. 3 is an overall flow chart of an implicit need identification method for comment text in an embodiment of the invention;
fig. 4 is a block diagram of an implicit requirement recognition apparatus for comment text according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application. It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
The method embodiment provided in the first embodiment of the present application may be executed in a server, a computer, or a similar computing device. Taking the operation on a server as an example, fig. 1 is a block diagram of a hardware structure of a server according to an embodiment of the present invention. As shown in fig. 1, the server may include one or more (only one is shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a microprocessor MCU or a processing device such as a programmable logic device FPGA) and a memory 104 for storing data, and optionally, a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative, and is not intended to limit the structure of the server described above. For example, the server may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a server program, for example, a software program of an application software and a module, such as a server program corresponding to a method for identifying implicit requirements of comment text in an embodiment of the present invention, and the processor 102 executes the server program stored in the memory 104, thereby performing various functional applications and data processing, that is, implementing the method described above. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory remotely located with respect to the processor 102, which may be connected to a server via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of a server. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is configured to communicate with the internet wirelessly.
In this embodiment, a method for identifying implicit requirements of comment text is provided, and fig. 2 is a flowchart of a method for identifying implicit requirements of comment text according to an embodiment of the present invention, as shown in fig. 2, where the flowchart includes the following steps:
step S202, comment text is obtained; mining a first implicit requirement of the comment text by adopting a requirement prediction model;
alternatively, the comment text may be obtained from social media, instant messaging software, or the like. The first implicit requirement may be characterized by a text set or the like, such as comment text: "flawed, the business is very wounded in the process of changing, feel the same as being played", the first implicit requirement for mining is: { promote product quality function; promote product related services }.
Step S204, reconstructing the first implicit requirement to obtain a second implicit requirement of a specific expression;
taking the first implicit requirement "promote product related services" mined by the above comments as an example, it is specifically expressed as: "hope that merchants can be more credited and specialized in after-market services" as a second implicit requirement.
Step S206, judging and obtaining a specific demand level and demand intensity corresponding to the first implicit demand according to the second implicit demand and the evaluation paper by adopting the demand prediction model, wherein the demand level is used for representing the demand type of the first implicit demand, and the demand intensity is used for representing the intensity degree of the first implicit demand;
Step S208 outputs an implicit requirement recognition result of the comment text, where the implicit requirement recognition result includes: the first implicit requirement, the requirement level, the requirement strength.
Through the steps, comment texts are obtained, a first implicit demand of the comment texts is mined by adopting a demand prediction model, a second implicit demand of specific expression is obtained through reconstruction of the first implicit demand, a specific demand level and demand intensity corresponding to the first implicit demand are obtained through judgment of the demand prediction model according to the second implicit demand and the comment texts, wherein the demand level is used for representing the demand type of the first implicit demand, the demand intensity is used for representing the intensity degree of the first implicit demand, and an implicit demand identification result of the comment texts is output, wherein the implicit demand identification result comprises the first implicit demand, the demand level and the demand intensity. The implicit requirements of comment texts are mined by adopting a requirement prediction model, the implicit requirements are reconstructed to obtain the implicit requirements of specific expressions, the requirement level and the requirement strength of the implicit requirements are obtained by adopting the requirement prediction model according to the implicit requirements of the specific expressions and the comment texts, the technical problems that the implicit requirements of users are difficult to deeply model and the requirements level and the strength cannot be jointly analyzed in the related technology are solved, the implicit requirements of the users are automatically and deeply mined from massive comments, and the analysis granularity and the analysis depth of comment texts are improved.
In one implementation of this embodiment, before mining the first implicit requirement of the comment text using the multi-level requirement prediction module, the method further includes: setting a first class label set of implicit requirements, a second class label set of a requirement layer, and a third class label set of requirement strength; and constructing an implicit demand classification subtask based on the first category label set, constructing a demand level classification subtask based on the second category label set, and constructing a demand intensity classification subtask based on the third category label set.
Optionally, the first category label set includes the following category label data { promote product quality function, promote product related service }, the second category label set includes the following category label data { product quality problem, product function problem }, or { customer service after-sales problem, express delivery installation problem, price related problem }, the third category label set includes the following category label data { strong, weak }.
The method and the system simultaneously utilize the understanding and generating capability of the generated large language model, are respectively used for multi-label user comment demand classification tasks and implicit demand expression tasks, and strengthen the multi-level implicit demand mining performance of the promotion model.
In this embodiment, mining the first implicit requirement of the comment text using a requirement prediction model includes:
s11, dividing the comment text by taking sentences as granularity to obtain a plurality of comment sentences;
in one example, a comment sentence obtained by dividing the comment text with the sentence as granularity is "flawed", and the merchant is very wounded in the process of changing the new one, and feels the same as being played.
And S12, performing multi-label classification on the comment sentences based on the demand prediction model aiming at each comment sentence to obtain a first implicit demand of the comment sentences.
In one example, for each comment, performing multi-label classification on the comment based on a demand prediction model, wherein obtaining a first implicit demand for the comment includes inputting the comment as "flawed, very wounded in the process of changing a new by a merchant, feeling as being played" and implicit demand category D;
the specific input is as follows: in commentary "flawed, very wounded in the process of changing a new business, feel the same" which kind of user needs are reflected? A. Praise B, improving the quality of the product and C, improving the related service of the product;
and carrying out multi-label classification on the comment sentence based on a demand prediction model and outputting the comment sentence: { D i }, wherein D i ∈D;
The specific output is: { D B -a function of improving the quality of the product; d (D) C Promote product related services }.
In this embodiment, reconstructing the first implicit requirement to obtain a second implicit requirement specifically includes:
s21, determining reconstruction requirement information, wherein the reconstruction requirement information is used for representing constraint conditions of a second implicit requirement to be reconstructed;
in one example, the reconfiguration requirement information is a specific aspect a= { a j Specific aspect A j Refers to further performing different guidance, constraints and specifications on the generated content in terms of "brevity", "clarity" and the like.
S22, fusing the reconstruction requirement information and the first implicit requirement to construct a question-answer type guiding prompt;
in one example, the reconstruction requirement information and the first implicit requirement are fused, and a question-and-answer type guiding prompt is constructed as 'give clear, concise and useful answer'.
S23, inputting the guide prompt into a generated large language model, and outputting a second implicit requirement;
in one example, the input of the guidance prompt into the language model is: to be identifiedComment X and demand-based D i And specific aspect { A j Guide prompt;
In one example, the specific inputs are: in commentary "flawed, a business is very wounded in the process of changing a new, feel the same" what is the specific reason why the user puts forward the need to "promote product-related services"? (giving a clear, concise, useful reply);
the output results using the language model are: demand D i Comment supplement reason X i
The specific output is: demand D C Comment supplement reason X C It is desirable for merchants to be able to be more credited and specialized in after-market services.
In one example, demand D C The corresponding demand level is D CM ={D C1 ,D C2 ,…,D Cm }。
In one example, the specific requirement level and the requirement strength input corresponding to the first implicit requirement are obtained by adopting the requirement prediction model according to the second implicit requirement and the evaluation paper, and the specific requirement level and the requirement strength input are: the comment X to be identified corresponds to a demand level category corresponding to each type of demand, and the demand intensity category S= { S, w };
in one example, specific inputs are: in commentary "flawed, a business is very wounded in the process of changing a new, feel the same" what is the specific reason why the user puts forward the need to "promote product-related services"? A. After-sales customer service issue b. Express delivery installation issue c. Price related issue/in comment "there is flaw, the business is very wounded in the process of changing new, feel the same as being played", what is the user puts forth the intensity of language that "promote product related service" needs? A. Strong B. Weak;
The output results using the language model are: { D Ci }, wherein D Ci ∈D CM The method comprises the steps of carrying out a first treatment on the surface of the { z }, where z ε S;
the specific output is: { D C1 -after-market customer service problem } { s-strength strong }. Similarly, a "D" can be obtained B The output { D ] corresponding to the first implicit requirement of the product quality improvement function B1 -product quality problem } { wWeak intensity }.
In this embodiment, an implicit requirement recognition result of the comment text is output, where the recognition result includes: the first implicit requirement, the requirement level, the requirement strength.
In one example, the structured output result in the form of triples using a language model is { D B Function of improving product quality D B1 Product quality problem, w strength weak } and { D C Promote the relevant service of the product, D C1 After-market customer service problem, strong s-strength }.
In this embodiment, before the first implicit requirement of the comment text is mined by using the requirement prediction model, the method further includes:
s31, acquiring a generated large language model;
in one example, a generative large language model is used along with an efficient parameter tuning method.
S32, configuring trainable prompt parameters before an input sequence in each layer of the transducer structure of the generated large language model to obtain an initial demand prediction model.
S33, acquiring sample data, and iteratively executing the following steps until the difference value between the predicted value and the true value is smaller than a preset threshold value: inputting the sample data into the initial demand prediction model to obtain a predicted value, and calculating a difference value between the predicted value and a true value by using a cross entropy loss function;
in one example, inputting the sample data into the initial demand prediction model to obtain a predicted value, and calculating a difference value between the predicted value and a true value using a cross entropy loss function includes: preprocessing a sample data item, and constructing model input data X: x= [ Soft sample, task sample, input text ]]Wherein X is the final input of the feed model, and soft prompt is a trainable prompt parameter; task prompt is different prompt templates constructed for different stage tasks; input text is input comment text data; inputting the model input data X into the initial demand prediction model, and outputting a predicted valueThe difference between the predicted value and the true value is calculated using the following loss function formula:wherein Y is i For the true value built up by the true tag of data item i +.>A predicted value output by the generated large language model is used for the data item i, t is the data quantity used for calculating Loss, CELoss is cross entropy Loss, and Loss is the difference value; and respectively calculating a difference value according to the field data of each stage task, wherein the field data is specific data under the task, and iteratively optimizing model parameters of a soft sample position in an efficient fine tuning mode.
In this embodiment, before the mining the first implicit requirement of the comment text by using the multi-level requirement prediction module, the method further includes:
s41, acquiring an initial classification label of the first implicit requirement, wherein the initial classification label comprises a plurality of label categories, and each label category corresponds to one option label;
s42, judging whether the number of the initial classification labels is larger than 1.
S43, if the number of the initial classification labels is greater than 1, respectively configuring mapping relations with all option labels aiming at each label class in the plurality of label classes to obtain enhanced classification labels;
in one example, the data-enhanced input for the answer to be multi-labeled is: comment X to be identified and class label to be classified;
the specific input is as follows: in commentary "flawed, very wounded in the process of changing a new business, feel the same" which kind of user needs are reflected? A. Praise B, improving the quality of the product and C, improving the related service of the product;
the output is: the comment X to be identified and the class label to be classified after the sorting enhancement are carried out;
the specific output is: in commentary "flawed, very wounded in the process of changing a new business, feel the same" which kind of user needs are reflected? A. The product quality improving function B, zanmei C, product related service improving/A, product quality improving function B, product related service improving C, zanmei/A, zanmei B, product related service improving C, product quality improving function/A, zanmei B, product quality improving function C, product related service improving/A, product related service improving B, product quality improving function C, zanmei/A, product related service improving B, zanmei C, product quality improving function.
Fig. 3 is an overall flowchart of an implicit requirement recognition method for comment text in an embodiment of the present invention, where the whole flowchart takes comment "defective", a merchant is very wounded in a new process, and feel the same as being played "as an example, and the flow includes four modules in total: the system comprises a multi-label classified data enhancement module, a two-stage prompt construction module, a LLM (large language model ) -based expression reconstruction module and a LLM-based multi-level demand prediction module, wherein LLM is a large language model.
The multi-tag classified data enhancement module comprises: in the first stage, each label (A, B, C) is respectively combined with an option name (praise, beauty, product quality improvement function and product related service improvement) to form a plurality of groups of label prompts, and the labels are used for enhancing the association of the labels and data when constructing the prompts;
the two-stage prompt construction module comprises: the implicit demand mining in the first stage and the demand level and strength judgment in the second stage are main processes for realizing implicit demand recognition of comment texts, a prompt construction module in the two stages continuously interacts with a multi-level demand prediction module based on a generated large language model in the main process execution process, and the multi-level demand prediction module is used for realizing the mining of the implicit demand and the judgment of the demand level and the demand strength;
The LLM-based expression reconstruction module includes: and (3) adopting a general generation type large language model, freezing and reserving default parameters in the pre-training model to generate a reconstruction expression under specific implicit requirements. Fusing specific requirements D according to user comments i With particular aspect a= { a j Building guidance cuesLanguage (inputtex+some requirement+some aspect), and generating a reconstructed comment expression under the implicit requirement using a generative large language model: "it is desirable for merchants to be more creditable and specialized in after-market services. "
The LLM-based multi-level demand prediction module includes: and performing domain fine adjustment aiming at the general generation type large language model to complete semantic understanding and demand prediction. And inputting soft sample, task sample and input text, and outputting a predicted implicit demand recognition result by a generated large language model after fine adjustment in the efficient parameter field, wherein the implicit demand recognition result comprises a first implicit demand, a demand level and demand intensity. Wherein soft prompt is a trainable prompt parameter; task prompt is different prompt templates constructed for different stage tasks; stage1Prompt is a Prompt template constructed by a first-Stage task; stage2Prompt is a Prompt template constructed by the task of the second Stage; input Text is Input comment Text data; y is Y i For a true value constructed from the true tag of data item i,a predicted value output by the generated large language model is used as a data item i, CELoss is cross entropy Loss, and Loss is the difference value; the domain fine tuning is to calculate difference values for domain data of each stage task, wherein the domain data is specific data under the task, and model parameters of a soft sample position are iteratively optimized in an efficient fine tuning mode.
By adopting the scheme of the embodiment, the two-stage multi-level implicit demand and demand intensity combined recognition method based on the generated large language model is provided, a two-stage flow frame is constructed based on the generated large language model so as to jointly complete the combined detection of the implicit demand, the demand level and the demand intensity, realize the automatic deep mining of the implicit demand of the user from massive comments, provide a quick way for enterprises and merchants to sort and summarize the promotion key points of the products, improve the product and service, promote the user satisfaction and loyalty, and provide technical support for the company to improve the competitiveness and influence.
The scheme completes deep implicit requirement mining of user comments based on the social platform with a unified framework, and realizes joint identification of multi-level requirements and requirement intensity. The scheme utilizes the understanding and generating capability of the generated large language model, is respectively used for the clear reconstruction task of the multi-label user comment demand classification task and the implicit demand expression, and enhances the multi-layer implicit demand mining performance of the promotion model. The method combines a unified view angle and a two-stage method to realize extraction of an 'implicit demand-demand level-demand intensity' triplet, and the expression of the model on a specific task is enhanced with low resources and low cost based on efficient fine adjustment of the generated large language model parameters. The method has the advantages of unified framework joint mining, interactive association modeling layering, semantic representation and expression reconstruction depth fusion and the like.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
Example 2
The embodiment also provides an implicit requirement recognition device for comment text, which is used for realizing the embodiment and the preferred implementation manner, and the description is omitted. As used below, the term "module" may be a combination of software and hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementations in hardware, or a combination of software and hardware, are also contemplated.
Fig. 4 is a block diagram of an implicit requirement recognition apparatus for comment text according to an embodiment of the present invention, as shown in fig. 4, including:
a first obtaining module 40, configured to obtain comment text;
a mining module 42 for mining a first implicit requirement of the comment text using a requirement prediction model;
a reconstruction module 44, configured to reconstruct the first implicit requirement to obtain a second implicit requirement that is specifically expressed;
a discriminating module 46, configured to discriminate, according to the second implicit requirement and the evaluation paper, by using the requirement prediction model to obtain a requirement level and a requirement strength of the second implicit requirement, where the requirement level is used to characterize a requirement type of the second implicit requirement;
the output module 48 is configured to output an implicit requirement recognition result of the comment text, where the recognition result includes: the first implicit requirement, the requirement level, the requirement strength.
Optionally, the apparatus further comprises: the setting module is used for setting a first class label set of the implicit requirement, a second class label set of a requirement layer and a third class label set of the requirement strength before the mining module adopts the multi-level requirement prediction module to mine the first implicit requirement of the comment text; the building module is used for building an implicit requirement classification subtask based on the first class label set, building a requirement level classification subtask based on the second class label set and building a requirement strength classification subtask based on the third class label set.
Optionally, the apparatus further comprises: the second acquisition module is used for acquiring a generated large language model before the mining module adopts a requirement prediction model to mine the first implicit requirement of the comment text; the input module is used for configuring trainable prompt parameters before an input sequence in each layer of the transducer structure of the generated large language model to obtain an initial demand prediction model; the iteration module is used for acquiring sample data, and iteratively executing the following steps until the difference value between the predicted value and the true value is smaller than a preset threshold value: inputting the sample data into the initial demand prediction model to obtain a predicted value, and calculating a difference value between the predicted value and a true value by using a cross entropy loss function; and the output module is used for outputting the demand prediction model after training is finished.
Optionally, the iteration module includes: the construction unit is used for preprocessing the sample data items and constructing model input data X: x= [ Soft sample, task sample, input text ]]Wherein soft sample is a trainable parameter; task prompt is different prompt templates constructed for different stage tasks; input text is input comment text data; a processing unit for inputting the model input data X into the initial demand prediction model and outputting a predicted value A calculating unit for calculating a difference value between the predicted value and the true value using the following loss function formula:wherein Y is i For the true value built up by the true tag of data item i +.>And (3) outputting a predicted value for the data item i by the generative large language model, wherein t is the data quantity for calculating Loss, CELoss is cross entropy Loss, and Loss is the difference value.
Optionally, the reconstruction module includes: the determining unit is used for determining reconstruction requirement information, wherein the reconstruction requirement information is used for representing constraint conditions of a second implicit requirement to be reconstructed; the fusion unit is used for fusing the reconstruction requirement information and the first implicit requirement and constructing a question-answering type guide prompt; and the processing unit is used for inputting the guide prompt into the language model and outputting a second implicit requirement.
Optionally, the apparatus further comprises: the third obtaining module is used for obtaining an initial classification label of the first implicit requirement before the mining module adopts the multi-level requirement prediction module to mine the first implicit requirement of the comment text, wherein the initial classification label comprises a plurality of label categories, and each label category corresponds to one option label; the judging module is used for judging whether the number of the initial classification labels is larger than 1; and the configuration module is used for respectively configuring mapping relations with all option labels aiming at each label class in the plurality of label classes if the number of the initial classification labels is larger than 1, so as to obtain the enhanced classification labels.
Optionally, the mining module includes: the segmentation unit is used for segmenting the comment text by taking sentences as granularity to obtain a plurality of comment sentences; and the classifying unit is used for classifying the comment sentences in a multi-label mode based on the demand prediction model aiming at each comment sentence to obtain the first implicit demand of the comment sentence.
It should be noted that each of the above modules may be implemented by software or hardware, and for the latter, it may be implemented by, but not limited to: the modules are all located in the same processor; alternatively, the above modules may be located in different processors in any combination.
Example 3
An embodiment of the invention also provides a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described storage medium may be configured to store a computer program for performing the steps of:
s1, obtaining comment texts;
s2, mining a first implicit requirement of the comment text by adopting a requirement prediction model;
s3, reconstructing the first implicit requirement to obtain a second implicit requirement of a specific expression;
S4, judging and obtaining a specific demand level and demand intensity corresponding to the first implicit demand according to the second implicit demand and the evaluation paper by adopting the demand prediction model, wherein the demand level is used for representing the demand type of the first implicit demand, and the demand intensity is used for representing the intensity degree of the first implicit demand;
s5, outputting an implicit requirement identification result of the comment text, wherein the implicit requirement identification result comprises: the first implicit requirement, the requirement level, the requirement strength.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
An embodiment of the invention also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, obtaining comment texts;
s2, mining a first implicit requirement of the comment text by adopting a requirement prediction model;
s3, reconstructing the first implicit requirement to obtain a second implicit requirement of a specific expression;
s4, adopting the demand prediction model according to a specific demand level and demand intensity corresponding to the first implicit demand, wherein the demand level is used for representing the demand type of the first implicit demand, and the demand intensity is used for representing the intensity degree of the first implicit demand;
s5, outputting an implicit requirement recognition result of the comment text, wherein the recognition result comprises: the first implicit requirement, the requirement level, the requirement strength.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments and optional implementations, and this embodiment is not described herein.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (10)

1. An implicit requirement recognition method for comment text, which is characterized by comprising the following steps:
obtaining comment text;
mining a first implicit requirement of the comment text by adopting a requirement prediction model;
reconstructing the first implicit requirement to obtain a second implicit requirement of a specific expression;
judging and obtaining a specific demand level and demand intensity corresponding to the first implicit demand according to the second implicit demand and the evaluation paper by adopting the demand prediction model, wherein the demand level is used for representing the demand type of the first implicit demand, and the demand intensity is used for representing the intensity degree of the first implicit demand;
outputting an implicit requirement recognition result of the comment text, wherein the implicit requirement recognition result comprises: the first implicit requirement, the requirement level, the requirement strength.
2. The method of claim 1, wherein prior to mining the first implicit requirement of comment text using a multi-level requirement prediction module, the method further comprises:
setting a first class label set of implicit requirements, a second class label set of a requirement layer, and a third class label set of requirement strength;
And constructing an implicit demand classification subtask based on the first category label set, constructing a demand level classification subtask based on the second category label set, and constructing a demand intensity classification subtask based on the third category label set.
3. The method of claim 1, wherein prior to mining the first implicit requirement of the comment text using a requirement prediction model, the method further comprises:
acquiring a generated large language model;
configuring trainable prompt parameters in front of an input sequence in each layer of the conversion structure of the generated large language model to obtain an initial demand prediction model;
sample data are acquired, and the following steps are iteratively executed until the difference value between the predicted value and the true value is smaller than a preset threshold value: inputting the sample data into the initial demand prediction model to obtain a predicted value, and calculating a difference value between the predicted value and a true value by using a cross entropy loss function;
after training is finished, the demand prediction model is output.
4. A method according to claim 3, wherein inputting the sample data into the initial demand prediction model to obtain a predicted value and calculating a difference value between the predicted value and a true value using a cross entropy loss function comprises:
Preprocessing a sample data item, and constructing model input data X: x= [ soft sample, task sample, input text ], where X is the final input into the language model, soft sample is a trainable hint parameter; task prompt is different prompt templates constructed for different stage tasks; input text is input comment text data;
inputting the model input data X into the initial demand prediction model, and outputting a predicted value
The difference between the predicted value and the true value is calculated using the following loss function formula:
wherein Y is i For a true value constructed from the true tag of data item i,and (3) outputting a predicted value for the data item i by the generative large language model, wherein t is the data quantity for calculating Loss, CELoss is cross entropy Loss, and Loss is the difference value.
5. The method of claim 1, wherein reconstructing the first implicit requirement into a second implicit requirement that is specified comprises:
determining reconstruction requirement information, wherein the reconstruction requirement information is used for representing constraint conditions of a second implicit requirement to be reconstructed;
fusing the reconstruction requirement information and the first implicit requirement to construct a question-answer type guiding prompt;
And inputting the guiding prompt into a language model and outputting a second implicit requirement.
6. The method of claim 1, wherein prior to mining the first implicit requirement of comment text using a multi-level requirement prediction module, the method further comprises:
acquiring an initial classification label of the first implicit requirement, wherein the initial classification label comprises a plurality of label categories, and each label category corresponds to one option label;
judging whether the number of the initial classification labels is larger than 1;
if the number of the initial classification labels is greater than 1, mapping relations between the initial classification labels and all the option labels are configured for each label class in the plurality of label classes, so that the enhanced classification labels are obtained.
7. The method of claim 1, wherein mining the first implicit requirement of comment text using a requirement prediction model comprises:
dividing the comment text by taking sentences as granularity to obtain a plurality of comment sentences;
and for each comment, performing multi-label classification on the comment based on a demand prediction model to obtain a first implicit demand of the comment.
8. An implicit demand recognition apparatus for comment text, comprising:
The first acquisition module is used for acquiring comment texts;
the mining module is used for mining the first implicit requirements of the comment text by adopting a multi-level requirement prediction model;
a reconstruction module, configured to reconstruct the first implicit requirement to obtain a second implicit requirement that is specifically expressed;
the judging module is used for judging and obtaining a specific demand level and demand intensity corresponding to the first implicit demand according to the second implicit demand and the evaluation paper by adopting the multi-level demand prediction model, wherein the demand level is used for representing the demand type of the first implicit demand, and the demand intensity is used for representing the intensity degree of the first implicit demand;
the output module is used for outputting an implicit requirement identification result of the comment text, wherein the implicit requirement identification result comprises: the first implicit requirement, the requirement level, the requirement strength.
9. A storage medium having a computer program stored therein, wherein the computer program is arranged to perform the method of any of claims 1 to 7 when run.
10. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the method of any of the claims 1 to 7.
CN202310974595.XA 2023-08-03 2023-08-03 Implicit demand recognition method and device for comment text Pending CN117313736A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310974595.XA CN117313736A (en) 2023-08-03 2023-08-03 Implicit demand recognition method and device for comment text

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310974595.XA CN117313736A (en) 2023-08-03 2023-08-03 Implicit demand recognition method and device for comment text

Publications (1)

Publication Number Publication Date
CN117313736A true CN117313736A (en) 2023-12-29

Family

ID=89287345

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310974595.XA Pending CN117313736A (en) 2023-08-03 2023-08-03 Implicit demand recognition method and device for comment text

Country Status (1)

Country Link
CN (1) CN117313736A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117592514A (en) * 2024-01-19 2024-02-23 中国传媒大学 Comment text viewpoint prediction method, comment text viewpoint prediction system, comment text viewpoint prediction device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117592514A (en) * 2024-01-19 2024-02-23 中国传媒大学 Comment text viewpoint prediction method, comment text viewpoint prediction system, comment text viewpoint prediction device and storage medium
CN117592514B (en) * 2024-01-19 2024-03-26 中国传媒大学 Comment text viewpoint prediction method, comment text viewpoint prediction system, comment text viewpoint prediction device and storage medium

Similar Documents

Publication Publication Date Title
CN110168535B (en) Information processing method and terminal, computer storage medium
EP3522078A1 (en) Explainable artificial intelligence
CN110909165B (en) Data processing method, device, medium and electronic equipment
KR102318103B1 (en) Method for machine learning train set and recommendation systems to recommend the scores to match between the recruiter and job seekers, and to give the scores of matching candidates to recruiters and to give the pass scores to job seekers respectively
CN107291840B (en) User attribute prediction model construction method and device
CN109948160B (en) Short text classification method and device
CN107844558A (en) The determination method and relevant apparatus of a kind of classification information
KR20200139008A (en) User intention-analysis based contract recommendation and autocomplete service using deep learning
US10922633B2 (en) Utilizing econometric and machine learning models to maximize total returns for an entity
CN117313736A (en) Implicit demand recognition method and device for comment text
CN112434501A (en) Work order intelligent generation method and device, electronic equipment and medium
CN111782793A (en) Intelligent customer service processing method, system and equipment
CN114387061A (en) Product pushing method and device, electronic equipment and readable storage medium
CN113392179A (en) Text labeling method and device, electronic equipment and storage medium
CN113051380A (en) Information generation method and device, electronic equipment and storage medium
CN111858919A (en) Text classification method and device and computer readable storage medium
CN113011156A (en) Quality inspection method, device and medium for audit text and electronic equipment
CN113298495A (en) Resume screening method, resume screening device, terminal device and storage medium
CN116484836B (en) Questionnaire generation system and method based on NLP model, electronic equipment and medium
CN114528851B (en) Reply sentence determination method, reply sentence determination device, electronic equipment and storage medium
CN108228573A (en) Text emotion analysis method, device and electronic equipment
CN114492446A (en) Legal document processing method and device, electronic equipment and storage medium
CN113297520A (en) Page design auxiliary processing method and device and electronic equipment
CN111667306A (en) Customized production-oriented customer demand identification method, system and terminal
CN111651575A (en) Session text processing method, device, medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination