CN110826340A - Evaluation text generation method and device and electronic equipment - Google Patents

Evaluation text generation method and device and electronic equipment Download PDF

Info

Publication number
CN110826340A
CN110826340A CN201911080696.2A CN201911080696A CN110826340A CN 110826340 A CN110826340 A CN 110826340A CN 201911080696 A CN201911080696 A CN 201911080696A CN 110826340 A CN110826340 A CN 110826340A
Authority
CN
China
Prior art keywords
text
information
semantic
trained
feature vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911080696.2A
Other languages
Chinese (zh)
Inventor
雷瑞生
杨嘉华
张宏龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong 3vjia Information Technology Co Ltd
Original Assignee
Guangdong 3vjia Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong 3vjia Information Technology Co Ltd filed Critical Guangdong 3vjia Information Technology Co Ltd
Priority to CN201911080696.2A priority Critical patent/CN110826340A/en
Publication of CN110826340A publication Critical patent/CN110826340A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/31Indexing; Data structures therefor; Storage structures
    • G06F16/316Indexing structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0282Rating or review of business operators or products

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Databases & Information Systems (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Machine Translation (AREA)

Abstract

The invention provides an evaluation text generation method, an evaluation text generation device and electronic equipment, and relates to the field of artificial intelligence, wherein the method comprises the steps of obtaining initial text information, wherein the initial text information comprises text description information of a home decoration matching scheme; extracting semantic features of the initial text information by using a pre-trained semantic understanding model, and obtaining a semantic feature vector; and generating an evaluation text according to the semantic feature vector based on a pre-trained text generation model. The method and the device can generate the evaluation text with higher degree of association with the initial text information, thereby effectively improving the operation efficiency of the website community or forum.

Description

Evaluation text generation method and device and electronic equipment
Technical Field
The invention relates to the field of artificial intelligence, in particular to an evaluation text generation method and device and electronic equipment.
Background
At present, professional home decoration design websites are usually provided with special communities or forums, which are mainly used for enterprises or users to upload and share indoor design effect drawings matched with home decoration and corresponding scheme text description information by themselves, and the purpose of the method is to efficiently publicize and market design concepts, home decoration products, home decoration designs and the like of the enterprises or users. However, in the initial operation, due to insufficient initial user activity, the scheme content shared by the enterprise or the user cannot obtain sufficient evaluation feedback, which further causes the sharing of the enterprise or the user to lose the attention of the user, and even seriously causes the continuous decrease of the quality of the shared content of the website and the loss of the user activity.
Disclosure of Invention
The invention aims to provide an evaluation text generation method, an evaluation text generation device and electronic equipment, so that the problem caused by less evaluation feedback in the prior art is solved, and an evaluation text with higher association degree with initial text information is generated, so that the operation efficiency of a website community or a forum is effectively improved.
In a first aspect, an embodiment provides a method for generating an evaluation text, including: acquiring initial text information, wherein the initial text information comprises text description information of a home decoration matching scheme; extracting semantic features of the initial text information by using a pre-trained semantic understanding model, and obtaining a semantic feature vector; and generating an evaluation text according to the semantic feature vector based on a pre-trained text generation model.
In an optional embodiment, the preset semantic understanding model is a BERT neural network model; the method comprises the following steps of extracting semantic features of initial text information by using a pre-trained semantic understanding model, and obtaining a semantic feature vector, wherein the steps comprise: performing text mapping on the initial text information according to a preset text mapping dictionary to obtain corresponding coding information; the coded information carries a word coded information embedded identifier, a position coded information embedded identifier and a paragraph coded information embedded identifier; extracting corresponding semantic information from the coded information based on a pre-trained BERT neural network model, and obtaining a semantic feature vector based on the semantic information.
In an optional embodiment, the step of performing text mapping on the initial text information according to a preset text mapping dictionary to obtain corresponding encoded information includes: searching index information corresponding to the initial text information according to a preset text mapping dictionary; and converting the text information into corresponding coding information according to the index information.
In an alternative embodiment, the method further comprises: and inputting the semantic feature vectors into a pre-trained text generation model based on an Attention mechanism so that the pre-trained text generation model outputs an evaluation text related to the initial text information. In an alternative embodiment, the text generation model is an LSTM neural network model; the method for generating the evaluation text according to the semantic feature vector based on the pre-trained text generation model comprises the following steps: inputting the semantic feature vector into a pre-trained LSTM neural network model; and processing semantic feature vector information according to a preset maximum probability output mechanism of the Beam Search to obtain an evaluation text.
In an alternative embodiment, the method is applied to a community of websites or forums.
In a second aspect, an embodiment provides a rating text generating apparatus, including: the system comprises a text acquisition module, a display module and a display module, wherein the text acquisition module is used for acquiring initial text information which comprises text description information of a home decoration collocation scheme; the semantic understanding model is used for extracting semantic features of the initial text information by using a pre-trained semantic understanding model and obtaining a semantic feature vector; and the text generation module is used for generating an evaluation text according to the semantic feature vector based on a pre-trained text generation model.
In an optional embodiment, the preset semantic understanding model is a BERT neural network model; the semantic understanding model is used to: performing text mapping on the initial text information according to a preset text mapping dictionary to obtain corresponding coding information; the coded information carries a word coded information embedded identifier, a position coded information embedded identifier and a paragraph coded information embedded identifier; extracting corresponding semantic information from the coded information based on a pre-trained BERT neural network model, and obtaining a semantic feature vector based on the semantic information.
In a third aspect, an embodiment provides an electronic device, including a processor and a memory, where the memory stores computer-executable instructions capable of being executed by the processor, and the processor executes the computer-executable instructions to implement the steps of the evaluation text generation method according to any one of the foregoing embodiments.
In a fourth aspect, embodiments provide a computer-readable storage medium, on which a computer program is stored, the computer program, when executed by a processor, performing the steps of the evaluation text generation method of any one of the preceding embodiments.
The evaluation text generation method, the evaluation text generation device and the electronic equipment provided by the invention firstly acquire initial text information of text description information including a home decoration matching scheme, perform semantic feature extraction on the initial text information by using a pre-trained semantic understanding model, obtain a semantic feature vector, and generate an evaluation text according to the semantic feature vector based on the pre-trained text generation model. The initial text information is semantically understood through the pre-trained semantic understanding model, and then the semantic feature vector obtained after the semantic understanding is input into the text generation model, so that the recognition effect of the text generation model can be improved, and the association degree of the generated evaluation text and the initial text information is further improved. Therefore, the evaluation text with higher correlation degree with the initial text information is generated, and the operation efficiency of the website community or forum is effectively improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic flow chart of an evaluation text generation method according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a specific neural network model according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an evaluation text generation apparatus according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Considering that the scheme content shared by enterprises or users cannot obtain enough evaluation feedback due to insufficient initial user activity in the initial operation of the existing home decoration design community or forum, the situation can further cause the sharing of the enterprises or users to lose the attention of the users, and even seriously cause the continuous reduction of the quality of the shared content of the website and the loss of the user activity.
For convenience of understanding, first, a detailed description is given of an evaluation text generation method provided in an embodiment of the present invention, referring to a flowchart of an evaluation text generation method shown in fig. 1, where the method mainly includes the following steps S102 to S106:
step S102: and acquiring initial text information, wherein the initial text information comprises text description information of the home decoration collocation scheme.
In one embodiment, the initial text information is natural language text information related to home decoration design knowledge, such as text description information about a home decoration matching scheme on a home decoration design platform or a forum, including chinese and english characters, punctuation characters, and the like, and the text description information of the home decoration matching scheme generally includes descriptions of indoor spatial layout, matching style, pavement design, water and electricity design, furniture matching, and the like, and meanwhile, corresponding internet friend evaluations and the like.
Step S104: and performing semantic feature extraction on the initial text information by using a pre-trained semantic understanding model, and obtaining a semantic feature vector.
In one embodiment, the semantic understanding model may also be referred to as a language representation model, which aims to quantize unstructured text information into structured coded information, and is used to perform semantic understanding (i.e., semantic feature extraction) on initial text information, and the semantic understanding model may include a BERT language representation model, Word2Vector, Glove, GPT, and the like. Inputting initial text information into a pre-trained semantic understanding model, and obtaining a corresponding semantic feature vector through a series of processing. By extracting the semantic features of the initial text information, the semantic understanding of the initial text information can be realized, and the relevance between the subsequently generated evaluation text and the initial text information is improved.
Step S106: and generating an evaluation text according to the semantic feature vector based on a pre-trained text generation model.
In one embodiment, a text generation model (i.e., a decoding generation model) is used for decoding the generated semantic feature vectors and generating related text, and the text generation model may include LSTM, RNN, GRU, and the like. The input text information is used for generating an evaluation text related to the input text information, for example, the text description information of the matching scheme includes' television cabinets purchased to a certain family, Japanese style, and no table legs are designed, so that the cleaning is more convenient, the simple design weakens the existence of the television cabinets, but the storage capacity is not good. The related comment text can be 'Japanese style likes and popular, the storage capacity is large and practical, and most building owners recommend a plurality of TV cabinets'.
The evaluation text generation method provided by the invention comprises the steps of firstly obtaining initial text information of text description information including a home decoration matching scheme, extracting semantic features of the initial text information by using a pre-trained semantic understanding model, obtaining a semantic feature vector, and generating an evaluation text according to the semantic feature vector based on the pre-trained text generation model. The initial text information is semantically understood through the pre-trained semantic understanding model, and then the semantic feature vector obtained after the semantic understanding is input into the text generation model, so that the recognition effect of the text generation model can be improved, and the association degree of the generated evaluation text and the initial text information is further improved. Therefore, the evaluation text with higher correlation degree with the initial text information is generated, and the operation efficiency of the website community or forum is effectively improved.
In order to understand the foregoing embodiment, an embodiment of the present invention provides a semantic understanding model + text generation model, and referring to a specific structural schematic diagram of a neural network model shown in fig. 2, a preset semantic understanding model is taken as an example for introduction of a BERT neural network model, first, a pre-training model of 12 layers, 768 hidden units and 12 attentionheads disclosed in *** is used to adjust network weight parameters and bias, and then the model is trained to obtain a trained BERT neural network model, where step S104 may include step 1 and step 2:
step 1, performing text mapping on the initial text information according to a preset text mapping dictionary to obtain corresponding coding information.
The preset text mapping dictionary is obtained through statistics according to a large amount of data and is mainly used for converting common text content into specific coded information. Wherein:
(1) the word coding embedding is to divide an original sentence by taking a word as a unit and then obtain dictionary index information corresponding to each word for coding; such as "i'm and you", can be divided into three independent characters of "i", and "you", and the code of the character is obtained in a preset dictionary.
(2) The position coding is a trainable initialization matrix, and mainly functions to code specific position information where a word is located into a feature vector, which has an important role in processing the word ambiguity.
(3) Paragraph coding is embedded in the BERT fine-tuning training stage for marking two different paragraph sentences by marking the position of the first paragraph word as 0 and the second paragraph as 1. In the final model input, only sentences of a single paragraph are input, so the model is marked as 1.
And 2, extracting corresponding semantic information from the coding information based on a pre-trained BERT neural network model, and obtaining a semantic feature vector based on the semantic information.
In one embodiment, for example, the initial text message input is "buy a television cabinet to a home, japanese style, without the design of legs to make cleaning more convenient, a compact design that weakens the existence of a television cabinet, but the ability to store things is not. After word input codes are obtained through a preset text mapping dictionary, corresponding 768-dimensional context semantic codes can be obtained at [ CLS ] bits by inputting a BERT model, and specifically, floating point arrays of [1,768] can be obtained.
For the convenience of understanding the step S106, taking the text generation model as an LSTM neural network model as an example, the step S106 may include the following steps (1) and (2):
and (1) inputting the semantic feature vector into a pre-trained LSTM neural network model. The LSTM is a structure with a length of 20 and a hidden neuron of 512, and is tuned by fine-tuning the network weight parameters and bias of the network structure, thereby generating an evaluation text adapted to the original text information. Referring to the network structure shown in fig. 2, the semantic feature vectors are input into the pre-trained text generation model based on the Attention mechanism, which can improve the information processing capability of the neural network, so that the generated evaluation text has stronger correlation with the initial text information.
And (2) processing semantic feature vector information according to a preset maximum probability output mechanism of the Beam Search to obtain an evaluation text.
In one embodiment, the processing by presetting the maximum probability output mechanism of Beam Search can be understood as: setting the length of a generated sentence to be L, randomly selecting a character from the first N characters with the highest probability of being output by the current LSTM neuron as the input of the next output neuron by adopting uniform integer distribution, repeating the step to generate K characters, and generating the rest sentence with the highest probability by adopting a Beam Search method for the rest L-K characters, wherein the values of the parameters can be as follows: the method can ensure the diversity of the generated sentences and the validity of the contents.
For the above evaluation text generation method, an embodiment of the present invention further provides an evaluation text generation apparatus, referring to a schematic structural diagram of an evaluation text generation apparatus shown in fig. 3, where the apparatus includes the following parts:
a text obtaining module 302, configured to obtain initial text information, where the initial text information includes text description information of a home decoration matching scheme;
the semantic understanding module 304 is configured to perform semantic feature extraction on the initial text information by using a pre-trained semantic understanding model, and obtain a semantic feature vector;
and the text generation module 306 is configured to generate an evaluation text according to the semantic feature vector based on a pre-trained text generation model.
The evaluation text generation device provided by the invention firstly obtains initial text information of text description information including a home decoration matching scheme, performs semantic feature extraction on the initial text information by using a pre-trained semantic understanding model, obtains a semantic feature vector, and generates an evaluation text according to the semantic feature vector based on the pre-trained text generation model. The initial text information is semantically understood through the pre-trained semantic understanding model, and then the semantic feature vector obtained after the semantic understanding is input into the text generation model, so that the recognition effect of the text generation model can be improved, and the association degree of the generated evaluation text and the initial text information is further improved. Therefore, the evaluation text with higher correlation degree with the initial text information is generated, and the operation efficiency of the website community or forum is effectively improved.
In an embodiment, the semantic understanding module 304 is further configured to: performing text mapping on the initial text information according to a preset text mapping dictionary to obtain corresponding coding information; the coded information carries a word coded information embedded identifier, a position coded information embedded identifier and a paragraph coded information embedded identifier; extracting corresponding semantic information from the coded information based on a pre-trained BERT neural network model, and obtaining a semantic feature vector based on the semantic information.
In an embodiment, the system further includes a text mapping module, configured to look up, according to a preset text mapping dictionary, index information corresponding to the initial text information; and converting the text information into corresponding coding information according to the index information.
In one embodiment, the above apparatus further comprises: and the input module is used for inputting the semantic feature vectors into a pre-trained text generation model based on an Attention mechanism so that the pre-trained text generation model outputs an evaluation text related to the initial text information.
In one embodiment, the text generation model is an LSTM neural network model; the text generating module 306 is further configured to input the semantic feature vector into a pre-trained LSTM neural network model, and process the semantic feature vector information according to a maximum probability output mechanism of preset Beam Search to obtain an evaluation text.
In one embodiment, the device is applied to a website community or a forum.
The invention also provides electronic equipment, which specifically comprises a processor and a storage device; the storage means has stored thereon a computer program which, when executed by the processor, performs the method of any of the above described embodiments.
Fig. 4 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present invention, where the electronic device 100 includes: a processor 40, a memory 41, a bus 42 and a communication interface 43, wherein the processor 40, the communication interface 43 and the memory 41 are connected through the bus 42; the processor 40 is arranged to execute executable modules, such as computer programs, stored in the memory 41.
The Memory 41 may include a high-speed Random Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 43 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, etc. may be used.
The bus 42 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 4, but that does not indicate only one bus or one type of bus.
The memory 41 is used for storing a program, the processor 40 executes the program after receiving an execution instruction, and the method executed by the apparatus defined by the flow process disclosed in any of the foregoing embodiments of the present invention may be applied to the processor 40, or implemented by the processor 40.
The processor 40 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 40. The Processor 40 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory 41, and the processor 40 reads the information in the memory 41 and completes the steps of the method in combination with the hardware thereof.
The evaluation text generation method, the evaluation text generation device, and the computer program product of the electronic device provided in the embodiments of the present invention include a computer-readable storage medium storing a nonvolatile program code executable by a processor, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by the processor, the method described in the foregoing method embodiments is executed.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the system described above may refer to the corresponding process in the foregoing embodiments, and is not described herein again.
The computer program product of the readable storage medium provided in the embodiment of the present invention includes a computer readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the method embodiment, which is not described herein again.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. An evaluation text generation method, comprising:
acquiring initial text information, wherein the initial text information comprises text description information of a home decoration matching scheme;
extracting semantic features of the initial text information by using a pre-trained semantic understanding model, and obtaining a semantic feature vector;
and generating an evaluation text according to the semantic feature vector based on a pre-trained text generation model.
2. The method of claim 1, wherein the semantic understanding model is a BERT neural network model; the method comprises the following steps of extracting semantic features of the initial text information by using a pre-trained semantic understanding model, and obtaining a semantic feature vector, wherein the steps comprise:
performing text mapping on the initial text information according to a preset text mapping dictionary to obtain corresponding coding information; the coded information carries a word coded information embedded identifier, a position coded information embedded identifier and a paragraph coded information embedded identifier;
extracting corresponding semantic information from the coding information based on the pre-trained BERT neural network model, and obtaining a semantic feature vector based on the semantic information.
3. The method according to claim 2, wherein the step of performing text mapping on the initial text information according to a preset text mapping dictionary to obtain corresponding encoded information comprises:
searching index information corresponding to the initial text information according to a preset text mapping dictionary;
and converting the initial text information into corresponding coding information according to the index information.
4. The method of claim 1, further comprising: inputting the semantic feature vector into the pre-trained text generation model based on an Attention mechanism, so that the pre-trained text generation model outputs an evaluation text related to the initial text information.
5. The method of claim 1, wherein the text generation model is an LSTM neural network model; the step of generating an evaluation text according to the semantic feature vector based on a pre-trained text generation model comprises the following steps:
inputting the semantic feature vector into the LSTM neural network model which is trained in advance;
and carrying out information processing on the semantic feature vector according to a maximum probability output mechanism of preset Beam Search to obtain the evaluation text.
6. The method of any one of claims 1 to 5, wherein the method is applied to a website community or forum.
7. An evaluation text generation device characterized by comprising:
the system comprises a text acquisition module, a text matching module and a text matching module, wherein the text acquisition module is used for acquiring initial text information which comprises text description information of a home decoration matching scheme;
the semantic understanding module is used for extracting semantic features of the initial text information by using a pre-trained semantic understanding model and obtaining a semantic feature vector;
and the text generation module is used for generating an evaluation text according to the semantic feature vector based on a pre-trained text generation model.
8. The apparatus of claim 7,
the semantic understanding model is a BERT neural network model; the semantic understanding module is used for:
performing text mapping on the initial text information according to a preset text mapping dictionary to obtain corresponding coding information; the coded information carries a word coded information embedded identifier, a position coded information embedded identifier and a paragraph coded information embedded identifier;
extracting corresponding semantic information from the coding information based on the pre-trained BERT neural network model, and obtaining a semantic feature vector based on the semantic information.
9. An electronic device comprising a processor and a memory, the memory storing computer-executable instructions executable by the processor, the processor executing the computer-executable instructions to implement the steps of the rating text generating method of any of claims 1 to 6.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the evaluation text generation method according to any one of claims 1 to 6.
CN201911080696.2A 2019-11-06 2019-11-06 Evaluation text generation method and device and electronic equipment Pending CN110826340A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911080696.2A CN110826340A (en) 2019-11-06 2019-11-06 Evaluation text generation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911080696.2A CN110826340A (en) 2019-11-06 2019-11-06 Evaluation text generation method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN110826340A true CN110826340A (en) 2020-02-21

Family

ID=69553085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911080696.2A Pending CN110826340A (en) 2019-11-06 2019-11-06 Evaluation text generation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN110826340A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111460800A (en) * 2020-03-27 2020-07-28 深圳价值在线信息科技股份有限公司 Event generation method and device, terminal equipment and storage medium
CN112417539A (en) * 2020-11-16 2021-02-26 杭州群核信息技术有限公司 Method, device and system for designing house type based on language description
CN112733507A (en) * 2021-01-16 2021-04-30 江苏网进科技股份有限公司 Method for automatically generating legal text marking event
CN112733515A (en) * 2020-12-31 2021-04-30 贝壳技术有限公司 Text generation method and device, electronic equipment and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106951516A (en) * 2017-03-18 2017-07-14 深圳市彬讯科技有限公司 A kind of finishing selection intelligent sorting method based on big data
CN107273487A (en) * 2017-06-13 2017-10-20 北京百度网讯科技有限公司 Generation method, device and the computer equipment of chat data based on artificial intelligence
CN108363753A (en) * 2018-01-30 2018-08-03 南京邮电大学 Comment text sentiment classification model is trained and sensibility classification method, device and equipment
CN109471915A (en) * 2018-10-09 2019-03-15 科大讯飞股份有限公司 A kind of text evaluation method, device, equipment and readable storage medium storing program for executing
US20190287142A1 (en) * 2018-02-12 2019-09-19 Baidu Online Network Technology (Beijing) Co., Ltd. Method, apparatus for evaluating review, device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106951516A (en) * 2017-03-18 2017-07-14 深圳市彬讯科技有限公司 A kind of finishing selection intelligent sorting method based on big data
CN107273487A (en) * 2017-06-13 2017-10-20 北京百度网讯科技有限公司 Generation method, device and the computer equipment of chat data based on artificial intelligence
CN108363753A (en) * 2018-01-30 2018-08-03 南京邮电大学 Comment text sentiment classification model is trained and sensibility classification method, device and equipment
US20190287142A1 (en) * 2018-02-12 2019-09-19 Baidu Online Network Technology (Beijing) Co., Ltd. Method, apparatus for evaluating review, device and storage medium
CN109471915A (en) * 2018-10-09 2019-03-15 科大讯飞股份有限公司 A kind of text evaluation method, device, equipment and readable storage medium storing program for executing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高扬, 北京理工大学出版社 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111460800A (en) * 2020-03-27 2020-07-28 深圳价值在线信息科技股份有限公司 Event generation method and device, terminal equipment and storage medium
CN111460800B (en) * 2020-03-27 2024-03-22 深圳价值在线信息科技股份有限公司 Event generation method, device, terminal equipment and storage medium
CN112417539A (en) * 2020-11-16 2021-02-26 杭州群核信息技术有限公司 Method, device and system for designing house type based on language description
CN112417539B (en) * 2020-11-16 2023-10-03 杭州群核信息技术有限公司 House type design method, device and system based on language description
CN112733515A (en) * 2020-12-31 2021-04-30 贝壳技术有限公司 Text generation method and device, electronic equipment and readable storage medium
CN112733507A (en) * 2021-01-16 2021-04-30 江苏网进科技股份有限公司 Method for automatically generating legal text marking event
CN112733507B (en) * 2021-01-16 2023-06-09 江苏网进科技股份有限公司 Method for automatically generating legal text marking event

Similar Documents

Publication Publication Date Title
CN110826340A (en) Evaluation text generation method and device and electronic equipment
US20180336193A1 (en) Artificial Intelligence Based Method and Apparatus for Generating Article
WO2020073673A1 (en) Text analysis method and terminal
CN111767796B (en) Video association method, device, server and readable storage medium
CN105975459B (en) A kind of the weight mask method and device of lexical item
CN106682170B (en) Application search method and device
CN107918778B (en) Information matching method and related device
CN112667780B (en) Comment information generation method and device, electronic equipment and storage medium
TW201804341A (en) Character string segmentation method, apparatus and device
EP3759621A1 (en) Content editing using ai-based content modeling
CN110782308B (en) Push method and device for recommended package, electronic equipment and readable storage medium
CN109325146A (en) A kind of video recommendation method, device, storage medium and server
CN110795935A (en) Training method and device for character word vector model, terminal and storage medium
CN111310037B (en) Household material recommendation method and device and electronic equipment
CN112380319A (en) Model training method and related device
CN113283238A (en) Text data processing method and device, electronic equipment and storage medium
CN109993216B (en) Text classification method and device based on K nearest neighbor KNN
CN112199606A (en) Social media-oriented rumor detection system based on hierarchical user representation
CN111597326A (en) Method and device for generating commodity description text
CN106569989A (en) De-weighting method and apparatus for short text
CN111651674B (en) Bidirectional searching method and device and electronic equipment
CN114492669B (en) Keyword recommendation model training method, recommendation device, equipment and medium
CN113821592A (en) Data processing method, device, equipment and storage medium
CN111506717B (en) Question answering method, device, equipment and storage medium
CN117764669A (en) Article recommendation method, device, equipment, medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination