CN110597966A - Automatic question answering method and device - Google Patents

Automatic question answering method and device Download PDF

Info

Publication number
CN110597966A
CN110597966A CN201810502726.3A CN201810502726A CN110597966A CN 110597966 A CN110597966 A CN 110597966A CN 201810502726 A CN201810502726 A CN 201810502726A CN 110597966 A CN110597966 A CN 110597966A
Authority
CN
China
Prior art keywords
question
sentence
sample
sentences
statement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810502726.3A
Other languages
Chinese (zh)
Inventor
陈华杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Gridsum Technology Co Ltd
Original Assignee
Beijing Gridsum Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Gridsum Technology Co Ltd filed Critical Beijing Gridsum Technology Co Ltd
Priority to CN201810502726.3A priority Critical patent/CN110597966A/en
Priority to PCT/CN2019/073662 priority patent/WO2019223362A1/en
Publication of CN110597966A publication Critical patent/CN110597966A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

According to the automatic question-answering method and device provided by the invention, the question sentences of the user and the candidate question sets are firstly obtained, then the question models obtained by training the attention neural network through the question sentence training samples are utilized, the question sentences matched with the question sentences of the user are determined from the candidate question sets and are used as target question sentences, finally, answer information corresponding to the target question sentences is adopted to respond to the question sentences of the user, and the characteristics of the attention neural network are utilized to enable the automatic question-answering process and results realized by utilizing the question models to have interpretability and improve the automatic question-answering effect.

Description

Automatic question answering method and device
Technical Field
The invention relates to the field of information processing, in particular to an automatic question answering method and device.
Background
The automatic question-answering technology automatically answers the problems encountered by the user when handling related affairs in the form of intelligent assistants or intelligent customer service, greatly reduces the workload of manual customer service, and simultaneously enables the user to enjoy faster question-answering service.
In the traditional automatic question-answering scheme, an automatic question-answering process is converted into a retrieval process of user questions in a common question bank, specifically, natural language processing tools are used for analyzing user question sentences, keywords of the user question sentences are identified, and matched questions and answers thereof are retrieved from the common question bank according to the obtained keywords. However, the machine learning method adopted in the conventional automatic question and answer scheme is usually based on a bag-of-words model, in which the precedence relationship between words in question sentences is ignored, and part of semantic information in the question sentences is lost, so that the machine learning process and the machine learning result lack interpretability, and the automatic question and answer process and the result also lack interpretability.
Therefore, there is a need for a practical and effective solution to improve the interpretability of the automatic question-answering process and the result.
Disclosure of Invention
In view of the above, the present invention has been made to provide an automatic question-answering method and apparatus that overcome or at least partially solve the above-mentioned problems.
In order to achieve the purpose, the invention provides the following technical scheme:
an automatic question answering method, comprising:
obtaining question sentences of a user and a candidate question set, wherein the question sentences in the candidate question set correspond to preset answer information;
determining question sentences matched with the question sentences of the user from the candidate question set by using a preset question model as target question sentences; the problem model is obtained by training an attention neural network by taking a historical problem statement as a training sample;
and responding to the question sentence of the user by adopting answer information corresponding to the target question sentence.
Preferably, the obtaining of the user question statement and the candidate question set includes:
acquiring a question sentence of a user;
adopting BM25 algorithm to retrieve at least one question sentence associated with the user question sentence from a preset question library as a candidate question set; and all question sentences in the preset question bank correspond to preset answer information.
Preferably, the determining, by using a preset problem model, a problem statement matched with the user problem statement from the candidate problem set, as a target problem statement, includes:
acquiring the similarity between the question sentences of the user and the question sentences in the candidate question set by using a preset question model;
and determining the question sentences of which the similarity accords with a preset similarity condition in the candidate question set as question sentences matched with the question sentences of the user as target question sentences.
Preferably, the training process of the problem model includes:
obtaining a training sample, wherein the training sample comprises a sample question statement;
and training an attention neural network by adopting the sample question sentence to obtain the question model.
Preferably, the acquiring training samples includes:
acquiring a first question sentence;
acquiring a second question statement with the same semantic as the first question statement and a third question statement with the different semantic from the first question statement;
taking the second question statement as a positive sample of the first question statement, and taking the third question statement as a negative sample of the first question statement; wherein the positive sample and the negative sample are training samples.
Preferably, the training of the attention neural network by using the sample question statement to obtain the problem model includes:
obtaining word vectors corresponding to all words in the sample question sentences;
extracting a feature vector corresponding to each moment in the sample question sentence by utilizing a recurrent neural network with a bidirectional gate structure according to the word vector;
determining weight information corresponding to each moment in the sample question sentence by using an attention mechanism;
determining sentence vectors corresponding to the sample question sentences according to the feature vectors and the weight information corresponding to all the moments in the sample question sentences;
and determining network parameters of the attention neural network according to the sentence vectors to obtain the problem model.
Preferably, before the training of the attention neural network using the sample question statement, the method further includes:
acquiring a text sentence of a target field;
performing word segmentation processing on the text sentence to obtain a word segmentation result of the text sentence;
performing word vector training on the word segmentation result of the text sentence to obtain a word vector model;
accordingly, the method can be used for solving the problems that,
the obtaining of the word vector corresponding to each word in the sample question sentence includes:
performing word segmentation processing on the sample question sentence to obtain a word segmentation result of the sample question sentence;
and obtaining a word vector corresponding to each word in the sample question sentence according to the word vector model.
An automatic question answering device comprising:
the question acquisition unit is used for acquiring question sentences of a user and a candidate question set, wherein the question sentences in the candidate question set correspond to preset answer information;
a problem determining unit, configured to determine, from the candidate problem set, a problem statement matched with the user problem statement as a target problem statement by using a preset problem model; the problem model is obtained by training an attention neural network by taking a historical problem statement as a training sample;
and the question responding unit is used for responding to the question sentences of the user by adopting answer information corresponding to the target question sentences.
A storage medium, the storage medium comprising a stored program, wherein, when the program runs, a device in which the storage medium is located is controlled to execute the above automatic question answering method.
A processor for executing a program, wherein the program executes the above automatic question answering method.
By means of the technical scheme, the automatic question-answering method and the automatic question-answering device provided by the invention have the advantages that the question sentences of the user and the candidate question sets are obtained firstly, the question models obtained by training the attention neural network through the question sentence training samples are utilized, the question sentences matched with the question sentences of the user are determined from the candidate question sets and serve as target question sentences, finally, answer information corresponding to the target question sentences is adopted to respond to the question sentences of the user, the characteristics of the attention neural network are utilized, the automatic question-answering process and the automatic question-answering result achieved by utilizing the question models are enabled to be interpretable, and the automatic question-answering effect is improved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a flowchart of an automatic question answering method according to an embodiment of the present application;
fig. 2 is another flowchart of an automatic question answering method according to an embodiment of the present application;
FIG. 3 is a flowchart of a problem model training process provided by an embodiment of the present application;
FIG. 4 is another flow chart of a problem model training process provided by embodiments of the present application;
FIG. 5 is a diagram illustrating sentence encoding provided by an embodiment of the present application;
FIG. 6 is a flowchart of a problem model training process provided by an embodiment of the present application;
fig. 7 is a schematic structural diagram of an automatic question answering device according to an embodiment of the present application;
fig. 8 is another schematic structural diagram of an automatic question answering device according to an embodiment of the present application;
fig. 9 is an exemplary diagram of an automatic question answering service flow provided in an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Referring to fig. 1, fig. 1 is a flowchart of an automatic question answering method according to an embodiment of the present application.
As shown in fig. 1, the method includes:
s101: and acquiring a user question sentence and a candidate question set.
The user question sentence is a sentence representing a question posed by a user, and the user question sentence can be a question sentence in a text form directly input by the user or a question sentence obtained by converting question voice spoken by the user into a text form.
The candidate question set is a question set used for matching question sentences of the user, and the question sentences in the candidate question set correspond to preset answer information.
S102: and determining question sentences matched with the question sentences of the user from the candidate question set by using a preset question model to serve as target question sentences.
The problem model is obtained by training an attention neural network by taking a historical problem statement as a training sample. When the problem sentences are coded based on time sequence, different positions of words in the problem sentences correspond to different moments, the attention neural network can be adopted to learn the weight information of the words in the problem sentences at different moments, the weight of important words in the problem sentences can be effectively improved, the interference of useless information in the problem sentences is reduced, and when sentence vectors are generated according to the weight information of the words in the problem sentences at different moments, the sentence vectors can have semantic information of the problem sentences, so that the interpretability of the sentence vectors is improved.
The problem model is actually an attention neural network model, which has the characteristics of an attention neural network. Therefore, by using the problem model to match the user problem statement from the candidate problem set, the interpretability of the matching result can be improved.
The question sentences in the candidate question set that match the user question sentences should have the same semantics as the user question sentences and should express the same meaning, i.e., the target question sentences and the user question sentences are actually the same question. The semantics of the target question sentence and the user question sentence should be the same, but the expression form may be the same or different. For example, the target question statement may be a modified statement of the user question statement, such that the target question statement has the same semantics as the user question statement but a different expression.
S103: and responding to the question sentence of the user by adopting answer information corresponding to the target question sentence.
The target question sentence is matched with the user question sentence, so that the target question sentence and the user question sentence are actually the same question, and the target question sentence corresponds to preset answer information, so that the answer information corresponding to the target question sentence is actually the answer information of the user question sentence.
After the answer information of the user question sentences is determined, the answer information is adopted to respond to the user question sentences, so that the automatic question answering process is realized.
In the automatic question-answering method provided by this embodiment, a user question sentence and a candidate question set are obtained first, a question model obtained by training an attention neural network through a question sentence training sample is used, a question sentence matched with the user question sentence is determined from the candidate question set and is used as a target question sentence, finally answer information corresponding to the target question sentence is used to respond to the user question sentence, and the automatic question-answering process and result realized by using the question model are interpretable by using the characteristics of the attention neural network, so that the automatic question-answering effect is improved.
Referring to fig. 2, fig. 2 is another flowchart of an automatic question answering method according to an embodiment of the present application.
As shown in fig. 2, the method includes:
s201: and acquiring a question sentence of the user.
S202: and using the BM25 algorithm to retrieve at least one question sentence associated with the user question sentence from a preset question library as a candidate question set.
The preset question bank (FAQ, frequencyty ask Questions) is a question bank constructed according to question sentences collected in advance, and specifically, the preset question bank may be a question bank for a certain field or a question bank for a plurality of fields.
In one example, the question statements in the preset question bank may include: a frequently asked question statement of a target domain, and a deformed statement of the frequently asked question statement.
The morphable sentences of the common question sentences refer to the question sentences which have the same semantics as the common question sentences but different expression forms, and can be obtained by crawling the common question sentences in the internet in a retrieval mode. Through the deformation sentences of the common question sentences, the expression forms of the common question sentences can be enriched, and the hit rate of the question sentences of the user is improved.
In one example, the target domain may refer to a judicial domain, a financial domain, a computer technology domain, or other domains.
The question sentences in the preset question bank correspond to preset answer information, and the preset answer information can be manually summarized and written or acquired by other methods.
The BM25(Best Match25) algorithm is an algorithm for evaluating the relevance between search terms and documents, and has high search efficiency and effect. And adopting a BM25 algorithm to retrieve at least one question sentence associated with the user question sentence from a preset question library to serve as a candidate question set instead of directly using the preset question library as the candidate question set, thereby reducing the data volume of the candidate question set and improving the processing speed of matching the user question sentence from the candidate question set.
S203: and acquiring the similarity between the question sentences of the user and the question sentences in the candidate question set by using a preset question model.
By using the problem model, the user problem statement and any problem statement in the candidate problem set can be converted into corresponding sentence vectors, and the similarity between the user problem statement and any problem statement in the candidate problem set can be obtained by calculating the similarity between the sentence vectors.
The problem model is obtained by training an attention neural network by taking historical problem sentences as training samples, and the sentence vectors obtained by using the problem model contain semantic information of the problem sentences, so that the similarity of the sentence vectors among the problem sentences with the more similar semantic information is higher.
S204: and determining the question sentences of which the similarity accords with a preset similarity condition in the candidate question set as question sentences matched with the question sentences of the user as target question sentences.
After the similarity between the user question statement and each question statement in the candidate question set is obtained, the question statement with the highest similarity in the candidate question set may be determined as a target question statement, or the question statement with the highest similarity in the candidate question set and larger than a preset similarity threshold may be determined as a target question statement.
Correspondingly, the preset similarity condition is as follows: the similarity is the highest, or the similarity is the highest and is greater than a preset similarity threshold.
S205: and responding to the question sentence of the user by adopting answer information corresponding to the target question sentence.
In the automatic question and answer method provided by this embodiment, after the user question and statement are obtained, the BM25 algorithm is first adopted to retrieve at least one question statement associated with the user question and statement from the preset question bank as the candidate question set, instead of directly taking the whole preset question bank as the candidate question set, so as to reduce the data size of the candidate question set, improve the processing speed of matching the user question and statement from the candidate question set by using the question model, reduce the response time of the whole automatic question and answer process, and fully meet the requirements of high concurrency and quick response of the automatic question and answer domain.
Referring to fig. 3, fig. 3 is a flowchart of a problem model training process according to an embodiment of the present application.
The problem model training process provided by this embodiment is a process of training an attention neural network by using a problem statement training sample.
As shown in fig. 3, the problem model training process includes:
s301: training samples are obtained.
The training samples comprise sample question sentences, and the sample question sentences are question sentences serving as training samples.
In an example, the sample question statement may be a question statement in a preset question bank (FAQ), or may be a question statement acquired through another channel.
In one example, the process of obtaining training samples may include:
a1, acquiring a first question statement;
the first question statement may be any question statement in the preset question bank.
a2, acquiring a second question statement with the same semantic as the first question statement, and a third question statement with the different semantic from the first question statement;
a3, taking the second question sentence as a positive sample of the first question sentence, and taking the third question sentence as a negative sample of the first question sentence; wherein the positive sample and the negative sample are training samples.
The second question statement and the third question statement may also be question statements in the preset question bank; the number of the second question sentences may be one or more; likewise, the number of the third question sentences may be one or more.
By using the steps a 1-a 3, question sentences with the same semantic meaning can be used as positive samples, and question sentences with different semantic meanings can be used as negative samples for each question sentence in the preset question library.
In one example, confusing semantically different question statements may also be added as negative examples by some unsupervised methods. The unsupervised method may include: a vector space model of TF-IDF (term frequency-inverse file frequency), BM25 and at least one of WMD (word move distance) algorithm.
S302: and training an attention neural network by adopting the sample question sentence to obtain the question model.
In one example, according to the obtained training samples, the attention neural network can be trained by using tenserflow, the training aims to make question sentences with similar semantics have similar sentence vectors (question sentences with the same semantics should have the same sentence vectors), and the similarity can be measured by cosine similarity.
In other examples, attention neural networks may also be trained using Theano, Keras, Torch, etc. TensorFlow, Theano, Keras, Torch are all machine learning frameworks.
In the problem model training process provided by this embodiment, after the training samples are obtained, the attention neural network is trained by using the sample problem sentences, so that the problem sentences with similar semantics can have similar sentence vectors, and thus, both the machine learning process and the learning result have strong interpretability, and the problem model with interpretability is obtained.
Referring to fig. 4, fig. 4 is another flowchart of a problem model training process according to an embodiment of the present application.
As shown in fig. 4, the problem model training process includes:
s401: training samples are obtained.
The training samples comprise sample question sentences, and the sample question sentences are question sentences serving as training samples.
S402: and acquiring a word vector corresponding to each word in the sample question sentence.
When training the attention neural network, each word in the sample question sentence needs to be converted into a word vector form as input data of the attention neural network.
S403: and extracting the feature vector corresponding to each moment in the sample question sentence by utilizing a recurrent neural network with a bidirectional gate structure according to the word vector.
When a sentence is analyzed based on time sequence, words in the sentence can be regarded as a time sequence, and each word in the sentence corresponds to a time.
By utilizing the recurrent neural network with the bidirectional gate structure, the information at the current moment can be coded by fully combining the context relationship of the word vector, so that the feature vector with more semantic information is extracted. The recurrent neural network of the bidirectional gate structure may include a Long Short-Term Memory network (LSTM), a Gated Recurrent Unit (GRU), and the like.
S404: and determining the weight information corresponding to each moment in the sample question sentence by using an attention mechanism.
Different words in a sentence have different importance, so that the weight information of different moments in the sentence can be learned by using an attention mechanism according to the importance of the different words.
S405: and determining sentence vectors corresponding to the sample question sentences according to the feature vectors and the weight information corresponding to all the moments in the sample question sentences.
Weighting and summarizing the characteristic vectors at all times by using the weight information at all times in the sample question sentences so as to obtain sentence vectors corresponding to the sample question sentences; certainly, other calculation methods may also be adopted to determine the sentence vector corresponding to the sample question sentence, which is not described herein again.
As shown in fig. 5, the implementation process of steps S403 to S405 may be to perform feature extraction of a bidirectional gate structure on a word vector corresponding to each word in any sample question sentence, determine weights w 1-wn of a time corresponding to each word vector, and finally determine a sentence vector of the sample question sentence according to the weights w 1-wn and each extracted feature vector.
S406: and determining network parameters of the attention neural network according to the sentence vectors to obtain the problem model.
The sentence vector obtained by coding the cyclic neural network with the bidirectional gate structure and the attention mechanism fully contains the semantic information of the sample question sentence, and the interference of useless information is reduced. The sample question sentences with the same semantics should have the same or similar sentence vectors, so the network parameters of the attention neural network can be continuously adjusted according to the similarity between the sentence vectors of all the sample question sentences until a group of network parameters which can realize the goal that the sample question sentences with the same semantics have the same or similar sentence vectors are determined, and a well-trained problem model is obtained.
In the problem model training process provided by this embodiment, according to the word vector corresponding to each word in the sample problem statement, the cyclic neural network with the bidirectional gate structure is used to extract the feature vector corresponding to each moment in the sample problem statement; determining weight information corresponding to each moment in the sample question sentence by using an attention mechanism; and finally, determining network parameters of the attention neural network according to the sentence vectors to obtain the problem model, fully mining semantic information of the sample problem sentences in the sentence vector determining process, further improving the interpretability of a machine learning process and a learning result, and obtaining the problem model with stronger interpretability.
Referring to fig. 6, fig. 6 is a flowchart illustrating a problem model training process according to an embodiment of the present disclosure.
As shown in fig. 6, the problem model training process includes:
s501: and acquiring the text sentence of the target field.
The target field is the application field of the automatic question and answer device, and in order to improve the accuracy of the automatic question and answer device, accurate word vectors corresponding to all words in sample question sentences need to be acquired. Taking the judicial field as an example, in order to obtain word vectors more suitable for the judicial field, a large number of text sentences in the judicial field can be collected to be used as a text sentence library in the judicial field.
S502: and performing word segmentation processing on the text sentence to obtain a word segmentation result of the text sentence.
The word segmentation process may be performed using an open-source word segmentation tool software, such as a Language Technology Platform (LTP) of the hafford corporation, to segment the text sentences in the text sentence library to obtain word segmentation results. The word segmentation result of the text sentence comprises each word in the text sentence.
S503: and performing word vector training on the word segmentation result of the text sentence to obtain a word vector model.
For the word segmentation result of the text sentence, a word vector training tool may be employed for training to generate a corresponding word vector model. The Word vector training tool can adopt Word2vec, and can also adopt other Word vector training tools.
S504: and acquiring a training sample, wherein the training sample comprises a sample question statement.
S505: and performing word segmentation processing on the sample question sentence to obtain a word segmentation result of the sample question sentence.
The word segmentation result of the sample question sentence comprises each word in the sample question sentence.
S506: and obtaining a word vector corresponding to each word in the sample question sentence according to the word vector model.
The word vector corresponding to the word can be obtained by using the word vector model, so that the word vector corresponding to each word in the sample question sentence can be obtained from the word vector model for each word in the sample question sentence.
In an example, when a word or some words in a sample question sentence cannot obtain a corresponding word vector from the word vector model, the words may be called "out of vocabulary" (out of vocabulary), and a random word vector (e.g., a word vector of UNK) is specified for the out-of-vocabulary words as the word vector corresponding to the out-of-vocabulary.
S507: and extracting the feature vector corresponding to each moment in the sample question sentence by utilizing a recurrent neural network with a bidirectional gate structure according to the word vector.
S508: and determining the weight information corresponding to each moment in the sample question sentence by using an attention mechanism.
S509: determining sentence vectors corresponding to the sample question sentences according to the feature vectors and the weight information corresponding to all the moments in the sample question sentences;
s510: and determining network parameters of the attention neural network according to the sentence vectors to obtain the problem model.
In the problem model training process provided by this embodiment, a text sentence in a target field is obtained and subjected to word segmentation processing, and then word vector training is performed on the word segmentation result of the text sentence to obtain a word vector model, so as to improve the obtaining efficiency of the word vector corresponding to each word in the sample problem sentence, and further improve the problem model training efficiency.
Corresponding to the automatic question answering method, the embodiment of the invention also provides a corresponding automatic question answering device. The technical contents of the automatic question answering device described below may be referred to in correspondence with the technical contents and mutual correspondence of the automatic question answering method described above.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an automatic question answering device according to an embodiment of the present application.
The automatic question answering device of the present embodiment is used for implementing the automatic question answering method of the foregoing embodiment, and as shown in fig. 7, the device includes:
the question obtaining unit 100 is configured to obtain question sentences of a user and a candidate question set, where the question sentences in the candidate question set all correspond to preset answer information.
A problem determining unit 200, configured to determine, from the candidate problem set, a problem statement matched with the user problem statement as a target problem statement by using a preset problem model; the problem model is obtained by training an attention neural network by taking a historical problem statement as a training sample;
the question responding unit 300 is configured to respond to the question sentence of the user by using answer information corresponding to the target question sentence.
In an example, the problem obtaining unit 100 is specifically configured to:
acquiring a question sentence of a user;
adopting BM25 algorithm to retrieve at least one question sentence associated with the user question sentence from a preset question library as a candidate question set; and all question sentences in the preset question bank correspond to preset answer information.
In an example, the problem determining unit 200 is specifically configured to:
acquiring the similarity between the question sentences of the user and the question sentences in the candidate question set by using a preset question model;
and determining the question sentences of which the similarity accords with a preset similarity condition in the candidate question set as question sentences matched with the question sentences of the user as target question sentences.
The automatic question-answering device provided in this embodiment first obtains a user question sentence and a candidate question set, then determines a question sentence matched with the user question sentence from the candidate question set as a target question sentence by using a question model obtained by training an attention neural network through a question sentence training sample, and finally responds to the user question sentence by using answer information corresponding to the target question sentence, so that the automatic question-answering process and result realized by using the question model have interpretability by using the characteristics of the attention neural network, and the automatic question-answering effect is improved.
Referring to fig. 8, fig. 8 is another schematic structural diagram of an automatic question answering device according to an embodiment of the present application.
As shown in fig. 8, the automatic question answering device of the present embodiment includes, in addition to the question acquisition unit 100, the question determination unit 200, and the question answering unit 300 of the previous embodiments, further: a model training unit 400 and a question bank unit 500.
The model training unit 400 is configured to obtain a training sample, where the training sample includes a sample question statement; and training an attention neural network by adopting the sample question sentence to obtain the question model.
The question bank unit 500 is configured to collect historical question sentences, configure a question bank according to the collected historical question sentences, and obtain a preset question bank.
In one example, the process of obtaining training samples by the model training unit 400 includes:
acquiring a first question sentence;
acquiring a second question statement with the same semantic as the first question statement and a third question statement with the different semantic from the first question statement;
taking the second question statement as a positive sample of the first question statement, and taking the third question statement as a negative sample of the first question statement; wherein the positive sample and the negative sample are training samples.
In an example, the model training unit 400 trains the attention neural network by using the sample question statement, and obtaining the question model includes:
obtaining word vectors corresponding to all words in the sample question sentences;
extracting a feature vector corresponding to each moment in the sample question sentence by utilizing a recurrent neural network with a bidirectional gate structure according to the word vector;
determining weight information corresponding to each moment in the sample question sentence by using an attention mechanism;
determining sentence vectors corresponding to the sample question sentences according to the feature vectors and the weight information corresponding to all the moments in the sample question sentences;
and determining network parameters of the attention neural network according to the sentence vectors to obtain the problem model.
In an example, the model training unit 400 is further configured to, before training the attention neural network using the sample question statement:
acquiring a text sentence of a target field;
performing word segmentation processing on the text sentence to obtain a word segmentation result of the text sentence;
and performing word vector training on the word segmentation result of the text sentence to obtain a word vector model.
Accordingly, the method can be used for solving the problems that,
the process of the model training unit 400 obtaining the word vector corresponding to each word in the sample question sentence includes:
performing word segmentation processing on the sample question sentence to obtain a word segmentation result of the sample question sentence;
and obtaining a word vector corresponding to each word in the sample question sentence according to the word vector model.
The automatic question answering device provided by this embodiment extracts feature vectors corresponding to respective times in sample question sentences by using a recurrent neural network of a bidirectional gate structure according to word vectors corresponding to respective words in the sample question sentences; determining weight information corresponding to each moment in the sample question sentence by using an attention mechanism; and finally, determining network parameters of the attention neural network according to the sentence vectors to obtain the problem model, fully mining semantic information of the sample problem sentences in the sentence vector determining process, further improving the interpretability of a machine learning process and a learning result, and obtaining the problem model with stronger interpretability.
Referring to fig. 9, fig. 9 is a diagram illustrating an example of an automatic question answering service flow according to an embodiment of the present application.
As shown in fig. 9, the automatic question answering business process of this embodiment includes:
firstly, acquiring a user query problem, preprocessing the user query problem and identifying user intention to obtain a user question statement; the preprocessing may include personalized word segmentation and keyword extraction.
Then, at least one question sentence associated with the user question sentence is preliminarily retrieved from a knowledge base (i.e., a preset question base) as a candidate question set. The acquisition mode of the question sentences in the knowledge base can comprise at least one of network intelligent crawling, user data mining and user self-definition.
And finally, carrying out semantic matching on the user question sentences and the candidate question set by using the trained question model, determining target question sentences and answer information thereof, and responding to the user question sentences by adopting the answer information.
The automatic question answering device provided by the embodiment of the invention comprises a processor and a memory, wherein the question acquisition unit 100, the question determination unit 200, the question responding unit 300, the model training unit 400, the question library unit 500 and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to be one or more than one, and the technical problem that the existing automatic question-answering scheme lacks interpretability is solved by adjusting the kernel parameters.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present invention provides a storage medium on which a program is stored, the program implementing the automatic question answering method when executed by a processor.
The embodiment of the invention provides a processor, which is used for running a program, wherein the automatic question answering method is executed when the program runs.
The embodiment of the invention provides equipment, which comprises a processor, a memory and a program which is stored on the memory and can run on the processor, wherein the processor executes the program and realizes the following steps:
obtaining question sentences of a user and a candidate question set, wherein the question sentences in the candidate question set correspond to preset answer information;
determining question sentences matched with the question sentences of the user from the candidate question set by using a preset question model as target question sentences; the problem model is obtained by training an attention neural network by taking a historical problem statement as a training sample;
and responding to the question sentence of the user by adopting answer information corresponding to the target question sentence.
Preferably, the obtaining of the user question statement and the candidate question set includes:
acquiring a question sentence of a user;
adopting BM25 algorithm to retrieve at least one question sentence associated with the user question sentence from a preset question library as a candidate question set; and all question sentences in the preset question bank correspond to preset answer information.
Preferably, the determining, by using a preset problem model, a problem statement matched with the user problem statement from the candidate problem set, as a target problem statement, includes:
acquiring the similarity between the question sentences of the user and the question sentences in the candidate question set by using a preset question model;
and determining the question sentences of which the similarity accords with a preset similarity condition in the candidate question set as question sentences matched with the question sentences of the user as target question sentences.
Preferably, the training process of the problem model includes:
obtaining a training sample, wherein the training sample comprises a sample question statement;
and training an attention neural network by adopting the sample question sentence to obtain the question model.
Preferably, the acquiring training samples includes:
acquiring a first question sentence;
acquiring a second question statement with the same semantic as the first question statement and a third question statement with the different semantic from the first question statement;
taking the second question statement as a positive sample of the first question statement, and taking the third question statement as a negative sample of the first question statement; wherein the positive sample and the negative sample are training samples.
Preferably, the training of the attention neural network by using the sample question statement to obtain the problem model includes:
obtaining word vectors corresponding to all words in the sample question sentences;
extracting a feature vector corresponding to each moment in the sample question sentence by utilizing a recurrent neural network with a bidirectional gate structure according to the word vector;
determining weight information corresponding to each moment in the sample question sentence by using an attention mechanism;
determining sentence vectors corresponding to the sample question sentences according to the feature vectors and the weight information corresponding to all the moments in the sample question sentences;
and determining network parameters of the attention neural network according to the sentence vectors to obtain the problem model.
Preferably, before the training of the attention neural network using the sample question statement, the method further includes:
acquiring a text sentence of a target field;
performing word segmentation processing on the text sentence to obtain a word segmentation result of the text sentence;
performing word vector training on the word segmentation result of the text sentence to obtain a word vector model;
accordingly, the method can be used for solving the problems that,
the obtaining of the word vector corresponding to each word in the sample question sentence includes:
performing word segmentation processing on the sample question sentence to obtain a word segmentation result of the sample question sentence;
and obtaining a word vector corresponding to each word in the sample question sentence according to the word vector model.
The device herein may be a server, a PC, a PAD, a mobile phone, etc.
The present application further provides a computer program product adapted to perform a program for initializing the following method steps when executed on a data processing device:
obtaining question sentences of a user and a candidate question set, wherein the question sentences in the candidate question set correspond to preset answer information;
determining question sentences matched with the question sentences of the user from the candidate question set by using a preset question model as target question sentences; the problem model is obtained by training an attention neural network by taking a historical problem statement as a training sample;
and responding to the question sentence of the user by adopting answer information corresponding to the target question sentence.
Preferably, the obtaining of the user question statement and the candidate question set includes:
acquiring a question sentence of a user;
adopting BM25 algorithm to retrieve at least one question sentence associated with the user question sentence from a preset question library as a candidate question set; and all question sentences in the preset question bank correspond to preset answer information.
Preferably, the determining, by using a preset problem model, a problem statement matched with the user problem statement from the candidate problem set, as a target problem statement, includes:
acquiring the similarity between the question sentences of the user and the question sentences in the candidate question set by using a preset question model;
and determining the question sentences of which the similarity accords with a preset similarity condition in the candidate question set as question sentences matched with the question sentences of the user as target question sentences.
Preferably, the training process of the problem model includes:
obtaining a training sample, wherein the training sample comprises a sample question statement;
and training an attention neural network by adopting the sample question sentence to obtain the question model.
Preferably, the acquiring training samples includes:
acquiring a first question sentence;
acquiring a second question statement with the same semantic as the first question statement and a third question statement with the different semantic from the first question statement;
taking the second question statement as a positive sample of the first question statement, and taking the third question statement as a negative sample of the first question statement; wherein the positive sample and the negative sample are training samples.
Preferably, the training of the attention neural network by using the sample question statement to obtain the problem model includes:
obtaining word vectors corresponding to all words in the sample question sentences;
extracting a feature vector corresponding to each moment in the sample question sentence by utilizing a recurrent neural network with a bidirectional gate structure according to the word vector;
determining weight information corresponding to each moment in the sample question sentence by using an attention mechanism;
determining sentence vectors corresponding to the sample question sentences according to the feature vectors and the weight information corresponding to all the moments in the sample question sentences;
and determining network parameters of the attention neural network according to the sentence vectors to obtain the problem model.
Preferably, before the training of the attention neural network using the sample question statement, the method further includes:
acquiring a text sentence of a target field;
performing word segmentation processing on the text sentence to obtain a word segmentation result of the text sentence;
performing word vector training on the word segmentation result of the text sentence to obtain a word vector model;
accordingly, the method can be used for solving the problems that,
the obtaining of the word vector corresponding to each word in the sample question sentence includes:
performing word segmentation processing on the sample question sentence to obtain a word segmentation result of the sample question sentence;
and obtaining a word vector corresponding to each word in the sample question sentence according to the word vector model.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. An automatic question answering method is characterized by comprising the following steps:
obtaining question sentences of a user and a candidate question set, wherein the question sentences in the candidate question set correspond to preset answer information;
determining question sentences matched with the question sentences of the user from the candidate question set by using a preset question model as target question sentences; the problem model is obtained by training an attention neural network by taking a historical problem statement as a training sample;
and responding to the question sentence of the user by adopting answer information corresponding to the target question sentence.
2. The method of claim 1, wherein obtaining the user question statement and the set of candidate questions comprises:
acquiring a question sentence of a user;
adopting BM25 algorithm to retrieve at least one question sentence associated with the user question sentence from a preset question library as a candidate question set; and all question sentences in the preset question bank correspond to preset answer information.
3. The method of claim 1, wherein the identifying a question statement from the candidate question set that matches the user question statement using a preset question model as a target question statement comprises:
acquiring the similarity between the question sentences of the user and the question sentences in the candidate question set by using a preset question model;
and determining the question sentences of which the similarity accords with a preset similarity condition in the candidate question set as question sentences matched with the question sentences of the user as target question sentences.
4. The method of claim 1, wherein the training process of the problem model comprises:
obtaining a training sample, wherein the training sample comprises a sample question statement;
and training an attention neural network by adopting the sample question sentence to obtain the question model.
5. The method of claim 4, wherein the obtaining training samples comprises:
acquiring a first question sentence;
acquiring a second question statement with the same semantic as the first question statement and a third question statement with the different semantic from the first question statement;
taking the second question statement as a positive sample of the first question statement, and taking the third question statement as a negative sample of the first question statement; wherein the positive sample and the negative sample are training samples.
6. The method of claim 4, wherein the training an attention neural network with the sample question statements to obtain the question model comprises:
obtaining word vectors corresponding to all words in the sample question sentences;
extracting a feature vector corresponding to each moment in the sample question sentence by utilizing a recurrent neural network with a bidirectional gate structure according to the word vector;
determining weight information corresponding to each moment in the sample question sentence by using an attention mechanism;
determining sentence vectors corresponding to the sample question sentences according to the feature vectors and the weight information corresponding to all the moments in the sample question sentences;
and determining network parameters of the attention neural network according to the sentence vectors to obtain the problem model.
7. The method of claim 6, wherein prior to said training an attention neural network with said sample question statement, said method further comprises:
acquiring a text sentence of a target field;
performing word segmentation processing on the text sentence to obtain a word segmentation result of the text sentence;
performing word vector training on the word segmentation result of the text sentence to obtain a word vector model;
accordingly, the method can be used for solving the problems that,
the obtaining of the word vector corresponding to each word in the sample question sentence includes:
performing word segmentation processing on the sample question sentence to obtain a word segmentation result of the sample question sentence;
and obtaining a word vector corresponding to each word in the sample question sentence according to the word vector model.
8. An automatic question answering device, comprising:
the question acquisition unit is used for acquiring question sentences of a user and a candidate question set, wherein the question sentences in the candidate question set correspond to preset answer information;
a problem determining unit, configured to determine, from the candidate problem set, a problem statement matched with the user problem statement as a target problem statement by using a preset problem model; the problem model is obtained by training an attention neural network by taking a historical problem statement as a training sample;
and the question responding unit is used for responding to the question sentences of the user by adopting answer information corresponding to the target question sentences.
9. A storage medium characterized by comprising a stored program, wherein a device on which the storage medium is located is controlled to execute the automatic question-answering method according to any one of claims 1 to 7 when the program is executed.
10. A processor, characterized in that the processor is configured to run a program, wherein the program is configured to execute the auto question-answering method according to any one of claims 1 to 7 when running.
CN201810502726.3A 2018-05-23 2018-05-23 Automatic question answering method and device Pending CN110597966A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810502726.3A CN110597966A (en) 2018-05-23 2018-05-23 Automatic question answering method and device
PCT/CN2019/073662 WO2019223362A1 (en) 2018-05-23 2019-01-29 Automatic answering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810502726.3A CN110597966A (en) 2018-05-23 2018-05-23 Automatic question answering method and device

Publications (1)

Publication Number Publication Date
CN110597966A true CN110597966A (en) 2019-12-20

Family

ID=68616527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810502726.3A Pending CN110597966A (en) 2018-05-23 2018-05-23 Automatic question answering method and device

Country Status (2)

Country Link
CN (1) CN110597966A (en)
WO (1) WO2019223362A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111666376A (en) * 2020-05-21 2020-09-15 武汉大学 Answer generation method and device based on paragraph boundary scan prediction and word shift distance cluster matching
CN111666770A (en) * 2020-06-02 2020-09-15 泰康保险集团股份有限公司 Semantic matching method and device
CN111737424A (en) * 2020-02-21 2020-10-02 北京沃东天骏信息技术有限公司 Question matching method, device, equipment and storage medium
CN111984763A (en) * 2020-08-28 2020-11-24 海信电子科技(武汉)有限公司 Question answering processing method and intelligent equipment
CN112380328A (en) * 2020-11-11 2021-02-19 广州知图科技有限公司 Safety emergency response robot interaction method and system
CN113268572A (en) * 2020-02-14 2021-08-17 华为技术有限公司 Question answering method and device
CN113486165A (en) * 2021-07-08 2021-10-08 山东新一代信息产业技术研究院有限公司 FAQ automatic question answering method, equipment and medium for cloud robot
CN113553412A (en) * 2021-06-30 2021-10-26 北京百度网讯科技有限公司 Question and answer processing method and device, electronic equipment and storage medium
CN114691815A (en) * 2020-12-25 2022-07-01 科沃斯商用机器人有限公司 Model training method and device, electronic equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080052078A1 (en) * 1999-11-12 2008-02-28 Bennett Ian M Statistical Language Model Trained With Semantic Variants
US20170039271A1 (en) * 2015-08-03 2017-02-09 International Business Machines Corporation Scoring Automatically Generated Language Patterns for Questions using Synthetic Events
CN106484664A (en) * 2016-10-21 2017-03-08 竹间智能科技(上海)有限公司 Similarity calculating method between a kind of short text
CN106649868A (en) * 2016-12-30 2017-05-10 首都师范大学 Method and device for matching between questions and answers
CN107358948A (en) * 2017-06-27 2017-11-17 上海交通大学 Language in-put relevance detection method based on attention model
CN107545003A (en) * 2016-06-28 2018-01-05 中兴通讯股份有限公司 Automatic question-answering method and system
CN107967302A (en) * 2017-11-08 2018-04-27 江苏名通信息科技有限公司 Game customer service conversational system based on deep neural network
CN107980130A (en) * 2017-11-02 2018-05-01 深圳前海达闼云端智能科技有限公司 It is automatic to answer method, apparatus, storage medium and electronic equipment
CN108052588A (en) * 2017-12-11 2018-05-18 浙江大学城市学院 A kind of construction method of the document automatically request-answering system based on convolutional neural networks

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106649612B (en) * 2016-11-29 2020-05-01 ***股份有限公司 Method and device for automatically matching question and answer templates

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080052078A1 (en) * 1999-11-12 2008-02-28 Bennett Ian M Statistical Language Model Trained With Semantic Variants
US20170039271A1 (en) * 2015-08-03 2017-02-09 International Business Machines Corporation Scoring Automatically Generated Language Patterns for Questions using Synthetic Events
CN107545003A (en) * 2016-06-28 2018-01-05 中兴通讯股份有限公司 Automatic question-answering method and system
CN106484664A (en) * 2016-10-21 2017-03-08 竹间智能科技(上海)有限公司 Similarity calculating method between a kind of short text
CN106649868A (en) * 2016-12-30 2017-05-10 首都师范大学 Method and device for matching between questions and answers
CN107358948A (en) * 2017-06-27 2017-11-17 上海交通大学 Language in-put relevance detection method based on attention model
CN107980130A (en) * 2017-11-02 2018-05-01 深圳前海达闼云端智能科技有限公司 It is automatic to answer method, apparatus, storage medium and electronic equipment
CN107967302A (en) * 2017-11-08 2018-04-27 江苏名通信息科技有限公司 Game customer service conversational system based on deep neural network
CN108052588A (en) * 2017-12-11 2018-05-18 浙江大学城市学院 A kind of construction method of the document automatically request-answering system based on convolutional neural networks

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
裔隽等: "《Python机器学习实战》", 31 March 2018, 《北京:科学技术文献出版社》 *
辛阳等: "《大数据技术原理与实践》", 31 January 2018, 《北京:北京邮电大学出版社》 *
邓憧: "基于CNN语义匹配的自动问答***构建方法研究", 《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113268572A (en) * 2020-02-14 2021-08-17 华为技术有限公司 Question answering method and device
CN111737424A (en) * 2020-02-21 2020-10-02 北京沃东天骏信息技术有限公司 Question matching method, device, equipment and storage medium
CN111666376A (en) * 2020-05-21 2020-09-15 武汉大学 Answer generation method and device based on paragraph boundary scan prediction and word shift distance cluster matching
CN111666376B (en) * 2020-05-21 2023-07-18 武汉大学 Answer generation method and device based on paragraph boundary scan prediction and word shift distance cluster matching
CN111666770A (en) * 2020-06-02 2020-09-15 泰康保险集团股份有限公司 Semantic matching method and device
CN111984763A (en) * 2020-08-28 2020-11-24 海信电子科技(武汉)有限公司 Question answering processing method and intelligent equipment
CN111984763B (en) * 2020-08-28 2023-09-19 海信电子科技(武汉)有限公司 Question answering processing method and intelligent device
CN112380328A (en) * 2020-11-11 2021-02-19 广州知图科技有限公司 Safety emergency response robot interaction method and system
CN112380328B (en) * 2020-11-11 2024-02-06 广州知图科技有限公司 Interaction method and system for safety emergency response robot
CN114691815A (en) * 2020-12-25 2022-07-01 科沃斯商用机器人有限公司 Model training method and device, electronic equipment and storage medium
CN113553412A (en) * 2021-06-30 2021-10-26 北京百度网讯科技有限公司 Question and answer processing method and device, electronic equipment and storage medium
CN113553412B (en) * 2021-06-30 2023-07-25 北京百度网讯科技有限公司 Question-answering processing method, question-answering processing device, electronic equipment and storage medium
CN113486165A (en) * 2021-07-08 2021-10-08 山东新一代信息产业技术研究院有限公司 FAQ automatic question answering method, equipment and medium for cloud robot

Also Published As

Publication number Publication date
WO2019223362A1 (en) 2019-11-28

Similar Documents

Publication Publication Date Title
CN110597966A (en) Automatic question answering method and device
CN110377911B (en) Method and device for identifying intention under dialog framework
CN110427463B (en) Search statement response method and device, server and storage medium
CN110442718B (en) Statement processing method and device, server and storage medium
CN110781687B (en) Same intention statement acquisition method and device
CN114625858A (en) Intelligent government affair question-answer replying method and device based on neural network
CN110717021A (en) Input text and related device for obtaining artificial intelligence interview
CN110968776A (en) Policy knowledge recommendation method, device storage medium and processor
CN116150306A (en) Training method of question-answering robot, question-answering method and device
CN109522920B (en) Training method and device of synonymy discriminant model based on combination of semantic features
CN109408175B (en) Real-time interaction method and system in general high-performance deep learning calculation engine
CN112667803A (en) Text emotion classification method and device
CN111881264B (en) Method and electronic equipment for searching long text in question-answering task in open field
CN111563381A (en) Text processing method and device
CN116680368B (en) Water conservancy knowledge question-answering method, device and medium based on Bayesian classifier
CN111708870A (en) Deep neural network-based question answering method and device and storage medium
CN116028626A (en) Text matching method and device, storage medium and electronic equipment
CN110287396A (en) Text matching technique and device
Karpagam et al. Deep learning approaches for answer selection in question answering system for conversation agents
CN110851597A (en) Method and device for sentence annotation based on similar entity replacement
CN110852103A (en) Named entity identification method and device
CN114239555A (en) Training method of keyword extraction model and related device
CN115809663A (en) Exercise analysis method, exercise analysis device, exercise analysis equipment and storage medium
CN113536790A (en) Model training method and device based on natural language processing
CN112579768A (en) Emotion classification model training method, text emotion classification method and text emotion classification device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191220

RJ01 Rejection of invention patent application after publication