CN114065770B - Method and system for constructing semantic knowledge base based on graph neural network - Google Patents

Method and system for constructing semantic knowledge base based on graph neural network Download PDF

Info

Publication number
CN114065770B
CN114065770B CN202210046579.XA CN202210046579A CN114065770B CN 114065770 B CN114065770 B CN 114065770B CN 202210046579 A CN202210046579 A CN 202210046579A CN 114065770 B CN114065770 B CN 114065770B
Authority
CN
China
Prior art keywords
semantic
network
node
semantic knowledge
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210046579.XA
Other languages
Chinese (zh)
Other versions
CN114065770A (en
Inventor
邹华
姚军
王楠
丁原
徐志国
宋永生
李军
郭晓华
周红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiang Sushengdanganguan
Jiangsu United Industrial Ltd By Share Ltd
Original Assignee
Jiangsu United Industrial Ltd By Share Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu United Industrial Ltd By Share Ltd filed Critical Jiangsu United Industrial Ltd By Share Ltd
Priority to CN202210046579.XA priority Critical patent/CN114065770B/en
Publication of CN114065770A publication Critical patent/CN114065770A/en
Application granted granted Critical
Publication of CN114065770B publication Critical patent/CN114065770B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Machine Translation (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a method and a system for constructing a semantic knowledge base based on a graph neural network, wherein the method comprises the following steps: obtaining first large-scale corpus information; an object extractor based on an NLP technology identifies and extracts text objects from the first large-scale corpus information to obtain a first text object set; constructing an object hierarchical network; semantic meta-analysis is carried out on each thing in the first text thing set according to the thing hierarchical network, and a first semantic meta-set is constructed; networking the first semantic element set by using a graph neural network to obtain a first semantic knowledge network; acquiring a second semantic knowledge network by performing reinforcement learning on the first semantic knowledge network; and outputting the second semantic knowledge network as a semantic knowledge base. The technical problems that a multilevel semantic knowledge base is not perfect and the automation level is weak in the construction of a large-scale corpus in the prior art are solved.

Description

Method and system for constructing semantic knowledge base based on graph neural network
Technical Field
The invention relates to the field related to semantic recognition, in particular to a method and a system for constructing a semantic knowledge base based on a graph neural network.
Background
At present, with the deep internet technology and the continuous development of artificial intelligence technology, some computing methods are formed in the aspect of language understanding, and a semantic knowledge base is a medium and a carrier for understanding text content and is widely applied to various fields, so that the intelligent degree of computer execution is improved, the perfect semantic knowledge base can promote the data processing of a computer to be more intelligent, and the semantic knowledge base is still a research hotspot as an important part of a computer intelligent development language processing system.
However, the technical problems of insufficient perfection and weak automation level of constructing a multi-level semantic knowledge base from a large-scale corpus exist in the prior art.
Disclosure of Invention
Aiming at the defects in the prior art, the method and the system for building the semantic knowledge base based on the graph neural network solve the technical problems that a multi-level semantic knowledge base is not perfect and the automation level is low in the prior art, and achieve the technical effect of automatically building the multi-level semantic knowledge base by automatically building a semantic knowledge network which takes objects as nodes and relations between the objects as edges.
In one aspect, the present application provides a method for constructing a semantic knowledge base based on a graph neural network, the method including: obtaining first large-scale corpus information; an object extractor based on an NLP technology identifies and extracts text objects from the first large-scale corpus information to obtain a first text object set; constructing an object hierarchical network; semantic meta-analysis is carried out on each thing in the first text thing set according to the thing hierarchical network, and a first semantic meta-set is constructed; networking the first semantic element set by using a graph neural network to obtain a first semantic knowledge network; acquiring a second semantic knowledge network by performing reinforcement learning on the first semantic knowledge network; and outputting the second semantic knowledge network as a semantic knowledge base form, and inputting the semantic knowledge base into the object hierarchical network to form closed loop compensation.
In another aspect, the present application further provides a system for building a semantic knowledge base based on a graph neural network, where the system includes: a first obtaining unit, configured to obtain first large-scale corpus information; a second obtaining unit, configured to perform text object identification and extraction from the first large-scale corpus information by using an object extractor based on an NLP technology, so as to obtain a first text object set; a first construction unit, configured to construct an object-level network; the second construction unit is used for carrying out semantic meta-analysis on each object in the first text object set according to the object hierarchical network to construct a first semantic meta-set; the first networking unit is used for networking the first semantic element set by utilizing a graph neural network to obtain a first semantic knowledge network; the first reinforcement unit is used for carrying out reinforcement learning on the first semantic knowledge network to obtain a second semantic knowledge network; and the first output unit is used for outputting the second semantic knowledge network as a semantic knowledge base and inputting the semantic knowledge base into the object hierarchical network to form closed-loop compensation.
In a third aspect, the present application provides a system for building a semantic knowledge base based on a graph neural network, including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the method according to any one of the first aspect when executing the program.
One or more technical solutions provided in the present application have at least the following technical effects or advantages:
because the sentence semantic annotation is carried out on the first large-scale corpus information, the object extraction rule is constructed based on the information of the sentence semantic annotation and the neural network deep learning technology, the object classification rule of the object extractor is configured, the object information is extracted on a two-dimensional level according to the object classification rule, and the logicalization and the information integrity of semantic element construction are improved.
The method comprises the steps of analyzing event behavior hierarchy and event property hierarchy key information of a first event node event respectively, determining branch nodes constructed aiming at a certain event semantic element, and determining the constructed positions of the semantic elements according to the time sequence of event behavior bearing key information and an environment variable set, thereby realizing the structural construction of the semantic elements.
Because the first semantic knowledge network which is well organized is adopted for reinforcement learning, and a loss function is introduced for analysis, the second semantic knowledge network which is stable can be output, and the part of the corresponding semantic knowledge base which exceeds the definition of the object level network is supplemented to the object level network in reverse, so that the interpretation range of the object level network is expanded.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a schematic flow chart of a method for building a semantic knowledge base based on a graph neural network according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of building an object hierarchy network based on a graph neural network building semantic knowledge base according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating a process of obtaining a first text object set by constructing a semantic knowledge base based on a graph neural network according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of building a first semantic element for building a semantic knowledge base based on a graph neural network according to an embodiment of the present application;
FIG. 5 is a schematic structural diagram of a first semantic element for building a semantic knowledge base based on a graph neural network according to an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a system for building a semantic knowledge base based on a graph neural network according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an exemplary electronic device according to an embodiment of the present application.
Detailed Description
The embodiment of the application solves the technical problems that a multilevel semantic knowledge base is not perfect and the automation level is low in the prior art by providing a method and a system for constructing the semantic knowledge base based on a graph neural network, and achieves the technical effect of automatically constructing the multilevel semantic knowledge base by automatically constructing the semantic knowledge network with objects as nodes and relationships among the objects as edges.
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are merely some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited to the example embodiments described herein.
According to the technical scheme, the data acquisition, storage, use, processing and the like meet relevant regulations of national laws and regulations.
At present, most of knowledge bases or knowledge maps in the academic world and the industry take ontologies as theoretical bases, that is, all of them are networks formed by taking concepts as nodes and taking the relationship between the concepts and the concepts as edges. Such knowledge-graphs are good at dealing with static concepts and their interrelationships, and have many uses in search engines and relevance recommendations, but are not focused on dealing with semantics that are centered around things (i.e., combinations of concepts and behaviors). In the industry, a semantic knowledge network taking objects as nodes and mutual relations between the objects as edges is rarely seen, but the semantic knowledge network automatically constructed from large-scale text corpora by using a computer is not seen. The invention provides a semantic knowledge network which automatically constructs a semantic knowledge network by using objects as nodes and relationships between the objects as edges from a large-scale text corpus by using a graph neural network.
In view of the above technical problems, the technical solution provided by the present application has the following general idea:
the method is a semantic knowledge network with things and mutual relations of things as sides, and the semantic knowledge network is automatically constructed from large-scale text corpora by using a computer. The method is characterized in that a graph neural network mode of judging a central node is realized by fully utilizing information of all nodes connected with the central node and information of edges connected with the nodes, a semantic knowledge network with objects as nodes and relationships between the objects as edges is automatically constructed from a large-scale text corpus, wherein the information of the edges between the objects can be obtained from key information of the objects identified by an object extractor based on NLP technology, and can also be obtained from an object hierarchical network modified by a Chinese classification dictionary. The object level network is used as a priori knowledge support graph neural network group graph, unknown object nodes are added in the process of the group graph, and the new object nodes are used for expanding the object level network after the reasonability of the unknown object nodes is confirmed through a neighbor matrix. The semantic knowledge network constructed by the method can effectively support analysis or induction of object layers, judgment of object properties and inference of object processes.
For better understanding of the above technical solutions, the following detailed descriptions will be provided in conjunction with the drawings and the detailed description of the embodiments.
Example one
As shown in fig. 1, an embodiment of the present application provides a method for building a semantic knowledge base based on a graph neural network, where the method includes:
step S100: obtaining first large-scale corpus information;
specifically, the first large-scale corpus information is information obtained from a corpus, the corpus comprises a plurality of corpus collection types, and corpus information with a large corpus magnitude is obtained by setting a corpus object domain, wherein the corpus stores linguistic materials of which languages appear in actual use, and the linguistic materials are processed and analyzed through an electronic text library, so that relevant language theory and application are developed by means of the corpus, and corpus screening can be performed on the information in the corpus by constructing a corpus screening unit, and the accuracy and effectiveness of obtaining the first large-scale corpus information are ensured.
Step S200: an object extractor based on an NLP technology identifies and extracts text objects from the first large-scale corpus information to obtain a first text object set;
specifically, the Natural Language Processing (NLP) technology is a Language information Processing technology that allows a computer to understand and process the first large-scale corpus information, and therefore, an object extractor based on the NLP technology performs text object identification and extraction from the first large-scale corpus to obtain the first text object set, where the first text object set includes object core elements, object key information, and object environment variables, and the first text object set constitutes side information between objects, and can be obtained from object key information identified by the object extractor based on the NLP technology, or can be obtained from an object hierarchical network modified from a chinese classification dictionary, so that the first large-scale corpus can be logically output for object identification.
In detail, the continuous character strings of the sentences in the first large-scale corpus information are input into an object extractor for word segmentation and part-of-speech tagging, semantic tagging is performed on the sentences according to the information after segmentation and part-of-speech tagging, so that entities and compound words are named and identified, object environment variables are identified according to the identified information, and sentence pattern logic trees are identified according to the identified environment variable sentences, so that the core elements and the key information of the objects in the sentences are obtained. In the NLP process, semantic annotation replaces part-of-speech annotation, and a neural network deep learning technology is utilized to obtain all elements of things in sentences and a sentence semantic logic tree, so that thing information is identified and extracted, the first text thing set is obtained, the thing set is extracted in a targeted mode, and the subsequent effective analysis of data is facilitated.
Step S300: constructing an object hierarchical network;
further, as shown in fig. 2, step S300 in the embodiment of the present application further includes:
step S310: constructing object classification rules, wherein the object classification rules comprise a first classification rule and a second classification rule, wherein the first classification rule is a narrow classification, and the second classification rule is a broad classification;
step S320: constructing an object narrow-sense hierarchical network according to the first classification rule;
step S330: constructing an object generalized hierarchical network according to the second classification rule;
step S340: and constructing the object hierarchical network according to the object narrow hierarchical network and the object generalized hierarchical network.
Specifically, the object classification rule is a rule for classifying objects in two dimensions by defining the first large-scale corpus information, and the object classification rule can reconstruct the object classification of the chinese classification dictionary into an object narrow-level network and an object wide-level network. Where "classification" is distinguished from "hierarchical network" in that the branches of "classification" are independent of each other, whereas the branches of "hierarchical network" are not independent of each other, i.e.: there are things and processes in which one node of the hierarchical network belongs to different branches and even to different layers. By "narrowly defined hierarchy" is meant the hierarchy formed by nesting between behavioral processes, namely: a process consists of a series of steps, one of which may be a step of another. By "generalized hierarchy" is meant a hierarchical relationship between an abstract description of the nature of a thing and a concrete instance. For example, events that affect social stability (abstract qualitative descriptions) include concrete examples that cause social disputes, compromise national confidence, and so on.
The object hierarchical network is constructed according to the object narrow hierarchical network and the object generalized hierarchical network, the object hierarchical network can be supplemented and perfected, and event information of different side dimensions is further provided for a semantic knowledge network, so that construction quality of the semantic knowledge network is improved, and application adaptability of the semantic knowledge network is expanded.
Step S400: semantic meta-analysis is carried out on each thing in the first text thing set according to the thing hierarchical network, and a first semantic meta-set is constructed;
specifically, the first semantic element set includes a plurality of semantic elements, that is, a first semantic element, a second semantic element …, an nth semantic element, and the like, and each semantic element in the first semantic element set has a first structure, and for each thing obtained from the first large-scale corpus information, a corresponding semantic element (or referred to as a neuron) is generated, and the semantic element performs node processing on the thing determined from the first text thing set, where, in the same hierarchy in the first semantic element set, things that do not belong to the same process are independent of each other. The method comprises the steps of determining a central node of an object, determining nodes connected with the central node according to behaviors and properties of the object, defining the position of the node in a process sequence of the peer object according to analysis of a time sequence, forming a semantic element by a first object node and all connected nodes according to a certain connection sequence, and constructing a first semantic element set according to all object nodes.
Step S500: networking the first semantic element set by using a graph neural network to obtain a first semantic knowledge network;
specifically, the process of networking the first semantic element set is to perform node connection on all semantic elements in the first semantic element set, and in order that the first semantic knowledge network has the characteristic of dynamic automatic networking after networking, that is, the networking process is a process of assembling a graph, nodes are added to the graph or the position of an existing node is adjusted, each time the graph is assembled is called as an iteration, and the semantic element state changes once being adjusted. In detail, the state of a semantic element (node v) is defined by
Figure DEST_PATH_IMAGE001
Is expressed by the formula
Figure DEST_PATH_IMAGE002
Wherein the state of the semantic element (v) represents
Figure 93847DEST_PATH_IMAGE001
(ii) related to the following elements (f): the characteristics of the semantic element itself
Figure DEST_PATH_IMAGE003
(ii) a The semantic element leads to an edge adjacent to the semantic element
Figure DEST_PATH_IMAGE004
(i.e., the connection with the concept or thing at the top, the step of the concept or thing at the bottom, the concept or thing at the same level at the left and right, [ v ]]More than one meaning); state representation of adjacent semantic elements
Figure DEST_PATH_IMAGE005
(ii) a And features of adjacent semantic elements
Figure DEST_PATH_IMAGE006
Now a new concrete semantic element is converted into a numerical state (determined by the above-mentioned series of elements)
Figure 128536DEST_PATH_IMAGE001
This transfer function is f.
The collective representation of all semantic elements in this semantic knowledge graph (all
Figure 208488DEST_PATH_IMAGE001
) Is H, namely:
Figure DEST_PATH_IMAGE007
where N is the total number of all semantic elements in the graph. Similarly, the collective representation of all node features is X. The conversion function of all semantic elements is F, and the set representation of F is F. The set of nodes represents the inclusion of all nodes, the set of equations above represents,
Figure DEST_PATH_IMAGE008
the first step of the graph neural network learning process is a group graph process, each group graph is called an iteration, each iteration is performed as follows,
finding superior nodes through hierarchical information or an object hierarchical network in the object key information, adding the superior nodes into a node group with the same superior node, and if the same semantic element exists in the group, not operating;
comparing the sequence or time with adjacent nodes according to the time quantum in the key information or the environment variable of the object sequence, if the sequence or the time sequence is correct, the position is not adjusted, and if the sequence is incorrect, the position of the node in the graph is adjusted until the sequence or the time sequence is correct;
if a new node has no hierarchical information and no sequential or time sequence information, the node is added into the semantic knowledge network as an independent node, has no connecting edge with other nodes, and is aligned and disambiguated with other newly added nodes. If the new node aligned with the new node carries the hierarchy, order or timing information, repeating the steps 1 and 2, and adjusting the node;
and if the object behavior of a new node is not within the range defined by the object hierarchy network, the node is regarded as an unknown node. One thing is composed of an entity concept and a behavior sent by the entity concept, and if the behavior is unknown, the thing is unknown, even if the entity concept is known; and the entity concept is unknown, the behavior is known, unknown things can not be counted, and only one kind of things (of the default entity concept) can be counted. For example, "shoot" is an object, we only know that the action is shooting, and anybody who shoots is the same object. Of course, the effects of the same kind of things produced by different entity concepts may be very different, and thus the nature of things is classified differently, and their locations in the hierarchical network of things may also be different. And aiming at the unknown nodes, adding a semantic knowledge network by taking the names of things and behaviors as the names of the whole nodes. If the node has hierarchy information of key information and order information or time sequence information of environment variables, repeating the steps 1 and 2 to adjust the node; if the node does not have hierarchical, sequential or timing information, step 3 is repeated.
The semantic meta-state changes every time the semantic meta-state is adjusted, so the aggregate representation of the semantic meta-state is related to time, and the formula is as follows:
Figure DEST_PATH_IMAGE009
namely: the next (t +1) state of the semantic element set (b:)
Figure 686743DEST_PATH_IMAGE010
) With the current (t) state of the semantic element set (
Figure 16093DEST_PATH_IMAGE011
) Is associated (F) with the set (X) of all node features.
Step S600: acquiring a second semantic knowledge network by performing reinforcement learning on the first semantic knowledge network;
specifically, the second step of the neural network learning process is an unsupervised reinforcement learning process, namely: and (5) through repeated iteration, a learning target is achieved. The goal of reinforcement learning is to clearly sort all object processes in the semantic knowledge network and the narrow and broad levels thereof, and semantic elements have integrity and consistency without internal conflict. The optimized semantic elements and network structures meeting the requirements are solidified to a semantic knowledge base in a physical form, and nodes in the semantic knowledge base are composed of
Figure 746151DEST_PATH_IMAGE012
Is expressed by the formula
Figure 431342DEST_PATH_IMAGE013
Namely: whether a semantic element is a node in a reasonable semantic knowledge base
Figure 752602DEST_PATH_IMAGE012
State with this semantic element
Figure 772510DEST_PATH_IMAGE001
And the characteristics of this semantic element
Figure 289948DEST_PATH_IMAGE003
(ii) related to (g). The set of this formula is expressed as,
Figure 78913DEST_PATH_IMAGE014
the reward mechanism of reinforcement learning is a no-operation strategy in iteration, namely: if the same semantic element exists in the correct hierarchy, the operation is not performed, and if the same semantic element exists in the correct position of the object process (step sequence), the operation is not performed. This enables semantic Web State at time t +1
Figure 39915DEST_PATH_IMAGE010
From time t
Figure 547120DEST_PATH_IMAGE011
The difference is minimal. In the reinforcement learning process, all steps of the same object are gradually aggregated under the same object name, and the incomplete object process of one step is gradually supplemented through the supplement and repeated adjustment of a large number of semantic elements. When the number of the object processes which can be found is not changed no matter how many nodes are added and how many times of iteration are performed, the step sequence of each found process is not changed, the hierarchy is not changed, and then a stable semantic knowledge network, namely the second semantic knowledge network, can be input.
Step S700: and outputting the second semantic knowledge network as a semantic knowledge base form, and inputting the semantic knowledge base into the object hierarchical network to form closed loop compensation.
Specifically, the second semantic knowledge network is a stabilized semantic knowledge network after reinforcement learning, and therefore, the second semantic knowledge network is output as a form of a semantic knowledge base. On the other hand, the second semantic knowledge network may be returned and input to the step of S300 to optimize the object-level network, and in detail, the object-level network may be supplemented by reversing the part of the semantic knowledge base beyond the definition of the object-level network, so as to expand the interpretation range of the object-level network. Due to the addition of the unknown semantic elements, the number of nodes covered by the semantic knowledge network may exceed the range of the definition domain of the object-level network, and the excess part can be fed back to the object-level network, so that the definition domain of the object-level network is expanded, and the analysis capability of the object-level network on new semantic elements is improved. And further, the technical effects of improving the quality of constructing the semantic knowledge network and automatically constructing a multi-level semantic knowledge base can be achieved according to the formed closed loop compensation.
Further, as shown in fig. 3, the object extractor based on the NLP technology performs text object identification and extraction from the first large-scale corpus information to obtain a first text object set, and step S200 in this embodiment of the present application further includes:
step S210: constructing an object extraction rule based on the sentence semantic labeling information and a neural network deep learning technology by performing sentence semantic labeling on the first large-scale corpus information, wherein the object extraction rule comprises a first unit and a second unit, the first unit is an environment identification unit, and the second unit is an element identification unit;
step S220: obtaining a first set of environmental variables according to the first unit;
step S230: the second unit identifies object core elements and event key information of the first large-scale corpus according to the sentence pattern logic tree to obtain a first core element set and a first key information set;
step S240: and generating the first text object set according to the first core element set, the first key information set and the first environment variable set.
Specifically, in the NLP process, by using a neural network deep learning technique to obtain each element of things in sentences and a sentence semantic logic tree by replacing part-of-speech tagging with semantic tagging, the thing (event) information "event" is identified and extracted as things that occur in specific environmental variables, for example, kicking is an thing, and kicking at a certain course on a certain day is an event. The object extraction rule is an extraction rule of an object extractor, the object extractor labels sentences through part of speech labeling and character segmentation, and all event information is recognized according to the meaning of the sentence labels.
Further, the thing extraction rule includes a first unit and a second unit, and the first unit is configured to identify and extract environment variables, where the environment variables include time, space, number, reason (condition), and result (effect) of occurrence of an event, thereby obtaining the first environment variable set. The second unit is used for extracting core elements and key information, wherein the core elements comprise subjects (behavior senders), objects (behavior receivers), resources (event participants) and behaviors; the key information comprises composition, attribute, capability and relation (including hierarchy) of a subject, an object and a resource, and also comprises the property and the hierarchy of the property of the whole object, and the mode, the direction and the relation (including order, hierarchy and the like) of the object behavior, and further the first core element set and the first key information set are obtained according to the first unit, so that the first text set is generated.
Further, as shown in fig. 4, step S240 in the embodiment of the present application further includes:
step S241: determining a first event node by performing class node alignment and disambiguation processing on the first core element set and the first key information set;
step S242: obtaining a first connection node group by analyzing the event behavior hierarchy key information of the first event node and the narrow-meaning hierarchical network;
step S243: analyzing the event property hierarchical key information of the first event node and the generalized hierarchical network to obtain a second connection node group;
step S244: and constructing a first semantic element by taking the first event node as a center and taking the first connecting node group and the second connecting node group as branches, wherein the first semantic element set comprises the first semantic element.
Further, step S244 in this embodiment of the present application further includes:
step S2441: determining the node position of the first connection node group by analyzing the transaction behavior compliance key information of the first event node, wherein the first connection node group comprises an upper layer name node and a lower layer name node;
step S2442: analyzing according to the time sequence of the first environment variable set, and determining the node position of the second connecting node group, wherein the second connecting node group comprises a former step node and a latter step node;
step S2443: and constructing the first semantic element according to the node position of the first connecting node group and the node position of the first connecting node group.
Specifically, the first event node is a central node for constructing a first semantic element, and the event node and the branch node of the first semantic element are determined first, and the branch nodes are connected according to a certain position arrangement order, so that the first semantic element is constructed. Because the semantic knowledge network is a directed graph taking objects as nodes, the objects which do not belong to the same process are mutually independent in the same level. Firstly, aiming at objects extracted from a large-scale text corpus, aligning or disambiguating the objects with other nodes through core elements and key information of the objects; then, searching the upper (or lower) object process node through the key information of the object behavior hierarchy and the narrow-sense hierarchical network; and then searching the upper (or lower) classification nodes of the object through the key information of the object property hierarchy and the generalized hierarchical network. Thereby enabling the determination of the event node and branch node of the first semantic element.
Further, as shown in fig. 5, the positions of all branch nodes in the peer-level object process sequence are defined by the object behavior compliance key information and the time sequence of the environment variable, wherein the object behavior compliance key information defines the upper-layer name node and the lower-layer name node of the event node, and the time sequence of the environment variable defines the former-step node and the latter-step node of the event node, so that a semantic element can be formed according to the node to which the object node is connected. Therefore, high-quality construction and layout of semantic elements can be realized through bidirectional analysis of time nodes, and the quality of automatically constructing a semantic knowledge network is improved.
Further, the step S600 of the embodiment of the present application further includes that a second semantic knowledge network is obtained by performing reinforcement learning on the first semantic knowledge network:
step S610: obtaining a first neighbor matrix of a first newly added node, wherein the first neighbor matrix is a matrix formed by the first newly added node and weights of all connection point edges;
step S620: obtaining N newly added nodes;
step S630: obtaining N neighbor matrixes according to the N newly added nodes;
step S640: and performing sequence position matching on the N nodes input in real time according to the N neighbor matrixes to obtain the second semantic knowledge network.
Further, a gradient function L is constructed for iteration to obtain the second semantic knowledge network, where the gradient function L is:
Figure 353402DEST_PATH_IMAGE015
wherein t is the current state; t +1 is the next state;
Figure 262452DEST_PATH_IMAGE010
representing a next state of the semantic element set;
Figure 879510DEST_PATH_IMAGE011
representing the current state of the semantic element collection.
Specifically, because the object-level network is used as a priori knowledge support graph neural network group graph, unknown object nodes are added in the process of group graph, and after the reasonableness of the unknown object nodes is confirmed through side information (neighbor matrix), the object-level network is expanded by using the new object nodes. And performing reinforcement learning on the first semantic knowledge network so as to obtain a stable semantic knowledge network and increase the usability of the semantic knowledge network. The reinforcement mechanism of reinforcement learning is to obtain a neighbor matrix S of each new node under the support of object key information and an object hierarchical network, namely: the node and the weights of all the connected node edges form a matrix. And by utilizing the matching of the neighbor matrix S, the new node is quickly positioned on the correct level and the correct sequence position, and the operation of each iteration of the semantic knowledge network is minimized.
In the reinforcement learning process, all steps of the same object are gradually aggregated under the same object name, and the incomplete object process of one step is gradually supplemented through the supplement and repeated adjustment of a large number of semantic elements; the establishment and concatenation of the sequence of sequential or time vectors can also clearly comb all steps of a process of things, so that by continuously iterating learning, the sequence of process of things can be discovered one by one. Each iteration either adds a new node to the graph, adjusts the position of an existing node in the graph, or both.
Can use
Figure 874010DEST_PATH_IMAGE016
To construct a gradient function (loss function) L, i.e.:
Figure 749563DEST_PATH_IMAGE017
the objective of the iterative learning is to make the gradient function zero. In an ideal case, when no new node can be added (the found object processes are all found and all nodes of each process are filled up) and no existing node can be adjusted in position (i.e. all object process steps are correct in order and all layers are correct), the gradient function is zero, that is: no matter how many nodes are added and iterated for many times, the number of the processes of the things which can be found by people is not changed, the step sequence of each found process is not changed, the hierarchy is not changed, at this time,
Figure 778698DEST_PATH_IMAGE018
when the learning process reaches the state, a stable semantic knowledge network and a semantic knowledge base are obtained.
Compared with the prior art, the invention has the following beneficial effects:
1. because the sentence semantic annotation is carried out on the first large-scale corpus information, the object extraction rule is constructed based on the information of the sentence semantic annotation and the neural network deep learning technology, the object classification rule of the object extractor is configured, the object information is extracted on a two-dimensional level according to the object classification rule, and the logicalization and the information integrity of semantic element construction are improved.
2. The method comprises the steps of analyzing event behavior hierarchy and event property hierarchy key information of a first event node event respectively, determining branch nodes constructed aiming at a certain event semantic element, and determining the constructed positions of the semantic elements according to the time sequence of event behavior bearing key information and an environment variable set, thereby realizing the structural construction of the semantic elements.
3. Because the first semantic knowledge network which is well organized is adopted for reinforcement learning, and a loss function is introduced for analysis, the second semantic knowledge network which is stable can be output, and the part of the corresponding semantic knowledge base which exceeds the definition of the object level network is supplemented to the object level network in reverse, so that the interpretation range of the object level network is expanded.
Example two
Based on the same inventive concept as the method for constructing the semantic knowledge base based on the graph neural network in the foregoing embodiment, the present invention further provides a system for constructing the semantic knowledge base based on the graph neural network, as shown in fig. 6, the system includes:
a first obtaining unit 11, where the first obtaining unit 11 is configured to obtain first large-scale corpus information;
a second obtaining unit 12, where the second obtaining unit 12 is configured to perform text object identification and extraction from the first large-scale corpus information by using an object extractor based on an NLP technology, so as to obtain a first text object set;
a first constructing unit 13, wherein the first constructing unit 13 is used for constructing an object hierarchical network;
a second construction unit 14, where the second construction unit 14 is configured to perform semantic meta-analysis on each object in the first text object set according to the object hierarchy network to construct a first semantic meta-set;
a first networking unit 15, where the first networking unit 15 is configured to perform networking on the first semantic element set by using a graph neural network to obtain a first semantic knowledge network;
the first reinforcement unit 16, the first reinforcement unit 16 is configured to perform reinforcement learning on the first semantic knowledge network to obtain a second semantic knowledge network;
a first output unit 17, where the first output unit 17 is configured to output the second semantic knowledge network as a semantic knowledge base, and input the semantic knowledge base into the object hierarchical network to form closed-loop compensation.
Further, the system further comprises:
a third construction unit, configured to construct a thing classification rule, where the thing classification rule includes a first classification rule and a second classification rule, where the first classification rule is a narrow classification and the second classification is a broad classification;
a fourth construction unit, configured to construct an object narrowly defined hierarchical network according to the first classification rule;
a fifth constructing unit, configured to construct an object generalized hierarchical network according to the second classification rule;
a sixth construction unit for constructing the thing hierarchy network according to the thing narrow hierarchy network and the thing wide hierarchy network.
Further, the system further comprises:
a seventh construction unit, configured to construct an object extraction rule based on the information labeled by the sentence semantics and a neural network deep learning technique by performing sentence semantics labeling on the first large-scale corpus information, where the object extraction rule includes a first unit and a second unit, the first unit is an environment identification unit, and the second unit is an element identification unit;
a third obtaining unit, configured to obtain a first set of environment variables according to the first unit;
a fourth obtaining unit, configured to perform, by the second unit, object core element identification and event key information identification on the first large-scale corpus according to a sentence pattern logic tree, and obtain a first core element set and a first key information set;
a first generating unit, configured to generate the first text object set according to the first core element set, the first key information set, and the first environment variable set.
Further, the system further comprises:
a first determining unit, configured to determine a first event node by performing class node alignment and disambiguation processing on the first core element set and the first key information set;
a fifth obtaining unit configured to obtain a first connection node group by analyzing the event behavior hierarchy key information of the first event node and the narrow-sense hierarchical network;
a sixth obtaining unit, configured to obtain a second connection node group by analyzing event property hierarchical key information of the first event node and the generalized hierarchical network;
an eighth constructing unit, configured to construct a first semantic element with the first event node as a center and the first connection node group and the second connection node group as branches, where the first semantic element set includes the first semantic element.
Further, the system further comprises:
a second determining unit, configured to determine a node position of the first connection node group by analyzing transaction behavior compliance key information of the first event node, where the first connection node group includes an upper-layer name node and a lower-layer name node;
a third determining unit, configured to perform analysis according to the time series of the first environment variable set, and determine a node position of the second connected node group, where the second connected node group includes a preceding step node and a succeeding step node;
a ninth construction unit, configured to construct the first semantic element according to the node position of the first connection node group and the node position of the first connection node group.
Further, the system further comprises:
the first judging unit is used for judging whether the second early warning attribute information and the first early warning attribute information have an attribute dependency relationship according to the first attribute connection map;
a seventh obtaining unit, configured to obtain a first neighbor matrix of a first newly added node, where the first neighbor matrix is a matrix formed by the first newly added node and weights of all connection point edges;
an eighth obtaining unit, configured to obtain N newly added nodes;
a ninth obtaining unit, configured to obtain N neighbor matrices according to the N newly added nodes;
a tenth obtaining unit, configured to perform sequence position matching on the N nodes input in real time according to the N neighbor matrices, and obtain the second semantic knowledge network.
Various changes and specific examples of the method for building a semantic knowledge base based on a neural network in the first embodiment of fig. 1 are also applicable to the system for building a semantic knowledge base based on a neural network in this embodiment, and through the foregoing detailed description of the method for building a semantic knowledge base based on a neural network, those skilled in the art can clearly know the implementation method of the system for building a semantic knowledge base based on a neural network in this embodiment, so for the sake of brevity of the description, detailed description is not repeated here.
EXAMPLE III
The electronic device of the present application is described below with reference to fig. 7.
Fig. 7 illustrates a schematic structural diagram of an electronic device according to the present application.
Based on the inventive concept of a method for building a semantic knowledge base based on a graph neural network in the foregoing embodiments, the present invention further provides a system for building a semantic knowledge base based on a graph neural network, wherein a computer program is stored thereon, and when the computer program is executed by a processor, the computer program implements the steps of any one of the methods for building a semantic knowledge base based on a graph neural network in the foregoing.
Where in fig. 7 a bus architecture (represented by bus 300), bus 300 may include any number of interconnected buses and bridges, bus 300 linking together various circuits including one or more processors, represented by processor 302, and memory, represented by memory 304. The bus 300 may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface 305 provides an interface between the bus 300 and the receiver 301 and transmitter 303. The receiver 301 and the transmitter 303 may be the same element, i.e., a transceiver, providing a means for communicating with various other systems over a transmission medium. The processor 302 is responsible for managing the bus 300 and general processing, and the memory 304 may be used for storing data used by the processor 302 in performing operations.
The embodiment of the application provides a method for constructing a semantic knowledge base based on a graph neural network, which comprises the following steps: obtaining first large-scale corpus information; an object extractor based on an NLP technology identifies and extracts text objects from the first large-scale corpus information to obtain a first text object set; constructing an object hierarchical network; semantic meta-analysis is carried out on each thing in the first text thing set according to the thing hierarchical network, and a first semantic meta-set is constructed; networking the first semantic element set by using a graph neural network to obtain a first semantic knowledge network; acquiring a second semantic knowledge network by performing reinforcement learning on the first semantic knowledge network; and outputting the second semantic knowledge network as a semantic knowledge base form, and inputting the semantic knowledge base into the object hierarchical network to form closed loop compensation. The technical problems that a multi-level semantic knowledge base is not perfect and the automation level is low in the prior art are solved, and the technical effect of automatically constructing the multi-level semantic knowledge base by automatically constructing a semantic knowledge network with objects as nodes and relationships among the objects as edges is achieved.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create a system for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including an instruction system which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (9)

1. A method for building a semantic knowledge base based on a graph neural network is characterized by comprising the following steps:
obtaining first large-scale corpus information;
an object extractor based on an NLP technology identifies and extracts text objects from the first large-scale corpus information to obtain a first text object set;
constructing an object hierarchical network; wherein said constructing an item hierarchy network comprises:
constructing object classification rules, wherein the object classification rules comprise a first classification rule and a second classification rule, wherein the first classification rule is a narrow classification, and the second classification rule is a broad classification;
constructing an object narrow-sense hierarchical network according to the first classification rule; wherein, the narrow layer refers to the layer formed by nesting among behavior processes;
constructing an object generalized hierarchical network according to the second classification rule; wherein, the generalized hierarchy refers to the hierarchical relationship between the abstract description of the object property and the concrete example;
constructing the object hierarchical network according to the object narrow hierarchical network and the object generalized hierarchical network;
semantic meta-analysis is carried out on each thing in the first text thing set according to the thing hierarchical network, and a first semantic meta-set is constructed;
networking the first semantic element set by using a graph neural network to obtain a first semantic knowledge network;
acquiring a second semantic knowledge network by performing reinforcement learning on the first semantic knowledge network;
and outputting the second semantic knowledge network as a semantic knowledge base form, and inputting the semantic knowledge base into the object hierarchical network to form closed loop compensation.
2. The method of claim 1, wherein said NLP technology based transaction extractor performs textual transaction recognition and extraction from said first large-scale corpus information, obtaining a first set of textual transactions, said method further comprising:
constructing an object extraction rule based on the sentence semantic labeling information and a neural network deep learning technology by performing sentence semantic labeling on the first large-scale corpus information, wherein the object extraction rule comprises a first unit and a second unit, the first unit is an environment identification unit, and the second unit is an element identification unit;
obtaining a first set of environmental variables according to the first unit;
the second unit identifies object core elements and event key information of the first large-scale corpus according to the sentence pattern logic tree to obtain a first core element set and a first key information set;
and generating the first text object set according to the first core element set, the first key information set and the first environment variable set.
3. The method of claim 2, wherein the method further comprises:
determining a first event node by performing class node alignment and disambiguation processing on the first core element set and the first key information set;
obtaining a first connection node group by analyzing the event behavior hierarchy key information of the first event node and the narrow-meaning hierarchical network;
analyzing the event property hierarchical key information of the first event node and the generalized hierarchical network to obtain a second connection node group;
and constructing a first semantic element by taking the first event node as a center and taking the first connecting node group and the second connecting node group as branches, wherein the first semantic element set comprises the first semantic element.
4. The method of claim 3, wherein the method further comprises:
determining the node position of the first connection node group by analyzing the transaction behavior compliance key information of the first event node, wherein the first connection node group comprises an upper layer name node and a lower layer name node;
analyzing according to the time sequence of the first environment variable set, and determining the node position of the second connecting node group, wherein the second connecting node group comprises a former step node and a latter step node;
and constructing the first semantic element according to the node position of the first connecting node group and the node position of the first connecting node group.
5. The method of claim 1, wherein the second semantic knowledge network is obtained by reinforcement learning of the first semantic knowledge network, the method further comprising:
obtaining a first neighbor matrix of a first newly added node, wherein the first neighbor matrix is a matrix formed by the first newly added node and weights of all connection point edges;
obtaining N newly added nodes;
obtaining N neighbor matrixes according to the N newly added nodes;
and performing sequence position matching on the N nodes input in real time according to the N neighbor matrixes to obtain the second semantic knowledge network.
6. The method according to claim 1, characterized in that the second semantic knowledge network is obtained by iterative construction of a gradient function L, the gradient function L being:
L=δ(H^(t+1)-H^t)
wherein t is the current state; t +1 is the next state; h ^ (t +1) represents the next state of the semantic element set; h ^ t represents the current state of the semantic meta-set.
7. A system for building a semantic knowledge base based on a graph neural network, the system comprising:
a first obtaining unit, configured to obtain first large-scale corpus information;
a second obtaining unit, configured to perform text object identification and extraction from the first large-scale corpus information by using an object extractor based on an NLP technology, so as to obtain a first text object set;
a first construction unit, configured to construct an object-level network; wherein the first building unit comprises:
a third construction unit, configured to construct a thing classification rule, where the thing classification rule includes a first classification rule and a second classification rule, where the first classification rule is a narrow classification and the second classification is a broad classification;
a fourth construction unit, configured to construct an object narrowly defined hierarchical network according to the first classification rule; wherein, the narrow layer refers to the layer formed by nesting among behavior processes;
a fifth constructing unit, configured to construct an object generalized hierarchical network according to the second classification rule; wherein, the generalized hierarchy refers to the hierarchical relationship between the abstract description of the object property and the concrete example;
a sixth construction unit configured to construct the thing hierarchy network based on the thing narrowly-defined hierarchy network and the thing broadly-defined hierarchy network;
the second construction unit is used for carrying out semantic meta-analysis on each object in the first text object set according to the object hierarchical network to construct a first semantic meta-set;
the first networking unit is used for networking the first semantic element set by utilizing a graph neural network to obtain a first semantic knowledge network;
the first reinforcement unit is used for carrying out reinforcement learning on the first semantic knowledge network to obtain a second semantic knowledge network;
and the first output unit is used for outputting the second semantic knowledge network as a semantic knowledge base and inputting the semantic knowledge base into the object hierarchical network to form closed-loop compensation.
8. An electronic device, comprising a processor and a memory:
the memory is used for storing;
the processor is configured to execute the method of any one of claims 1-6 by calling.
9. A computer program product comprising a computer program and/or instructions, characterized in that the computer program and/or instructions, when executed by a processor, implement the steps of the method of any one of claims 1-6.
CN202210046579.XA 2022-01-17 2022-01-17 Method and system for constructing semantic knowledge base based on graph neural network Active CN114065770B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210046579.XA CN114065770B (en) 2022-01-17 2022-01-17 Method and system for constructing semantic knowledge base based on graph neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210046579.XA CN114065770B (en) 2022-01-17 2022-01-17 Method and system for constructing semantic knowledge base based on graph neural network

Publications (2)

Publication Number Publication Date
CN114065770A CN114065770A (en) 2022-02-18
CN114065770B true CN114065770B (en) 2022-04-15

Family

ID=80231040

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210046579.XA Active CN114065770B (en) 2022-01-17 2022-01-17 Method and system for constructing semantic knowledge base based on graph neural network

Country Status (1)

Country Link
CN (1) CN114065770B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116011461B (en) * 2023-03-02 2023-07-21 文灵科技(北京)有限公司 Concept abstraction system and method based on event classification model

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103699663B (en) * 2013-12-27 2017-02-08 中国科学院自动化研究所 Hot event mining method based on large-scale knowledge base
CN105677913B (en) * 2016-02-29 2019-04-26 哈尔滨工业大学 A kind of construction method of the Chinese semantic knowledge-base based on machine translation
CN111930906A (en) * 2020-07-29 2020-11-13 北京北大软件工程股份有限公司 Knowledge graph question-answering method and device based on semantic block

Also Published As

Publication number Publication date
CN114065770A (en) 2022-02-18

Similar Documents

Publication Publication Date Title
CN103207855B (en) For the fine granularity sentiment analysis system and method for product review information
US20230362175A1 (en) Malicious behavior identification method and system for weighted heterogeneous graph, and storage medium
CN108595708A (en) A kind of exception information file classification method of knowledge based collection of illustrative plates
RU2679988C1 (en) Extracting information objects with the help of a classifier combination
CN109992784B (en) Heterogeneous network construction and distance measurement method fusing multi-mode information
CN113515632A (en) Text classification method based on graph path knowledge extraction
CN108021557A (en) Irregular entity recognition method based on deep learning
EP3948501A1 (en) Hierarchical machine learning architecture including master engine supported by distributed light-weight real-time edge engines
Miao et al. A dynamic financial knowledge graph based on reinforcement learning and transfer learning
CN114841140A (en) Dependency analysis model and Chinese combined event extraction method based on dependency analysis
CN110569355B (en) Viewpoint target extraction and target emotion classification combined method and system based on word blocks
CN114065770B (en) Method and system for constructing semantic knowledge base based on graph neural network
Satapathy et al. Subjectivity detection in nuclear energy tweets
Kalo et al. Knowlybert-hybrid query answering over language models and knowledge graphs
CN113486143A (en) User portrait generation method based on multi-level text representation and model fusion
CN111930892A (en) Scientific and technological text classification method based on improved mutual information function
CN117473054A (en) Knowledge graph-based general intelligent question-answering method and device
CN114491029B (en) Short text similarity calculation method based on graph neural network
CN113656556B (en) Text feature extraction method and knowledge graph construction method
Chen English translation template retrieval based on semantic distance ontology knowledge recognition algorithm
CN112329440A (en) Relation extraction method and device based on two-stage screening and classification
CN109299442A (en) Chinese chapter primary-slave relation recognition methods and system
Ottens et al. Dynamic ontology co-evolution from texts: Principles and case study
CN117056458B (en) Method for carrying out front-end retrieval based on vector space algorithm
CN117216193B (en) Controllable text generation method and device based on large language model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220707

Address after: 210000 No.1, Qingdao Road, Gulou District, Nanjing, Jiangsu Province

Patentee after: Jiang Sushengdanganguan

Patentee after: Jiangsu United Industrial Limited by Share Ltd.

Address before: Room 1502, Tongfu building, 501 Zhongshan South Road, Qinhuai District, Nanjing, Jiangsu 210006

Patentee before: Jiangsu United Industrial Limited by Share Ltd.

TR01 Transfer of patent right