CN105808568B - Context distributed reasoning method and device - Google Patents

Context distributed reasoning method and device Download PDF

Info

Publication number
CN105808568B
CN105808568B CN201410844207.7A CN201410844207A CN105808568B CN 105808568 B CN105808568 B CN 105808568B CN 201410844207 A CN201410844207 A CN 201410844207A CN 105808568 B CN105808568 B CN 105808568B
Authority
CN
China
Prior art keywords
context
reasoning
user
inference
inferred
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410844207.7A
Other languages
Chinese (zh)
Other versions
CN105808568A (en
Inventor
孟祥武
李珂
张玉洁
王凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Beijing University of Posts and Telecommunications
Original Assignee
Huawei Technologies Co Ltd
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd, Beijing University of Posts and Telecommunications filed Critical Huawei Technologies Co Ltd
Priority to CN201410844207.7A priority Critical patent/CN105808568B/en
Publication of CN105808568A publication Critical patent/CN105808568A/en
Application granted granted Critical
Publication of CN105808568B publication Critical patent/CN105808568B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the invention discloses a context distributed reasoning method and a context distributed reasoning device, wherein the method comprises the following steps: acquiring a large amount of and various types of context data from a plurality of computing nodes, and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount; acquiring an inference rule for performing distributed inference on the inferred context set; analyzing the inference rule to generate an inference plan of the inference rule; and carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan. The embodiment of the invention can improve the reasoning efficiency of the context.

Description

Context distributed reasoning method and device
Technical Field
The present invention relates to the field of communications, and in particular, to a context distributed inference method and apparatus.
Background
Currently in the field of communications, enterprise communications, collaboration tools, and communication systems generate large amounts of contextual data, such as email, instant messaging, text messaging, call logs, and social media data, every day. How to provide communication services in a proper device and mode at a proper time and place by collecting enterprise user context, communication context, meeting context and the like for analysis and reasoning is a research hotspot at present. The enterprise user context may include, among other things: user basic information (e.g., information such as gender, occupation, age, cultural degree, professional knowledge, etc.), user life information (e.g., preference information), communication context may include: user behavior information (e.g., user telephone, sms message record, etc.), conference context may include: conference start time and end time, etc.
The current contextual reasoning methods all adopt centralized reasoning, wherein the centralized reasoning is carried out in a single-machine environment. Thus, when the amount of context data is large, there is a problem that inference efficiency is low.
Disclosure of Invention
The invention provides a context distributed reasoning method and device, which can improve the reasoning efficiency of contexts.
In a first aspect, the present invention provides a context distributed inference method, including:
acquiring a large amount of and various types of context data from a plurality of computing nodes, and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount;
acquiring an inference rule for performing distributed inference on the inferred context set;
analyzing the inference rule to generate an inference plan of the inference rule;
and carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan.
In a first possible implementation manner of the first aspect, the modeling the acquired context data to obtain an inferred context set includes:
and modeling the acquired context data to obtain an inferred context set comprising the mark, the attribute content and the occurrence time.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, the set of inferred contexts includes a user context and a meeting context:
the method further comprises the following steps:
acquiring a first inference model for performing conditional inference on the inferred context set, wherein the first inference model comprises two inference conditions;
the distributed reasoning is carried out on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and a reasoning result of the reasoning plan is generated, and the method comprises the following steps:
performing conditional distributed reasoning on the inferred context set by using the first reasoning model to obtain context information meeting reasoning conditions included by the first reasoning model, wherein the context information is used for representing users participating in a conference;
and carrying out distributed reasoning on the inferred context set corresponding to the users participating in the conference according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and generating the conference state information of the users according with the reasoning rule.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the set of inferred contexts includes a user context and a communication context;
the method further comprises the following steps:
acquiring a second inference model for performing importance inference on the inferred context set;
the distributed reasoning is carried out on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and a reasoning result of the reasoning plan is generated, and the method comprises the following steps:
performing communication importance distributed reasoning on the communication context by using the second reasoning model to obtain communication importance information between two users in the communication context;
performing job importance distributed inference on the user context by using the second inference model to obtain job importance information among users in the user context;
and carrying out distributed reasoning on the obtained communication importance information and job importance information according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate the important relationship information of a user pair with the communication importance and job importance meeting the reasoning rule, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have an important relationship.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, the method further includes:
acquiring user conference state information and important relationship information of a user pair, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have important relationship;
the distributed reasoning is carried out on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and a reasoning result of the reasoning plan is generated, and the method comprises the following steps:
according to the user conference state information and the important relationship information of the user pairs, carrying out distributed reasoning on a reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule, and outputting users who have important relationships and are in a meeting in each reasoned context; and/or
According to the user conference state information and the important relationship information of the user pairs, carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule, and outputting conference room information of the user participating in the conference in the reasoned context set; and/or
And performing distributed reasoning on the inferred context set according to the user conference state information and the important relationship information of the user pairs and according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to obtain whether the incoming call user of the user in the conference is the important user of the user in the conference, and if so, outputting prompt information for indicating that the incoming call user is the important user of the user in the conference.
In a second aspect, the present invention provides a context distributed reasoning apparatus, comprising: the device comprises a modeling unit, a first acquisition unit, an analysis unit and an inference unit, wherein:
the modeling unit is used for acquiring a large amount of context data of various types from a plurality of computing nodes and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount;
the first acquisition unit is used for acquiring an inference rule for performing distributed inference on the inferred context set;
the analysis unit is used for analyzing the inference rule to generate an inference plan of the inference rule;
and the reasoning unit is used for carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule to generate a reasoning result of the reasoning plan.
In a first possible implementation manner of the second aspect, the modeling unit is configured to perform modeling processing on the acquired context data to obtain an inferred context set including a flag, attribute content, and occurrence time.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a second possible implementation manner of the second aspect, the set of inferred contexts includes a user context and a meeting context:
the device further comprises:
the second acquisition unit is used for acquiring a first inference model used for carrying out condition distributed inference on the inferred context set, and the first inference model comprises two inference conditions;
the inference unit includes:
the first reasoning subunit is configured to perform conditional distributed reasoning on the inferred context set by using the first reasoning model to obtain context information meeting reasoning conditions included in the first reasoning model, where the context information is used to represent users participating in a conference;
and the second reasoning subunit is used for reasoning the inferred context set corresponding to the user participating in the conference according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and generating the conference state information of the user according with the reasoning rule.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a third possible implementation manner of the second aspect, the set of inferred contexts includes a user context and a communication context;
the device further comprises:
a third obtaining unit, configured to obtain a second inference model for performing importance inference on the inferred context set;
the inference unit includes:
the third reasoning subunit is used for carrying out communication importance distributed reasoning on the communication context by using the second reasoning model to obtain communication importance information between two users in the communication context;
the fourth reasoning subunit is used for performing job importance distributed reasoning on the user context by using the second reasoning model to obtain job importance information among the users in the user context;
and the fifth reasoning subunit is used for performing distributed reasoning on the obtained communication importance information and job importance information according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate the important relationship information of a user pair with the communication importance and job importance meeting the reasoning rule, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have an important relationship.
With reference to the second aspect or the first possible implementation manner of the second aspect, in a fourth possible implementation manner of the second aspect, the apparatus further includes:
the fourth acquisition unit is used for acquiring conference state information of the users and important relationship information of the user pairs, wherein the important relationship information of the user pairs is used for indicating that two users in the user pairs have important relationships;
the reasoning unit is used for carrying out distributed reasoning on the reasoned context set according to the user conference state information and the important relationship information of the user pairs and a context reasoning algorithm which is configured in advance and used for the reasoning rule, and outputting the users who have important relationships and are in a meeting in each reasoned context; and/or
The reasoning unit is used for carrying out distributed reasoning on the reasoned context set according to the user conference state information and the important relationship information of the user pairs and a preset context reasoning algorithm for the reasoning rule and outputting conference room information of the user participating in the conference in the reasoned context set; and/or
The reasoning unit is used for carrying out distributed reasoning on the inferred context set according to the user conference state information and the important relationship information of the user pairs and according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to obtain whether the calling user of the user in the conference is the important user of the user in the conference or not, and if so, outputting prompt information for indicating that the calling user is the important user of the user in the conference.
In the technical scheme, a large amount of and various types of context data are obtained from a plurality of computing nodes, modeling processing is carried out on the obtained context data, and an inferred context set is obtained, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount; acquiring an inference rule for performing distributed inference on the inferred context set; analyzing the inference rule to generate an inference plan of the inference rule; and carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan. In this way, distributed reasoning can be performed on a large amount of context data of a plurality of computing nodes, compared with the prior art in which context reasoning is performed in a single machine environment. The invention can improve the context reasoning efficiency.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart diagram of a context distributed inference method according to an embodiment of the present invention;
FIG. 2 is a flow diagram illustrating another context-distributed inference method according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a context-distributed inference apparatus according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an alternative context-distributed inference engine according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an alternative context-distributed inference engine according to an embodiment of the present invention;
FIG. 6is a schematic diagram of an alternative context-distributed inference engine according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of another context-distributed inference apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart illustrating a context distributed inference method according to an embodiment of the present invention, as shown in fig. 1, including the following steps:
101. acquiring a large amount of and various types of context data from a plurality of computing nodes, and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount.
The computing node may be any device or means that stores context data. For example: the computing node has stored thereon a user data set, or the computing node has stored thereon a conference data set, wherein the user data set may comprise user data, such as: user basic information (such as information of gender, occupation, age, cultural degree, professional knowledge and the like), user life information (such as preference information) or user behavior information (such as information of user telephone, short message communication records and the like) and the like; the conference data set may include conference room identification, conference start time and end time, etc. information.
In addition, the above-mentioned acquisition may be of various context data, such as: different context data may be obtained from different computing nodes. The modeling process may be an array of elements that marshals the obtained context data into a unified rule or representation. The large amount may refer to the size of the context data as TB level number, where 1024B is 1KB,1024KB is 1MB,1024MB is 1GB, and 1024GB is 1 TB.
102. And acquiring an inference rule for performing distributed inference on the inferred context set.
The inference rule may receive an inference rule input by a user or an inference rule automatically generated by the device.
103. And analyzing the inference rule to generate an inference plan of the inference rule.
The analyzing of the inference rule may be understood as analyzing the inference rule to obtain the inference content of the inference rule. The inference plan may be understood as an inference purpose when the inference rule performs inference.
104. And carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan.
The above-mentioned distributed reasoning for the inferred context set according to the reasoning plan and the pre-configured context reasoning algorithm for the reasoning rule may be to reason about user information satisfying the reasoning rule in the inferred context set, for example: and deducing state information of users participating in the conference, or deducing important relationship information between the users and the like. In addition, the above-mentioned context inference algorithm can be used for different inference rules, and of course, the steps performed by the inference algorithm are different for different inference rules. The reasoning algorithm carries out distributed reasoning on the reasoned context set so as to obtain a reasoning result meeting the reasoning plan.
Optionally, the method may be implemented based on a distributed File System, that is, the method may be implemented in any device including the distributed File System, where the distributed File System may be a Hadoop Distributed File System (HDFS).
In this embodiment, a large amount of and various types of context data are acquired from a plurality of computing nodes, and modeling processing is performed on the acquired context data to obtain a set including an inferred context, where the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of context data reaching a specific data amount; acquiring an inference rule for performing distributed inference on the inferred context set; analyzing the inference rule to generate an inference plan of the inference rule; and carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan. Therefore, distributed reasoning can be performed on a large amount of context data of a plurality of computing nodes, and compared with the prior art in which context reasoning is performed in a single-machine environment, the method can improve the context reasoning efficiency.
Referring to fig. 2, fig. 2 is a flow chart illustrating another context distributed inference method according to an embodiment of the present invention, as shown in fig. 2, including the following steps:
201. obtaining a large amount of and various types of context data from a plurality of computing nodes, and carrying out modeling processing on the obtained context data to obtain a set of inferred contexts, wherein each inferred context in the set of inferred contexts comprises a mark, attribute content and occurrence time.
Optionally, the inferred context may be used for the following triplet representation:
context(={cid;body;time}
wherein context represents the inferred context, cid is the identification (e.g. id) of the inferred context, the identification is unique and can identify different contexts, body is the attribute content contained in the inferred context, and time is the occurrence time of the inferred context.
Optionally, the user context may be described in a format:
user context contextU ═ userid; { username, phone, position, location }; time }
The userid is a user id, and body is { username, phone, position, location }, where username, phone, position, and location are a user name, phone, position, and location, respectively, and time is a starting time when the user is at the location.
For example: the following example of a user context:
user0 user_name0 17958143769 fellow outdoor 2014-05-13 15:09:13
the fields correspond to the user contexts, wherein fellow indicates the researcher and outdoor indicates the user contexts.
Optionally, the conference context may be described in a format:
meeting context contextM ═ { meetingid; { endtime, place }; startime }
meetingidwei is the conference id, body is { end, place }, and start, end, and place are the conference start time, end time, and conference place, respectively.
For example: the following example of a meeting context:
meeting1 2014-05-13 10:00:00 2014-05-13 11:00:00 meetingRoom1
meetingi corresponds to meetingid, 2014-05-1310: 00:00 corresponds to strtime, 2014-05-1311:00:00 corresponds to endtime of the body part, and meetingrom 1 corresponds to place field of the body part.
Optionally, the communication context may be described in a format:
the communication context contextC ═ { cid; { photo, contact, description, duration }; time }
cid is communication record id, body { (phonei, contact, description, duration }, phonei, contact, description, time, and duration are the contact time of the phone of phonei and contact and the duration of the phone, respectively, or phonei, contact, description, time, and duration are the contact time of the short message of phonei and contact, respectively, wherein the duration of the short message is null. Phonei may represent telephone number i and contact represents the telephone number associated with telephone number i.
For example: the following two examples of communication contexts:
1: 15747699284146922588702014-02-5000: 22:44 SMS NULL
2: 15147692225146922558472014-01-3721: 43:21 call 236
The first field in record 1 corresponds to cid, id for communication context, 15747699284 and 14692258870 correspond to phonei and contact of body part, 2014-02-5000: 22:44 correspond to time, SMS corresponds to description, null corresponds to duration, SMS has empty communication duration, if telephone, its description content is call, and similarly, field 236 in record 2 is call duration.
Optionally, the mail context may be described in a format:
contextE={eid;{email,contact,title,content};time}
the eid mail communication record id, body ═ { email, contact, title, content }, email, contact, title and content are the mail sender email and receiver contact, mail subject title, mail content, time is the mail sending time, respectively.
202. And acquiring an inference rule for performing distributed inference on the inferred context set.
203. And analyzing the inference rule to generate an inference plan of the inference rule.
204. And carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan.
As an alternative embodiment, the inferred context may include a user context and a meeting context. The above method may further comprise the steps of:
acquiring a first inference model for performing conditional inference on the inferred context set, wherein the first inference model comprises two inference conditions;
step 204 may include:
performing conditional distributed reasoning on the inferred context set by using the first reasoning model to obtain context information meeting reasoning conditions included by the first reasoning model, wherein the context information is used for representing users participating in a conference;
and carrying out distributed reasoning on the inferred context set corresponding to the users participating in the conference according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and generating the conference state information of the users according with the reasoning rule.
The users participating in the conference may be users currently in a conference or users who are in a conference at a certain time.
Optionally, the first inference model may include one or more of the following:
an early-arriving conference user, an on-time-arriving conference user, and a late-arriving conference user.
For example: the first inference model described above can be described by the following formula:
Figure GDA0002273422070000101
wherein, the contextu.location represents that the location of the user context U is a conference room in the conference context;
where the condition 1 is contextm.strtime-contextu.time ∈ [0,60), contextm.strtime represents the time of the full-text context M, contextu.time represents the time of the user context entering the conference room, i.e., the aforementioned contextm.strtime-contextu.time ∈ [0,60) may represent the user within 60 minutes of the earlier conference room;
wherein the condition 2 is contextm.end-contextu.time e (0, meeting duration), contextm.end represents the meeting end time of the full-text context M, contextm.end-contextu.time e (0, meeting duration) represents the user who is late to the user and enters the meeting room on time;
condition 3 is contextm.strtime-contextu.time e [0,60) & & contextm.endtime-contextu.time e [0, conference duration ], representing the union of condition 1 and condition 2 described above.
In this way, the above model can perform distributed processing on the context with the user context and the conference context as input, and output context information that satisfies both the contextu.location ═ contextm.place table and the condition 1 (or 2 or 3).
For example, for the first model input described above:
user context:
user2 user_name2 14036921444 fellow office 2014-05-13 09:37:07
user3 user_name3 14625881103 fellow meeting_room8 2014-05-1310:50:42
user9 user_name9 15147692225 fellow meeting_room13 2014-05-1309:45:21
user10 user_name10 18706699281 fellow office 2014-05-13 10:08:56
user11 user_name11 18069981103 fellow meeting_room20 2014-05-1310:06:00
user12 user_name12 16814476992 leader meeting_room27 2014-05-1309:14:15
user13 user_name13 13581147739 leader office 2014-05-13 10:15:50
inputting a meeting context:
meeting8 2014-05-13 10:00:00 2014-05-13 11:00:00meetingRoom8
meeting13 2014-05-13 10:00:00 2014-05-13 11:00:00meetingRoom13
meeting20 2014-05-13 09:00:00 2014-05-13 10:00:00meetingRoom20
meeting27 2014-05-13 09:00:00 2014-05-13 11:00:00meetingRoom27
the inference context information obtained after the first model processes the user context sum and the context is:
user3 is in meeting8
user9 is in meeting13
user12 is in meeting27
wherein, user3 is in meeting8 shows that user3 is in meeting room8, user9 is in meeting13 shows that user9 is in meeting room13, and user12 is in meeting27 shows that user12 is in meeting room 27.
The analysis of inference context information may be as follows:
the time that user3 enters meetingrom 8 is within the beginning time and the end of meeting8, and is a conference user.
The time that the user9 enters meetingrom 13 is before the starting time of meeting13, and if the current time is 2014-05-1310: 10:00, it can be obtained that the user9 enters the conference room for a meeting 15 minutes ahead, and is the user.
The time when the user11 enters meetingroom20 is later than the meeting end time, and the meeting room is not seen to have a meeting currently, so that the user does not belong to the conference user.
The time when the user12 enters the meetingroom27 is within the starting time and the ending time of the meeting27, and belongs to the conference user.
However, whether the user is a late user, an early user or a punctual user, how early the user enters the conference room and how late the user enters the conference room, needs to be determined according to inference rules.
In addition, the inference rule may include any one of:
1. the time-time is a, wherein a is an integer variable and has a value range of [0,60), and the time-time represents the time of subtracting the time of the user entering the conference room from the conference start time of the conference context;
2. and b, wherein b is an integer variable and has a value range of (0, 120), and the end-time represents the conference end time of the conference context minus the time of the user entering the conference room in the user context, because the conference duration is 60 minutes in some and 120 minutes in some, if b is (0, 60), users of two conferences can be obtained, the smaller the value of b is, the more users in the conference are obtained by inference, and if b is (60, 120), only the conference user with the conference time of 120 minutes can be obtained.
3. The method comprises the following steps of (1) strtime-time, a, endtime-time and b, wherein the value ranges of the variables a and b are the same as above, namely the inference rule 3 is the union of the inference rule 1 and the inference rule 2.
In this embodiment, the inference plans of the different inference rules are also different, for example: if the inference rule 1 is the above inference rule, the inference plan is to infer the user who arrives earlier, that is, the time when the user enters the conference room is earlier than the conference start time. In addition, the larger the value of a is, the more users in a meeting are obtained.
If the inference rule 2 is above, the inference plan is to infer the late conference user and the conference user who enters the conference room on time, that is, the time when the user enters the conference room is later than the start time or the same, but earlier than the end time of the conference.
If the inference rule 3 is the above inference rule, the inference plan is to infer the users who arrive earlier, arrive later and are in a meeting according to the meeting entering the meeting room, namely the union of the above two cases, under the condition that the same parameters are taken as the former two rules.
The embodiment can realize that the user context, the conference context and the inference rule are used as input, wherein the user context and the conference context are input by the first inference model, the first inference model is a combination of two conditions, the description and the representation of the conditions are related to information in context data, the inference of the inferred context corresponding to the user participating in the conference by using the inference rule is to process the input data, and the output of the inference is the user in the meeting at a certain moment.
For example: in this embodiment, performing distributed inference on the inferred context set corresponding to the user participating in the conference according to the inference plan and the pre-configured context inference algorithm for the inference rule may include:
step 1), corresponding a location field in a user context corresponding to the user participating in the conference and a place field in a conference context, and connecting the two tables.
And 2) calculating the difference values (striime-time, endtime-time and endtime) of the three fields of time, striime and endtime for the output of the step 1) and adding the result to the output of the step 1).
And 3) acquiring the analysis result of the inference rule, and generating the conference state information of the user according with the inference rule.
For example: if the inference rule obtained in step 202 is a single rule, and the rule header content is string-time, the input parameter a belongs to [0,60 ], if the difference between the conference start time and the time when the user enters the conference room belongs to [0, a ], userid + meetingid (the user who enters the conference room on time and within a minute earlier than the conference start time in the conference) can be output.
If the inference rule obtained in step 202 is a single rule, the head content of the rule is endtime-time, the input parameter b belongs to (0, meeting duration), and if the difference between the meeting ending time and the meeting room entering time belongs to (b, meeting duration), userid + meetingid can be output (the user is late, i.e. the user enters the meeting room later than the meeting starting time).
If the inference rule obtained in step 202 is two (rule no-order requirement), rule 1 header content is string-end time, input parameter a belongs to [0,60 ], rule 2 header content is end-time, input parameter b belongs to (0, conference duration), and if the difference between the conference start time and the user's time of entering the conference room belongs to [0, a) or the difference between the conference start time and the user's time of entering the conference room is less than 0, and the difference between the conference end time and the user's time of entering the conference room belongs to (b, conference duration), userid + meetingid (punctual user, early user and late user) can be output
For example; inputting a user context:
user17 user_name17 16814477062 fellow office 2014-05-13 09:55:54
user18 user_name18 16847700325 fellow meeting_room29 2014-05-1309:10:37
user19 user_name19 10281103325 fellow meeting_room20 2014-05-1309:33:24
user20 user_name20 10958870392 fellow meeting_room1 2014-05-1309:18:48
user21 user_name21 17925477092 fellow meeting_room11 2014-05-1309:02:43
meeting context:
meeting1 2014-05-13 10:00:00 2014-05-13 11:00:00 meetingRoom1
meeting11 2014-05-13 10:00:00 2014-05-13 12:00:00 meetingRoom11
meeting20 2014-05-13 09:00:00 2014-05-13 10:00:00 meetingRoom20
meeting29 2014-05-13 10:00:00 2014-05-13 11:00:00 meetingRoom29
using the first reasoning model to carry out conditional reasoning on the user context and the conference context
user18 is in meeting29
user19 is in meeting20
user20 is in meeting1
user21 is in meeting11
If the reasoning rule is as follows: strtime-time,59
And reasoning the reasoning context corresponding to the user participating in the conference by using the reasoning rule, and generating the conference state information of the user according with the reasoning rule as follows:
user18 is in meeting29
user20 is in meeting1
user21 is in meeting11
therefore, the inference rule of the strtime-time 59 can be used, and the result is the conference user within 59 minutes of the earliest time.
If the reasoning rule is as follows: strime-time, 45
And reasoning the reasoning context corresponding to the user participating in the conference by using the reasoning rule, and generating the conference state information of the user according with the reasoning rule as follows:
user20 is in meeting1
therefore, the inference rule of the strtime-time 45 can be used, and the output result is the conference user within 45 minutes.
If the reasoning rule is as follows: end-time, 60
And reasoning the reasoning context corresponding to the user participating in the conference by using the reasoning rule, and generating the conference state information of the user according with the reasoning rule as follows:
user19 is in meeting20
therefore, the result can be output as a late conference user by using the inference rule of the endtime-time 60.
If the reasoning rule is as follows: strtime-time,59, endtime-time,10
And reasoning the reasoning context corresponding to the user participating in the conference by using the reasoning rule, and generating the conference state information of the user according with the reasoning rule as follows:
user18 is in meeting29
user19 is in meeting20
user20 is in meeting1
user21 is in meeting11
in this embodiment: and taking the user context, the conference context and the inference rule as input, carrying out distributed inference according to the first inference model and the inference rule, and outputting the state of the user at a certain moment. 3 kinds of inference rules are provided, and when the inference rules are different, the output results are also different; corresponding to the same reasoning result, when the parameters are different, the obtained reasoning results are also different.
As an alternative embodiment, the set of inferred contexts can include a user context and a communication context. The above method may further comprise the steps of:
acquiring a second inference model for performing importance inference on the inferred context set;
step 204 may include:
performing communication importance inference on the communication context by using the second inference model to obtain communication importance information between two users in the communication context;
performing job importance reasoning on the user context by using the second reasoning model to obtain job importance information among users in the user context;
and carrying out distributed reasoning on the obtained communication importance information and job importance information according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate the important relationship information of a user pair with the communication importance and job importance meeting the reasoning rule, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have an important relationship.
It should be noted that, the important relationship information of the user pair may be understood that one user in the user pair is an important user of another user, but the users are not important users of each other, for example: the user pair includes a user i and a user j, and when the important relationship information of the user pair may indicate that the user i is an important user of the user j, the user j is not necessarily an important user of the user i.
Optionally, the second inference model may be described by the following formula:
Figure GDA0002273422070000161
the phonei represents a user i, the phonej represents a user j, and the importance among the communication users and the importance of the positions of the users can be obtained through the second reasoning model.
Wherein, when the user communication importance concordance (phonei, phonej) infers the communication importance in a distributed manner, a method for giving different weight parameters to different contexts and mapping the context value-taking result by adopting a marginal utility theory is provided, and the method is included in the user communication importance concordance (phonei, phonej) in the second inference model.
The method for assigning different weight parameters to the context may include the following three steps:
1. different weight parameters are set.
The communication context can be divided into a phone context and a short message context, i.e.
contextC={cid;{phonei,contact,tel,duration};time}
contextC={cid;{phonei,contact,sm,0};time}
For example: the following two examples of phone context and short message context:
2518069981103175811477692014-03-3812: 59:27 short message NULL
2616141417406162487003252013-12-3816: 08:15 Call 509
Similarly, the first field "25" in record 25 corresponds to cid, id for communication context, 18069981103 and 17581147769 correspond to phonei, contact, 2014-03-3812:59:27 for the body part, time, sm for the note, tel, if a phone call, null for duration, and 0, null for communication duration, and similarly, field 509 (minutes) in record 26 is the duration of the call.
When the importance of communication between users is inferred, the invention mainly considers the telephone communication time length total of phonei and phonejphonei,phonejNumber of telephone communications telnumphonei,phonejSMNUM of short message communication timesphonei,phonejDifferent weighting parameters are given to the telephone and the short message according to different influences on the communication importance between the users. Wherein, the telephone communication time length totalphonei,phonejMay be the total duration of the telephone communication.
2. And mapping the results of the telephone communication duration, the telephone communication times and the short message communication times by using a logarithmic function by adopting a marginal utility theory.
The actual values of the communication duration, the telephone communication times and the short message times between phonei and phonej may be large or small, but are not regularly distributed in a certain interval, and if the actual values are not mapped by some method and distributed in a certain range, certain difficulty is brought to the importance measurement between users.
Considering that the influence of the telephone communication time length, the telephone communication times and the short message times on the importance among the users is logarithmically distributed on the whole, that is, the importance among the users does not linearly increase along with the increase of the telephone communication times, the telephone communication time length and the short message times among the users, but tends to be slow after increasing to a certain degree, similar to the change of a logarithmic function,
therefore, referring to the marginal effect theory, a logarithmic function is adopted to map the telephone communication time length, the telephone communication times and the short message times between phonei and phonej, so that the actual numerical values are in a certain interval, and the accuracy of the reasoning result is effectively improved.
For example: total number of telephone communications between phonei and phonejphonei,phonej2056 minutes, number of telephone communications telnumphonei,phonej64, the number of short message communication times smnumphonei,phonej128 if they are not mapped, according to their weighting parametersWhen calculating, because of the totalphonei,phonejThe value of (A) is large, and the corresponding result is also large, so that the influence of the telephone communication times and the short message communication times is reduced or even ignored, which is not in accordance with the actual situation. If the base-2 logarithm function is mapped before computation, then the totalphonei,phonej,telnumphonei,phonej,smnumphonei,phonejRespectively 20,6 and 7.
Therefore, the influence of the great telephone communication time length value on the communication importance among users is effectively reduced. After all, if the call duration between users is very long in real life, but the number of telephone communication times and the number of short messages are very small, the relationship between users is not necessarily close, and the users may also be advertisement promotion and the like, and the users are not necessarily important contacts of the telephone of the users.
3. And (4) calculating.
The calculation formula is as follows:
Figure GDA0002273422070000171
a, β, γ respectively represent weight parameters, and all values are [0,1], and a + β + γ is 1.
The weighting parameter can take the following two principles:
different contexts are treated equally, and the weight parameters are the same. This approach is not scientific because different contexts affect the importance of communications between users differently.
Different contexts are assigned different weight parameters. Considering that the influence of the number of telephone communications on the importance of the communication between the users is smaller than the influence of the number of short message communications and the duration of telephone communications, telnum is set separately herephonei,phonej,totalphonei,phonej,smnumphonei,phonejThe values of the weight parameter(s) of (2) are 0.3, 0.4, 0.3.
And (4) analyzing results: experiments on a data set prove that the communication importance numerical ratio among the user groups is obtained when the weight parameters are different through two conditions of equal weight parameters and unequal weight parameters of the telephone communication duration, the telephone number and the short message numberIs equally large and telnumphonei,phonej,totalphonei,phonej,smnumphonei,phonejThe maximum values were obtained at 0.3, 0.4 and 0.3, respectively.
For example: inputting a communication context
014692258870146258811032013-02-2818: 34:31 SMS NULL
115747699284146922588702014-02-5000: 22:44 SMS NULL
215147692225146922558472014-01-3721: 43:21 call 236
Then, the method and the calculation formula for giving different weights to different contexts and mapping the context result by using a logarithmic function, which are provided in the conditional user communication importance connimpartance (phonei, phonej) of the second inference model, have the following calculation results:
14692258870 14625881103 0.35
15747699284 14692258870 0.35
15147692225 14692255847 0.65
the value of the last field is the importance of the communication between the users. However, whether the incoming call user is an important contact of the telephone or not needs to be determined by parameters in the inference rule input by the user. Particularly, a distributed reasoning algorithm for important contacts of the telephone is shown.
In the embodiment, different weight parameters can be given to different contexts, and the numerical values of the contexts are mapped by using a logarithmic function by adopting a marginal utility theory, and experiments prove that the communication importance numerical values obtained when the contexts are divided into 0.3, 0.4 and 0.3 are the largest, so that more important users can be obtained when the communication importance threshold values are the same, and some important users which may be filtered by the importance threshold values can also appear. The accuracy of the reasoning result is mentioned to a certain extent.
Wherein the conditions of the above-mentioned position importance allocation entity (phonei, phonej) are as follows:
if(phonei.position==leader)
positionImportance(phonei,phonej)=yes
that is, if the position of the incoming user phonei is leader, it is a phone important contact of phonej. It should be noted that the occupational importance is relative, not relative to each other, for example: a position identity identifier (phonei, phonej) ═ yes indicates that the user phonei is a telephone important contact of the user phonej, but does not indicate that the user phonej is a telephone important contact of the user phonei.
For example: the user context is as follows:
user94 user_name94 15747699284fellow meeting_room501 2014-05-1310:15:26
user9 user_name9 15147692225leader office 2014-05-13 09:55:23
user97 user_name97 14692258870fellow office 2014-05-13 08:58:31
then, the job importance position importance of the second inference model performs distributed inference on the job importance of the user, and the result is:
15147692225 14692255847 yes
i.e., the user's job in the user context, is an important contact of their phone when 15147692225 calls any other user.
The inferred context for this embodiment can obtain the following information through the second inference model:
14692258870 14625881103 0.35
15747699284 14692258870 0.35
15147692225 14692255847 0.65
15747699284 19170033628 0.3
15147692225 phonej yes
and performing distributed reasoning on the second reasoning model by taking the user context and the communication context as input, and outputting a union set meeting two conditions of the user context and the communication context, wherein a method for giving different weights to different contexts and mapping original values of the contexts by using a marginal utility theory is provided when the communication importance between the users is deduced, and the accuracy of a reasoning result is improved to a certain extent by the method.
In this embodiment, the inference rule may include any one of the following items:
1. conn, a, wherein the value range of a is determined according to the value interval of the communication importance and is tentatively (0, 1)
2. position, b, wherein b is a string constant with a numeric range of 2 yes/no.
3. conn, a, position, b, wherein the value ranges of a and b are the same as above.
Here, conn represents communication, and position represents position.
In this embodiment, the inference plans of the different inference rules are also different, for example: if the inference rule is the inference rule 1, the inference plan is used for obtaining the communication importance user for inference.
If the inference rule 2 is the above inference rule, the inference plan is to infer the users with the importance of the job. When the input rule is position, yes, the output result is the important user judged according to the importance of the job. When the input rule is position, no, the output result is the non-important user obtained according to the job importance judgment. In general, it makes more sense to infer job importance.
If the inference rule 3 is the above inference rule, the inference plan is to infer the communication importance user and the job importance (non-important) user, and the theoretically obtained result is the union of the above two rules under the condition that the input parameters are the same.
In this embodiment, the communication context, the user context, and the inference rule may be used as input, where the communication context and the user context are input by the second inference model, and the second inference model is a combination of two conditions, and the two conditions may satisfy one of the two conditions. The method of assigning different weight parameters to different contexts and mapping context results by adopting marginal utility theory is proposed in condition 1 to measure the importance among communication users. The setting of the importance of the job of the user is described in condition 2. And the user distributed inference algorithm processes the two input data according to the model. The output of this inference is the importance between users.
For example: the step of performing distributed inference on the obtained communication importance information and job importance information according to the inference plan and a pre-configured context inference algorithm for the inference rule, and generating the important relationship information of the user pairs whose communication importance and job importance satisfy the inference rule may include the following steps:
1. and calculating the telephone communication times telnumphonei, phonj, the telephone communication time length totalphonei, phonj and the short message communication times smphonei and phonj of the phonei and phonej in the communication context.
2. And calculating the telephone communication times telnumphonej, the telephone communication time length totalphonej and the short message communication time length smphonej of the phonej in the communication context.
3. And connecting the two tables by using the phonej field output in the step 1 and the phonej field output in the step 2 as a correspondence.
4. Different weight parameters are given to the telephone number, the communication time length and the short message number through the following formulas:
Figure GDA0002273422070000201
wherein telnumphonei,phonej,smnumphonei,phonej,totalphonei,phonejRespectively representing the telephone times, short message times and telephone duration of phonei and phonej; telnumphonei,phonej,smnumphonei,phonej,totalphonei,phonejRespectively representing the number of telephone calls, the number of short messages and the telephone duration of phonej, a, β and gamma respectively represent weight parameters, and the value intervals are all [0,1]A + β + γ ═ 1. if important (phonei, phonej)>Threshold value
Figure GDA0002273422070000211
Phonei is a communication important contact of phonej.
Figure GDA0002273422070000212
The parameter a can be set by the user, i.e. is input in the inference rule.
5. The phonei field of the user context and the phonei field of the communication context are used as corresponding fields, and the two tables are connected.
6. In the output of the step 5, if the position of phonei is leader, phonei is important contact of position of phonj. Otherwise, the contact is a non-important contact.
7. The tables are connected with phonei and phonej in the outputs of the above steps 4 and 6 as corresponding.
8. And acquiring the analysis result of the inference rule, and generating important relationship information of the user pairs with communication importance and job importance meeting the inference rule.
For example: if the inference rule is a single rule and the rule header content is conn, the input parameter a belongs to (0,0.1), and if the importance w > of phonei to phonej is a, then phonei conn yes (communication importance user) can be output
If the inference rule is single, the head content of the rule is position, the parameter b takes the value of yes/no, and if the content of the last field in the output result of the second inference model is the same as the parameter in the input rule, then the photo phone position yes/no (important/non-important user) can be output
If the inference rule is two (rule has no sequence requirement), the head content of rule 1 is conn, the input parameter a belongs to (0,0.1), the head content of rule 2 is endtime-time, the input parameter b is yes/no, and if the importance w > of phonei to phonej is a, then phonei conn yes (communication importance user) can be output; according to the second reasoning model, the content of the last field of the output result is the same as the parameters in the input rule, and a phonei phonejunction yes/no (position important/non-important user) can be output.
For example: user context:
user94 user_name94 15747699284fellow meeting_room501 2014-05-1310:15:26
user9 user_name9 15147692225leader office 2014-05-13 09:55:23
user97 user_name97 14692258870fellow office 2014-05-13
communication context:
014692258870146258811032013-02-2818: 34:31 SMS NULL
115747699284146922588702014-02-5000: 22:44 SMS NULL
215147692225146922558472014-01-3721: 43:21 call 236
And obtaining communication importance information and job importance information between the users according to a second reasoning model:
14692258870 14625881103 0.35
15747699284 14692258870 0.35
15147692225 14692255847 0.65
15147692225 14692255847yes
if the inference rule is conn,0.7 and the obtained communication importance information and job importance information are used for inference, the following important relationship information of the user pairs whose communication importance and job importance meet the inference rule can be generated:
15147692225 14692255847yes
wherein 0.7 is the specific value of the parameter a, which is the threshold value of the communication importance
Figure GDA0002273422070000221
The values of a are different, and the corresponding results are also different.
Although the importance of communication between 15147692225 and 14692255847 is 0.65<0.7, 15147692225 is a job importance user of 14692255847. 15147692225 is therefore a phone important contact of 14692255847.
If the inference rule is conn,0.3 and the obtained communication importance information and job importance information are used for inference, the following important relationship information of the user pairs whose communication importance and job importance meet the inference rule can be generated:
14692258870 14625881103yes
15747699284 14692258870yes
15147692225 14692255847yes
if the reasoning rule is as follows: position, yes, using the obtained communication importance information and job importance information of the inference rule to perform inference, the following important relationship information of the user pairs whose communication importance and job importance satisfy the inference rule can be generated:
15147692225 14692255847yes
15147692225, leader, which meets the requirements of the position importance allocation (phonei, phonej) condition of the second reasoning model.
If the reasoning rule is as follows: conn,0.3, position, yes; and reasoning by using the obtained communication importance information and job importance information of the reasoning rule, so that the following important relationship information of the user pairs with the communication importance and job importance meeting the reasoning rule can be generated:
14692258870 14625881103yes
15747699284 14692258870yes
15147692225 14692255847yes
in the implementation mode, the user context, the communication context and the inference rule are used as input, distributed inference is carried out according to the second inference model and the inference rule, and important contacts of the telephone are output. The inference rules are 3 types, the inference rules are different, the output results are also different, and the obtained inference results are also different corresponding to the same inference rule and different parameters.
As an optional implementation, the method may further include the steps of:
acquiring user conference state information and important relationship information of a user pair, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have important relationship.
Step 204 may include:
according to the user conference state information and the important relationship information of the user pairs, carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule, and outputting the users who have important relationships and are in a meeting in the reasoned context set; and/or
According to the user conference state information and the important relationship information of the user pairs, carrying out distributed reasoning on a reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule, and outputting conference room information of the user participating in the conference in each reasoned context; and/or
And performing distributed reasoning on the inferred context set according to the user conference state information and the important relationship information of the user pairs and according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to obtain whether the incoming call user of the user in the conference is the important user of the user in the conference, and if so, outputting prompt information for indicating that the incoming call user is the important user of the user in the conference.
Wherein, the above and/or representing step 204 may include any one or more steps of the above three steps, and in addition, the inference rules used for different inference results are different, wherein, when the inference output is the user who has an important relationship and is in a meeting in the inference context set, the first inference rule may be used; the second inference rule can be used when the inference outputs the meeting room information of the user participating in the meeting in the inferred context set; the third inference rule may be used when inferring whether the incoming call user of the user in the meeting is an important user of the user in the meeting.
That is, the inference rule in step 204 may include any one or more of:
a first inference rule, a second inference rule, and a third inference rule.
The user conference state information and the important relationship information of the user pair may be obtained through the two embodiments described above, that is, the method further includes the following steps:
acquiring a first inference model for performing conditional inference on the inferred context set, wherein the first inference model comprises two inference conditions;
performing conditional distributed reasoning on the inferred context set by using the first reasoning model to obtain context information meeting reasoning conditions included by the first reasoning model, wherein the context information is used for representing users participating in a conference;
and carrying out distributed reasoning on the context information corresponding to the users participating in the conference according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule, and generating the conference state information of the users according with the reasoning rule.
And may further comprise:
acquiring a second inference model for performing importance inference on the inferred context set;
performing communication importance distributed reasoning on the communication context by using the second reasoning model to obtain communication importance information between two users in the communication context;
performing job importance distributed inference on the user context by using the second inference model to obtain job importance information among users in the user context;
and carrying out distributed reasoning on the obtained communication importance information and job importance information according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate the important relationship information of a user pair with the communication importance and job importance meeting the reasoning rule, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have an important relationship.
Optionally, for the first inference rule, the user who has an important relationship and is in a meeting in each inferred context may understand that the user who is in the meeting and who has an important relationship in each inferred context includes the target user in the first inference rule and is in the meeting. For example: the user in the first inference rule is user i, and user i is an important user of user j, user t and user w, so that when inference is performed, user j is in a meeting, and thus user j can be inferred through the first inference rule, although user i is also an important user of user t and user w, the two users are not in a meeting currently.
Optionally, the above-mentioned outputting the conference room information of the users participating in the conference in each inferred context may be understood as outputting the conference room identifier of the user in the conference.
In this embodiment, optionally, the method may further include the following steps:
when a user in a meeting receives a call of an incoming call user, judging whether the incoming call user is an important user of the user in the meeting, if so, sending a message prompt to the user in the meeting so as to prompt the incoming call user to call the incoming call user.
If the incoming call user is judged not to be an important user of the user in the meeting, the user in the meeting currently in the meeting can be sent to the incoming call user; or if the incoming call user is judged not to be the important user of the user in the meeting, prohibiting the incoming call user from calling the user in the meeting.
Therefore, when a certain incoming call user calls a user in a meeting, the user in the meeting can be reminded only if the incoming call user is an important user of the user in the meeting, and otherwise, the user is not reminded.
In this embodiment, the first inference rule may include:
calling,useri/phonei
the userid/phone is respectively a user identifier (userid) and a phone identifier (phone) of the user useri, only one of the userid and the phone identifier (phone) needs to be input, the value range of the userid and the phone identifier corresponding to the userid in the user data set is obtained, and the calling is a phone.
The second inference rule may include:
meeting,userj/phonj
wherein, userj/phonej are id and phone of user userj, only one of them needs to be input, its value range is userid in user data set and its corresponding phonej, the meeting is the above-mentioned meeting.
The third inference rule may include:
calling,useri/phonei
meeting,userj/phonej
wherein the value ranges of useri/phonei and userj/phonej are the same as above.
If the inference rule is the first inference rule, the inference plan may infer that the user in a meeting with useri as an important telephone contact person is obtained, and if not, return a prompt without information (e.g., no information).
If the inference rule is the second inference rule, the inference plan may infer whether the user userj is in a meeting, if so, return the meeting id of the meeting, and if not, return a prompt without information (e.g., no information).
If the inference rule is the third inference rule, the inference plan may infer whether the incoming user useri is an important telephone contact of the user userj being conferred, if so, the user i may be returned as the important contact of the user j (for example, useri userj import contact), and if not, a prompt of no information (for example, no information) may be returned.
In this embodiment, the user context, the communication context, the conference context, and the inference rule are used as inputs, and the corresponding user alert is output.
For example: step 204 may include the steps of:
1. the phone field in the user context and the phone field in the output of the distributed inference algorithm of the important contact of the telephone are taken as corresponding, and the two fields are connected;
2. and connecting the two tables by taking the phonej field output in the step 1 and the phone field in the user context as correspondence.
3. And (3) connecting the two tables by taking the userj field output in the step (2) and the userid field in the inference rule as correspondence.
4. And acquiring the analysis result of the inference rule and generating the inference result of the inference plan.
For example: if the inference rule is a single rule, the rule header content is calling, and the first field (the second field) in the output of the input parameters useri (phonei) and reduce3 is the same, then useri + phonei + phonej + conn + position + userj can be output.
If the inference rule is a single rule and the rule header content is meeting, the input parameters useri (phonei) and 3 rd field (6 th field) in the output of reduce3 function are the same, then phonej + userj + meetingid can be output.
If the inference rule is two (rule no-order requirement) and the header of rule 1 is calling, the first field (second field) in the output of the input parameters useri (phonei) and reduce3 are the same.
And the header content of rule 2 is meeting, and the input parameters useri (phonei) and the 3 rd field (6 th field) in the output of the reduce3 function are the same, it can be output that user i is an important contact (useri userjimitportontact) of user j, or that user i is an important contact (useri not userjimitact) of user j.
For example: the acquired user conference state information is as follows:
user19 is in meeting20
user21 is in meeting11
the obtained important relationship information of the user pairs is as follows:
10281144736 10281103325yes
17581147769 17925477092yes
the user context is as follows:
user33 user_name33 10281144736 leader office 2014-05-13 10:20:55
user34 user_name34 17581147769 fellow office 2014-05-13 12:19:31
user19 user_name19 10281103325 leader meeting_room20 2014-05-1309:07:56
user21 user_name21 17925477092 fellow meeting_room11 2014-05-1313:44:58
if the inference rule is as follows:
user33,calling
user19,meeting
the result of this step 204 may be as follows:
caller 33 is an important contact of user19 (calling user 3310281144736 user1910281103325import contact)
This may short message the meeting user19, for example: output Important contacts user33 is calling, please answer (instant contact user33 is calling, lease answer the phone)!
If the inference rule is as follows:
user78,calling
user19,meeting
the result of this step 204 may be as follows:
in a meeting, please wait for a call (I'm in the meeting, lease call me later)
This is output because the incoming user78 is not a significant contact of the phone that is conferencing with user 19.
If the inference rule is as follows:
user33,calling
the result of this step 204 may be as follows:
user19 is in a meeting at conference room20 (user19 is in meeting20)
The result is a meeting user with the incoming user as an important telephone contact.
If the inference rule is as follows:
user19,meeting
the result of this step 204 may be as follows:
user33 information (e.g., user33 user _ name 3310281144736)
The result is user information for the important contact that is in a meeting and is user 19.
In the embodiment, the user context, the conference context and the communication context are used as input, distributed reasoning is carried out according to the reasoning rule, the user conference state information and the important relation information of the user pair, and a corresponding result is output. The inference rules are 3 types, the inference rules are different, the inferred results are also different, the same rule corresponds to, the parameters are different, and the obtained results are also different.
It should be noted that the embodiment of the invention can improve the reasoning efficiency of a large amount of context data in the enterprise communication and cooperation field and save the reasoning time. The experimental data are as follows:
for 20000 users, 100 conference records and five million communication records as input, the running time in the Hadoop cluster of 4 nodes (one host and three submachine) is 678464 milliseconds, and the running time in the Hadoop cluster of 3 nodes (one host and 2 submachines) is 783020 milliseconds, so that the inference time is obviously reduced, and the inference efficiency is obviously improved.
For 100 users, 30 conference data, 15000 communication records are used as input, and the operation time of the Hadoop cluster (a host and 2 submachine) of 3 nodes is less than that of the Hadoop cluster (a host and 2 submachine) of 10000 users, 100 conference records and five million communication records are used as input.
In addition, the embodiment of the invention can also improve the accuracy of context distributed reasoning in the fields of enterprise communication and cooperation. The experimental data are as follows:
for 10000 users, 100 conference records and five million communication records as input, the method operates in a Hadoop cluster of 3 nodes (one host and two submachine).
When the communication importance is inferred, the telephone communication frequency, the telephone communication time length and the short message communication frequency are treated equally:
important(phonei,phonej)=telnumphonei,phonej+smnumphonei,phonej+totalphonei,phonej
by the method introduced by the embodiment, different weight parameters are set for the telephone communication times, the telephone communication duration and the short message communication times for processing. Experiments prove that the accuracy of the latter is 5 to 10 percent higher than that of the former.
In addition, the embodiment of the invention can also reduce the memory occupancy rate when a large amount of context data in the fields of enterprise communication and cooperation are inferred. The experimental data are as follows:
100 users, 30 conference data and 15000 communication records are used as input, the operation is carried out in a Hadoop cluster of 3 nodes (a main machine and two sub machines), the CPU occupancy rate change is in a range of 0-50%, because although the input data volume is small, certain resources are occupied when the Hadoop cluster is started and operated.
100 users, 30 conference data and 15000 communication records are used as input, the operation is carried out in a Hadoop cluster with 4 nodes (one host and three submachine), and the CPU occupancy rate is less than that of the experiment (1), and the reason is the same as the above.
For 10000 users, 100 conference records and five million communication records as input, the system runs in a Hadoop cluster of 3 nodes (a host and two submachine), the CPU occupancy rate change is higher than that of the experiment (1), and the Hadoop cluster occupies more resources when starting and processing data because the input data volume is increased.
10000 users, 100 conference records and five million communication records are input, the system runs in a Hadoop cluster with 4 nodes (a host and two submachine), the CPU occupancy rate change is lower than that of an experiment ⑶, and computing nodes are increased when the data volume is the same.
It should be noted that the above description is only given by way of example of a conference scenario. In addition, the present embodiment may also be applied to the field including but not limited to meeting scenes, such as: but also in the context of a personal business intelligent assistant. For example: in the scene, userj is in office, time is saved to process some emergency work, and the user does not want to be disturbed by the telephone of an unimportant user, the user eri calls at the moment, the personal service intelligent assistant starts to judge whether the useri is an Important contact of the userj, if so, the user eri outputs information of' import contact user eri calling, please answer the phone! If not, automatically hanging up the phone and sending a short message to the useri to output I'm in the meeting, lease call me later. Or when userj misses useri's phone, it can send a message reminder according to the importance of useri: "missingimpartant/Unimportant contact useri calls, plexase answer".
The distributed inference module of the important telephone contact persons can be called to obtain an intermediate result which can be used as the basis for the inference of the personal intelligent service assistant and provide services for the user.
In this embodiment, various optional implementations are added to the embodiment shown in fig. 1, and all of them can improve the efficiency of context inference.
For convenience of description, only the relevant parts of the embodiments of the present invention are shown, and details of the specific technology are not disclosed.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a context distributed inference apparatus according to an embodiment of the present invention, as shown in fig. 3, including: a modeling unit 31, a first obtaining unit 32, an analyzing unit 33, and an inference unit 34, wherein:
the modeling unit 31 is configured to acquire a large amount of context data of multiple types from multiple computing nodes, and perform modeling processing on the acquired context data to obtain an inferred context set, where the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to a size of reaching a specific data amount.
A first obtaining unit 32, configured to obtain an inference rule for performing distributed inference on the inferred context set.
And the analyzing unit 33 is configured to analyze the inference rule and generate an inference plan of the inference rule.
And the reasoning unit 34 is used for performing distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan.
Optionally, the apparatus may be implemented based on a distributed File System, that is, the apparatus may be implemented in any device including the distributed File System, where the distributed File System may be a Hadoop Distributed File System (HDFS).
In this embodiment, a large amount of and various types of context data are acquired from a plurality of computing nodes, and modeling processing is performed on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of context data reaching a specific data amount; acquiring an inference rule for performing distributed inference on the inferred context set; analyzing the inference rule to generate an inference plan of the inference rule; and carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan. Therefore, distributed reasoning can be performed on a large amount of context data of a plurality of computing nodes, and compared with the prior art that the context reasoning is performed in a single machine environment. The embodiment can improve the context reasoning efficiency.
Referring to fig. 4, fig. 4 is a schematic structural diagram of another context-distributed inference apparatus according to an embodiment of the present invention, as shown in fig. 34, including: a modeling unit 41, a first acquisition unit 42, an analysis unit 43, and an inference unit 44, wherein:
the modeling unit 41 is configured to obtain a large amount and a variety of context data from a plurality of computing nodes, and perform modeling processing on the obtained context data to obtain a set of inferred contexts, where each inferred context in the set of inferred contexts includes a flag, an attribute content, and an occurrence time.
Optionally, the inferred context may be used for the following triplet representation:
context={cid;body;time}
wherein context represents the inferred context, cid is the identification (e.g. id) of the inferred context, the identification is unique and can identify different contexts, body is the attribute content contained in the inferred context, and time is the occurrence time of the inferred context.
A first obtaining unit 42, configured to obtain an inference rule for performing distributed inference on the inferred context set.
And an analyzing unit 43, configured to analyze the inference rule, and generate an inference plan of the inference rule.
And the reasoning unit 44 is used for performing distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan.
Optionally, the set of inferred contexts may include a user context and a meeting context:
the apparatus may further include:
a second obtaining unit 45, configured to obtain a first inference model for performing conditional inference on the inferred context set, where the first inference model includes two inference conditions;
the inference unit 44 may include:
a first reasoning subunit 441, configured to perform conditional distributed reasoning on the inferred context set by using the first reasoning model, so as to obtain context information that satisfies reasoning conditions included in the first reasoning model, where the context information is used to represent users participating in a conference;
and a second inference subunit 442, configured to perform distributed inference on the context corresponding to the user participating in the conference according to the inference plan and a pre-configured context inference algorithm for the inference rule, and generate conference state information of the user meeting the inference rule.
Optionally, the first inference model may include one or more of the following:
an early-arriving conference user, an on-time-arriving conference user, and a late-arriving conference user.
For example: the first inference model described above can be described by the following formula:
Figure GDA0002273422070000321
wherein, the contextu.location represents that the location of the user context U is a conference room in the conference context;
where the condition 1 is contextm.strtime-contextu.time ∈ [0,60), contextm.strtime represents the time of the full-text context M, contextu.time represents the time of the user context entering the conference room, i.e., the aforementioned contextm.strtime-contextu.time ∈ [0,60) may represent the user within 60 minutes of the earlier conference room;
wherein the condition 2 is contextm.end-contextu.time e (0, meeting duration), contextm.end represents the meeting end time of the full-text context M, contextm.end-contextu.time e (0, meeting duration) represents the user who is late to the user and enters the meeting room on time;
condition 3 is contextm.strtime-contextu.time e [0,60) & & contextm.endtime-contextu.time e [0, conference duration ], representing the union of condition 1 and condition 2 described above.
In this way, the above model can perform distributed processing on the context with the user context and the conference context as input, and output context information that satisfies both the contextu.location ═ contextm.place table and the condition 1 (or 2 or 3).
In addition, the inference rule may include any one of:
1. the time-time is a, wherein a is an integer variable and has a value range of [0,60), and the time-time represents the time of subtracting the time of the user entering the conference room from the conference start time of the conference context;
2. and b, wherein b is an integer variable and has a value range of (0, 120), and the end-time represents the conference end time of the conference context minus the time of the user entering the conference room in the user context, because the conference duration is 60 minutes in some and 120 minutes in some, if b is (0, 60), users of two conferences can be obtained, the smaller the value of b is, the more users in the conference are obtained by inference, and if b is (60, 120), only the conference user with the conference time of 120 minutes can be obtained.
3. The method comprises the following steps of (1) strtime-time, a, endtime-time and b, wherein the value ranges of the variables a and b are the same as above, namely the inference rule 3 is the union of the inference rule 1 and the inference rule 2.
In this embodiment, the inference plans of the different inference rules are also different, for example: if the inference rule 1 is the above inference rule, the inference plan is to infer the user who arrives earlier, that is, the time when the user enters the conference room is earlier than the conference start time. In addition, the larger the value of a is, the more users in a meeting are obtained.
If the inference rule 2 is above, the inference plan is to infer the late conference user and the conference user who enters the conference room on time, that is, the time when the user enters the conference room is later than the start time or the same, but earlier than the end time of the conference.
If the inference rule 3 is the above inference rule, the inference plan is to infer the users who arrive earlier, arrive later and are in a meeting according to the meeting entering the meeting room, namely the union of the above two cases, under the condition that the same parameters are taken as the former two rules.
The embodiment can realize that the user context, the conference context and the inference rule are used as input, wherein the user context and the conference context are input by the first inference model, the first inference model is a combination of two conditions, the description and the representation of the conditions are related to information in context data, the inference of the inferred context corresponding to the user participating in the conference by using the inference rule is to process the input data, and the output of the inference is the user in the meeting at a certain moment.
In the embodiment, the user context, the conference context and the inference rule are used as input, distributed inference is carried out according to the first inference model and the inference rule, and the state of the user at a certain moment is output. 3 kinds of inference rules are provided, and when the inference rules are different, the output results are also different; corresponding to the same reasoning result, when the parameters are different, the obtained reasoning results are also different.
As an alternative embodiment, the set of inferred contexts can include a user context and a communication context. As shown in fig. 5, the apparatus may further include:
a third obtaining unit 46, configured to obtain a second inference model for performing importance inference on the inferred context set;
the inference unit 44 includes:
the third reasoning sub-unit 443 is configured to perform distributed reasoning on the communication importance of the communication context by using the second reasoning model, so as to obtain communication importance information between two users in the communication context;
a fourth reasoning subunit 444, configured to perform role importance distributed reasoning on the user context using the second reasoning model, so as to obtain role importance information between users in the user context;
and a fifth inference subunit 445, configured to perform distributed inference on the obtained communication importance information and job importance information according to the inference plan and a pre-configured context inference algorithm for the inference rule, and generate the important relationship information of a user pair whose communication importance and job importance satisfy the inference rule, where the important relationship information of the user pair is used to indicate that two users in the user pair have an important relationship.
Optionally, the second inference model may be described by the following formula:
Figure GDA0002273422070000331
wherein phonei represents user useri, phonej represents user userj, and the importance between communication users and the importance of user positions can be obtained through the second reasoning model.
Wherein, when the user communication importance concordance (phonei, phonej) is in distributed reasoning, a method for assigning different weight parameters to different contexts and mapping the context value-taking result by adopting a marginal utility theory is provided, and the method is included in the user communication importance concordance (phonei, phonej) in the second reasoning model.
Wherein the conditions of the above-mentioned position importance allocation entity (phonei, phonej) are as follows:
if(phonei.position==leader)
positionImportance(phonei,phonej)=yes
that is, if the position of the incoming user phonei is leader, it is a phone important contact of phonej. It should be noted that the occupational importance is relative, not relative to each other, for example: a position identity identifier (phonei, phonej) ═ yes indicates that the user phonei is a telephone important contact of the user phonej, but does not indicate that the user phonej is a telephone important contact of the user phonei.
In this embodiment, the inference rule may include any one of the following items:
1. conn, a, wherein the value range of a is determined according to the value interval of the communication importance and is tentatively (0, 1)
2. position, b, wherein b is a string constant with a numeric range of 2 yes/no.
3. conn, a, position, b, wherein the value ranges of a and b are the same as above.
Here, conn represents communication, and position represents position.
In this embodiment, the inference plans of the different inference rules are also different, for example: if the inference rule is the inference rule 1, the inference plan is used for obtaining the communication importance user for inference.
If the inference rule 2 is the above inference rule, the inference plan is to infer the users with the importance of the job. When the input rule is position, yes, the output result is the important user judged according to the importance of the job. When the input rule is position, no, the output result is the non-important user obtained according to the job importance judgment. In general, it makes more sense to infer job importance.
If the inference rule 3 is the above inference rule, the inference plan is to infer the communication importance user and the job importance (non-important) user, and the theoretically obtained result is the union of the above two rules under the condition that the input parameters are the same.
In the implementation mode, the user context, the communication context and the inference rule are used as input, distributed inference is carried out according to the second inference model and the inference rule, and important contacts of the telephone are output. The inference rules are 3 types, the inference rules are different, the output results are also different, and the obtained inference results are also different corresponding to the same inference rule and different parameters.
As an alternative embodiment, as shown in fig. 6, the apparatus may further include:
a fourth obtaining unit 47, configured to obtain conference state information of a user and important relationship information of a user pair, where the important relationship information of the user pair is used to indicate that two users in the user pair have an important relationship;
the reasoning unit 44 is configured to perform distributed reasoning on the inferred context set according to the user conference state information and the important relationship information of the user pair and according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule, and output the user who has an important relationship and is in a meeting in each inferred context; and/or
The inference unit 44 is configured to perform distributed inference on the inferred context set according to the user conference state information and the important relationship information of the user pair and according to the inference plan and a pre-configured context inference algorithm for the inference rule, and output meeting room information of the user participating in the conference in each inferred context; and/or
The inference unit 44 is configured to perform distributed inference on the inferred context set according to the user conference state information and the important relationship information of the user pair and according to the inference plan and a pre-configured context inference algorithm for the inference rule, to obtain whether an incoming call user of the user in a meeting is an important user of the user in the meeting, and if so, output prompt information indicating that the incoming call user is an important user of the user in the meeting.
Optionally, the inference rules used for different inference results are also different, where the first inference rule may be used when the inference outputs the user who has an important relationship and is in a meeting in each inferred context; the second inference rule is used when the inference outputs the meeting room information of the user participating in the meeting in each inferred context; the third inference rule may be used when inferring whether the incoming call user of the user in the meeting is an important user of the user in the meeting. The inference rule may include any one or more of:
a first inference rule, a second inference rule, and a third inference rule.
In this embodiment, optionally, the apparatus may be further configured to determine whether the incoming call user is an important user of the user in the meeting when the user in the meeting receives the call of the incoming call user, and if so, send a message prompt to the user in the meeting to prompt the incoming call user to call the incoming call user.
If the incoming call user is judged not to be an important user of the user in the meeting, the user in the meeting currently in the meeting can be sent to the incoming call user; or if the incoming call user is judged not to be the important user of the user in the meeting, prohibiting the incoming call user from calling the user in the meeting.
Therefore, when a certain incoming call user calls a user in a meeting, only if the incoming call user is an important user of the user in the meeting, the incoming call information is output to the user in the meeting.
In this embodiment, the first inference rule may include:
calling,useri/phonei
the userid/phone is respectively a user identifier (userid) and a phone identifier (phone) of the user useri, only one of the userid and the phone identifier (phone) needs to be input, the value range of the userid and the phone i corresponding to the userid in the user data table are obtained, and the calling is a phone.
The second inference rule may include:
meeting,userj/phonj
wherein, userj/phone are id and phone of user j, only one of them needs to be input, the value range is userid in user data set and its corresponding phone, and the meeting is the meeting.
The third inference rule may include:
calling,useri/phonei
meeting,userj/phonej
wherein the value ranges of useri/phonei and userj/phonej are the same as above.
If the inference rule is the first inference rule, the inference plan may infer that the user in a meeting with useri as an important telephone contact person is obtained, and if not, return a prompt without information (e.g., no information).
If the inference rule is the second inference rule, the inference plan may infer whether the user userj is in a meeting, if so, return the meeting id of the meeting, and if not, return a prompt without information (e.g., no information).
If the inference rule is the third inference rule, the inference plan may infer whether the incoming user useri is an important telephone contact of the user userj being conferred, if so, the user i may be returned as the important contact of the user j (for example, useri userj import contact), and if not, a prompt of no information (for example, no information) may be returned.
In this embodiment, the user context, the communication context, the conference context, and the inference rule are used as inputs, and the corresponding user alert is output.
In this embodiment, various optional implementations are added to the embodiment shown in fig. 3, and all of them can achieve the purpose of improving the efficiency of context inference.
Referring to fig. 7, fig. 7 is a schematic structural diagram of another context distributed inference apparatus according to an embodiment of the present invention, as shown in fig. 7, including: a memory 71 and a processor 72, wherein the memory 71 is used for storing a set of program codes, and the processor 72 is used for calling the codes stored in the memory 71 to execute the following operations:
acquiring a large amount of and various types of context data from a plurality of computing nodes, and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount;
acquiring an inference rule for performing distributed inference on the inferred context set;
analyzing the inference rule to generate an inference plan of the inference rule;
and carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan.
Optionally, the processor 72 performs an operation of modeling the acquired context data to obtain an inferred context set, which may include:
and modeling the acquired context data to obtain an inferred context set comprising the mark, the attribute content and the occurrence time.
Optionally, the set of inferred contexts includes a user context and a meeting context: processor 72 may perform operations that may further include:
acquiring a first inference model for performing conditional inference on the inferred context set, wherein the first inference model comprises two inference conditions;
optionally, the operation performed by the processor 72 for performing distributed inference on the inferred context set according to the inference plan and a pre-configured context inference algorithm for the inference rule to generate the inference result of the inference plan may include:
performing conditional distributed reasoning on the inferred context set by using the first reasoning model to obtain context information meeting reasoning conditions included by the first reasoning model, wherein the context information is used for representing users participating in a conference;
and reasoning the context information corresponding to the users participating in the conference according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and generating the conference state information of the users according with the reasoning rule.
Optionally, the set of inferred contexts includes a user context and a communication context; the processor 71 performing operations may further include:
acquiring a second inference model for performing importance inference on the inferred context set;
the operation performed by processor 71 to perform distributed inference on said set of inferred contexts according to said inference plan and a pre-configured context inference algorithm for said inference rule, and generate inference results of said inference plan may include:
performing communication importance distributed reasoning on the communication context by using the second reasoning model to obtain communication importance information between two users in the communication context;
performing job importance distributed reasoning on the user context by using the second reasoning model to obtain job importance information among users;
and carrying out distributed reasoning on the obtained communication importance information and job importance information according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate the important relationship information of a user pair with the communication importance and job importance meeting the reasoning rule, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have an important relationship.
Optionally, the operations performed by the processor 72 may further include:
acquiring user conference state information and important relationship information of a user pair, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have important relationship;
the operation performed by processor 72 to perform distributed inference on the inferred context set according to the inference plan and a pre-configured context inference algorithm for the inference rule to generate an inference result of the inference plan may include:
according to the user conference state information and the important relationship information of the user pairs, carrying out distributed reasoning on a reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule, and outputting users who have important relationships and are in a meeting in each reasoned context; and/or
According to the user conference state information and the important relationship information of the user pairs, carrying out distributed reasoning on a reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule, and outputting conference room information of the user participating in the conference in each reasoned context; and/or
And performing distributed reasoning on the inferred context set according to the user conference state information and the important relationship information of the user pairs and according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to obtain whether the incoming call user of the user in the conference is the important user of the user in the conference, and if so, outputting prompt information for indicating that the incoming call user is the important user of the user in the conference.
Optionally, the apparatus may be implemented based on a distributed File System, that is, the apparatus may be implemented in any device including the distributed File System, where the distributed File System may be a Hadoop Distributed File System (HDFS).
In this embodiment, a large amount of and various types of context data are acquired from a plurality of computing nodes, and modeling processing is performed on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of context data reaching a specific data amount; acquiring an inference rule for performing distributed inference on the inferred context set; analyzing the inference rule to generate an inference plan of the inference rule; and carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan. In this way, distributed reasoning can be performed on a large amount of context data of a plurality of computing nodes, compared with the prior art in which context reasoning is performed in a single machine environment. The embodiment can improve the context reasoning efficiency.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present invention, and it is therefore to be understood that the invention is not limited by the scope of the appended claims.

Claims (14)

1. A method of context-distributed reasoning, comprising:
acquiring a large amount of and various types of context data from a plurality of computing nodes, and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount;
acquiring an inference rule for performing distributed inference on the inferred context set;
analyzing the inference rule to generate an inference plan of the inference rule;
performing distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan;
wherein the set of inferred contexts includes a user context and a meeting context:
wherein the inferred context is represented by the following triplets:
context={cid;body;time};
wherein context represents the inferred context, cid is the identification of the inferred context, body is the attribute content contained in the inferred context, and time is the occurrence time of the inferred context;
the method further comprises the following steps:
acquiring a first inference model for performing conditional inference on the inferred context set, wherein the first inference model comprises two inference conditions;
the distributed reasoning is carried out on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and a reasoning result of the reasoning plan is generated, and the method comprises the following steps:
performing conditional distributed reasoning on the inferred context set by using the first reasoning model to obtain context information meeting reasoning conditions included by the first reasoning model, wherein the context information is used for representing users participating in a conference;
and carrying out distributed reasoning on the inferred context set corresponding to the users participating in the conference according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and generating the conference state information of the users according with the reasoning rule.
2. The method of claim 1, wherein said modeling the obtained context data to obtain a set of inferred contexts comprises:
and modeling the acquired context data to obtain an inferred context set comprising the mark, the attribute content and the occurrence time.
3. A method of context-distributed reasoning, comprising:
acquiring a large amount of and various types of context data from a plurality of computing nodes, and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount;
acquiring an inference rule for performing distributed inference on the inferred context set;
analyzing the inference rule to generate an inference plan of the inference rule;
performing distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan;
wherein the inferred context is represented by the following triplets: context ═ { cid; a body; time };
wherein context represents the inferred context, cid is the identification of the inferred context, body is the attribute content contained in the inferred context, and time is the occurrence time of the inferred context;
wherein the set of inferred contexts includes a user context and a communication context;
the method further comprises the following steps:
acquiring a second inference model for performing importance inference on the inferred context set;
the distributed reasoning is carried out on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and a reasoning result of the reasoning plan is generated, and the method comprises the following steps:
performing communication importance distributed reasoning on the communication context by using the second reasoning model to obtain communication importance information between two users in the communication context;
performing job importance distributed inference on the user context by using the second inference model to obtain job importance information among users in the user context;
and carrying out distributed reasoning on the obtained communication importance information and job importance information according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate the important relationship information of a user pair with the communication importance and job importance meeting the reasoning rule, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have an important relationship.
4. The method of claim 3, wherein said modeling the obtained context data to obtain a set of inferred contexts comprises: and modeling the acquired context data to obtain an inferred context set comprising the mark, the attribute content and the occurrence time.
5. A method of context-distributed reasoning, comprising:
acquiring a large amount of and various types of context data from a plurality of computing nodes, and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount;
acquiring an inference rule for performing distributed inference on the inferred context set;
analyzing the inference rule to generate an inference plan of the inference rule;
performing distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate a reasoning result of the reasoning plan;
wherein the inferred context is represented by the following triplets: context ═ { cid; a body; time };
wherein context represents the inferred context, cid is the identification of the inferred context, body is the attribute content contained in the inferred context, and time is the occurrence time of the inferred context;
wherein the method further comprises:
acquiring user conference state information and important relationship information of a user pair, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have important relationship;
the distributed reasoning is carried out on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and a reasoning result of the reasoning plan is generated, and the method comprises the following steps:
according to the user conference state information and the important relationship information of the user pairs, carrying out distributed reasoning on a reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule, and outputting users who have important relationships and are in a meeting in each reasoned context; and/or
According to the user conference state information and the important relationship information of the user pairs, carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule, and outputting conference room information of the user participating in the conference in the reasoned context set; and/or
And performing distributed reasoning on the inferred context set according to the user conference state information and the important relationship information of the user pairs and according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to obtain whether the incoming call user of the user in the conference is the important user of the user in the conference, and if so, outputting prompt information for indicating that the incoming call user is the important user of the user in the conference.
6. The method of claim 5, wherein said modeling the obtained context data to obtain a set of inferred contexts comprises:
and modeling the acquired context data to obtain an inferred context set comprising the mark, the attribute content and the occurrence time.
7. A context distributed reasoning apparatus, comprising: the device comprises a modeling unit, a first acquisition unit, an analysis unit and an inference unit, wherein:
the modeling unit is used for acquiring a large amount of context data of various types from a plurality of computing nodes and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount;
the first acquisition unit is used for acquiring an inference rule for performing distributed inference on the inferred context set;
the analysis unit is used for analyzing the inference rule to generate an inference plan of the inference rule;
the reasoning unit is used for carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule to generate a reasoning result of the reasoning plan;
wherein the set of inferred contexts includes a user context and a meeting context:
wherein the inferred context is represented by the following triplets: context ═ { cid; a body; time };
wherein context represents the inferred context, cid is the identification of the inferred context, body is the attribute content contained in the inferred context, and time is the occurrence time of the inferred context;
the device further comprises:
the second acquisition unit is used for acquiring a first inference model used for carrying out condition distributed inference on the inferred context set, and the first inference model comprises two inference conditions;
the inference unit includes:
the first reasoning subunit is configured to perform conditional distributed reasoning on the inferred context set by using the first reasoning model to obtain context information meeting reasoning conditions included in the first reasoning model, where the context information is used to represent users participating in a conference;
and the second reasoning subunit is used for reasoning the inferred context set corresponding to the user participating in the conference according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule, and generating the conference state information of the user according with the reasoning rule.
8. The apparatus of claim 7, wherein the modeling unit is configured to model the obtained context data to obtain a set of inferred contexts including tokens, attribute content, and occurrence times.
9. A context distributed reasoning apparatus, comprising: the device comprises a modeling unit, a first acquisition unit, an analysis unit and an inference unit, wherein:
the modeling unit is used for acquiring a large amount of context data of various types from a plurality of computing nodes and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount;
the first acquisition unit is used for acquiring an inference rule for performing distributed inference on the inferred context set;
the analysis unit is used for analyzing the inference rule to generate an inference plan of the inference rule;
the reasoning unit is used for carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule to generate a reasoning result of the reasoning plan;
wherein the inferred context is represented by the following triplets: context ═ { cid; a body; time };
wherein context represents the inferred context, cid is the identification of the inferred context, body is the attribute content contained in the inferred context, and time is the occurrence time of the inferred context;
wherein the set of inferred contexts includes a user context and a communication context;
the device further comprises:
a third obtaining unit, for obtaining the importance inference for the inferred context set
A second inference model;
the inference unit includes:
the third reasoning subunit is used for carrying out communication importance distributed reasoning on the communication context by using the second reasoning model to obtain communication importance information between two users in the communication context;
the fourth reasoning subunit is used for performing job importance distributed reasoning on the user context by using the second reasoning model to obtain job importance information among the users in the user context;
and the fifth reasoning subunit is used for performing distributed reasoning on the obtained communication importance information and job importance information according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to generate the important relationship information of a user pair with the communication importance and job importance meeting the reasoning rule, wherein the important relationship information of the user pair is used for indicating that two users in the user pair have an important relationship.
10. The apparatus of claim 9, wherein the modeling unit is configured to model the obtained context data to obtain a set of inferred contexts including flags, attribute content, and occurrence times.
11. A context distributed reasoning apparatus, comprising: the device comprises a modeling unit, a first acquisition unit, an analysis unit and an inference unit, wherein:
the modeling unit is used for acquiring a large amount of context data of various types from a plurality of computing nodes and carrying out modeling processing on the acquired context data to obtain an inferred context set, wherein the representation forms of various contexts in the inferred context set are uniform, and the large amount of context data refers to the scale of the context data reaching a specific data amount;
the first acquisition unit is used for acquiring an inference rule for performing distributed inference on the inferred context set;
the analysis unit is used for analyzing the inference rule to generate an inference plan of the inference rule;
the reasoning unit is used for carrying out distributed reasoning on the reasoned context set according to the reasoning plan and a pre-configured context reasoning algorithm used for the reasoning rule to generate a reasoning result of the reasoning plan;
wherein the inferred context is represented by the following triplets: context ═ { cid; a body; time };
wherein context represents the inferred context, cid is the identification of the inferred context, body is the attribute content contained in the inferred context, and time is the occurrence time of the inferred context;
wherein the apparatus further comprises:
the fourth acquisition unit is used for acquiring conference state information of the users and important relationship information of the user pairs, wherein the important relationship information of the user pairs is used for indicating that two users in the user pairs have important relationships;
the reasoning unit is used for carrying out distributed reasoning on the reasoned context set according to the user conference state information and the important relationship information of the user pairs and a context reasoning algorithm which is configured in advance and used for the reasoning rule, and outputting the users who have important relationships and are in a meeting in each reasoned context; and/or
The reasoning unit is used for carrying out distributed reasoning on the reasoned context set according to the user conference state information and the important relationship information of the user pairs and a preset context reasoning algorithm for the reasoning rule and outputting conference room information of the user participating in the conference in the reasoned context set; and/or
The reasoning unit is used for carrying out distributed reasoning on the inferred context set according to the user conference state information and the important relationship information of the user pairs and according to the reasoning plan and a pre-configured context reasoning algorithm for the reasoning rule to obtain whether the calling user of the user in the conference is the important user of the user in the conference or not, and if so, outputting prompt information for indicating that the calling user is the important user of the user in the conference.
12. The apparatus of claim 11, wherein the modeling unit is configured to model the obtained context data to obtain a set of inferred contexts including tokens, attribute content, and occurrence times.
13. A computer-readable storage medium storing a computer program for instructing associated hardware to perform the method of any one of claims 1 to 6.
14. A context distributed reasoning apparatus, comprising:
a memory for storing a set of program code and a processor for calling the memory stored code to perform the method of any of claims 1 to 6.
CN201410844207.7A 2014-12-30 2014-12-30 Context distributed reasoning method and device Active CN105808568B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410844207.7A CN105808568B (en) 2014-12-30 2014-12-30 Context distributed reasoning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410844207.7A CN105808568B (en) 2014-12-30 2014-12-30 Context distributed reasoning method and device

Publications (2)

Publication Number Publication Date
CN105808568A CN105808568A (en) 2016-07-27
CN105808568B true CN105808568B (en) 2020-02-14

Family

ID=56419984

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410844207.7A Active CN105808568B (en) 2014-12-30 2014-12-30 Context distributed reasoning method and device

Country Status (1)

Country Link
CN (1) CN105808568B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109447269B (en) * 2018-10-10 2022-02-18 广州极天信息技术股份有限公司 Inference rule configuration method and device
CN109684485B (en) * 2018-12-25 2021-09-28 福州大学 Method for designing arbitrary rule inference engine based on SWRL
CN112784989B (en) * 2019-11-08 2024-05-03 阿里巴巴集团控股有限公司 Inference system, inference method, electronic device, and computer storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1742300A (en) * 2003-01-30 2006-03-01 电影教学***股份有限公司 System for learning language through embedded content on a single medium
US7613670B2 (en) * 2005-06-29 2009-11-03 Microsoft Corporation Precomputation of context-sensitive policies for automated inquiry and action under uncertainty

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101645789B (en) * 2009-08-26 2011-05-11 北京邮电大学 Meeting notice system and method based on context service
CN101694629B (en) * 2009-10-23 2014-03-26 北京邮电大学 Context sensing application platform based on main body and work method thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1742300A (en) * 2003-01-30 2006-03-01 电影教学***股份有限公司 System for learning language through embedded content on a single medium
US7613670B2 (en) * 2005-06-29 2009-11-03 Microsoft Corporation Precomputation of context-sensitive policies for automated inquiry and action under uncertainty

Also Published As

Publication number Publication date
CN105808568A (en) 2016-07-27

Similar Documents

Publication Publication Date Title
KR102064203B1 (en) Emoji recommendation method and device
US8943053B2 (en) Social data ranking and processing
CN105719104B (en) Intelligent approval method and device
US8688735B2 (en) Determining potential relationships between individuals
US10191950B1 (en) Identifying influencers using social information
CN103297602A (en) Method, device and mobile terminal for processing contacts
US9390455B2 (en) Method and system for calculating importance of communication participants
WO2020134523A1 (en) User identification method and device
CN106686265A (en) A service providing method and device based on communication records
CN105808568B (en) Context distributed reasoning method and device
CN106936971B (en) A kind of incoming person&#39;s information presentation system and reminding method
CN105898085A (en) Harassment communication account identification method and device
CN105893562A (en) Conversation message processing method and device and terminal
US10389768B2 (en) Method and device for managing a conference
CN105430654B (en) The recognition methods of the attaching information of number and device
CN102902674A (en) Service group classifying method and system
CN107992598B (en) Method for mining social relation of group based on video material
CN110677269B (en) Method and device for determining communication user relationship and computer readable storage medium
CN104111926A (en) Generation method and generation device for attention recommending list of address book
CN106998386B (en) A kind of method and device thereof of newly-increased contact method, user terminal
CN111478779A (en) Method and device for determining unlimited set meal and computing equipment
CN114928678A (en) Outbound method, device and electronic equipment
CN107203515B (en) Topic generation method and device
CN110166635B (en) Suspicious terminal identification method and suspicious terminal identification system
CN108062379B (en) Data processing method, platform, device and computer readable storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant