CN107634901B - Session expression pushing method and device and terminal equipment - Google Patents

Session expression pushing method and device and terminal equipment Download PDF

Info

Publication number
CN107634901B
CN107634901B CN201710846245.XA CN201710846245A CN107634901B CN 107634901 B CN107634901 B CN 107634901B CN 201710846245 A CN201710846245 A CN 201710846245A CN 107634901 B CN107634901 B CN 107634901B
Authority
CN
China
Prior art keywords
user
current
emotion type
conversation
heart rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710846245.XA
Other languages
Chinese (zh)
Other versions
CN107634901A (en
Inventor
唐涛
唐丽华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Genius Technology Co Ltd
Original Assignee
Guangdong Genius Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Genius Technology Co Ltd filed Critical Guangdong Genius Technology Co Ltd
Priority to CN201710846245.XA priority Critical patent/CN107634901B/en
Publication of CN107634901A publication Critical patent/CN107634901A/en
Application granted granted Critical
Publication of CN107634901B publication Critical patent/CN107634901B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention is suitable for the technical field of instant messaging, and provides a pushing method, a pushing device, terminal equipment and a computer readable storage medium of a conversation expression, wherein the pushing method comprises the following steps: acquiring the current skin electricity response data of a user of the intelligent terminal under a session interface of the intelligent terminal, and determining the current emotion type of the user based on the acquired skin electricity response data; and based on the determined emotion type, pushing a session expression corresponding to the determined emotion type on the session interface so that the user selects the pushed session expression as session content. The invention avoids the operation of inputting characters or selecting the conversation emoticons from a large number of conversation emoticons, and improves the convenience of the user for instant messaging by using the conversation emoticons.

Description

Session expression pushing method and device and terminal equipment
Technical Field
The invention belongs to the technical field of instant messaging, and particularly relates to a method and a device for pushing a conversation emotion, terminal equipment and a computer-readable storage medium.
Background
Along with the development of society, the functions of the intelligent terminal are more and more abundant, conversation and chat between people through the intelligent terminal become an indispensable networking and streaming mode in the life of people, the corresponding conversation expressions can be selected besides the mood of the people can be expressed by characters in the conversation, the mood of the users can be more intuitively reflected by the conversation expressions, and the interestingness of the conversation and chat is improved. In the prior art, the pushing of the conversation emotions can be passively performed according to characters which are input by a user and are associated with the conversation emotions, for example, when the user inputs 'happy', the conversation emotions which pop up a 'smiling face' are attached; and secondly, the user can select to enter a conversation expression library and select conversation expressions suitable for the mood of the user from the conversation expression library.
However, the two ways in the prior art are not convenient enough, firstly, the user is required to manually input the characters related to the conversation expressions, secondly, the user is required to manually select the corresponding conversation expressions, and a lot of conversation expressions generally exist in the conversation expression library, so that a certain time is required for selecting the conversation expressions desired by the user, and the input efficiency of the user is influenced.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method, a device, a terminal device, and a computer-readable storage medium for pushing a session expression, which can push a session expression corresponding to a current emotion of a user to the user when the user is in a session, so that the user selects the pushed session expression as session content, and convenience of the user in performing instant messaging using the session expression is improved.
A first aspect of an embodiment of the present invention provides a method for pushing a conversational expression, where the method includes:
acquiring the current skin electricity response data of a user of the intelligent terminal under a session interface of the intelligent terminal, wherein the session interface is the session interface of the intelligent terminal after an instant messaging application program is started;
determining the current emotion type of the user based on the acquired electrodermal response data;
and based on the determined emotion type, pushing a session expression corresponding to the determined emotion type on the session interface so that the user selects the pushed session expression as session content.
A second aspect of the embodiments of the present invention provides a push device for conversational expressions, where the push device includes:
the system comprises a picoelectricity response data acquisition unit, a control unit and a display unit, wherein the picoelectricity response data acquisition unit is used for acquiring the current picoelectricity response data of a user of an intelligent terminal under a session interface of the intelligent terminal, and the session interface is the session interface after an instant messaging application program of the intelligent terminal is started;
the emotion type determining unit is used for determining the current emotion type of the user based on the electrodermal response data acquired by the electrodermal response data acquiring unit;
and the conversation expression pushing unit is used for pushing the conversation expression corresponding to the determined emotion type on the conversation interface based on the emotion type determined by the emotion type determining unit so that the user can select the pushed conversation expression as conversation content.
A third aspect of the embodiments of the present invention provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method for pushing the conversational emotions according to any one of the above when executing the computer program.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the push method for conversational emotions according to any one of the preceding claims.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
the method comprises the steps that the current skin electric reaction data of a user of the intelligent terminal are obtained under a conversation interface of the intelligent terminal, the skin electric reaction data of the user under different emotions are different, and the different skin electric reaction data can reflect different emotions of the user, so that the current emotion type of the user can be determined according to the obtained skin electric reaction data, and then the conversation expression corresponding to the determined emotion type is pushed on the conversation interface based on the determined emotion type, so that the user can select the pushed conversation expression as conversation content, the user can avoid the operation of inputting characters or selecting the conversation expression from a large number of conversation expression libraries, and the convenience of instant communication carried out by the user through the conversation expression is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a flowchart of a pushing method of a conversational emotion according to an embodiment of the present invention;
fig. 2 is a flowchart of a pushing method of conversational emotions according to another embodiment of the invention;
fig. 3 is a schematic structural diagram of a push device for conversational emotions according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Fig. 1 shows a flowchart of a method for pushing a conversational emotion according to an embodiment of the present invention, which is detailed as follows:
step 11, obtaining the current skin electricity response data of the user of the intelligent terminal under a session interface of the intelligent terminal, wherein the session interface is the session interface after an instant messaging application program of the intelligent terminal is started.
In this embodiment, an instant messaging application program is installed on an intelligent terminal, an intelligent terminal user performs a conversation with other users through the instant messaging application program, and when it is detected that the instant messaging application program is started and a display interface of the intelligent terminal displays a conversation interface after the instant messaging application program is started, it can be determined that the user of the intelligent terminal is in a conversation state, and in this state, current skin-to-electricity reaction data of the user of the intelligent terminal is obtained.
Optionally, the skin electricity response data of the intelligent terminal user can be acquired through a skin electricity sensor arranged on the intelligent terminal. For example, when a skin electric sensor on the intelligent terminal contacts the human skin of the user, the intelligent terminal can acquire skin electric reaction data of the user through the skin electric sensor.
As an alternative embodiment of the present invention, in step 11, acquiring the electrodermal response data of the user of the intelligent terminal currently includes:
and acquiring the current skin electricity reaction data of the user of the intelligent terminal through wearable equipment connected with the intelligent terminal.
In this embodiment, can also acquire user's skin electricity reaction data through the wearable equipment that disposes the skin electricity sensor, and exemplarily, the user wears the earphone, dispose the skin electricity sensor that is used for acquireing user's skin electricity reaction data on the earphone, under intelligent terminal's conversation interface, when the earphone is connected with user's intelligent terminal, above-mentioned intelligent terminal passes through above-mentioned earphone and acquires user's skin electricity reaction data.
And step 12, determining the current emotion type of the user based on the acquired electrodermal response data.
When the human body is in different emotion types (emotional states), the electrodermal response data are different, and therefore, in step 12, based on the acquired electrodermal response data, the current emotion type of the user can be determined.
For example, when the human body is excited, the activity of sweat glands is strengthened, and sweat is secreted more. The skin conductivity is improved due to more salt components in sweat, and the skin galvanic reaction data is higher at the moment; when the human body is calmer in emotion, the activity of sweat glands is at a normal level, and corresponding galvanic skin response data are normal values; when the human body is in low mood, the sweat gland activity is weak, and the corresponding skin electric reaction data is low. Therefore, based on the acquired electrodermal response data value of the user, the emotion type of the user can be roughly determined.
As an optional embodiment of the present invention, the obtained electrodeionization reaction data at least includes electrodeionization reaction data from a time point before the current time to the current time, and a time duration from the time point before the current time to the current time is equal to a preset time duration. The step 12 can be realized by the following steps:
and generating the current skin electric reaction curve of the user based on the acquired skin electric reaction data.
And determining the current emotion type of the user based on the generated bioelectricity response curve and the corresponding relation between the preset bioelectricity response curve and the emotion type.
It should be noted that, because the basic level of the electrodermal response also has individual differences, that is, the basic level of the electrodermal response of different human bodies is different, the emotion type of the user is determined by detecting the variation trend of the electrodermal response of the user within the preset time period, so that the determined emotion type of the user can be more accurate.
In this embodiment, the acquired electrodermal response data at least includes electrodermal response data of a time period ending to the current time, the duration of the time period is the preset duration, and the preset duration can be flexibly set, and is generally in the order of seconds. For example, the acquired electrodermal response data includes corresponding electrodermal response data from 10 seconds before the current time to the current time.
Specifically, a current electrodermal response curve of the user is generated according to the acquired electrodermal response data, and the electrodermal response curve reflects a change trend of the electrodermal response of the user in a corresponding time period, namely, can reflect a change of emotion types of the user. For example, when the current mood type of the user is excited, the corresponding electrodermal response curve is a steeper rising curve.
In this embodiment, a one-to-one correspondence relationship between the electrodermal response curves and the emotion types is preset, for example, a plurality of electrodermal response curves of the user when the emotion type is a high emotion are obtained in a pre-sampling manner, a reference electrodermal response curve corresponding to the emotion type being a high emotion is obtained after comprehensive analysis processing according to trend characteristics of each electrodermal response curve, and a plurality of reference electrodermal response curves corresponding to different emotion types can be obtained in the same manner. For example, the emotion types are peace, happiness, anger, and loss, respectively, corresponding to a reference electrodermal response curve. And taking the one-to-one correspondence relationship between the different emotion types and the corresponding multiple reference bioelectricity response curves as the correspondence relationship between the preset bioelectricity response curves and the emotion types. And performing similarity matching on the generated electrodermal response curves and each reference electrodermal response curve, selecting a reference electrodermal response curve with the highest similarity to the generated electrodermal response curves, and determining the emotion type corresponding to the reference electrodermal response curve with the highest similarity to the generated electrodermal response curves as the current emotion type of the user.
And step 13, based on the determined emotion type, pushing a conversation expression corresponding to the determined emotion type on the conversation interface so that the user can select the pushed conversation expression as conversation content.
In the embodiment, after the current emotion type of the user is determined, the conversation expression corresponding to the current emotion type of the user is pushed on the conversation interface of the intelligent terminal according to the current emotion type of the user, and the user can directly select the pushed conversation expression and send the selected conversation expression as the conversation content without searching for a proper conversation expression in a conversation expression library containing a large amount of expressions.
Specifically, the pushed conversation expression is a preset conversation expression matched with the current emotion type of the user, and the pushed conversation expression may be in a local expression library of the user or a popular conversation expression on the network.
It should be noted that the number of the pushed conversation emoticons may be one, or two or more.
As an alternative embodiment of the present invention, after the step 13, the following steps are further included:
and detecting whether the session expression pushed this time is selected by the user.
And if the pushed conversation expression is not selected by the user, acquiring the conversation expression actually selected by the user, and adding the conversation expression actually selected by the user into the conversation expression corresponding to the current emotion type of the user.
Specifically, different individual requirements or individual selections are provided for different users, in order to enable the pushed conversation expressions to be more accurate, after the conversation expressions corresponding to the determined emotion types are pushed, whether the pushed conversation expressions are selected by the users or not can be detected, and if the pushed conversation expressions are not selected by the users, the conversation expressions actually selected by the users are added into the conversation expressions corresponding to the current emotion types of the users. Therefore, when the fact that the user has the same emotion type as the current emotion type is detected again, the pushed conversation expression is closer to the habit that the user selects the conversation expression.
Therefore, the invention obtains the current skin electric reaction data of the user of the intelligent terminal under the session interface of the intelligent terminal, and the skin electric reaction data of the user under different emotions are different, so that different skin electric reaction data can reflect different emotions of the user, the current emotion type of the user can be determined according to the obtained skin electric reaction data, and then the session expression corresponding to the determined emotion type is pushed on the session interface based on the determined emotion type, so that the user can select the pushed session expression as the session content, the user can avoid the operation of inputting characters or selecting the session expression from a large number of session expression libraries, and the convenience of instant communication by the user using the session expression is improved.
Fig. 2 shows a flowchart of a method for pushing a conversational emotion according to another embodiment of the present invention, which is detailed as follows:
and step 21, acquiring the skin electricity response data and the heart rate data of the current user of the intelligent terminal under a session interface of the intelligent terminal.
The session interface is the session interface after the instant messaging application program of the intelligent terminal is started; the acquired electrodeionization reaction data at least comprises electrodeionization reaction data from a time point before the current time to the current time, and the time length from the time point before the current time to the current time is equal to a preset time length; the acquired heart rate data at least comprises heart rate data from a time point before the current time to the current time, and the time length from the time point before the current time to the current time is equal to the preset time length.
In this embodiment, an instant messaging application program is installed on an intelligent terminal, an intelligent terminal user performs a session through the instant messaging application program, and when it is detected that the instant messaging application program is started and a display interface of the intelligent terminal displays a session interface after the instant messaging application program is started, it can be determined that the user of the intelligent terminal is in a session state, and in this state, current skin-to-electricity response data and heart rate data of the user of the intelligent terminal are obtained.
In this embodiment, the step 11 in the embodiment shown in fig. 1 may be specifically referred to for obtaining the electrodermal response data of the user of the current intelligent terminal under the session interface of the intelligent terminal, which is not described herein again.
In this embodiment, heart rate data of a user of the current intelligent terminal is acquired under a session interface of the intelligent terminal. Optionally, the heart rate data of the intelligent terminal user can be acquired through an electrocardio sensor arranged on the intelligent terminal. For example, when a heart rate sensor on the intelligent terminal contacts with a human body of a user, the intelligent terminal can acquire heart rate data of the user through the heart rate sensor.
In addition, the heart rate data of the user can also be acquired through a wearable device equipped with a heart rate sensor. Acquiring heart rate data of a user of the intelligent terminal through wearable equipment connected with the intelligent terminal under a session interface of the intelligent terminal. For example, a user wears an earphone, a heart rate sensor for acquiring heart rate data of the user is configured on the earphone, and under a session interface of the intelligent terminal, when the earphone is connected with the intelligent terminal of the user, the intelligent terminal acquires the heart rate data of the user through the earphone.
And step 22, generating the current skin electric reaction curve of the user based on the acquired skin electric reaction data.
In this embodiment, the acquired electrodermal response data at least includes electrodermal response data of a time period ending to the current time, the duration of the time period is the preset duration, and the preset duration can be flexibly set, and is generally in the order of seconds. For example, the acquired electrodermal response data includes corresponding electrodermal response data from 10 seconds before the current time to the current time.
Specifically, a current electrodermal response curve of the user is generated according to the acquired electrodermal response data, and the electrodermal response curve reflects a change trend of the electrodermal response of the user in a corresponding time period, namely, can reflect a change of emotion types of the user. For example, when the current mood type of the user is excited, the corresponding electrodermal response curve is a steeper rising curve.
And step 23, generating a current heart rate curve of the user based on the acquired heart rate data.
The heart rate can reflect the emotion type of the user to a certain extent, in this embodiment, the acquired heart rate data at least includes heart rate data of a time period ending at the current time, the duration of the time period is the preset duration, and the preset duration can be flexibly set, generally in the order of seconds. For example, the acquired heart rate data includes heart rate data corresponding from 10 seconds before the current time to the current time.
Specifically, according to the acquired heart rate data, a current heart rate curve of the user is generated, and the heart rate curve also reflects the heart rate variation trend of the user in a corresponding time period, namely, the emotion type variation of the user can be reflected to a certain extent. For example, when the user's current mood type is excited, the corresponding heart rate curve is a steeper rising curve.
And 24, determining the current emotion type of the user based on the generated electrodermal response curve, the generated heart rate curve and the preset corresponding relationship among the electrodermal response curve, the heart rate curve and the emotion type.
In this embodiment, the corresponding relationship between the electrodermal response curves and the heart rate curves and the emotion types are preset, for example, the electrodermal response curves and the heart rate curves of a large number of users with happy emotion types are obtained in a pre-sampling manner, a reference electrodermal response curve corresponding to the emotion types of the users with happy emotion types is obtained after comprehensive analysis processing according to the trend characteristics of the electrodermal response curves, a reference heart rate curve corresponding to the emotion types of the users with happy emotion types is obtained after comprehensive analysis processing according to the trend characteristics of the heart rate curves, and a plurality of reference electrodermal response curves and a plurality of reference heart rate curves corresponding to different emotion types can be obtained in the same manner. For example, the emotion types are peace, happiness, anger and loss, respectively, corresponding to a reference electrodermal response curve and a reference heart rate curve. The preset corresponding relation among the bioelectricity response curves, the heart rate curves and the emotion types is the corresponding relation among the different emotion types, the corresponding reference bioelectricity response curves and the corresponding heart rate curves.
And respectively carrying out similarity matching on the generated electrodeionization response curve and the generated heart rate curve and the reference electrodeionization response curve and the heart rate curve corresponding to each emotion type, and respectively obtaining two similarity matching results for each emotion type. And selecting one emotion type with the similarity matching results higher than a preset threshold value, and determining the emotion type as the current emotion type of the user.
In the embodiment, the current emotion type of the user is judged by combining the skin electric response curve and the heart rate curve, so that the accuracy of judging the current emotion type of the user is further improved.
And 25, based on the determined emotion type, pushing a session expression corresponding to the determined emotion type on the session interface so that the user selects the pushed session expression as session content.
In this embodiment, step 25 may specifically refer to step 13 in the embodiment shown in fig. 1, which is not described herein again.
Therefore, the invention obtains the current skin electric reaction data of the user of the intelligent terminal under the session interface of the intelligent terminal, and the skin electric reaction data of the user under different emotions are different, so that different skin electric reaction data can reflect different emotions of the user, the current emotion type of the user can be determined according to the obtained skin electric reaction data, and then the session expression corresponding to the determined emotion type is pushed on the session interface based on the determined emotion type, so that the user can select the pushed session expression as the session content, the user can avoid the operation of inputting characters or selecting the session expression from a large number of session expression libraries, and the convenience of instant communication by the user using the session expression is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Fig. 3 shows a schematic structural diagram of a push apparatus for conversational emotions provided by an embodiment of the present invention, and for convenience of description, only the parts related to the embodiment of the present invention are shown, which are detailed as follows:
as shown in fig. 3, the push apparatus 3 for conversational emotions includes: a galvanic skin response data acquisition unit 31, an emotion type determination unit 32, and a conversational expression push unit 33.
The electrodermal response data acquiring unit 31 is configured to acquire electrodermal response data of a user of the intelligent terminal currently under a session interface of the intelligent terminal, where the session interface is a session interface after an instant messaging application of the intelligent terminal is started.
And the emotion type determining unit 32 is used for determining the current emotion type of the user based on the electrodermal response data acquired by the electrodermal response data acquiring unit 31.
And a session expression pushing unit 33, configured to push, based on the emotion type determined by the emotion type determining unit 32, a session expression corresponding to the determined emotion type in the session interface, so that the user selects the pushed session expression as session content.
Optionally, the electrodeionization reaction data acquired by the electrodeionization reaction data acquiring unit 31 at least includes electrodeionization reaction data from a time point before the current time to the current time, and a duration from the time point before the current time to the current time is equal to a preset duration;
the push device 3 for conversation emotions further comprises:
and a electrodermal response curve generating unit, configured to generate the current electrodermal response curve of the user based on the electrodermal response data acquired by the electrodermal response data acquiring unit 31.
The emotion type determination unit 32 is further configured to determine the current emotion type of the user based on the electrodermal response curve generated by the electrodermal response curve generation unit and a preset corresponding relationship between the electrodermal response curve and the emotion type.
Optionally, the push device 3 for the conversational emotions further includes:
the heart rate data acquisition unit is used for acquiring the heart rate data of the user of the intelligent terminal, wherein the acquired heart rate data at least comprises heart rate data from a time point before the current time to the current time, and the time length from the time point before the current time to the current time is equal to the preset time length.
And the heart rate curve generating unit is used for generating the current heart rate curve of the user based on the heart rate data acquired by the heart rate data acquiring unit.
The emotion type determination unit 32 is further configured to determine the current emotion type of the user based on the bioelectricity response curve obtained by the bioelectricity response curve generation unit, the heart rate curve generated by the heart rate curve generation unit, and a preset corresponding relationship between the bioelectricity response curve, the heart rate curve, and the emotion type.
Therefore, the invention obtains the current skin electric reaction data of the user of the intelligent terminal under the session interface of the intelligent terminal, and the skin electric reaction data of the user under different emotions are different, so that different skin electric reaction data can reflect different emotions of the user, the current emotion type of the user can be determined according to the obtained skin electric reaction data, and then the session expression corresponding to the determined emotion type is pushed on the session interface based on the determined emotion type, so that the user can select the pushed session expression as the session content, the user can avoid the operation of inputting characters or selecting the session expression from a large number of session expression libraries, and the convenience of instant communication by the user using the session expression is improved.
Fig. 4 is a schematic diagram of a terminal device according to an embodiment of the present invention. As shown in fig. 4, the terminal device 4 of this embodiment includes: a processor 40, a memory 41 and a computer program 42 stored in said memory 41 and executable on said processor 40. The processor 40 executes the computer program 42 to implement the steps in the above-mentioned push method embodiments of the respective conversational gestures, for example, the steps 11 to 13 shown in fig. 1. Alternatively, the processor 40, when executing the computer program 42, implements the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the modules 31 to 33 shown in fig. 3.
Illustratively, the computer program 42 may be partitioned into one or more modules/units that are stored in the memory 41 and executed by the processor 40 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 42 in the terminal device 4. For example, the computer program 42 may be divided into a galvanic skin response data acquisition unit, an emotion type determination unit, and a conversational expression push unit, and the specific functions of each unit are as follows:
the system comprises a picoelectricity response data acquisition unit, a control unit and a display unit, wherein the picoelectricity response data acquisition unit is used for acquiring the current picoelectricity response data of a user of an intelligent terminal under a session interface of the intelligent terminal, and the session interface is the session interface after an instant messaging application program of the intelligent terminal is started;
the emotion type determining unit is used for determining the current emotion type of the user based on the electrodermal response data acquired by the electrodermal response data acquiring unit;
and the conversation expression pushing unit is used for pushing the conversation expression corresponding to the determined emotion type on the conversation interface based on the emotion type determined by the emotion type determining unit so that the user can select the pushed conversation expression as conversation content.
The terminal device 4 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is merely an example of a terminal device 4 and does not constitute a limitation of terminal device 4 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the terminal device may also include input-output devices, network access devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the terminal device 4, such as a hard disk or a memory of the terminal device 4. The memory 41 may also be an external storage device of the terminal device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the terminal device 4. The memory 41 is used for storing the computer program and other programs and data required by the terminal device. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (7)

1. A method for pushing conversational expressions is characterized in that the method for pushing conversational expressions comprises the following steps:
acquiring current skin-to-electricity response data of a user of the intelligent terminal under a session interface of the intelligent terminal, wherein the acquired skin-to-electricity response data at least comprises skin-to-electricity response data from a time point before the current time to the current time, and the time length from the time point before the current time to the current time is equal to a preset time length, and the session interface is a session interface after an instant messaging application program of the intelligent terminal is started;
generating the current skin electric reaction curve of the user based on the acquired skin electric reaction data; the skin electric reaction curve reflects the skin electric reaction change trend of a user within a preset time;
determining the current emotion type of the user based on the generated bioelectricity response curve and the corresponding relation between the preset bioelectricity response curve and the emotion type, specifically: carrying out similarity matching on the generated electrodermal response curve and a reference electrodermal response curve corresponding to each emotion type, selecting a reference electrodermal response curve with the highest similarity to the generated electrodermal response curve, and determining the emotion type corresponding to the reference electrodermal response curve with the highest similarity to the generated electrodermal response curve as the current emotion type of the user; based on the determined emotion type, pushing a session expression corresponding to the determined emotion type on the session interface so that the user selects the pushed session expression as session content;
detecting whether the conversation expression pushed this time is selected by the user;
and if the pushed conversation expression is not selected by the user, acquiring the conversation expression actually selected by the user, and adding the conversation expression actually selected by the user into the conversation expression corresponding to the current emotion type of the user.
2. The pushing method according to claim 1, wherein under a session interface of an intelligent terminal, the pushing method further comprises:
acquiring heart rate data of a user of the intelligent terminal at present, wherein the acquired heart rate data at least comprises heart rate data from a time point before the current time to the current time, and the time length from the time point before the current time to the current time is equal to a preset time length;
generating a current heart rate curve of the user based on the acquired heart rate data;
the current emotion type of the user is determined based on the generated bioelectricity response curve and the corresponding relation between the preset bioelectricity response curve and the emotion type, and specifically comprises the following steps:
and determining the current emotion type of the user based on the generated bioelectricity response curve, the generated heart rate curve and the corresponding relation between the preset bioelectricity response curve, the preset heart rate curve and the emotion type.
3. The pushing method according to claim 1 or 2, wherein the obtaining of the galvanic skin response data of the current user of the intelligent terminal comprises:
and acquiring the current skin electricity reaction data of the user of the intelligent terminal through wearable equipment connected with the intelligent terminal.
4. A push device for conversational emotions, the push device comprising:
the system comprises a picoelectricity response data acquisition unit, a conversation interface and a control unit, wherein the picoelectricity response data acquisition unit is used for acquiring the current picoelectricity response data of a user of the intelligent terminal under the conversation interface of the intelligent terminal, the picoelectricity response data acquired by the picoelectricity response data acquisition unit at least comprises the picoelectricity response data from a time point before the current time to the current time, the time length from the time point before the current time to the current time is equal to a preset time length, and the conversation interface is the conversation interface after an instant messaging application program of the intelligent terminal is started;
the system comprises a picoelectric reaction curve generating unit, a data acquiring unit and a data processing unit, wherein the picoelectric reaction curve generating unit is used for generating a current picoelectric reaction curve of the user based on the picoelectric reaction data acquired by the picoelectric reaction data acquiring unit; the skin electric reaction curve reflects the skin electric reaction change trend of a user within a preset time;
the emotion type determination unit is configured to determine the current emotion type of the user based on the bioelectricity response curve generated by the bioelectricity response curve generation unit and a preset correspondence between the bioelectricity response curve and the emotion type, and specifically includes: carrying out similarity matching on the generated electrodermal response curve and a reference electrodermal response curve corresponding to each emotion type, selecting a reference electrodermal response curve with the highest similarity to the generated electrodermal response curve, and determining the emotion type corresponding to the reference electrodermal response curve with the highest similarity to the generated electrodermal response curve as the current emotion type of the user; the conversation expression pushing unit is used for pushing the conversation expression corresponding to the determined emotion type on the conversation interface based on the emotion type determined by the emotion type determining unit so that the user can select the pushed conversation expression as conversation content; detecting whether the conversation expression pushed this time is selected by the user; and if the pushed conversation expression is not selected by the user, acquiring the conversation expression actually selected by the user, and adding the conversation expression actually selected by the user into the conversation expression corresponding to the current emotion type of the user.
5. The pushing device of claim 4, further comprising:
the heart rate data acquisition unit is used for acquiring heart rate data of a user of the intelligent terminal at present, wherein the acquired heart rate data at least comprises heart rate data from a time point before the present time to the present time, and the time length from the time point before the present time to the present time is equal to a preset time length;
the heart rate curve generating unit is used for generating a current heart rate curve of the user based on the heart rate data acquired by the heart rate data acquiring unit;
the emotion type determination unit is further used for determining the current emotion type of the user based on the bioelectricity response curve obtained by the bioelectricity response curve generation unit, the heart rate curve generated by the heart rate curve generation unit and the corresponding relation among the preset bioelectricity response curve, the preset heart rate curve and the emotion type.
6. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 3 when executing the computer program.
7. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 3.
CN201710846245.XA 2017-09-19 2017-09-19 Session expression pushing method and device and terminal equipment Active CN107634901B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710846245.XA CN107634901B (en) 2017-09-19 2017-09-19 Session expression pushing method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710846245.XA CN107634901B (en) 2017-09-19 2017-09-19 Session expression pushing method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN107634901A CN107634901A (en) 2018-01-26
CN107634901B true CN107634901B (en) 2020-07-07

Family

ID=61103360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710846245.XA Active CN107634901B (en) 2017-09-19 2017-09-19 Session expression pushing method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN107634901B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108512995B (en) * 2018-03-01 2020-08-18 Oppo广东移动通信有限公司 Electronic device, brain wave control method and related product
CN109447234B (en) * 2018-11-14 2022-10-21 腾讯科技(深圳)有限公司 Model training method, method for synthesizing speaking expression and related device
CN111190493A (en) * 2018-11-15 2020-05-22 中兴通讯股份有限公司 Expression input method, device, equipment and storage medium
CN109842546B (en) * 2018-12-25 2021-09-28 创新先进技术有限公司 Conversation expression processing method and device
CN110971424B (en) * 2019-11-29 2021-10-29 广州市百果园信息技术有限公司 Message processing method, device and system, computer equipment and storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104063683B (en) * 2014-06-06 2017-05-17 北京搜狗科技发展有限公司 Expression input method and device based on face identification
US20160042648A1 (en) * 2014-08-07 2016-02-11 Ravikanth V. Kothuri Emotion feedback based training and personalization system for aiding user performance in interactive presentations
CN104598127B (en) * 2014-12-31 2018-01-26 广东欧珀移动通信有限公司 A kind of method and device in dialog interface insertion expression
CN104824931B (en) * 2015-06-03 2017-03-08 深圳市欧珀通信软件有限公司 Intelligent watch and its control method
CN105447164A (en) * 2015-12-02 2016-03-30 小天才科技有限公司 Method and apparatus for automatically pushing chat expressions
CN105528917A (en) * 2016-02-15 2016-04-27 小天才科技有限公司 Method, device and system for feeding back network teaching effect
CN106175727B (en) * 2016-07-25 2018-11-20 广东小天才科技有限公司 Expression pushing method applied to wearable device and wearable device
CN106725473A (en) * 2016-12-29 2017-05-31 杭州联络互动信息科技股份有限公司 A kind of method and device that emotional state is adjusted based on intelligent wearable device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
破解"测谎"的密码 心理生理检测在探案中的应用;付有志等;《中国人民公安大学出版社》;20060331;第134-144页 *

Also Published As

Publication number Publication date
CN107634901A (en) 2018-01-26

Similar Documents

Publication Publication Date Title
CN107634901B (en) Session expression pushing method and device and terminal equipment
JP6594534B2 (en) Text information processing method and device
CN109514586B (en) Method and system for realizing intelligent customer service robot
WO2017084541A1 (en) Method and apparatus for sending expression image during call session
CN108255316B (en) Method for dynamically adjusting emoticons, electronic device and computer-readable storage medium
CN108334202B (en) Wallpaper updating method and related product
CN108573306B (en) Method for outputting reply information, and training method and device for deep learning model
CN106470110B (en) Method and device for sending messages to multiple users in user list in group mode
CN107909011B (en) Face recognition method and related product
CN105183347A (en) Input method operation method and electronic terminal
CN107102744A (en) A kind of recommendation method and electronic equipment for inputting vocabulary
CN115982323A (en) Big data analysis method and artificial intelligence system applied to cloud online service
CN109101956B (en) Method and apparatus for processing image
CN108920085B (en) Information processing method and device for wearable device
CN112532507A (en) Method and device for presenting expression image and method and device for sending expression image
CN108170292B (en) Expression management method, expression management device and intelligent terminal
CN106940710B (en) Information pushing method and device
CN111047332B (en) Model training and risk identification method, device and equipment
CN108399118A (en) file system test data processing method, device, storage medium and terminal
CN109726550A (en) Abnormal operation behavioral value method, apparatus and computer readable storage medium
CN114666291A (en) Message sending method and device
CN110730323B (en) Conference interaction information processing method and device, computer device and storage medium
CN110347247B (en) Man-machine interaction method and device, storage medium and electronic equipment
CN107219968A (en) A kind of drafting method, system, mobile terminal and readable storage medium storing program for executing
CN108777058B (en) Posture reminding method, electronic device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant