CN115113781A - Interactive icon display method, device, medium and electronic equipment - Google Patents

Interactive icon display method, device, medium and electronic equipment Download PDF

Info

Publication number
CN115113781A
CN115113781A CN202210753908.4A CN202210753908A CN115113781A CN 115113781 A CN115113781 A CN 115113781A CN 202210753908 A CN202210753908 A CN 202210753908A CN 115113781 A CN115113781 A CN 115113781A
Authority
CN
China
Prior art keywords
interactive
content
target
emotion
coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210753908.4A
Other languages
Chinese (zh)
Inventor
江子龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguan Information Technology Co Ltd
Original Assignee
Guangzhou Boguan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguan Information Technology Co Ltd filed Critical Guangzhou Boguan Information Technology Co Ltd
Priority to CN202210753908.4A priority Critical patent/CN115113781A/en
Publication of CN115113781A publication Critical patent/CN115113781A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/63Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Signal Processing (AREA)
  • Computational Linguistics (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Child & Adolescent Psychology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides an interactive icon display method, an interactive icon display device, an interactive icon display medium and electronic equipment, and relates to the technical field of computers. The interactive icon display method comprises the following steps: determining target interactive content; performing emotion analysis on the target interactive content to obtain an emotion coefficient of the target interactive content; and determining an interactive icon associated with the target interactive content according to the emotional coefficient, and displaying the interactive icon associated with the target interactive content at a preset display position of the target interactive content. The invention provides a personalized display scheme of interactive icons, and the information acquisition efficiency of interactive contents is improved.

Description

Interactive icon display method, device, medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interactive icon display method, an interactive icon display apparatus, a computer-readable storage medium, and an electronic device.
Background
With the maturation of internet technology, social networks have become a common interaction medium. The social network allows the user to express own attitude or view of the interactive content published by other users, for example, the user may forward, comment or like the interactive content.
In the related art, a user may approve the interactive content through the interactive icon displayed on the device, but the interactive icon displayed in the related art is usually designed in advance by the social platform, so that the style of the interactive icon displayed on the social network is relatively fixed.
Disclosure of Invention
The disclosure provides an interactive icon display method, an interactive icon display device, an interactive icon display medium and electronic equipment, and further information acquisition efficiency of interactive contents is improved.
According to a first aspect of the present disclosure, there is provided an interactive icon display method, including:
determining target interactive content;
performing emotion analysis on the target interaction content to obtain an emotion coefficient of the target interaction content;
determining an interactive icon associated with the target interactive content according to the emotion coefficient;
and displaying an interactive icon associated with the target interactive content at a preset display position of the target interactive content.
Optionally, the determining, according to the emotion coefficient, an interactive icon associated with the target interactive content includes:
determining a target emotion coefficient interval to which the emotion coefficient of the target interactive content belongs based on a pre-established correspondence between the emotion coefficient interval and the interactive icon;
and acquiring an interactive icon corresponding to the target emotion coefficient interval to obtain an interactive icon associated with the target interactive content.
Optionally, the number of the interactive icons associated with the target interactive content includes a plurality of interactive icons, the target interactive content is interactive content to be released, and after the interactive icons associated with the target interactive content are determined according to the emotion coefficients, the method further includes:
displaying a plurality of interactive icons associated with the target interactive content;
in response to the selection operation of a target interactive icon in the plurality of interactive icons, determining the target interactive icon as an interactive icon associated with the target interactive content.
Optionally, the target interactive content includes at least one target interactive sub-content, and the emotion analysis is performed on the target interactive content to obtain an emotion coefficient of the target interactive content, including;
performing emotion analysis on at least one target interactive sub-content respectively to obtain an emotion coefficient of each target interactive sub-content;
and determining the emotion coefficient of the target interaction content according to the emotion coefficient of each target interaction sub-content.
Optionally, the target interactive sub-content includes interactive characters, and the obtaining of the emotion coefficient of each target interactive sub-content by performing emotion analysis on at least one target interactive sub-content includes:
and carrying out emotion analysis on the interactive characters to obtain the emotion coefficients of the interactive characters.
Optionally, the target interactive sub-content includes an interactive picture, and the obtaining of the emotion coefficient of each target interactive sub-content by performing emotion analysis on at least one target interactive sub-content includes:
performing text recognition on the interactive picture;
if the interactive picture is determined to contain first text information, performing emotion analysis on the first text information to obtain a text emotion coefficient of the interactive picture;
performing emotion analysis on the image information of the interactive picture to obtain an image emotion coefficient of the interactive picture;
and determining the emotion coefficient of the interactive picture according to the text emotion coefficient and the image emotion coefficient of the interactive picture.
Optionally, the target interactive sub-content includes an interactive audio, and the obtaining an emotion coefficient of each target interactive sub-content by performing emotion analysis on at least one target interactive sub-content includes:
extracting second text information and first audio information in the interactive audio;
performing emotion analysis on the second text information to obtain a text emotion coefficient of the interactive audio;
performing emotion analysis on the first audio information to obtain an audio emotion coefficient of the interactive audio;
and determining the emotion coefficient of the interactive audio according to the text emotion coefficient and the audio emotion coefficient of the interactive audio.
Optionally, the target interactive sub-content includes an interactive video, and performing emotion analysis on at least one target interactive sub-content to obtain an emotion coefficient of each target interactive sub-content includes:
extracting third text information, second audio information and video frame picture information in the interactive video;
and determining the emotion coefficient of the interactive video according to the third text information, the second audio information and the video frame picture information.
According to a second aspect of the present disclosure, there is provided an interactive icon display device, comprising:
a first determining module configured to determine target interactive content;
the analysis module is configured to perform emotion analysis on the target interaction content to obtain an emotion coefficient of the target interaction content;
a second determining module configured to determine an interactive icon associated with the target interactive content according to the emotion coefficient;
the first display control module is configured to display an interactive icon associated with the target interactive content at a preset display position of the target interactive content.
Optionally, the second determining module is configured to:
determining a target emotion coefficient interval to which the emotion coefficient of the target interactive content belongs based on a pre-established correspondence between the emotion coefficient interval and the interactive icon;
and acquiring an interactive icon corresponding to the target emotion coefficient interval to obtain an interactive icon associated with the target interactive content.
Optionally, the number of the interaction icons associated with the target interaction content includes a plurality of interaction icons, the target interaction content is interaction content to be released, and the apparatus further includes a second display control module configured to:
displaying a plurality of interactive icons associated with the target interactive content;
in response to the selection operation of a target interactive icon in the plurality of interactive icons, determining the target interactive icon as an interactive icon associated with the target interactive content.
Optionally, the analysis module is configured to;
performing emotion analysis on at least one target interactive sub-content respectively to obtain an emotion coefficient of each target interactive sub-content;
and determining the emotion coefficient of the target interaction content according to the emotion coefficient of each target interaction sub-content.
Optionally, the target interactive sub-content includes interactive text, and the analysis module is configured to:
and carrying out emotion analysis on the interactive characters to obtain the emotion coefficients of the interactive characters.
Optionally, the target interactive sub-content includes an interactive picture, and the analysis module is configured to:
performing text recognition on the interactive picture;
if the interactive picture is determined to contain first text information, performing emotion analysis on the first text information to obtain a text emotion coefficient of the interactive picture;
performing emotion analysis on the image information of the interactive picture to obtain an image emotion coefficient of the interactive picture;
and determining the emotion coefficient of the interactive picture according to the text emotion coefficient and the image emotion coefficient of the interactive picture.
Optionally, the target interactive sub-content includes interactive audio, and the analysis module is configured to:
extracting second text information and first audio information in the interactive audio;
performing emotion analysis on the second text information to obtain a text emotion coefficient of the interactive audio;
performing emotion analysis on the first audio information to obtain an audio emotion coefficient of the interactive audio;
and determining the emotion coefficient of the interactive audio according to the text emotion coefficient and the audio emotion coefficient of the interactive audio.
Optionally, the target interactive sub-content includes an interactive video, and the analysis module is configured to:
extracting third text information, second audio information and video frame picture information in the interactive video;
and determining the emotion coefficient of the interactive video according to the third text information, the second audio information and the video frame picture information.
According to a third aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of the first aspect.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of the first aspect via execution of the executable instructions.
The technical scheme of the disclosure has the following beneficial effects:
the interactive icon display method, the interactive icon display device, the interactive icon display medium and the electronic equipment can determine target interactive content, and conduct emotion analysis on the target interactive content to obtain an emotion coefficient of the target interactive content; and determining an interactive icon associated with the target interactive content according to the emotional coefficient, and displaying the interactive icon associated with the target interactive content at a preset display position of the target interactive content. The corresponding interactive icon can be determined according to the interactive content, so that the interactive icon is more personalized and diversified in display; the interactive icons are determined according to the emotional information of the interactive content, so that a user can visually acquire the emotional information of the interactive content through the interactive icons, and the information acquisition efficiency of the interactive content is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is apparent that the drawings in the following description are only some embodiments of the present disclosure, and that other drawings can be obtained from those drawings without inventive effort for a person skilled in the art.
FIG. 1 shows a schematic architecture diagram of an interactive icon display system in this exemplary embodiment;
fig. 2 is a flowchart illustrating an interactive icon display method according to the exemplary embodiment;
FIG. 3 is a flowchart illustrating a method for determining emotion coefficients of targeted interactive content in the exemplary embodiment;
FIG. 4 is a schematic diagram of an interactive content display interface in the exemplary embodiment;
FIG. 5 is a schematic structural diagram of an interactive icon display device according to the exemplary embodiment;
fig. 6 shows a schematic structural diagram of an electronic device in the present exemplary embodiment.
Detailed Description
Exemplary embodiments will now be described more fully with reference to the accompanying drawings. The exemplary embodiments, however, may be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all steps. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
In the related art, for interactive content, the social platform may design a display style of the interactive icon in advance, and display the interactive icon at a preset display position of the interactive content.
However, the display style of the interactive icon designed by the social platform is usually fixed, and the same interactive icon is displayed no matter what interactive content is, and the display style of the interactive icon is not associated with the interactive content, so that the user cannot intuitively acquire the related information of the interactive content, and the interactive content reading experience of the user is poor.
In view of the above problems, exemplary embodiments of the present disclosure provide an interactive icon display method. The application scenarios of the interactive icon display method include but are not limited to: in a social network or a social platform, emotion analysis can be performed on target interaction content to obtain an emotion coefficient of the target interaction content; determining an interactive icon associated with the target interactive content according to the emotion coefficient; and displaying an interactive icon associated with the target interactive content at a preset display position of the target interactive content.
In order to implement the above-described interactive icon display method, an exemplary embodiment of the present disclosure provides an interactive icon display system. Fig. 1 shows a schematic architecture diagram of the interactive icon display system. As shown in fig. 1, the service processing system 100 may include a server 110 and a terminal device 120. Among other things, the server 110 is a backend server deployed by a service provider (social platform or social network). The terminal device 120 is a terminal device used by a user who publishes interactive content based on a social network or a social platform, and the terminal device may include a smart phone, a personal computer, a tablet computer, a wearable device, and the like. The server 110 and the terminal device 120 may establish a connection through a network.
It should be understood that the server 110 may be a single server or a cluster formed by a plurality of services, and the present disclosure is not limited to the specific architecture of the server 110.
The interactive icon method is explained below from the perspective of the terminal device. Fig. 2 shows an exemplary flow of an interactive icon display method performed by a terminal device, which may include steps S201 to S204:
step S201, determining target interactive content;
step S202, carrying out emotion analysis on the target interactive content to obtain an emotion coefficient of the target interactive content;
step S203, determining an interactive icon associated with the target interactive content according to the emotion coefficient;
step S204, displaying an interactive icon associated with the target interactive content at a preset display position of the target interactive content.
In summary, the interactive icon display method provided by the embodiment of the present disclosure can determine the target interactive content, and perform emotion analysis on the target interactive content to obtain an emotion coefficient of the target interactive content; and determining an interactive icon associated with the target interactive content according to the emotional coefficient, and displaying the interactive icon associated with the target interactive content at a preset display position of the target interactive content. The corresponding interactive icon can be determined according to the interactive content, so that the interactive icon is more personalized and diversified in display; the interactive icons are determined according to the emotional information of the interactive content, so that a user can visually acquire the emotional information of the interactive content through the interactive icons, and the information acquisition efficiency of the interactive content is improved.
Each step in fig. 2 is explained in detail below:
in step S201, the terminal device may determine the target interactive content.
In the embodiment of the present disclosure, the target interactive content may be interactive content to be published or interactive content to be displayed.
In an optional implementation manner, the process of the terminal device determining the target interactive content may include: and the terminal equipment responds to the issuing operation of the interactive content to be issued and determines the interactive content to be issued as the target interactive content. The interactive content to be released may be user input content acquired by the terminal device in the interactive content editing interface, and the releasing operation may be a triggering operation on a releasing button displayed in the interactive content editing interface, or the releasing operation may be a preset triggering gesture detected in the interactive content editing interface, and the preset triggering gesture may be determined based on actual needs, which is not limited in the embodiment of the present disclosure.
In an optional implementation manner, the process of the terminal device determining the target interactive content may include: the terminal device responds to the refreshing operation of the interactive content display page and sends a display content acquisition instruction to the server, the server can acquire interactive content to be displayed corresponding to the terminal device according to the display content acquisition instruction and sends the interactive content to be displayed to the terminal device, and the terminal device can determine the interactive content to be displayed as target interactive content. The server can determine the published interactive content of other user accounts concerned by the user account as the interactive content to be displayed corresponding to the terminal equipment; or, the server may obtain the interactive content of the browsing keyword according to the browsing keyword of the user account, and obtain the interactive content to be displayed corresponding to the terminal device.
In step S202, the terminal device may perform emotion analysis on the target interactive content to obtain an emotion coefficient of the target interactive content.
In the embodiment of the disclosure, the emotion coefficient is used for representing the emotion positive degree of the interactive content, and the greater the emotion coefficient is, the more positive the emotion of the target interactive content is; the target interactive content may include at least one target interactive sub-content, and the target interactive sub-content may be interactive pictures, interactive audio, interactive video, and interactive text.
In an alternative implementation, as shown in fig. 3, the process of the terminal device performing emotion analysis on the target interactive content to obtain an emotion coefficient of the target interactive content may include steps S301 to S302;
step S301, performing emotion analysis on at least one target interactive sub-content respectively to obtain an emotion coefficient of each target interactive sub-content;
in an optional implementation manner, the target interactive sub-content includes interactive text, and the process of obtaining the emotion coefficient of each target interactive sub-content by the terminal device performing emotion analysis on at least one target interactive sub-content respectively may include: and carrying out emotion analysis on the interactive characters to obtain the emotion coefficients of the interactive characters. The process of obtaining the emotion coefficient of the interactive characters by performing emotion analysis on the interactive characters can be realized based on a predetermined emotion dictionary or a pre-trained text emotion recognition model, which is not limited by the embodiment of the disclosure, and emotion analysis can be performed on the interactive text to further determine interactive icons associated with the interactive text, so that the interactive icon personalized display requirement of the interactive text is met.
In an optional implementation manner, the target interactive sub-content includes an interactive picture, and the process of obtaining the emotion coefficient of each target interactive sub-content by the terminal device performing emotion analysis on at least one target interactive sub-content respectively may include: performing text recognition on the interactive picture; if the interactive picture is determined to contain the text information, performing emotion analysis on the first text information to obtain a text emotion coefficient of the interactive picture; performing emotion analysis on the image information of the interactive picture to obtain an image emotion coefficient of the interactive picture; further, the emotion coefficient of the interactive picture can be determined according to the text emotion coefficient and the image emotion coefficient of the interactive picture; and if the interactive picture does not contain the text information, carrying out emotion analysis on the image information of the interactive picture to obtain the emotion coefficient of the interactive picture. Emotion analysis can be carried out on the interactive picture so as to further determine the interactive icon associated with the interactive picture and meet the actual display requirement of the interactive picture.
The terminal equipment analyzes the emotion of the first text information to obtain the text emotion coefficient of the interactive picture, and the process of obtaining the text emotion coefficient of the interactive picture can be realized based on a predetermined emotion dictionary or a pre-trained text emotion recognition model, and the embodiment of the disclosure does not limit the process; the terminal equipment carries out emotion analysis on the image information of the interactive picture, and the process of obtaining the picture emotion coefficient of the interactive picture can be realized based on a picture emotion recognition model trained in advance.
In an optional implementation manner, the process of determining the emotion coefficient of the interactive picture by the terminal device according to the text emotion coefficient and the image emotion coefficient of the interactive picture may include: determining the text emotion coefficient of the interactive picture and the sum value of the text emotion coefficient to obtain the emotion coefficient of the interactive picture; or, the process of determining the emotion coefficient of the interactive picture by the terminal device according to the text emotion coefficient and the image emotion coefficient of the interactive picture may include: determining a first product of a text emotion coefficient and a first weight value of an interactive picture; and determining a second product of the image emotion coefficient and the second weight value of the interactive picture, and determining a sum of the first product and the second product to obtain the emotion coefficient of the interactive picture. The first weight value is used for representing the importance degree of first text information in the interactive picture in determining the emotion coefficient of the interactive picture; the second weight value is used for representing the importance degree of the image information in the interactive picture in determining the emotion coefficient of the interactive picture. The larger the weight value is, the higher the importance degree is, and the weight value may be determined based on actual needs, which is not limited in the embodiments of the present disclosure.
In an optional implementation manner, the target interactive sub-content includes interactive audio, and the process of obtaining the emotion coefficient of each target interactive sub-content by the terminal device performing emotion analysis on at least one target interactive sub-content respectively may include: extracting second text information and first audio information in the interactive audio; performing emotion analysis on the second text information to obtain a text emotion coefficient of the interactive audio; performing emotion analysis on the first audio information to obtain an audio emotion coefficient of the interactive audio; and determining the emotion coefficient of the interactive audio according to the text emotion coefficient and the audio emotion coefficient of the interactive audio. Emotion analysis can be carried out on the interactive audio to further determine the interactive icon associated with the interactive audio, and the interactive icon personalized display requirement of the interactive audio is met.
The process of obtaining the text emotion coefficient of the interactive audio by performing emotion analysis on the second text information can be realized based on a predetermined emotion dictionary or a pre-trained text emotion recognition model, and the process is not limited in the embodiment of the disclosure; the emotion analysis is carried out on the first audio information, and the process of obtaining the audio emotion coefficient of the interactive audio can be realized based on a pre-trained audio emotion analysis model.
In an optional implementation manner, the determining, by the terminal device, the emotion coefficient of the interactive audio according to the text emotion coefficient and the audio emotion coefficient of the interactive audio may include: determining the sum of the text emotion coefficient of the interactive audio and the audio emotion coefficient of the interactive audio to obtain the emotion coefficient of the interactive audio; or, the determining, by the terminal device, the emotion coefficient of the interactive audio according to the text emotion coefficient and the audio emotion coefficient of the interactive audio may include: determining a third product of the text emotion coefficient and a third weight value of the interactive audio; and determining a fourth product of the audio emotion coefficient of the interactive audio and the fourth weight value, and determining a sum of the third product and the fourth product to obtain the emotion coefficient of the interactive audio. The third weighted value is used for representing the importance degree of second text information in the interactive audio in determining the emotion coefficient of the interactive audio; the fourth weight value is used for representing the importance degree of the first audio information in the interactive audio in determining the emotion coefficient of the interactive audio.
In an optional implementation manner, the target interactive sub-content includes an interactive video, where the interactive video includes text information, audio information, and a video frame picture, and then the process of performing emotion analysis on at least one target interactive sub-content by the terminal device to obtain an emotion coefficient of each target interactive sub-content may include: extracting third text information, second audio information and video frame picture information in the interactive video; and determining the emotion coefficient of the interactive video according to the third text information, the second audio information and the video frame picture information. The process of determining the emotion coefficient of the interactive video by the terminal equipment according to the third text information, the second audio information and the video frame picture information can be realized based on a pre-trained multi-mode emotion analysis model. Emotion analysis can be carried out on the interactive video so as to further determine the interactive icon associated with the interactive video and meet the personalized display requirement of the interactive icon of the interactive video.
In an optional implementation manner, the interactive video includes audio information and a video frame, and the process of obtaining the emotion coefficient of each target interactive sub-content by the terminal device performing emotion analysis on at least one target interactive sub-content respectively may include: extracting second audio information and video frame picture information in the interactive video; and determining the emotion coefficient of the interactive video according to the second audio information and the video frame picture information. The process of determining the emotion coefficient of the interactive video by the terminal equipment according to the second audio information and the video frame picture information can be realized based on a pre-trained multi-mode emotion analysis model.
Step S302, determining the emotion coefficient of the target interactive content according to the emotion coefficient of each target interactive sub-content.
In an optional implementation manner, the process of determining, by the terminal device, the emotion coefficient of the target interactive content according to the emotion coefficient of each target interactive sub-content may include: and determining the sum of the emotion coefficients of each target interaction sub-content to obtain the emotion coefficients of the target interaction contents.
In an optional implementation manner, the determining, by the terminal device, the emotion coefficient of the target interactive content according to the emotion coefficient of each target interactive sub-content may include: determining the product of the emotion coefficient of each target interaction sub-content and the weight value of each target interaction sub-content to obtain the weight emotion coefficient of each target interaction sub-content; and determining the sum of the weight emotion coefficients of each target interactive sub-content to obtain the emotion coefficients of the target interactive contents. The weight value of the target interactive sub-content is used for representing the importance degree of the target interactive sub-content in determining the emotion coefficient of the interactive content, and the weight value of the target interactive sub-content can be determined based on actual needs. The target interactive content with various types can be subjected to emotion analysis, the interactive icons of the target interactive content are determined, and the interactive icon personalized display requirements of the target interactive content with various types are met.
For example, the target interactive content includes interactive text and an interactive picture, and the terminal device may respectively determine an emotion coefficient of the interactive text and an emotion coefficient of the interactive picture according to the method provided in step S301, and further may determine a sum of the emotion coefficient of the interactive text and the emotion coefficient of the interactive picture to obtain an emotion coefficient of the target interactive content; or determining the product of the emotion coefficient of the interactive characters and the weighted value of the interactive characters to obtain the weight emotion coefficient of the interactive characters; determining the product of the emotion coefficient of the interactive picture and the weight value of the interactive picture to obtain the weight emotion coefficient of the interactive picture, and determining the sum of the weight emotion coefficient of the interactive text and the weight emotion coefficient of the interactive picture to obtain the emotion coefficient of the target interactive content.
In step S203, the terminal device may determine an interactive icon associated with the target interactive content according to the emotion coefficient.
In the embodiment of the disclosure, the interactive icon associated with the target interactive content can display the emotional information of the interactive content. For example, if the emotion information in the target interactive content is happy, the interactive icon associated with the target interactive content is an interactive icon expressing happy; if the emotional information in the target interactive content is angry, the interactive icon associated with the target interactive content is an interactive icon expressing angry.
In an optional implementation manner, the terminal device may pre-store a pre-established correspondence between an emotion coefficient interval and an interactive icon, where the correspondence between the emotion coefficient interval and the interactive icon may include a plurality of emotion coefficient intervals and an interactive icon corresponding to each emotion coefficient interval. The process that the terminal device determines the interactive icon associated with the target interactive content according to the emotion coefficient may include: determining a target emotion coefficient interval to which the emotion coefficient of the target interactive content belongs based on a pre-established correspondence between the emotion coefficient interval and the interactive icon; and acquiring an interactive icon corresponding to the target emotion coefficient interval to obtain an interactive icon associated with the target interactive content. The interactive icon associated with the target interactive content can be rapidly determined based on the corresponding relation between the emotion coefficient interval and the interactive icon, which is established in advance, so that the efficiency of determining the interactive icon associated with the target interactive content is improved.
For example, the correspondence between the emotion coefficient interval and the interactive icon includes: the emotion coefficient interval is-1 to t < -0.5, and the interactive icon corresponding to the emotion coefficient interval-1 to t < -0.5 is an interactive icon representing anger; t is more than or equal to-0.5 and less than 0 in the emotion coefficient interval, and the interactive icon corresponding to t is more than or equal to-0.5 and less than 0 in the emotion coefficient interval is an interactive icon representing vitality; the emotion coefficient interval is more than or equal to 0 and less than 0.5, and the interactive icon corresponding to the emotion coefficient interval of more than or equal to 0 and less than 0.5 is an interactive icon representing distraction; t is more than or equal to 0.5 and less than 1 in the emotional coefficient interval, and the interactive icon corresponding to the t is more than or equal to 0.5 and less than 1 in the emotional coefficient interval is an interactive icon representing super distraction. Wherein t is an emotion coefficient.
Assuming that the emotion coefficient of the target interactive content is 0.3, the target emotion coefficient interval to which the emotion coefficient of the target interactive content belongs can be determined to be the emotion coefficient interval 0-t < 0.5, and the interactive icon corresponding to the emotion coefficient interval 0-t < 0.5 and representing distraction is determined to be the interactive icon associated with the target interactive content.
In an optional implementation manner, the server may store a pre-established correspondence between an emotion coefficient interval and an interactive icon in advance, and the process of the terminal device determining the interactive icon associated with the target interactive content according to the emotion coefficient may include: sending an interactive icon acquisition request to a server, wherein the interactive icon acquisition request can comprise the emotion coefficient of target interactive content, and after receiving the interactive icon acquisition request, the server can determine a target emotion coefficient interval to which the emotion coefficient of the target interactive content belongs based on the corresponding relation between the emotion coefficient interval and the interactive icon which is established in advance; acquiring an interactive icon corresponding to the target emotion coefficient interval to obtain an interactive icon associated with the target interactive content; and sending the interactive icon associated with the target interactive content to the terminal equipment.
In an optional implementation manner, the number of the interactive icons associated with the target interactive content includes a plurality of interactive icons, and the target interactive content is interactive content to be released, after the terminal device determines the interactive icons associated with the target interactive content according to the emotion coefficients, the terminal device may further: displaying a plurality of interactive icons associated with the target interactive content; and in response to the selected operation of the target interactive icon in the plurality of interactive icons, determining the target interactive icon as the interactive icon associated with the target interactive content. The target interactive icon is any one of a plurality of interactive icons associated with the target interactive content. The method and the device can be used for enabling the user to select the interactive icon meeting the user requirement from the interactive icons in the process of releasing the interactive content to be released by the user, and meet the personalized display requirement of the interactive icon.
In an optional implementation manner, the number of the interactive icons associated with the target interactive content includes a plurality of interactive icons, and the target interactive content is interactive content to be displayed, and after the terminal device determines the interactive icons associated with the target interactive content according to the emotion coefficient, the terminal device may further: and randomly selecting an interactive icon from the plurality of interactive icons associated with the target interactive content to obtain the interactive icon associated with the target interactive content. The interactive icons can be randomly selected from the interactive icons in the process of displaying the interactive content to be displayed, and the diversified display requirements of the interactive icons are met.
In step S203, the terminal device may display an interactive icon associated with the target interactive content at a preset display position of the target interactive content.
In the embodiment of the present disclosure, the preset display position of the target interactive content may be determined based on actual needs, which is not limited by the embodiment of the present disclosure. For example, as shown in fig. 4, fig. 4 shows a display interface 400 of interactive contents in a terminal device, where the display interface of interactive contents includes an interactive contents display area 401 for displaying target interactive contents; and an interactive icon display area 402 for displaying an interactive icon associated with the target interactive content. Wherein, the target interactive content of the user 1 is "today's weather is really good! "interaction with target content" weather today is really good! "the associated interactive icon is a heart-shaped interactive icon; the target interaction content of the user 2 is "I am a cat is uncomfortable and worried about! ", content interacting with the target". My Cat is not comfortable today and is worried about! The associated interactive icon is a crying face interactive icon.
In an optional implementation manner, the process of displaying, by the terminal device, the interactive icon associated with the target interactive content at the preset display position of the target interactive content may include: determining a display area of the target interactive content on a display interface, and displaying the target interactive content in the display area of the target interactive content; and displaying the interactive icon associated with the target interactive content in the interactive icon display area associated with the display area of the target interactive content.
An embodiment of the present disclosure provides an interactive icon display device, as shown in fig. 5, an interactive icon display device 500, including:
a first determining module 501 configured to determine target interactive content;
an analysis module 502 configured to perform emotion analysis on the target interaction content to obtain an emotion coefficient of the target interaction content;
a second determining module 503, configured to determine an interactive icon associated with the target interactive content according to the emotion coefficient;
the first display control module 504 is configured to display an interactive icon associated with the target interactive content at a preset display position of the target interactive content.
Optionally, the second determining module 503 is configured to:
determining a target emotion coefficient interval to which the emotion coefficient of the target interactive content belongs based on a pre-established correspondence between the emotion coefficient interval and the interactive icon;
and acquiring an interactive icon corresponding to the target emotion coefficient interval to obtain an interactive icon associated with the target interactive content.
Optionally, the number of the interactive icons associated with the target interactive content is multiple, and the target interactive content is interactive content to be released, as shown in fig. 5, the interactive icon display apparatus 500 further includes a second display control module 505 configured to:
displaying a plurality of interactive icons associated with the target interactive content;
and in response to the selected operation of the target interactive icon in the plurality of interactive icons, determining the target interactive icon as the interactive icon associated with the target interactive content.
Optionally, an analysis module 502 configured to;
performing emotion analysis on at least one target interactive sub-content respectively to obtain an emotion coefficient of each target interactive sub-content;
and determining the emotion coefficient of the target interactive content according to the emotion coefficient of each target interactive sub-content.
Optionally, the target interactive sub-content includes interactive text, and the analysis module 502 is configured to:
and carrying out emotion analysis on the interactive characters to obtain the emotion coefficients of the interactive characters.
Optionally, the target interactive sub-content includes an interactive picture, and the analysis module 502 is configured to:
performing text recognition on the interactive picture;
if the interactive picture is determined to contain the first text information, performing emotion analysis on the first text information to obtain a text emotion coefficient of the interactive picture;
performing emotion analysis on the image information of the interactive picture to obtain an image emotion coefficient of the interactive picture;
and determining the emotion coefficient of the interactive picture according to the text emotion coefficient and the image emotion coefficient of the interactive picture.
Optionally, the target interactive sub-content includes interactive audio, and the analysis module 502 is configured to:
extracting second text information and first audio information in the interactive audio;
performing emotion analysis on the second text information to obtain a text emotion coefficient of the interactive audio;
performing emotion analysis on the first audio information to obtain an audio emotion coefficient of the interactive audio;
and determining the emotion coefficient of the interactive audio according to the text emotion coefficient and the audio emotion coefficient of the interactive audio.
Optionally, the target interactive sub-content includes an interactive video, and the analysis module 502 is configured to:
extracting third text information, second audio information and video frame picture information in the interactive video;
and determining the emotion coefficient of the interactive video according to the third text information, the second audio information and the video frame picture information.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium, which may be implemented in the form of a program product, including program code for causing an electronic device to perform the steps according to various exemplary embodiments of the present disclosure described in the above-mentioned "exemplary method" section of this specification, when the program product is run on the electronic device. In one embodiment, the program product may be embodied as a portable compact disc read only memory (CD-ROM) and include program code, and may be run on an electronic device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Exemplary embodiments of the present disclosure also provide an electronic device, which may be a terminal device. The electronic device is explained below with reference to fig. 6. It should be understood that the electronic device 600 shown in fig. 6 is only one example and should not bring any limitations to the functionality or scope of use of the embodiments of the present disclosure.
As shown in fig. 6, the electronic device 600 is in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, and a bus 630 that couples the various system components including the memory unit 620 and the processing unit 610.
Where the memory unit stores program code, the program code may be executed by the processing unit 610 such that the processing unit 610 performs the steps according to various exemplary embodiments of the present invention as described in the above-mentioned "exemplary methods" section of this specification. For example, processing unit 610 may perform method steps, etc., as shown in fig. 2.
The storage unit 620 may include volatile storage units such as a random access memory unit (RAM)621 and/or a cache memory unit 622, and may further include a read only memory unit (ROM) 623.
The storage unit 620 may also include a program/utility 624 having a set (at least one) of program modules 625, such program modules 625 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which or some combination thereof may comprise an implementation of a network environment.
The bus 630 may include a data bus, an address bus, and a control bus.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), which may be through input/output (I/O) interface 640. The electronic device 600 may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 650. As shown, the network adapter 650 communicates with the other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, according to exemplary embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the following claims.

Claims (11)

1. An interactive icon display method is characterized by comprising the following steps:
determining target interactive content;
performing emotion analysis on the target interaction content to obtain an emotion coefficient of the target interaction content;
determining an interactive icon associated with the target interactive content according to the emotion coefficient;
and displaying an interactive icon associated with the target interactive content at a preset display position of the target interactive content.
2. The method of claim 1, wherein the determining the interactive icon associated with the target interactive content according to the emotion coefficient comprises:
determining a target emotion coefficient interval to which the emotion coefficient of the target interactive content belongs based on a pre-established correspondence between the emotion coefficient interval and the interactive icon;
and acquiring an interactive icon corresponding to the target emotion coefficient interval to obtain an interactive icon associated with the target interactive content.
3. The method of claim 1, wherein the number of the interactive icons associated with the target interactive content is multiple, the target interactive content is interactive content to be released, and after determining the interactive icon associated with the target interactive content according to the emotion coefficient, the method further comprises:
displaying a plurality of interactive icons associated with the target interactive content;
in response to the selection operation of a target interactive icon in the plurality of interactive icons, determining the target interactive icon as an interactive icon associated with the target interactive content.
4. The method according to any one of claims 1 to 3, wherein the target interactive content comprises at least one target interactive sub-content, and the emotion analysis is performed on the target interactive content to obtain an emotion coefficient of the target interactive content, including;
performing emotion analysis on at least one target interactive sub-content respectively to obtain an emotion coefficient of each target interactive sub-content;
and determining the emotion coefficient of the target interactive content according to the emotion coefficient of each target interactive sub-content.
5. The method of claim 4, wherein the target interactive sub-content comprises interactive text, and the performing sentiment analysis on at least one target interactive sub-content to obtain the sentiment coefficient of each target interactive sub-content comprises:
and carrying out emotion analysis on the interactive characters to obtain the emotion coefficients of the interactive characters.
6. The method of claim 4, wherein the target interactive sub-content comprises an interactive picture, and the performing sentiment analysis on at least one target interactive sub-content respectively to obtain a sentiment coefficient of each target interactive sub-content comprises:
performing text recognition on the interactive picture;
if the interactive picture is determined to contain first text information, performing emotion analysis on the first text information to obtain a text emotion coefficient of the interactive picture;
performing emotion analysis on the image information of the interactive picture to obtain an image emotion coefficient of the interactive picture;
and determining the emotion coefficient of the interactive picture according to the text emotion coefficient and the image emotion coefficient of the interactive picture.
7. The method of claim 4, wherein the target interactive sub-content comprises interactive audio, and the performing sentiment analysis on at least one target interactive sub-content to obtain sentiment coefficients of each target interactive sub-content comprises:
extracting second text information and first audio information in the interactive audio;
performing emotion analysis on the second text information to obtain a text emotion coefficient of the interactive audio;
performing emotion analysis on the first audio information to obtain an audio emotion coefficient of the interactive audio;
and determining the emotion coefficient of the interactive audio according to the text emotion coefficient and the audio emotion coefficient of the interactive audio.
8. The method of claim 4, wherein the target interactive sub-content comprises an interactive video, and the performing sentiment analysis on at least one target interactive sub-content to obtain the sentiment coefficient of each target interactive sub-content comprises:
extracting third text information, second audio information and video frame picture information in the interactive video;
and determining the emotion coefficient of the interactive video according to the third text information, the second audio information and the video frame picture information.
9. An interactive icon display device, comprising:
a first determination module configured to determine target interactive content;
the analysis module is configured to perform emotion analysis on the target interaction content to obtain an emotion coefficient of the target interaction content;
a second determining module configured to determine an interactive icon associated with the target interactive content according to the emotion coefficient;
the first display control module is configured to display an interactive icon associated with the target interactive content at a preset display position of the target interactive content.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 8.
11. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 8 via execution of the executable instructions.
CN202210753908.4A 2022-06-28 2022-06-28 Interactive icon display method, device, medium and electronic equipment Pending CN115113781A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210753908.4A CN115113781A (en) 2022-06-28 2022-06-28 Interactive icon display method, device, medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210753908.4A CN115113781A (en) 2022-06-28 2022-06-28 Interactive icon display method, device, medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115113781A true CN115113781A (en) 2022-09-27

Family

ID=83331098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210753908.4A Pending CN115113781A (en) 2022-06-28 2022-06-28 Interactive icon display method, device, medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115113781A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103473555A (en) * 2013-08-26 2013-12-25 中国科学院自动化研究所 Horrible video scene recognition method based on multi-view and multi-instance learning
US20130346067A1 (en) * 2012-06-26 2013-12-26 International Business Machines Corporation Real-time message sentiment awareness
CN104200804A (en) * 2014-09-19 2014-12-10 合肥工业大学 Various-information coupling emotion recognition method for human-computer interaction
CN109446378A (en) * 2018-11-08 2019-03-08 北京奇艺世纪科技有限公司 Information recommendation method, Sentiment orientation determine method and device and electronic equipment
CN109766476A (en) * 2018-12-27 2019-05-17 西安电子科技大学 Video content sentiment analysis method, apparatus, computer equipment and storage medium
CN110473571A (en) * 2019-07-26 2019-11-19 北京影谱科技股份有限公司 Emotion identification method and device based on short video speech
CN112667120A (en) * 2021-01-22 2021-04-16 百果园技术(新加坡)有限公司 Display method and device of interactive icon and electronic equipment
WO2021174757A1 (en) * 2020-03-03 2021-09-10 深圳壹账通智能科技有限公司 Method and apparatus for recognizing emotion in voice, electronic device and computer-readable storage medium
WO2022089192A1 (en) * 2020-10-28 2022-05-05 北京有竹居网络技术有限公司 Interaction processing method and apparatus, electronic device, and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130346067A1 (en) * 2012-06-26 2013-12-26 International Business Machines Corporation Real-time message sentiment awareness
CN103473555A (en) * 2013-08-26 2013-12-25 中国科学院自动化研究所 Horrible video scene recognition method based on multi-view and multi-instance learning
CN104200804A (en) * 2014-09-19 2014-12-10 合肥工业大学 Various-information coupling emotion recognition method for human-computer interaction
CN109446378A (en) * 2018-11-08 2019-03-08 北京奇艺世纪科技有限公司 Information recommendation method, Sentiment orientation determine method and device and electronic equipment
CN109766476A (en) * 2018-12-27 2019-05-17 西安电子科技大学 Video content sentiment analysis method, apparatus, computer equipment and storage medium
CN110473571A (en) * 2019-07-26 2019-11-19 北京影谱科技股份有限公司 Emotion identification method and device based on short video speech
WO2021174757A1 (en) * 2020-03-03 2021-09-10 深圳壹账通智能科技有限公司 Method and apparatus for recognizing emotion in voice, electronic device and computer-readable storage medium
WO2022089192A1 (en) * 2020-10-28 2022-05-05 北京有竹居网络技术有限公司 Interaction processing method and apparatus, electronic device, and storage medium
CN112667120A (en) * 2021-01-22 2021-04-16 百果园技术(新加坡)有限公司 Display method and device of interactive icon and electronic equipment

Similar Documents

Publication Publication Date Title
CN107832433B (en) Information recommendation method, device, server and storage medium based on conversation interaction
CN110020411B (en) Image-text content generation method and equipment
CN102016905B (en) Intelligent autocompletion
KR20230141907A (en) Analyzing web pages to facilitate automatic navigation
US11126794B2 (en) Targeted rewrites
CN109389365B (en) Multi-person collaborative document processing method and device and electronic equipment
CN111324252B (en) Display control method and device in live broadcast platform, storage medium and electronic equipment
CN107977155B (en) Handwriting recognition method, device, equipment and storage medium
WO2015043442A1 (en) Method, device and mobile terminal for text-to-speech processing
CN112667118A (en) Method, apparatus and computer readable medium for displaying historical chat messages
JP7050857B2 (en) Summary generation method and equipment
JP2022093317A (en) Computer-implemented method, system and computer program product (recognition and restructuring of previously presented information)
CN113419711A (en) Page guiding method and device, electronic equipment and storage medium
CN114092608B (en) Expression processing method and device, computer readable storage medium and electronic equipment
CN110580648A (en) financial information processing method and device based on artificial intelligence
CN112492399A (en) Information display method and device and electronic equipment
CN115113781A (en) Interactive icon display method, device, medium and electronic equipment
CN115328362A (en) Book information display method, device, equipment and storage medium
CN115081423A (en) Document editing method and device, electronic equipment and storage medium
CN110658974B (en) Page sliding method and device, electronic equipment and storage medium
CN113220297A (en) Webpage style dynamic generation method and device, storage medium and electronic equipment
CN113961680A (en) Human-computer interaction based session processing method and device, medium and electronic equipment
CN111259181B (en) Method and device for displaying information and providing information
CN113535018A (en) Human-computer interaction method and device for evaluating cognitive speed
CN112308745A (en) Method and apparatus for generating information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination