CN105630846B - Head portrait updating method and device - Google Patents

Head portrait updating method and device Download PDF

Info

Publication number
CN105630846B
CN105630846B CN201410664670.3A CN201410664670A CN105630846B CN 105630846 B CN105630846 B CN 105630846B CN 201410664670 A CN201410664670 A CN 201410664670A CN 105630846 B CN105630846 B CN 105630846B
Authority
CN
China
Prior art keywords
message
user
interactive
friend
updating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410664670.3A
Other languages
Chinese (zh)
Other versions
CN105630846A (en
Inventor
钟聆芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Tencent Computer Systems Co Ltd
Original Assignee
Shenzhen Tencent Computer Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Tencent Computer Systems Co Ltd filed Critical Shenzhen Tencent Computer Systems Co Ltd
Priority to CN201410664670.3A priority Critical patent/CN105630846B/en
Publication of CN105630846A publication Critical patent/CN105630846A/en
Application granted granted Critical
Publication of CN105630846B publication Critical patent/CN105630846B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a head portrait updating method and device, and belongs to the technical field of internet. The method comprises the following steps: acquiring the interactive message of the current user and the friend on a message interactive interface; judging whether the interactive message is an avatar updating message; when the interactive message is the head portrait updating message, acquiring an image matched with the interactive message; and updating the head portrait of the user according to the image. According to the invention, after the interactive message of the user and the friend is acquired on the message interactive interface, if the interactive message is the head portrait updating message, the image matched with the interactive message is acquired, and the head portrait of the user is updated according to the image.

Description

Head portrait updating method and device
Technical Field
The invention relates to the technical field of internet, in particular to a method and a device for updating a head portrait.
Background
With the continuous development of internet technology, social applications are favored by users. Social applications may provide a host of personalized settings. For example, the user may set his or her avatar, signature information, chat interface skin, and so on. Usually, a head portrait with individuation not only can show the individuation and the recent mood of the user, but also can make others know the user according to the individuation head portrait. Therefore, it is necessary to update the user's avatar in time.
In the prior art, when the head portrait is updated, a user needs to manually operate. One way is for the user to select a head portrait pendant through which to manually update the head portrait. The other mode is that the user manually updates the head portrait of the user according to the head portrait in the head portrait list or the locally uploaded images.
In the process of implementing the invention, the inventor finds that the prior art has at least the following problems:
because the user needs to manually update the head portrait, the head portrait updating speed is low, the head portrait updating is not in time, the flexibility is not high, and the interestingness is low.
Disclosure of Invention
In order to solve the problems in the prior art, embodiments of the present invention provide an avatar updating method and apparatus. The technical scheme is as follows:
in one aspect, a method for updating an avatar is provided, the method comprising:
acquiring the interactive message of the current user and the friend on a message interactive interface;
judging whether the interactive message is an avatar updating message;
when the interactive message is the head portrait updating message, acquiring an image matched with the interactive message;
and updating the head portrait of the user according to the image.
In another aspect, an avatar update apparatus is provided, the apparatus including:
the interactive message acquisition module is used for acquiring the interactive message of the current user and the friend on the message interactive interface;
the updating message judging module is used for judging whether the interactive message is an avatar updating message;
the image acquisition module is used for acquiring an image matched with the interactive message when the interactive message is the head portrait updating message;
and the head portrait updating module is used for updating the head portrait of the user according to the image.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
after the interactive message of the user and the friend currently is acquired on the message interactive interface, if the interactive message is the head portrait updating message, the image matched with the interactive message is acquired, and the head portrait of the user is updated according to the image.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1a is a schematic structural diagram of an implementation environment related to an avatar updating method according to an embodiment of the present invention;
FIG. 1b is a diagram of a message interaction interface according to an embodiment of the present invention;
FIG. 2 is a flowchart of an avatar updating method according to an embodiment of the present invention;
FIG. 3 is a flowchart of an avatar updating method according to an embodiment of the present invention;
FIG. 4a is a schematic diagram of a message interaction interface according to an embodiment of the present invention;
FIG. 4b is a diagram of a message interaction interface according to an embodiment of the present invention;
FIG. 5a is a schematic diagram of a message interaction interface according to an embodiment of the present invention;
FIG. 5b is a diagram of a message interaction interface according to an embodiment of the present invention;
FIG. 6a is a diagram illustrating a message interaction interface according to an embodiment of the present invention;
FIG. 6b is a diagram illustrating a message interaction interface according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an avatar updating apparatus according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Before explaining the embodiment of the present invention in detail, an application scenario of the embodiment of the present invention will be explained.
Referring to fig. 1a, a schematic structural diagram of an implementation environment related to the avatar updating method provided by the embodiment of the present invention is shown. The implementation environment includes a user terminal and a friend terminal. Where a buddy typically refers to a contact located in a chain of user relationships. The user can add any contact person into the user relationship chain through friend adding operation to become a friend of the user. After the two parties become friends, operations such as chatting and file transmission can be performed through the message interaction interface shown in fig. 1 b. The user terminal and the friend terminal are both provided with the same social application or the same instant messaging application. The user terminal and the friend terminal can be smart phones or tablet computers and the like, and the types of the terminals are not particularly limited in the embodiment of the invention.
The user terminal is used for acquiring the interactive message of the current user and the friend on the message interactive interface; judging whether the interactive message is a head portrait updating message; when the interactive message is a head portrait updating message, acquiring an image matched with the interactive message; and updating the head portrait of the user according to the image. And the friend terminal is used for sending the interactive message to the user terminal or receiving the interactive message sent by the user terminal. By adopting the head portrait updating method provided by the embodiment of the invention, the user can update the head portrait or the friend image according to the content of the interactive message in the process of interacting with the friend information, so that the interactive experience is enhanced.
Fig. 2 is a flowchart of an avatar updating method according to an embodiment of the present invention. Referring to fig. 2, a method flow provided by the embodiment of the present invention includes:
201. and acquiring the interactive message between the user and the friend currently on the message interactive interface.
202. And judging whether the interactive message is an avatar updating message.
203. And when the interactive message is the head portrait updating message, acquiring an image matched with the interactive message.
204. And updating the head portrait of the user according to the image.
According to the method provided by the embodiment of the invention, after the interactive message of the user and the friend is acquired on the message interactive interface, if the interactive message is the head portrait updating message, the image matched with the interactive message is acquired, and the head portrait of the user is updated according to the image.
Optionally, obtaining the current interaction message between the user and the friend includes:
acquiring an interactive message currently sent to a user by a friend; or the like, or, alternatively,
acquiring an interactive message currently sent to a friend by a user; or the like, or, alternatively,
when the interval duration of the first time and the second time is less than a specified time threshold, determining the interactive message sent by the user to the friend as the interactive message of the user and the friend currently, wherein the first time is the time when the friend sends the interactive message to the user, and the second time is the time when the user sends the interactive message to the friend.
Optionally, the determining whether the interactive message is an avatar update message includes:
performing word segmentation processing on the interactive message to obtain a plurality of words;
judging whether the participle is contained in the stored head portrait updating key word or not for each participle in the plurality of participles;
and when at least one participle in the interactive message is contained in the stored avatar update keyword, determining the interactive message as the avatar update message.
Optionally, acquiring an image matching the interactive message includes:
determining a specified updating keyword included in the interactive message;
and according to the specified updating key words, traversing and searching in the corresponding relation between the stored head portrait updating key words and the images, and determining the images corresponding to the specified updating key words as the images matched with the interactive messages.
Optionally, updating the user avatar according to the image includes:
replacing the original head portrait of the user with an image;
and after the head portrait is updated, replacing the image with the original head portrait of the user after a specified time length.
All the above-mentioned optional technical solutions can be combined arbitrarily to form the optional embodiments of the present invention, and are not described herein again.
Fig. 3 is a flowchart of an avatar updating method according to an embodiment of the present invention. Referring to fig. 3, a method flow provided by the embodiment of the present invention includes:
301. and acquiring the interactive message between the user and the friend currently on the message interactive interface.
The message interaction interface refers to an interface for displaying interaction messages between the user and the friends. The form of the message interaction interface can be seen in fig. 1 b. After the user executes the corresponding trigger operation, the message interaction interface can be displayed on the screen of the terminal. For example, when a user wants to perform message interaction with a friend, the head portrait of the friend may be clicked in the user relationship chain. When the terminal detects the head portrait clicking operation, displaying a detail information page of the friend on a screen of the terminal; and after the user clicks a 'send message' key on the page, triggering a message interaction interface to be displayed on a screen of the terminal. Or, after receiving an interactive message sent by a friend in the user relationship chain, the user may trigger the message interactive interface to be displayed on the screen of the terminal by clicking the prompt message of the interactive message.
In the embodiment of the present invention, since the interactive message sent by the friend to the user and the interactive message sent by the user to the friend both trigger the update of the avatar of the user, in order to make sure that the avatar of the user is specifically updated according to the interactive message initiated by which party, the method provided in the embodiment of the present invention further includes a step of obtaining the interactive message between the user and the friend currently. When the current interactive message between the user and the friend is obtained, the following method can be specifically adopted:
in the first mode, the interactive message sent by the friend to the user currently is obtained.
For the first way, referring to fig. 4a, the session between the user and the buddy is initiated by the buddy, but the user has not initiated any interactive message at this time. Therefore, in this case, after receiving the interactive message and displaying the interactive message on the message interactive interface, the user terminal directly determines the interactive message "zombie" initiated by the friend as the interactive message between the user and the friend currently. That is, in the subsequent process, when the user avatar is updated, updating is performed according to the interactive message "zombie".
And in the second mode, the interactive message which is currently sent to the friend by the user is obtained.
For the second way, referring to fig. 5a, a session between a user and a buddy is initiated by the user, while the buddy has not initiated any interactive messages at this time. Therefore, in this case, after the user terminal displays the interactive message on the message interactive interface, the user terminal directly determines the interactive message "when you want, everyone waits for you all" as the interactive message between the user and the friend currently. That is, in the subsequent process, when the user avatar is updated, updating is performed according to the interactive message "when you want, everybody waits for you".
In the third mode, when the interval duration between the first time and the second time is less than the specified time threshold, the interactive message sent by the user to the friend is determined as the interactive message between the user and the friend currently, the first time is the time when the friend sends the interactive message to the user, and the second time is the time when the user sends the interactive message to the friend.
For the third mode, when a conversation with a friend is performed in daily life, the situation is usually encountered, and two parties send interactive messages to each other at the same time or the interval time of the interactive messages sent by the two parties is short. For example, the user sends an interactive message immediately after the friend has sent an interactive message, and so on. For the situation, in order to make sure that the subsequent user avatar is specifically updated according to the interactive message initiated by which party, the embodiment of the present invention provides a solution for dividing according to the time interval for sending the interactive message by both parties. If the moment of sending the interactive message by the user is before the moment of sending the interactive message by the friend, taking the interactive message sent by the user as the standard; and if the interval duration of the first time and the second time is less than the specified time threshold, taking the interactive message sent by the user as the standard. Except the above situation, the interactive message sent by the friend is taken as the standard.
The specified time threshold may be 2s or 5s, and the like, and the size of the specified time threshold is not particularly limited in the embodiment of the present invention.
It should be noted that, when the current interactive message between the user and the friend is obtained, any one of the three manners may be adopted for implementation; different obtaining modes can be adopted at different times according to actual conditions, and the embodiment of the invention is not particularly limited in this respect. Of course, besides the three ways of acquiring the interactive message, other ways of acquiring may also be adopted, and this is not specifically limited in the embodiment of the present invention.
302. Judging whether the interactive message is a head portrait updating message; if the interactive message is the avatar update message, the following step 303 is executed; if the interactive message is not an avatar update message, the process flow ends so far.
In the embodiment of the invention, before judging whether the interactive message is the avatar updating message, the method further comprises the step of generating and storing the avatar updating keyword. The head portrait updating keywords refer to words for triggering the head portrait to be updated, and are high in use frequency in daily life and not uncommon. For example, words such as happy birthday, legal holiday, angry, autumn clothing, lovely, hug, black person, and the like can be used as the head portrait updating keywords. That is, the avatar update key may include words or phrases of various parts of speech (e.g., nouns, adjectives, verbs, etc.). Avatar update keywords may be collected by the relevant technician and an avatar update keyword library generated.
When judging whether the interactive message is the avatar update message, the following method can be specifically adopted:
firstly, performing word segmentation processing on the interactive message to obtain a plurality of words.
For the first step, one interactive message may include words of various parts of speech such as subject, predicate, object, etc. The words such as nouns and adjectives have a large influence on the semantics of the interactive message, while the words such as dummy words and adverbs have a small influence on the semantics of the interactive message, and in order to correctly understand the semantics of the interactive message, the embodiment of the invention can analyze a complete interactive message by performing word segmentation processing on the interactive message.
Taking the example of the interactive message "when you want to want, everyone waits for you" as an example, after the word segmentation processing is performed on the interactive message, the interactive message can be divided into "when you", "coming", "like", "everyone" and "everyone" waiting for you "and" woollen ". The specific implementation manner of performing word segmentation processing on the interactive message may refer to the existing word segmentation technology, and is not described herein again.
And secondly, judging whether each participle in the plurality of participles is contained in the stored head portrait updating key words.
For the second step, the keyword library is updated because the head portrait has already been generated and stored in advance. Therefore, after each segmented word is obtained according to the first step, the keyword library can be updated according to the head portrait to determine whether each segmented word is the head portrait update keyword. Continuing with the example in the first step, after performing word segmentation processing on the interactive message "when you want, everyone waits for you so", the above 7 word segmentations are obtained. And for any word segmentation in the 7 word segmentations, comparing the word segmentation with each head portrait updating keyword in the head portrait updating keyword library. If the segmentation word is matched with one avatar updating keyword in the avatar updating keyword library, determining that the segmentation word is contained in the stored avatar updating keyword; and if the participle is not matched with any avatar updating keyword in the avatar updating keyword library, determining that the participle is not contained in the stored avatar updating keywords.
And thirdly, when at least one word segmentation in the interactive message is contained in the stored head portrait updating key words, determining that the interactive message is the head portrait updating message.
And for the third step, when at least one participle in the interactive message is contained in the stored avatar update keyword, which means that the avatar of the user can be updated according to the at least one participle, so that the interactive new message including the at least one participle is determined as the avatar update message.
303. And when the interactive message is the head portrait updating message, acquiring an image matched with the interactive message.
In this embodiment of the present invention, the image may be stored on the server or the user terminal, which is not specifically limited in this embodiment of the present invention. The image may be a static image or a dynamic image, and the embodiment of the present invention also does not specifically limit this. Before the user terminal acquires the image matched with the interactive message, the method also comprises the step of setting the corresponding relation between the head portrait updating keyword and the image. The image may be a static picture or a dynamic picture, which is not specifically limited in this embodiment of the present invention. When the correspondence between the avatar update keyword and the image is set, a correspondence table as shown in table 1 below may be generated.
TABLE 1
Figure BDA0000611506470000081
Figure BDA0000611506470000091
It should be noted that the step of setting the correspondence between the avatar update keyword and the image is only required to be performed when the method provided in this embodiment is performed for the first time. When the method provided by the embodiment of the invention is executed again subsequently, the corresponding relation table can be directly applied. And if and only if the corresponding relation is changed, the server updates the corresponding relation table.
When acquiring the image matched with the interactive message, the following method can be specifically adopted:
determining a specified updating keyword included in the interactive message; and according to the specified updating key words, traversing and searching in the corresponding relation between the stored head portrait updating key words and the images, and determining the images corresponding to the specified updating key words as the images matched with the interactive messages.
Of course, besides the above-mentioned manner of acquiring the image matched with the interactive message, other manners of acquiring may also be adopted, which is not specifically limited in the embodiment of the present invention. It should be noted that, if one interactive message includes a plurality of segments, and more than one segment is included in the stored avatar update keyword, and the images corresponding to the more than one segment are different, when determining the image matching the interactive message, the weighted value corresponding to each segment may be referred to. The image corresponding to the participle with the largest weight value can be determined as the image matched with the interactive message. Wherein, the weight value corresponding to each participle can be preset. For example, when generating the avatar update keyword library, weights are set for the respective avatar update keywords. The weight values of the noun words and the adjective words may be set to be slightly larger, and the weight values of the words of other parts of speech may be set to be slightly smaller.
304. The original avatar of the user is replaced with an image.
Referring to fig. 4a, a session between a user and a friend is initiated by the friend, and after a user terminal receives an interactive message "zombie" sent to the user by the friend, the interactive message is determined as the interactive message between the user and the friend currently. Since the interactive message includes the segmentation word "zombie" which is exactly located in the stored avatar update keyword, after determining the image corresponding to the segmentation word "zombie" according to table 1 above, the user image (a female facial avatar) in fig. 4a is replaced with the avatar shown in fig. 4 b. Then, if the user sends an interactive message to the friend, the avatar of the user is displayed as the avatar of the zombie as shown in fig. 4 b.
Referring to fig. 5a, a session between a user and a friend is initiated by the user, and after the user terminal displays an interactive message "when you want, everyone waits for you" sent by the user to the friend, the interactive message is determined as the interactive message of the user currently with the friend. Since the words "all are waiting you" are included in the interactive message, and the words "all are waiting you" are exactly in the stored avatar update key, the user image (one female facial avatar) in fig. 4a is replaced with the crying expression avatar shown in fig. 5a after determining the image corresponding to the words "all are waiting you" according to table 1 above. And then if the friend immediately sends an interactive message to the user, the head portrait of the user still keeps the updated crying expression head portrait as shown in fig. 5 b.
Referring to fig. 6a and 6b, after the association relationship between the user and the friend is established, if the user and the friend happen to have sent the interactive messages to the other party at the same time (less than the specified time threshold value), respectively. The interactive message sent by the friend to the user is 'zombies', and the interactive message sent by the user to the friend is 'when you want, and everybody waits for your wonder'. According to the step 301, the interactive message "when you want, everyone waits for you" is determined as the interactive message of the user and the friend currently. Since the words "all are waiting you" are included in the interactive message, and the words "all are waiting you" are exactly in the stored avatar update key, the user image (one female facial avatar) in fig. 6a is replaced with the crying expression avatar shown in fig. 6b after determining the image corresponding to the words "all are waiting you" according to table 1 above. Then, if the user sends an interactive message to the friend, the head portrait of the user still shows the crying expression head portrait as shown in fig. 6 b.
305. And after the head portrait is updated, replacing the image with the original head portrait of the user after a specified time length.
The specified duration may be 30s or 60s, and the like, and the size of the specified duration is not particularly limited in the embodiment of the present invention. As shown in fig. 6a and 6b, taking the specified time length of 60s as an example, after the user image is updated with a female facial avatar to cry expression avatar 60s, the user avatar restores the crying expression avatar to a female facial avatar.
It should be noted that, after the avatar of the user is updated on the message interaction interface of the user terminal, in order to enable the friend to see the updated image of the user, the server may also send the updated avatar of the user to the friend terminal, and display the updated avatar on the message interaction interface of the friend terminal, thereby increasing the interest of interaction. In addition, the user can also update the head portrait of the friend by sending the interactive message to the friend, and the specific implementation mode is consistent with the mode of updating the head portrait of the user by sending the interactive message to the friend, which is not described herein again.
In addition, if the current user respectively carries out information interaction with a plurality of friends, three people are in a chat state like A, B, C. The avatar of the user may be different on the message interactive interface a where the user chats with a, on the message interactive interface B where the user chats with B, and on the message interactive interface C where the user chats with C. That is, the same user may correspond to different avatars on different message interfaces, the original avatar of the user may be flexibly changed according to the interaction information on the current message interaction interface, and is not affected by the interaction information on other message interaction interfaces, and only the interaction information appearing on the current message interaction interface may trigger the avatar of the user to be updated. In addition, the user avatar on the main panel or main interface may also maintain the original avatar unchanged. The embodiment of the present invention is not particularly limited to this. That is, when the avatar of the user is updated, the avatar is only displayed on the corresponding message interaction interface, so that the communication interest between the two interaction parties is increased.
According to the method provided by the embodiment of the invention, after the interactive message of the user and the friend is obtained on the message interactive interface, if the interactive message is the head portrait updating message, the image matched with the interactive message is obtained, and the head portrait of the user is updated according to the image.
Fig. 7 is a schematic structural diagram of an avatar updating apparatus according to an embodiment of the present invention. Referring to fig. 7, the apparatus includes: an interactive message acquisition module 701, an update message judgment module 702, an image acquisition module 703 and an avatar update module 704.
The interactive message acquiring module 701 is configured to acquire, on a message interactive interface, an interactive message between a user and a friend currently; the update message determining module 702 is connected to the interactive message obtaining module 701, and is configured to determine whether the interactive message is an avatar update message; the image obtaining module 703 is connected to the update message determining module 702, and is configured to obtain an image matching the interactive message when the interactive message is the avatar update message; the avatar updating module 704 is connected to the image obtaining module 703, and is configured to update the user avatar according to the image.
Optionally, the interactive message obtaining module includes:
the first interactive message acquisition unit is used for acquiring the interactive message currently sent to the user by the friend;
the second interactive message acquiring unit is used for acquiring the interactive message currently sent to the friend by the user;
and the third interactive message acquiring unit is used for determining the interactive message sent to the friend by the user as the current interactive message between the user and the friend when the interval duration between the first time and the second time is less than the specified time threshold, wherein the first time is the time when the friend sends the interactive message to the user, and the second time is the time when the user sends the interactive message to the friend.
Optionally, the update message judgment module is configured to perform word segmentation processing on the interactive message to obtain a plurality of words; judging whether the participle is contained in the stored head portrait updating key word or not for each participle in the plurality of participles; and when at least one participle in the interactive message is contained in the stored avatar update keyword, determining the interactive message as the avatar update message.
Optionally, the image obtaining module is configured to determine a specified update keyword included in the interactive message; and according to the specified updating key words, traversing and searching in the corresponding relation between the stored head portrait updating key words and the images, and determining the images corresponding to the specified updating key words as the images matched with the interactive messages.
Optionally, the avatar updating module is configured to replace the original avatar of the user with the image; and after the head portrait is updated, replacing the image with the original head portrait of the user after a specified time length.
According to the device provided by the embodiment of the invention, after the interactive message of the user and the friend is acquired on the message interactive interface, if the interactive message is the head portrait updating message, the image matched with the interactive message is acquired, and the head portrait of the user is updated according to the image.
It should be noted that: in the embodiment, when the avatar updating apparatus updates the avatar, only the division of the functional modules is taken as an example, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the apparatus is divided into different functional modules to complete all or part of the functions described above. In addition, the avatar updating apparatus provided in the above embodiments and the avatar updating method embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
Fig. 8 is a terminal according to an embodiment of the present invention, where the terminal may be configured to execute the avatar updating method provided in the above embodiment. Referring to fig. 8, the terminal 800 includes:
RF (Radio Frequency) circuitry 110, memory 120 including one or more computer-readable storage media, input unit 130, display unit 140, sensor 150, audio circuitry 160, WiFi (wireless fidelity) module 170, processor 180 including one or more processing cores, and power supply 190. Those skilled in the art will appreciate that the terminal structure shown in fig. 8 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information from a base station and then sends the received downlink information to the one or more processors 180 for processing; in addition, data relating to uplink is transmitted to the base station. In general, the RF circuitry 110 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), email, SMS (short messaging Service), etc.
The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal 800, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 120 may further include a memory controller to provide the processor 180 and the input unit 130 with access to the memory 120.
The input unit 130 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 130 may include a touch-sensitive surface 131 as well as other input devices 132. The touch-sensitive surface 131, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 131 (e.g., operations by a user on or near the touch-sensitive surface 131 using a finger, a stylus, or any other suitable object or attachment), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 131 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. Additionally, the touch-sensitive surface 131 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. In addition to the touch-sensitive surface 131, the input unit 130 may also include other input devices 132. In particular, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by or provided to a user and various graphical user interfaces of the terminal 800, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch-sensitive surface 131 may cover the display panel 141, and when a touch operation is detected on or near the touch-sensitive surface 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although in FIG. 8, touch-sensitive surface 131 and display panel 141 are shown as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 131 may be integrated with display panel 141 to implement input and output functions.
The terminal 800 can also include at least one sensor 150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 141 and/or a backlight when the terminal 800 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the terminal 800, further description is omitted here.
Audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between a user and terminal 800. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 160, and then outputs the audio data to the processor 180 for processing, and then to the RF circuit 110 to be transmitted to, for example, another terminal, or outputs the audio data to the memory 120 for further processing. The audio circuitry 160 may also include an earbud jack to provide communication of peripheral headphones with the terminal 800.
WiFi belongs to a short-distance wireless transmission technology, and the terminal 800 can help a user send and receive e-mails, browse web pages, access streaming media, and the like through the WiFi module 170, and provides wireless broadband internet access for the user.
The processor 180 is a control center of the terminal 800, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the terminal 800 and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the mobile phone. Optionally, processor 180 may include one or more processing cores; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The terminal 800 further includes a power supply 190 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 180 via a power management system to manage charging, discharging, and power consumption management functions via the power management system. The power supply 190 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the terminal 800 may further include a camera, a bluetooth module, etc., which will not be described herein. Specifically, in this embodiment, the display unit of the terminal is a touch screen display, the terminal further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for:
acquiring the interactive message of the current user and the friend on a message interactive interface;
judging whether the interactive message is a head portrait updating message;
when the interactive message is a head portrait updating message, acquiring an image matched with the interactive message;
and updating the head portrait of the user according to the image.
Assuming that the above is the first possible implementation manner, in a second possible implementation manner provided on the basis of the first possible implementation manner, the memory of the terminal further includes instructions for performing the following operations:
acquiring an interactive message currently sent to a user by a friend; or the like, or, alternatively,
acquiring an interactive message currently sent to a friend by a user; or the like, or, alternatively,
when the interval duration of the first time and the second time is less than a specified time threshold, determining the interactive message sent by the user to the friend as the interactive message of the user and the friend currently, wherein the first time is the time when the friend sends the interactive message to the user, and the second time is the time when the user sends the interactive message to the friend.
In a third possible implementation manner provided as a basis for the first possible implementation manner, the memory of the terminal further includes instructions for performing the following operations:
performing word segmentation processing on the interactive message to obtain a plurality of words;
judging whether the participle is contained in the stored head portrait updating key word or not for each participle in the plurality of participles;
and when at least one participle in the interactive message is contained in the stored avatar update keyword, determining the interactive message as the avatar update message.
In a fourth possible implementation manner provided on the basis of the first or third possible implementation manner, the memory of the terminal further includes instructions for performing the following operations:
determining a specified updating keyword included in the interactive message;
and according to the specified updating key words, traversing and searching in the corresponding relation between the stored head portrait updating key words and the images, and determining the images corresponding to the specified updating key words as the images matched with the interactive messages.
In a fifth possible implementation manner provided as a basis for the first possible implementation manner, the memory of the terminal further includes instructions for performing the following operations:
replacing the original head portrait of the user with an image;
and after the head portrait is updated, replacing the image with the original head portrait of the user after a specified time length.
The terminal provided by the embodiment of the invention acquires the interactive message between the user and the friend currently on the message interactive interface, acquires the image matched with the interactive message if the interactive message is the head portrait updating message, and updates the head portrait of the user according to the image.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (9)

1. An avatar updating method, the method comprising:
acquiring the current interactive message between a user and a friend on a message interactive interface, wherein the message interactive interface refers to an interface for displaying the interactive message between the user and the friend, and the message interactive interface is displayed on a screen of a terminal after the user executes corresponding trigger operation;
judging whether the interactive message is an avatar updating message;
when the interactive message is the head portrait updating message, acquiring an image matched with the interactive message;
updating the user head portrait displayed on the message interaction interface according to the image;
the obtaining of the current interaction message between the user and the friend includes:
if the conversation on the message interaction interface is initiated by the friend and the user does not send the interaction message, acquiring the interaction message currently sent to the user by the friend;
if the conversation on the message interaction interface is initiated by the user and the friend does not send the interaction message, acquiring the interaction message currently sent to the friend by the user;
if the user and the friend both send the interactive messages on the message interactive interface, when the interval duration between a first time and a second time is less than a specified time threshold, determining the interactive message sent by the user to the friend to be the interactive message between the user and the friend currently, wherein the first time is the time when the friend sends the interactive message to the user, and the second time is the time when the user sends the interactive message to the friend.
2. The method of claim 1, wherein the determining whether the interactive message is an avatar update message comprises:
performing word segmentation processing on the interactive message to obtain a plurality of words;
for each word segmentation in the plurality of word segmentation, judging whether the word segmentation is contained in the stored head portrait updating key words or not;
and when at least one word segmentation in the interactive message is contained in the stored avatar update keyword, determining that the interactive message is the avatar update message.
3. The method of claim 1 or 2, wherein the obtaining the image matching the interactive message comprises:
determining a specified updating keyword included in the interactive message;
and according to the specified updating keywords, traversing and searching in the corresponding relation between the stored head portrait updating keywords and the images, and determining the images corresponding to the specified updating keywords as the images matched with the interactive messages.
4. The method of claim 1, wherein the updating the user avatar presented on the messaging interface according to the image comprises:
replacing the original head portrait of the user shown on the message interaction interface with the image;
and after the head portrait is updated, replacing the image displayed on the message interaction interface with the original head portrait of the user after a specified time length.
5. An avatar update apparatus, comprising:
the interactive message acquiring module is used for acquiring the current interactive messages between the user and the friends on a message interactive interface, wherein the message interactive interface refers to an interface for displaying the interactive messages between the user and the friends, and the message interactive interface is displayed on a screen of a terminal after the user executes corresponding triggering operation;
the updating message judging module is used for judging whether the interactive message is an avatar updating message;
the image acquisition module is used for acquiring an image matched with the interactive message when the interactive message is the head portrait updating message;
the head portrait updating module is used for updating the user head portrait displayed on the message interaction interface according to the image;
wherein the interactive message acquisition module is configured to:
if the conversation on the message interaction interface is initiated by the friend and the user does not send the interaction message, acquiring the interaction message currently sent to the user by the friend;
if the conversation on the message interaction interface is initiated by the user and the friend does not send the interaction message, acquiring the interaction message currently sent to the friend by the user;
if the user and the friend both send the interactive messages on the message interactive interface, when the interval duration between a first time and a second time is less than a specified time threshold, determining the interactive message sent by the user to the friend to be the interactive message between the user and the friend currently, wherein the first time is the time when the friend sends the interactive message to the user, and the second time is the time when the user sends the interactive message to the friend.
6. The apparatus of claim 5, wherein the update message determining module is configured to perform word segmentation on the interactive message to obtain a plurality of words; for each word segmentation in the plurality of word segmentation, judging whether the word segmentation is contained in the stored head portrait updating key words or not; and when at least one word segmentation in the interactive message is contained in the stored avatar update keyword, determining that the interactive message is the avatar update message.
7. The apparatus according to claim 5 or 6, wherein the image obtaining module is configured to determine a specified update keyword included in the interactive message; and according to the specified updating keywords, traversing and searching in the corresponding relation between the stored head portrait updating keywords and the images, and determining the images corresponding to the specified updating keywords as the images matched with the interactive messages.
8. The apparatus of claim 5, wherein the avatar updating module is configured to replace an original avatar of the user presented on the messagelnteraction interface with the image; and after the head portrait is updated, replacing the image displayed on the message interaction interface with the original head portrait of the user after a specified time length.
9. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the avatar updating method of any of claims 1-4.
CN201410664670.3A 2014-11-19 2014-11-19 Head portrait updating method and device Active CN105630846B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410664670.3A CN105630846B (en) 2014-11-19 2014-11-19 Head portrait updating method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410664670.3A CN105630846B (en) 2014-11-19 2014-11-19 Head portrait updating method and device

Publications (2)

Publication Number Publication Date
CN105630846A CN105630846A (en) 2016-06-01
CN105630846B true CN105630846B (en) 2020-09-15

Family

ID=56045787

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410664670.3A Active CN105630846B (en) 2014-11-19 2014-11-19 Head portrait updating method and device

Country Status (1)

Country Link
CN (1) CN105630846B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10155168B2 (en) 2012-05-08 2018-12-18 Snap Inc. System and method for adaptable avatars
CN106155485B (en) * 2016-07-01 2020-03-17 北京小米移动软件有限公司 Display method and device of interactive interface
CN106411695A (en) * 2016-08-29 2017-02-15 广州华多网络科技有限公司 User characteristic information area pendant dynamic updating method and device and smart terminal
US10432559B2 (en) 2016-10-24 2019-10-01 Snap Inc. Generating and displaying customized avatars in electronic messages
CN106789552B (en) * 2016-11-24 2020-09-29 青岛海信移动通信技术股份有限公司 Friend head portrait updating method and device of social account
CN107181673A (en) * 2017-06-08 2017-09-19 腾讯科技(深圳)有限公司 Instant communicating method and device, computer equipment and storage medium
CN107707453B (en) * 2017-09-18 2021-11-30 北京小米移动软件有限公司 Reminding method and device
CN110955787B (en) * 2019-11-12 2024-03-12 上海连尚网络科技有限公司 User head portrait setting method, computer equipment and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7080034B1 (en) * 2000-05-04 2006-07-18 Reams John M Interactive specialty commodities information and exchange system and method
CN102238096A (en) * 2010-05-06 2011-11-09 蒋斌 Control method for updating friend information of chatting tool according to characteristics of login user
CN102377762A (en) * 2010-08-27 2012-03-14 ***通信有限公司 Information processing method, device and system in message interaction process
CN104076944A (en) * 2014-06-06 2014-10-01 北京搜狗科技发展有限公司 Chat emoticon input method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101917512A (en) * 2010-07-26 2010-12-15 宇龙计算机通信科技(深圳)有限公司 Method and system for displaying head picture of contact person and mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7080034B1 (en) * 2000-05-04 2006-07-18 Reams John M Interactive specialty commodities information and exchange system and method
CN102238096A (en) * 2010-05-06 2011-11-09 蒋斌 Control method for updating friend information of chatting tool according to characteristics of login user
CN102377762A (en) * 2010-08-27 2012-03-14 ***通信有限公司 Information processing method, device and system in message interaction process
CN104076944A (en) * 2014-06-06 2014-10-01 北京搜狗科技发展有限公司 Chat emoticon input method and device

Also Published As

Publication number Publication date
CN105630846A (en) 2016-06-01

Similar Documents

Publication Publication Date Title
CN109388297B (en) Expression display method and device, computer readable storage medium and terminal
CN105630846B (en) Head portrait updating method and device
WO2017206916A1 (en) Method for determining kernel running configuration in processor and related product
KR101978590B1 (en) Message updating method, device and terminal
WO2016184302A1 (en) Message forwarding method and electronic device
WO2016110182A1 (en) Method, apparatus and terminal for matching expression image
JP6492184B2 (en) Method, device, and system for managing information recommendations
JP6910300B2 (en) A method for displaying chat history records and a device for displaying chat history records
WO2018196588A1 (en) Information sharing method, apparatus and system
CN107734170B (en) Notification message processing method, mobile terminal and wearable device
CN106293738B (en) Expression image updating method and device
CN105094501B (en) Method, device and system for displaying messages in mobile terminal
CN108600089B (en) Expression image display method and terminal equipment
CN109189303B (en) Text editing method and mobile terminal
CN110750198A (en) Expression sending method and mobile terminal
CN106302101B (en) Message reminding method, terminal and server
WO2015135457A1 (en) Method, apparatus, and system for sending and playing multimedia information
CN105320532B (en) Method, device and terminal for displaying interactive interface
CN105159655B (en) Behavior event playing method and device
CN110888572A (en) Message display method and terminal equipment
KR101939925B1 (en) Video-based check-in method, terminal, server and system
CN112311652B (en) Message sending method, device, terminal and storage medium
CN107346347B (en) Webpage table display method and device
CN106803916B (en) Information display method and device
CN113542206B (en) Image processing method, device and computer readable storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant