WO2017177770A1 - 表情分类标识的控制方法、装置和存储介质 - Google Patents

表情分类标识的控制方法、装置和存储介质 Download PDF

Info

Publication number
WO2017177770A1
WO2017177770A1 PCT/CN2017/074795 CN2017074795W WO2017177770A1 WO 2017177770 A1 WO2017177770 A1 WO 2017177770A1 CN 2017074795 W CN2017074795 W CN 2017074795W WO 2017177770 A1 WO2017177770 A1 WO 2017177770A1
Authority
WO
WIPO (PCT)
Prior art keywords
expression classification
classification identifier
expression
area
identifier
Prior art date
Application number
PCT/CN2017/074795
Other languages
English (en)
French (fr)
Inventor
栗绍峰
朱明浩
杨晓明
林焕彬
张仁寿
梁志杰
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2017177770A1 publication Critical patent/WO2017177770A1/zh
Priority to US16/112,473 priority Critical patent/US20180365527A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/107Computer-aided management of electronic mailing [e-mailing]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/175Static expression
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/176Dynamic expression
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Interoperability with other network applications or services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items

Definitions

  • the present invention relates to the field of multimedia, and in particular to a method, an apparatus, and a storage medium for controlling an expression classification identifier.
  • the identifier 1 and the identifier 2 are expression identifiers, and each of the expression identifiers includes a plurality of expression icons.
  • the expression icon corresponding to the identifier 1 is an expression of “a nerve frog and a happy horse”, and the identifier 2 corresponds to The emoticon icon is the icon of "Wild Meng".
  • the user wants to send the expression in the expression ID 2 it is necessary to slide the area 1 to the left or right until the expression in the logo 2 is displayed in the terminal device.
  • the above method of selecting an expression wastes a large amount of time for the user, and the user cannot send the emoticon icon as quickly as possible. Meanwhile, as shown in FIG.
  • the embodiment of the invention provides a method, a device and a storage medium for controlling the expression classification identifier, so as to at least solve the technical problem that the flexibility of controlling the expression classification identifier in the prior art is poor.
  • a method for controlling an expression classification identifier including: acquiring a movement instruction, wherein the movement instruction is used to move a first expression classification identifier located on an expression panel to a target position, Displaying, on the expression panel, one or more expression classification identifiers including the first expression classification identifier, and the first expression classification identifier includes one or more expression icons; and acquiring the target location a first operation indicated by an area; performing the first operation on the first expression classification identifier.
  • a control apparatus for an expression classification identifier including: a first acquisition unit, configured to acquire a movement instruction, wherein the movement instruction is used to be located on an expression panel An expression classification identifier is moved to the target location, the expression panel is displayed with one or more expression classification identifiers including the first expression classification identifier, and the first expression classification identifier includes one or more emoticons a second acquiring unit, configured to acquire a first operation indicated by the first area where the target location is located, and an execution unit, configured to perform the first operation on the first expression classification identifier.
  • a storage medium configured to store program code for performing the following steps, comprising:
  • Obtaining a movement instruction wherein the movement instruction is configured to move a first expression classification identifier located on the expression panel to the target position, and the expression panel displays one or more of the first expression classification identifiers
  • An expression classification identifier, and the first expression classification identifier includes one or more expression icons
  • the acquiring movement instruction is used, wherein the movement instruction is used to move the first expression classification identifier located on the expression panel to the target position, and the expression panel is displayed
  • One or more expression classification identifiers including the first expression classification identifier are displayed, and the first expression classification identifier includes one or more emoticons; and the first region where the target location is located is indicated a first operation; performing a first operation on the first expression classification identifier, and performing a corresponding operation on the first expression classification identifier moved to the target position in the expression panel to implement the first expression classification identifier
  • the control for example, display, deletion, or movement control, has the disadvantage of being able to flexibly control the expression of the expression in the emoticon panel, and achieves the purpose of flexibly controlling the expression classification identifier in the emoticon panel.
  • the technical effect of controlling the flexibility of the expression classification identifier is improved, thereby solving the technical problem of poor flexibility in controlling the expression classification identifier in the prior art.
  • FIG. 1 is an architectural diagram of a hardware structure in accordance with an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a display interface of an expression panel in the related art
  • FIG. 3 is a flowchart diagram of a method for controlling an expression classification identifier according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of a display interface of an optional expression panel according to an embodiment of the invention.
  • FIG. 5 is a schematic diagram of another display interface of an optional expression panel according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of another display interface of an optional expression panel according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of another display interface of an optional expression panel according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of another display interface of an optional expression panel according to an embodiment of the present invention.
  • FIG. 9 is a schematic diagram of another display interface of an optional expression panel according to an embodiment of the present invention.
  • FIG. 10 is a schematic diagram of a control apparatus for an expression classification identifier according to an embodiment of the present invention.
  • FIG. 11 is a hardware configuration diagram of a terminal according to an embodiment of the present invention.
  • an embodiment of a method that can be performed by an embodiment of the apparatus of the present application is provided. It is noted that the steps illustrated in the flowchart of the accompanying drawings can be in a computer system such as a set of computer executable instructions. The execution is performed, and although the logical order is shown in the flowcharts, in some cases the steps shown or described may be performed in a different order than the ones described herein.
  • a method for controlling an expression classification identifier is provided.
  • the method for controlling the expression classification identifier may be applied to a hardware environment formed by the mobile terminal 102 and the server 104 as shown in FIG. 1.
  • the mobile terminal 102 is connected to the server 104 through a network, including but not limited to: a mobile communication network, a wide area network, a metropolitan area network, or a local area network, and the mobile terminal 102 may be a mobile terminal. It can also be a PC terminal, a notebook terminal or a tablet terminal.
  • FIG. 3 is a flowchart of a method for controlling an expression classification identifier according to an embodiment of the present invention.
  • the control method of the expression classification identifier provided by the embodiment of the present invention is specifically introduced in conjunction with FIG. 3, as shown in FIG.
  • the control method of the classification identifier mainly includes the following steps S302 to S306:
  • Step S302 acquiring a movement instruction, wherein the movement instruction is used to move the first expression classification identifier located on the expression panel to the target position, and the expression panel displays one or more expression classification identifiers including the first expression classification identifier And the first expression classification identifier includes one or more emoticons.
  • a plurality of expression classification identifiers are included in the expression panel 1, which are an expression classification identifier 1 to an expression classification identifier 7, respectively, wherein each expression classification identifier includes one or more expression icons.
  • each expression classification identifier includes one or more expression icons.
  • FIG. 4 when the user selects the expression classification identifier 2 in the expression panel 1, a plurality of expression icons (such as the expression icons displayed by symbols 21 to 28 in FIG. 4) are displayed in the expression panel, wherein The emoticon can be a dynamic icon or a static icon.
  • the emoticons displayed by the symbols 21 to 28 are only part of the emoticons included in the emoticon identification 2, and other emoticons included in the emoticon 2 are not displayed or are not all displayed. Elevate icons that are not displayed or not displayed, by swiping up, down, or swiping left or right.
  • the left side of the expression classification identifier 7 may further include one or more expression classification identifiers that are not completely displayed.
  • the above movement instruction may be a long press instruction, for example, the user long presses the expression of the expression classification identifier 2 in the terminal device. After the terminal device obtains the long press command, the expression classification identifier 2 can start moving from the current position until it moves to the target position.
  • Step S304 Acquire a first operation indicated by the first area where the target location is located.
  • the interface shown in FIG. 5 is an optional display interface of the instant messaging service (eg, QQ or WeChat) in the terminal device, indicated by different regions in the interface as shown in FIG.
  • the first operation is different.
  • the expression classification mark 2 is moved to the area 2 indicated by the broken line frame in FIG. 5, the first operation indicated by the area 2 may be an operation of moving the expression classification mark 2 to another position in the expression classification identification list. That is, in this area, the expression classification mark 2 can be moved from the current initial area to the area where the other expression classification mark is located, for example, the area where the expression classification mark 7 is located.
  • the first operation indicated by the area 3 may be an operation of deleting the expression classification identifier 2
  • the first operation indicated by the area 4 may also be
  • the above-mentioned area 3 and area 4 are areas other than the area 2, which are not shown in FIG.
  • the first operations indicated by each of the above areas 2 to 4 may be the same operation, and may also be different operations.
  • Step S306 performing a first operation on the first expression classification identifier.
  • the area where the expression classification identifier 2 is located is an area between any two adjacent expression classification identifiers (for example, the expression classification identifier 6 and the expression classification identifier 4) in the area 2, when the user releases the expression classification identifier 2
  • the corresponding first operation can be performed on the expression classification identifier 2, for example, the expression classification identifier 2 is moved to the position where the expression classification identifier 7 is located in the expression classification identification list.
  • the target position after the movement of the expression classification identifier 2 is in the area 3 or the area 4, when the user releases the expression classification identifier 2, that is, when the instruction to long press the expression classification identifier 2 is stopped, the first expression of the expression classification identifier 2 is performed.
  • the operation for example, deletes the expression classification identifier 2 in the expression classification identification list.
  • the first expression classification identifier is controlled by performing a corresponding operation on the first expression classification identifier moved to the target position in the expression panel, for example, display, deletion or movement control, etc.
  • the shortcomings of the expression in the instant messaging service cannot be flexibly controlled, and the purpose of flexibly controlling the expression classification identifier in the expression panel is achieved. Therefore, the technical effect of improving the flexibility of controlling the expression classification identifier is realized, thereby solving the technical problem that the flexibility of controlling the expression classification identifier in the prior art is poor.
  • the first expression classification identifier may be further controlled to move from the initial region in the expression classification identifier list to the first region; and the first expression classification identifier is selected in the expression classification classification identifier list.
  • the other expression classification marks are sequentially moved in the direction pointing to the initial area.
  • the area where the expression classification identifier 1 to the expression classification identifier 7 is located is the expression classification identifier list.
  • the terminal device may control the first expression classification identifier to be moved from the initial region of the first expression classification identifier to the target location of the first region, and the first expression classification identifier after the movement
  • the initial region is in an idle state, and no expression classification identifier is displayed.
  • the expression classification identifiers other than the first expression classification identifier in the expression classification identifier list may sequentially move one position in the direction pointing to the initial region.
  • FIG. 4 is an initial region in which each expression classification identifier in the expression classification identification table is located before moving the first expression classification identifier.
  • the initial area in which the expression classification mark 2 is located does not display any expression classification mark.
  • the expression classification mark 3 to the right side of the expression classification mark 2 to the expression classification mark 7 may be sequentially moved to the left by one position. For example, as shown in FIG.
  • the expression classification mark 3 is moved to the initial area of the expression classification mark 2
  • the expression classification mark 4 is moved to the initial area of the expression classification mark 3, and the like, and the expression on the left side of the expression classification mark 2 can also be
  • the classification mark 1 is moved to the right by one position, that is, the expression classification mark 1 is moved to the initial area of the expression classification mark 2 (not shown in Fig. 6).
  • the expression classification identifiers other than the first expression classification identifier in the control expression classification identifier list are sequentially moved according to the direction pointing to the initial region, it is further determined whether the number of other expression classification identifiers is smaller than the expression in the expression classification identifier list.
  • the number of regions for displaying the expression classification identifier if the number of other expression classification identifiers is smaller than the number of regions for displaying the expression classification identifier in the expression classification identifier list, the expressions are not displayed in the expression classification identifier list
  • the area of the classification identifier is displayed as a free area; if the number of other expression classification identifiers is greater than or equal to the number of areas in the expression classification identifier list for displaying the expression classification identifier, the second expression classification identifier is displayed in the expression classification identification list.
  • the second expression classification identifier is not displayed in the expression classification identifier list before moving the first expression classification identifier.
  • the expression classification identifier 3 to the expression classification identifier 7 After the expression classification identifier 3 to the expression classification identifier 7 are sequentially moved to the left by one position, it can be determined whether the expression classification identification list includes other expressions other than the above-mentioned expression classification identifier 1, the expression classification identifier 3 to the expression classification identifier 7, The expression classification identifier; if it is judged to include other expression classification identifiers. For example, the expression classification identifier 8 and the expression classification identifier 9 are further included.
  • the expression classification identifier 8 may be displayed in the expression classification identifier list, wherein the expression classification identifier 8 is adjacent to the expression classification identifier 7; if it is determined not to include For other expression classification identifiers, the area in which the expression classification identifier is not displayed in the expression classification identifier list may be displayed as a free area, that is, the initial area of the expression classification identifier 7 is displayed as a free area. Specifically, as shown in FIG.
  • the expression classification identifier 3 to the expression classification identifier 7 are sequentially moved to the left by one position, and the expression classification identifier 8 can be moved to The initial area of the expression classification identifier 7 is displayed in the expression classification identification list.
  • the expression classification identifier may be displayed in the initial region of the expression classification identifier 1 at the present time, wherein the expression classification identifier displayed in the initial region of the expression classification identifier 1 is in the mobile expression.
  • the classification identifier 2 is not displayed in the expression classification identifier list before; if it is determined that other expression classification identifiers are not included, the initial region of the expression classification identifier 1 may be displayed as a free area.
  • the expression classification may be controlled after the initial position of the expression classification identifier 1 is displayed as the idle position.
  • the marker 1, the expression classification marker 3 to the expression classification marker 7 are sequentially moved to the left, so that the initial region of the expression classification marker 7 is a free region.
  • performing the first operation on the first expression classification identifier may be: determining whether the first area includes a second area where any two adjacent expression classification identifiers in the expression classification identifier list are located; and determining the first area Including the second area in the expression classification identifier list, when the first expression classification identifier partially or completely overlaps with the second area, displaying a free area between the adjacent two expression classification identifiers; moving the first expression classification identifier To the free area.
  • the second area may be an area where any two adjacent expression classification identifiers in the expression classification identifier list are located.
  • the expression classification identifier 7 and the area adjacent to the expression classification identifier 8 adjacent thereto are Two areas.
  • the region 2 it can be determined that the region 2 (ie, the first region described above) includes the region where the expression classification identifier 7 and the expression classification identifier 8 are located (ie, the second region described above). .
  • the expression classification mark 2 and the second area are about to fit or overlap (for example, the area where the expression classification mark 2 is located in FIG. 6), it will be displayed between the expression classification mark 7 and the expression classification mark 8.
  • the expression classification mark 2 will be released to the free area indicated by reference numeral 7'.
  • the initial area where the expression classification identifier 8 is located in FIG. 6 is not limited to be displayed as a free area, and the initial area where the expression classification identifier 7 is located in FIG. 6 may be displayed as idle. region. Specifically, if 1/2 of the expression classification identifier 2 overlaps with the expression classification identifier 7, and 1/2 of the expression classification identifier 2 overlaps with the expression classification identifier 8, the initial region where the expression classification identifier 8 is located in FIG. 6 may be displayed as If the part of the expression classification identifier 2 that overlaps with the expression classification identifier 7 is larger than 1/2 of the expression classification identifier 2, the initial area where the expression classification identifier 7 is located in FIG.
  • the portion of the 2 that overlaps with the expression classification identifier 8 is larger than 1/2 of the expression classification identifier 2, and the initial region where the expression classification identifier 8 is located in FIG. 6 can be displayed as a free area.
  • a free area may be displayed between two adjacent expression classification identifiers by the following three methods:
  • the expression classification identifier of the first side of the first expression classification identifier that controls the target position is moved to the first direction, wherein the first direction is a direction of the first expression classification identifier to the expression classification identifier of the first side.
  • the expression classification identifier of the first side of the expression classification identifier 2 may be The expression classification identifier on the right side of the expression classification identifier 2, that is, the expression classification identifier 8.
  • the expression classification mark 8 can be controlled to move to the right side by one position, and the free area indicated by the reference numeral 7' shown in FIG. 7 is obtained, wherein the first direction is the right side of the expression classification mark 2 at the target position. position.
  • the expression classification identifier of the first side of the expression classification identifier 2 may be The expression classification identifier on the left side of the expression classification identifier 2, that is, the expression classification identifier 1, the expression classification identifier 3, and the expression classification identifier 7.
  • the expression classification mark 7 can be controlled to move to the left side by one position, and the area indicated by the reference numeral 7' shown in FIG. 7 is obtained, wherein the second direction is the left side position of the expression classification mark 2 at the target position. .
  • the expression classification identifier 8 can also be controlled to move to the right side by half position, and the expression classification classification identifier 1, the expression classification identifier 3, and the expression classification identifier 7 are moved to the left side by half position, and the figure is obtained as shown in FIG. 7.
  • the area indicated by reference numeral 7' is shown in which the first direction is the rightward direction and the second direction is the leftward direction.
  • the first operation is to move the first expression classification identifier to the free area
  • the first expression classification identifier moves to an area other than the first area
  • the first expression classification identifier is deleted.
  • the first expression classification identifier is moved from the area indicated by the area 2 (ie, the first area) to the area other than the area 2, after the user releases the first expression classification, the first expression classification identifier may be deleted, and at the same time A plurality of emoticons included in the first expression classification may be deleted.
  • the first expression classification identifier may be enlarged to obtain the enlarged first expression classification identifier; and the enlarged first expression classification identifier is displayed.
  • the expression classification identifier 2 may be enlarged, and the enlarged expression classification identifier 2 is displayed in the expression panel to prompt the user to classify the expression 2 Perform related operations.
  • performing the first operation on the first expression classification identifier may further: deleting the first expression classification identifier in the first area, where deleting the first expression classification identifier, deleting the first in the expression classification identifier list
  • An expression classification identifier includes a plurality of emoticons.
  • the displaying the deletion icon in the first area, and deleting the first expression classification identifier in the first area may be: when the first expression classification identifier moves to the delete icon, the display color of the control deletion icon is gradually deepened, and is first
  • the expression classification identifier controls the deletion icon to change from the first icon to the second icon when the preset area where the deletion icon is located.
  • the first expression classification identifier may be deleted, wherein the method for deleting the first expression classification identifier may be in the present invention.
  • the first expression classification identifier may be moved into the area 3 or the area 4 (ie, the first area), because the first operation performed in the area 3 or the area 4 is to delete the first expression classification identifier. operating.
  • a deletion icon may be displayed in the area 4, wherein the shape of the deletion icon may be the shape of the trash can.
  • the color of the deletion icon gradually deepens, and the deletion icon is controlled when the first expression classification identifier is in the area where the deletion icon is located (ie, the preset area) Change from the first icon to the second icon.
  • the first expression classification identifier ie, the expression classification identifier 2 in FIG. 9 is closer to the deletion icon than the first expression classification identifier (ie, the expression classification identifier 2) in FIG. Therefore, the display color of the delete icon in FIG.
  • FIG. 9 is darker than the display color of the delete icon in FIG. 8, and the form of the delete icon in FIG. 9 and FIG. 8 is also different, and the trash can (ie, delete icon) cover in FIG.
  • the opening angle is small (i.e., the first icon described above), and the trash can (i.e., the delete icon) cover opening angle of Fig. 9 is large (i.e., the second icon described above).
  • the order of the expression classification identifiers in the expression classification identifier list of the expression panel is adjusted by long pressing, dragging, and releasing, or the expression classification identifier in the expression classification identifier list is deleted. Therefore, the control method of the expression classification identifier provided by the foregoing embodiment of the present invention allows the user to adjust the order of the expressions at random, so that the user can quickly call the corresponding expression, and accordingly, saves the time for the user to send the message, and improves the user's Experience.
  • control device for implementing an expression classification identifier of a control method for the expression classification identifier wherein the control device for the expression classification identifier is mainly used to execute the expression provided by the above content in the embodiment of the present invention.
  • control method of the classification identifier the following describes the control device for the expression classification identifier provided by the embodiment of the present invention:
  • FIG. 10 is a schematic diagram of a control apparatus for an expression classification identifier according to an embodiment of the present invention.
  • the control apparatus of the expression classification identifier mainly includes:
  • a first obtaining unit 101 configured to acquire a movement instruction, wherein the movement instruction is used to move the first expression classification identifier located on the expression panel to the target position, and the expression panel includes one or the first expression classification identifier A plurality of expression classification identifiers, and the first expression classification identifier includes one or more expression icons.
  • a plurality of expression classification identifiers are included in the expression panel 1, which are an expression classification identifier 1 to an expression classification identifier 7, respectively, wherein each expression classification identifier includes one or more expression icons.
  • each expression classification identifier includes one or more expression icons.
  • FIG. 4 when the user selects the expression classification identifier 2 in the expression panel 1, a plurality of expression icons (such as the expression icons displayed by symbols 21 to 28 in FIG. 4) are displayed in the expression panel, wherein The emoticon can be a dynamic icon or a static icon.
  • the emoticons displayed by the symbols 21 to 28 are only part of the emoticons included in the emoticon identification 2, and other emoticons included in the emoticon 2 are not displayed or are not all displayed. Elevate icons that are not displayed or not displayed, by swiping up, down, or swiping left or right.
  • the expression classification identifier The left side of 7 may also include one or more expression classification identifiers that are not fully displayed.
  • the above movement instruction may be a long press instruction, for example, the user long presses the expression of the expression classification identifier 2 in the terminal device. After the terminal device obtains the long press command, the expression classification identifier 2 can start moving from the current position until it moves to the target position.
  • the second obtaining unit 103 is configured to acquire a first operation indicated by the first area where the target location is located.
  • the interface shown in FIG. 5 is an optional display interface of the instant messaging service (eg, QQ or WeChat) in the terminal device, and the first operation indicated by the different regions in the interface as shown in FIG. 5 is different.
  • the first operation indicated by the area 2 may be an operation of moving the expression classification mark 2 to another position in the expression classification identification list. That is, in this area, the expression classification mark 2 can be moved from the current initial area to the area where the other expression classification mark is located, for example, the area where the expression classification mark 7 is located.
  • the first operation indicated by the area 3 may be an operation of deleting the expression classification identifier 2
  • the first operation indicated by the area 4 may also be
  • the above-mentioned area 3 and area 4 are areas other than the area 2, which are not shown in FIG.
  • the first operations indicated by each of the above areas 2 to 4 may be the same operation, and may also be different operations.
  • the executing unit 105 is configured to perform a first operation on the first expression classification identifier.
  • the area where the expression classification identifier 2 is located is an area between any two adjacent expression classification identifiers (for example, the expression classification identifier 6 and the expression classification identifier 4) in the area 2, when the user releases the expression classification identifier 2 , that is, when the instruction of long-pressing the expression classification identifier 2 is stopped, the corresponding first operation can be performed on the expression classification identifier 2, for example, moving the expression classification identifier 2 to the table.
  • the location of the expression classification identifier 7 in the classification list is an area between any two adjacent expression classification identifiers (for example, the expression classification identifier 6 and the expression classification identifier 4) in the area 2
  • the target position after the movement of the expression classification identifier 2 is in the area 3 or the area 4, when the user releases the expression classification identifier 2, that is, when the instruction to long press the expression classification identifier 2 is stopped, the first expression of the expression classification identifier 2 is performed.
  • the operation for example, deletes the expression classification identifier 2 in the expression classification identification list.
  • the target position after the movement of the expression classification identifier 2 is in the area 3 or the area 4, when the user releases the expression classification identifier 2, that is, when the instruction to long press the expression classification identifier 2 is stopped, the first expression of the expression classification identifier 2 is performed.
  • the operation for example, deletes the expression classification identifier 2 in the expression classification identification list.
  • the first expression classification identifier is controlled by performing a corresponding operation on the first expression classification identifier moved to the target position in the expression panel, for example, display, deletion or movement control, etc.
  • the shortcomings of flexible control of the expression in the instant messaging service cannot be achieved, and the purpose of flexibly controlling the expression classification identifier in the expression panel is achieved, thereby realizing the technical effect of improving the flexibility of controlling the expression of the expression classification, thereby solving the problem.
  • the technical problem of controlling the expression classification identifier in the prior art is poor.
  • the executing unit includes: a determining module, configured to determine whether the first area includes a second area where any two adjacent expression classification identifiers in the expression classification identifier list are located; and a display module, configured to determine the first area In the case of including the second region in the expression classification identifier list, when the first expression classification identifier partially or completely overlaps with the second region, a free area is displayed between the adjacent two expression classification identifiers; the mobile module is used for Move the first expression classification identifier to the free area.
  • the display module includes: a first control submodule, configured to control the expression classification identifier of the first side of the first expression classification identifier at the target position to move in a first direction, where the first direction is the first expression classification Identifying a direction of the expression classification identifier to the first side; and/or a second control sub-module, configured to control the expression classification identifier of the second side of the first expression classification identifier at the target position to move to the second direction, wherein
  • the two directions are directions of the first expression classification identifier to the expression classification identifier of the second side, wherein the first direction is opposite to the second direction.
  • the device further includes: a deleting unit, configured to delete the first when the first expression classification identifier moves to an area other than the first area when the first operation is to move the first expression classification identifier to the free area Expression classification logo.
  • a deleting unit configured to delete the first when the first expression classification identifier moves to an area other than the first area when the first operation is to move the first expression classification identifier to the free area Expression classification logo.
  • the device further includes: a first control unit, configured to: after acquiring the movement instruction, control the first expression classification identifier to move from the initial region in the expression classification identifier list to the first region; and the second control unit is configured to: The expression classification identifiers other than the first expression classification identifier in the control expression classification identifier list are sequentially moved in the direction pointing to the initial region.
  • a first control unit configured to: after acquiring the movement instruction, control the first expression classification identifier to move from the initial region in the expression classification identifier list to the first region
  • the second control unit is configured to: The expression classification identifiers other than the first expression classification identifier in the control expression classification identifier list are sequentially moved in the direction pointing to the initial region.
  • the device further includes: a determining unit, configured to determine, according to the direction of the initial region, the other expression classification identifiers other than the first expression classification identifier in the control expression classification identifier list, and then determine the number of other expression classification identifiers Whether it is smaller than the number of regions for displaying the expression classification identifier in the expression classification identifier list; the first display unit is configured to: when the number of other expression classification identifiers is smaller than the number of regions for displaying the expression classification identifier in the expression classification identifier list The area where the expression classification identifier is not displayed in the expression classification identifier list is displayed as a free area, and the second display unit is configured to display, in the expression classification identifier list, the area for displaying the expression classification identifier in the list of other expression classification identifiers. In the case of the number, the second expression classification identifier is displayed in the expression classification identification list, wherein the second expression classification identifier is not displayed in the expression classification identification list before moving the first expression classification
  • the device further includes: an amplifying unit, configured to: after acquiring the movement instruction, enlarge the first expression classification identifier to obtain the enlarged first expression classification identifier; and the third display unit is configured to display the enlarged first Expression classification logo.
  • an amplifying unit configured to: after acquiring the movement instruction, enlarge the first expression classification identifier to obtain the enlarged first expression classification identifier
  • the third display unit is configured to display the enlarged first Expression classification logo.
  • the executing unit includes: a deleting module, configured to delete the first expression classification identifier in the first area, where deleting the first expression classification identifier in the expression classification identifier list when deleting the first expression classification identifier includes Multiple emoticons icon.
  • the deleting icon is displayed in the first area
  • the deleting module includes: a third control sub-module, configured to control the display color of the deleted icon to gradually deepen when the first expression classification identifier moves to the delete icon, and is first The expression classification identifier controls the deletion icon to change from the first icon to the second icon when the preset area where the deletion icon is located.
  • the mobile terminal for implementing the foregoing control method of the expression classification identifier is further provided.
  • the mobile terminal mainly includes a processor 401, a display 402, a data interface 403, a memory 404, and Network interface 405, wherein:
  • the display 402 is mainly used to display an expression panel, wherein the expression panel includes an expression classification identifier.
  • the data interface 403 transmits the expression classification identifier selected by the user to the processor 401 mainly by means of data transmission.
  • the memory 404 is mainly used to store related records for moving or deleting the expression classification identifier.
  • the network interface 405 is mainly used for network communication with the server, and provides data support for the control of the expression classification identifier.
  • the processor 401 is mainly used to perform the following operations:
  • Obtaining a movement instruction wherein the movement instruction is used to move the first expression classification identifier located on the expression panel to the target position, and the expression panel displays one or more expression classification identifiers including the first expression classification identifier, and An expression classification identifier includes one or more expression icons; a first operation indicated by the first region where the target location is located; and a first operation performed on the first expression classification identifier.
  • the processor 401 is further configured to determine whether the first area includes a second area where any two adjacent expression classification identifiers in the expression classification identifier list are located; if it is determined that the first area includes the second area in the expression classification identifier list, When the first expression classification identifier partially or completely overlaps with the second region, a free area is displayed between the adjacent two expression classification identifiers; and the first expression classification identifier is moved to the free area.
  • the processor 401 is further configured to control the expression classification identifier of the first side of the first expression classification identifier at the target position to move to the first direction, where the first direction is the first expression classification identifier to the first The direction of the expression of the expression on one side; and/or the expression classification identifier of the second side of the first expression classification identifier that controls the target position is moved to the second direction, wherein the second direction is the first expression classification identifier to the second The expression of the side classification marks the direction, wherein the first direction is opposite to the second direction.
  • the processor 401 is further configured to delete the first expression classification identifier when the first expression classification identifier moves to an area other than the first area when the first operation is to move the first expression classification identifier to the free area.
  • the processor 401 is further configured to: after obtaining the movement instruction, control the first expression classification identifier to move from the initial region in the expression classification identifier list to the first region; and control the expression classification other than the first expression classification identifier in the expression classification identifier list.
  • the logo moves in the direction that points to the initial area.
  • the processor 401 is further configured to determine, in the control expression classification identifier list, that the expression identifiers other than the first expression classification identifier are sequentially moved according to the direction pointing to the initial region, and determine whether the number of other expression classification identifiers is smaller than the expression of the expression classification identifier.
  • the processor 401 is further configured to: after acquiring the movement instruction, enlarge the first expression classification identifier to obtain the enlarged first expression classification identifier; and display the enlarged first expression classification identifier.
  • the processor 401 is further configured to delete the first expression classification identifier in the first area, wherein, when the first expression classification identifier is deleted, the plurality of expression icons included in the first expression classification identifier are deleted in the expression classification identifier list.
  • the processor 401 is further configured to: when the first expression classification identifier moves to the delete icon, control the display color of the delete icon to gradually deepen, and the first expression classification identifier is in the pre-position where the delete icon is located. When the area is set, the control delete icon is changed from the first icon to the second icon.
  • Embodiments of the present invention also provide a storage medium.
  • the foregoing storage medium may be used to store program code of a control method of the expression classification identifier of the embodiment of the present invention.
  • the foregoing storage medium may be located in at least one of a plurality of network devices in a network of a mobile communication network, a wide area network, a metropolitan area network, or a local area network.
  • the storage medium is arranged to store program code for performing the following steps:
  • the foregoing storage medium may include, but not limited to, a USB flash drive, a Read-Only Memory (ROM), a Random Access Memory (RAM), a mobile hard disk, and a magnetic memory.
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • a mobile hard disk e.g., a hard disk
  • magnetic memory e.g., a hard disk
  • the integrated unit in the above embodiment if implemented in the form of a software operating unit and sold or used as a stand-alone product, may be stored in the above-described computer readable storage medium.
  • the technical solution of the present invention may contribute to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium.
  • a number of instructions are included to cause one or more computer devices (which may be a personal computer, server or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the disclosed client may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical operation division.
  • there may be another division manner for example, multiple units or components may be combined or may be Integrate into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, unit or module, and may be electrical or otherwise.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each operation unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software operating unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Human Resources & Organizations (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Data Mining & Analysis (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Quality & Reliability (AREA)
  • Computer Hardware Design (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Economics (AREA)
  • Operations Research (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明公开了一种表情分类标识的控制方法、装置和存储介质。其中,该方法包括:获取移动指令,其中,移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,表情面板上显示有包括第一表情分类标识在内的一个或多个表情分类标识,并且第一表情分类标识包括一个或多个表情图标;获取目标位置所在的第一区域所指示的第一操作;对第一表情分类标识执行第一操作。本发明解决了现有技术中控制表情分类标识的灵活度较差的技术问题。

Description

表情分类标识的控制方法、装置和存储介质
本申请要求于2016年04月15日提交中国专利局、申请号为201610235335.0、发明名称为“表情分类标识的控制方法、装置和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及多媒体领域,具体而言,涉及一种表情分类标识的控制方法、装置和存储介质。
背景技术
现有社会中随着即时通讯业务的快速发展,大家也越来越多地通过该即时通信业务进行交流和沟通,同时人们也正在享受着即时通讯业务给大家带来的通讯上的便利。例如,大家可以互相发送表情来表达当前的心情和状态等,相对于直接发送文字等消息,能够更加生动并直接地表达出用户此时的心情,并且随着即时通讯业务的扩展,大家可以发送的表情消息也越来越多样。但是,表情消息的变化给大家在选择要发送的表情时带来了一定的困扰。具体地,如图2所示,标识1和标识2为表情标识,每个表情标识包括多个表情图标,例如,标识1对应的表情图标为“神经蛙与欢乐马”的表情,标识2对应的表情图标为“野萌君”的图标。当用户想要发送的表情标识2中的表情时,需要向左或者向右滑动区域1,直到在终端设备中显示出标识2中的表情。但是,上述选取表情的方法浪费了用户大量的时间,导致用户不能最快的发送表情图标。同时,如图2所示,如果用户经常使用标识2中的表情,而不常用标识7中的表情,则将标识7放在标识2中的前面也会浪费用户选取表情的时间,因此,相关技术中在即时通讯业务中选取表情的方法不利于提高用户的体验。
针对上述的问题,目前尚未提出有效的解决方案。
发明内容
本发明实施例提供了一种表情分类标识的控制方法、装置和存储介质,以至少解决现有技术中控制表情分类标识的灵活度较差的技术问题。
根据本发明实施例的一个方面,提供了一种表情分类标识的控制方法,包括:获取移动指令,其中,所述移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,所述表情面板上显示有包括所述第一表情分类标识在内的一个或多个表情分类标识,并且所述第一表情分类标识包括一个或多个表情图标;获取所述目标位置所在的第一区域所指示的第一操作;对所述第一表情分类标识执行所述第一操作。
根据本发明实施例的另一方面,还提供了一种表情分类标识的控制装置,包括:第一获取单元,用于获取移动指令,其中,所述移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,所述表情面板上显示有包括所述第一表情分类标识在内的一个或多个表情分类标识,并且所述第一表情分类标识包括一个或多个表情图标;第二获取单元,用于获取所述目标位置所在的第一区域所指示的第一操作;执行单元,用于对所述第一表情分类标识执行所述第一操作。
根据本发明实施例的另一方面,还提供了一种存储介质,所述存储介质被设置为存储用于执行以下步骤的程序代码,包括:
获取移动指令,其中,所述移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,所述表情面板上显示有包括所述第一表情分类标识在内的一个或多个表情分类标识,并且所述第一表情分类标识包括一个或多个表情图标;
获取所述目标位置所在的第一区域所指示的第一操作;
对所述第一表情分类标识执行所述第一操作。
在本发明实施例中,采用获取移动指令,其中,所述移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,所述表情面板上显 示有包括所述第一表情分类标识在内的一个或多个表情分类标识,并且所述第一表情分类标识包括一个或多个表情图标;获取所述目标位置所在的第一区域所指示的第一操作;对所述第一表情分类标识执行所述第一操作的方式,通过对表情面板中的移动至目标位置的第一表情分类标识执行相应的操作,来实现对第一表情分类标识的控制,例如,显示,删除或者移动等控制,相对于现有技术中不能对即时通讯业务中的表情进行灵活控制的缺点,达到了灵活控制表情面板中的表情分类标识的目的,从而实现了提高控制表情分类标识的灵活度的技术效果,进而解决了现有技术中控制表情分类标识的灵活度较差的技术问题。
附图说明
此处所说明的附图用来提供对本发明的进一步理解,构成本申请的一部分,本发明的示意性实施例及其说明用于解释本发明,并不构成对本发明的不当限定。在附图中:
图1是根据本发明实施例的硬件结构的架构图;
图2是相关技术中表情面板的显示界面的示意图;
图3是根据本发明实施例的一种表情分类标识的控制方法的流程图图;
图4是根据本发明实施例的一种可选地表情面板的显示界面的示意图;
图5是根据本发明实施例另一种可选地表情面板的显示界面的示意图;
图6是根据本发明实施例另一种可选地表情面板的显示界面的示意图;
图7是根据本发明实施例另一种可选地表情面板的显示界面的示意图;
图8是根据本发明实施例另一种可选地表情面板的显示界面的示意图;
图9是根据本发明实施例另一种可选地表情面板的显示界面的示意图;
图10是根据本发明实施例的一种表情分类标识的控制装置的示意图;以及
图11是根据本发明实施例的终端的硬件结构图。
具体实施方式
为了使本技术领域的人员更好地理解本发明方案,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分的实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本发明保护的范围。
需要说明的是,本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本发明的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、***、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
实施例1
根据本发明实施例,提供了一种可以通过本申请装置实施例执行的方法实施例,需要说明的是,在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机***中执行,并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
根据本发明实施例,提供了一种表情分类标识的控制方法。
可选地,在本实施例中,上述表情分类标识的控制方法可以应用于如图1所示的移动终端102和服务器104所构成的硬件环境中。如图1所示,移动终端102通过网络与服务器104进行连接,上述网络包括但不限于:移动通信网络、广域网、城域网或局域网,移动终端102可以是手机终端, 也可以是PC终端、笔记本终端或平板电脑终端。
图3是根据本发明实施例的一种表情分类标识的控制方法的流程图,以下结合图3对本发明实施例所提供的表情分类标识的控制方法做具体介绍,如图3所示,该表情分类标识的控制方法主要包括如下步骤S302至步骤S306:
步骤S302,获取移动指令,其中,移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,表情面板上显示有包括第一表情分类标识在内的一个或多个表情分类标识,并且第一表情分类标识包括一个或多个表情图标。
如图4所示,在表情面板1中包括多个表情分类标识,分别为表情分类标识1至表情分类标识7,其中,每个表情分类标识包括一个或者多个表情图标。例如,如图4所示,当用户在表情面板1中选中表情分类标识2时,在表情面板中显示出多个表情图标(如图4中符号21至符号28所显示的表情图标),其中,表情图标可以为动态图标,还可以为静态图标。在图4中,上述符号21至符号28所显示的表情图标仅为表情分类标识2所包括的部分表情图标,表情分类标识2中包括的其他未显示或者未能全部显示的表情图标,可以通过向上、向下滑动,或者向左、向右滑动的方式将未显示或者未能全部显示的表情图标显示出来。
需要说明的是,在如图4所示的表情分类标识列表中,表情分类标识7的左侧可能还包括一个或者多个未完全显示的表情分类标识。
上述移动指令可以为长按指令,例如,用户在终端设备中长按表情分类标识2的指令。在终端设备获取到该长按指令之后,表情分类标识2即可以由当前所处的位置开始移动,直到移动至目标位置。
步骤S304,获取目标位置所在的第一区域所指示的第一操作。
图5显示的界面为即时通讯业务(例如,QQ或者微信)在终端设备中的一种可选地显示界面,在该如图5所显示的界面中不同的区域所指示 的第一操作不同。例如,如果将表情分类标识2移动至图5中虚线框所表示的区域2,则该区域2所指示的第一操作可以为将表情分类标识2移动至表情分类标识列表中的其他位置的操作,也即,在该区域内,可以将表情分类标识2由当前的初始区域移动至其他表情分类标识所在的区域,例如,表情分类标识7所在的区域。又例如,如果将表情分类标识2移动至区域3或者区域4,则该区域3所指示的第一操作可以为将表情分类标识2删除的操作,则该区域4所指示的第一操作同样可以为将表情分类标识2删除的操作,其中,上述区域3和区域4为区域2之外的区域,在图5中并未示出。上述区域2至区域4中每个区域所指示的第一操作可以为相同的操作,还可以不相同的操作。
需要说明的是,在本发明实施例中,仅举例说明区域2至区域4,以及区域2至区域4所指示的第一操作。在如图5所显示的界面中,还包括其他的第一区域,以及该第一区域所指示的第一操作。
步骤S306,对第一表情分类标识执行第一操作。
如果表情分类标识2所在的区域为区域2中任意两个相邻的表情分类标识(例如,表情分类标识6和表情分类标识4)之间的区域时,则当用户松开表情分类标识2时,即长按表情分类标识2的指令停止时,即可对表情分类标识2执行相应的第一操作,例如,将表情分类标识2移动至表情分类标识列表中表情分类标识7所在的位置。
如果表情分类标识2移动之后的目标位置处于区域3或者区域4中,当用户松开表情分类标识2时,即长按表情分类标识2的指令停止时,对表情分类标识2执行相应的第一操作,例如,将表情分类标识2在表情分类标识列表中删除。
在本发明实施例中,通过对表情面板中的移动至目标位置的第一表情分类标识执行相应的操作,来实现对第一表情分类标识的控制,例如,显示,删除或者移动等控制,相对于现有技术中不能对即时通讯业务中的表情进行灵活控制的缺点,达到了灵活控制表情面板中的表情分类标识的目 的,从而实现了提高控制表情分类标识的灵活度的技术效果,进而解决了现有技术中控制表情分类标识的灵活度较差的技术问题。
在本发明实施例中,在获取移动指令之后,还可以控制第一表情分类标识从表情分类标识列表中的初始区域移动至第一区域;控制表情分类标识列表中除第一表情分类标识以外的其他表情分类标识按照指向初始区域的方向依次移动。
如图4、图5和图6所示,表情分类标识1至表情分类标识7所在的区域即为表情分类标识列表。
终端设备在获取到用户长按第一表情分类标识的指令之后,可以控制第一表情分类标识由第一表情分类标识的初始区域移动至第一区域的目标位置,移动之后的第一表情分类标识的初始区域处于空闲状态,未显示任何的表情分类标识,此时,表情分类标识列表中除第一表情分类标识以外的其他表情分类标识可以按照指向初始区域的方向依次移动一个位置。
例如,图4所示的为在移动第一表情分类标识之前,表情分类标识类表中每个表情分类标识所处的初始区域。在将表情分类标识2移动出该初始区域的时刻,表情分类标识2所处的初始区域不显示任何的表情分类标识。随后,可以将表情分类标识2右侧的表情分类标识3至表情分类标识7可以依次向左移动一个位置。例如,如图6所示,将表情分类标识3移动至表情分类标识2的初始区域,将表情分类标识4移动至表情分类标识3的初始区域等;还可以将表情分类标识2左侧的表情分类标识1向右移动一个位置,即,表情分类标识1移动至表情分类标识2的初始区域(在图6中并未示出)。
可选地,在控制表情分类标识列表中除第一表情分类标识以外的其他表情分类标识按照指向初始区域的方向依次移动之后,还可以判断其他表情分类标识的数量是否小于表情分类标识列表中用于显示表情分类标识的区域的数量;如果其他表情分类标识的数量小于表情分类标识列表中用于显示表情分类标识的区域的数量,则将表情分类标识列表中未显示表情 分类标识的区域显示为空闲区域;如果其他表情分类标识的数量大于或等于表情分类标识列表中用于显示表情分类标识的区域的数量,则将第二表情分类标识显示在表情分类标识列表中,其中,第二表情分类标识在移动第一表情分类标识之前未显示在表情分类标识列表中。
在将表情分类标识3至表情分类标识7依次向左移动一个位置之后,可以判断在表情分类标识列表除上述表情分类标识1、表情分类标识3至表情分类标识7之外,是否还包括其他的表情分类标识;如果判断出还包括其他的表情分类标识。例如,还包括表情分类标识8和表情分类标识9,此时,可以将表情分类标识8显示在表情分类标识列表中,其中,表情分类标识8与表情分类标识7相邻;如果判断出不包括其他的表情分类标识,则可以将此时表情分类标识列表中未显示表情分类标识的区域显示为空闲区域,即将表情分类标识7的初始区域显示为空闲区域。具体地,如图6所示,在将表情分类标识3移出表情分类标识列表之后,将表情分类标识3至表情分类标识7依次向左移动一个位置之后,并可以将上述表情分类标识8移动至表情分类标识7的初始区域,并显示在表情分类标识列表中。
又或者,将表情分类标识1移动至表情分类标识2的初始区域之后,如果判断出在表情分类标识列表除上述表情分类标识1、表情分类标识3至表情分类标识7之外,还包括其他的表情分类标识;此时还可以将与表情分类标识1相邻的表情分类标识显示在表情分类标识1的初始区域,其中,此时显示在表情分类标识1的初始区域的表情分类标识在移动表情分类标识2之前未显示在表情分类标识列表中;如果判断出不包括其他的表情分类标识,则可以将表情分类标识1的初始区域显示为空闲区域。
需要说明的是,由于大家的正常习惯为将未显示表情分类标识的区域显示设置在表情分类标识列表中的右侧位置,因此,当其他表情分类标识的数量小于表情分类标识列表中用于显示表情分类标识的区域的数量时,可以在将表情分类标识1的初始位置显示为空闲位置之后,控制表情分类 标识1、表情分类标识3至表情分类标识7依次向左移动,使得表情分类标识7的初始区域为空闲区域。
可选地,对第一表情分类标识执行第一操作具体可以为:判断第一区域是否包括表情分类标识列表中任意相邻的两个表情分类标识所在的第二区域;若判断出第一区域包括表情分类标识列表中的第二区域,则在第一表情分类标识与第二区域部分或者全部重叠时,在相邻的两个表情分类标识之间显示空闲区域;将第一表情分类标识移动至空闲区域。
第二区域可以为表情分类标识列表中任意相邻的两个表情分类标识所在的区域,例如,如图6所示,表情分类标识7和与其相邻的表情分类标识8所在的区域即为第二区域。
如果表情分类标识2移动之后所在的第一区域为区域2,可以判断出区域2(即,上述第一区域)包括表情分类标识7和表情分类标识8所在的区域(即,上述第二区域)。此时,如果表情分类标识2与第二区域即将贴合或者出现重叠(例如,图6中表情分类标识2的所在的区域),将会在表情分类标识7和表情分类标识8之间显示出如图7中标号7’所示的空闲区域,其中,标号7’所示的区域为表情分类标识7所在的初始区域,空闲区域表示在该区域中未显示任何的表情分类标识。当用户松开表情分类标识2时,表情分类标识2将被释放至标号7’所示的空闲区域。
需要说明的是,在本发明实施例中,并不局限于将图6中表情分类标识8所在的初始区域显示为空闲区域,还可以将图6中表情分类标识7所在的初始区域显示为空闲区域。具体地,如果表情分类标识2的1/2与表情分类标识7重叠,表情分类标识2的1/2与表情分类标识8重叠,则可以将图6中表情分类标识8所在的初始区域显示为空闲区域;如果表情分类标识2中与表情分类标识7重叠的部分大于表情分类标识2的1/2,则可以将图6中表情分类标识7所在的初始区域显示为空闲区域;如果表情分类标识2中与表情分类标识8重叠的部分大于表情分类标识2的1/2,则可以将图6中表情分类标识8所在的初始区域显示为空闲区域。
在本发明实施例中,可以通过下述三种方式在相邻的两个表情分类标识之间显示空闲区域:
方式一:
控制处于目标位置的第一表情分类标识的第一侧的表情分类标识向第一方向移动,其中,第一方向为第一表情分类标识至第一侧的表情分类标识的方向。
假设,第一表情分类标识为表情分类标识2,当表情分类标识2的目标位置为如图6中表情分类标识2所示的位置时,表情分类标识2的第一侧的表情分类标识可以为表情分类标识2右侧的表情分类标识,即,表情分类标识8。此时,可以控制表情分类标识8向右侧移动一个位置,得到如图7所示的标号7’所示的空闲区域,其中,第一方向即为处于目标位置的表情分类标识2的右侧位置。
方式二:
控制处于目标位置的第一表情分类标识的第二侧的表情分类标识向第二方向移动,其中,第二方向为第一表情分类标识至第二侧的表情分类标识的方向,其中,第一方向与第二方向相反。
假设,第一表情分类标识为表情分类标识2,当表情分类标识2的目标位置为如图6中表情分类标识2所示的位置时,表情分类标识2的第一侧的表情分类标识可以为表情分类标识2左侧的表情分类标识,即,表情分类标识1、表情分类标识3至表情分类标识7。此时,可以控制表情分类标识7向左侧移动一个位置,得到如图7所示的标号7’所示的区域,其中,第二方向即为处于目标位置的表情分类标识2的左侧位置。
方式三:
控制处于目标位置的第一表情分类标识的第一侧的表情分类标识向第一方向移动,其中,第一方向为第一表情分类标识至第一侧的表情分类标识的方向;以及
控制处于目标位置的第一表情分类标识的第二侧的表情分类标识向第二方向移动,其中,第二方向为第一表情分类标识至第二侧的表情分类标识的方向,其中,第一方向与第二方向相反。
在本发明实施例中,还可以控制表情分类标识8向右侧移动半个位置,并控制表情分类标识1、表情分类标识3至表情分类标识7向左侧移动半个位置,得到如图7所示的标号7’所示的区域,其中,第一方向即为向右的方向,第二方向即为向左的方向。
可选地,在第一操作为将第一表情分类标识移动至空闲区域时,在第一表情分类标识移动至第一区域以外的区域时,删除第一表情分类标识。
如果第一表情分类标识由区域2所示的区域(即,第一区域)移动至区域2之外的区域时,在用户松开第一表情分类之后,即可删除第一表情分类标识,同时可以删除第一表情分类包括的多个表情图标。
可选地,在获取移动指令之后,还可以放大第一表情分类标识,得到放大后的第一表情分类标识;显示放大后的第一表情分类标识。
如图6和图7所示,当用户长按表情分类标识2之后,可以放大表情分类标识2,并将放大之后的表情分类标识2显示在表情面板中,以提示用户正在对表情分类标识2执行相关操作。
可选地,对第一表情分类标识执行第一操作还可以为:在第一区域中删除第一表情分类标识,其中,在删除第一表情分类标识的时候,在表情分类标识列表中删除第一表情分类标识包括的多个表情图标。
其中,在第一区域中显示删除图标,在第一区域中删除第一表情分类标识可以为:在第一表情分类标识向删除图标移动时,控制删除图标的显示颜色逐渐加深,并在第一表情分类标识处于删除图标所在的预设区域时控制删除图标从第一图标变为第二图标。
除上述对第一表情分类标识进行移动之外,还可以对第一表情分类标识进行删除,其中,删除第一表情分类标识的方法可以有很多中,在本发 明实施例中,可以将第一表情分类标识移动至区域3或者区域4(即,第一区域)中,因为,在区域3或者区域4中执行的第一操作为删除第一表情分类标识的操作。
其中,如果将第一表情分类标识移动至区域4中时,可以在区域4中显示一个删除图标,其中,该删除图标的形状可以为垃圾箱的形状。当第一表情分类标识移动至区域4,并向删除图标靠近时,删除图标的颜色逐渐加深,并且在第一表情分类标识处于删除图标所在的区域(即,上述预设区域)时控制删除图标从第一图标变为第二图标。如图8和图9所示,图9中的第一表情分类标识(即,表情分类标识2)相对于图8中的第一表情分类标识(即,表情分类标识2)更靠近删除图标,因此,在图9中删除图标的显示颜色比图8中删除图标的显示颜色深,并且图9中和图8中删除图标的形态也不一样,图8中垃圾箱(即,删除图标)盖子打开的角度较小(即,上述第一图标),图9中垃圾箱(即,删除图标)盖子打开的角度较大(即,上述第二图标)。
在本发明实施例中,通过长按、拖拽和释放等方式,调整表情面板的表情分类标识列表中表情分类标识的顺序,或者删除表情分类标识列表中的表情分类标识。因此,采用本发明上述实施例提供的表情分类标识的控制方法,可以让用户随意调整表情的顺序,使得用户能够快速调用相应地表情,相应地,节省了用户发送消息的时间,提高了用户的体验。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,因为依据本发明,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本发明所必须的。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理 解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本发明各个实施例所述的方法。
实施例2
根据本发明实施例,还提供了一种用于实施上述表情分类标识的控制方法的表情分类标识的控制装置,该表情分类标识的控制装置主要用于执行本发明实施例上述内容所提供的表情分类标识的控制方法,以下对本发明实施例所提供的表情分类标识的控制装置做具体介绍:
图10是根据本发明实施例的表情分类标识的控制装置的示意图,如图10所示,该表情分类标识的控制装置主要包括:
第一获取单元101,用于获取移动指令,其中,移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,表情面板上显示有包括第一表情分类标识在内的一个或多个表情分类标识,并且第一表情分类标识包括一个或多个表情图标。
如图4所示,在表情面板1中包括多个表情分类标识,分别为表情分类标识1至表情分类标识7,其中,每个表情分类标识包括一个或者多个表情图标。例如,如图4所示,当用户在表情面板1中选中表情分类标识2时,在表情面板中显示出多个表情图标(如图4中符号21至符号28所显示的表情图标),其中,表情图标可以为动态图标,还可以为静态图标。在图4中,上述符号21至符号28所显示的表情图标仅为表情分类标识2所包括的部分表情图标,表情分类标识2中包括的其他未显示或者未能全部显示的表情图标,可以通过向上、向下滑动,或者向左、向右滑动的方式将未显示或者未能全部显示的表情图标显示出来。
需要说明的是,在如图4所示的表情分类标识列表中,表情分类标识 7的左侧可能还包括一个或者多个未完全显示的表情分类标识。
上述移动指令可以为长按指令,例如,用户在终端设备中长按表情分类标识2的指令。在终端设备获取到该长按指令之后,表情分类标识2即可以由当前所处的位置开始移动,直到移动至目标位置。
第二获取单元103,用于获取目标位置所在的第一区域所指示的第一操作。
图5显示的界面为即时通讯业务(例如,QQ或者微信)在终端设备中的一种可选地显示界面,在该如图5所显示的界面中不同的区域所指示的第一操作不同。例如,如果将表情分类标识2移动至图5中虚线框所表示的区域2,则该区域2所指示的第一操作可以为将表情分类标识2移动至表情分类标识列表中的其他位置的操作,也即,在该区域内,可以将表情分类标识2由当前的初始区域移动至其他表情分类标识所在的区域,例如,表情分类标识7所在的区域。又例如,如果将表情分类标识2移动至区域3或者区域4,则该区域3所指示的第一操作可以为将表情分类标识2删除的操作,则该区域4所指示的第一操作同样可以为将表情分类标识2删除的操作,其中,上述区域3和区域4为区域2之外的区域,在图5中并未示出。上述区域2至区域4中每个区域所指示的第一操作可以为相同的操作,还可以不相同的操作。
需要说明的是,在本发明实施例中,仅举例说明区域2至区域4,以及区域2至区域4所指示的第一操作。在如图5所显示的界面中,还包括其他的第一区域,以及该第一区域所指示的第一操作。
执行单元105,用于对第一表情分类标识执行第一操作。
如果表情分类标识2所在的区域为区域2中任意两个相邻的表情分类标识(例如,表情分类标识6和表情分类标识4)之间的区域时,则当用户松开表情分类标识2时,即长按表情分类标识2的指令停止时,即可对表情分类标识2执行相应的第一操作,例如,将表情分类标识2移动至表 情分类标识列表中表情分类标识7所在的位置。
如果表情分类标识2移动之后的目标位置处于区域3或者区域4中,当用户松开表情分类标识2时,即长按表情分类标识2的指令停止时,对表情分类标识2执行相应的第一操作,例如,将表情分类标识2在表情分类标识列表中删除。
如果表情分类标识2移动之后的目标位置处于区域3或者区域4中,当用户松开表情分类标识2时,即长按表情分类标识2的指令停止时,对表情分类标识2执行相应的第一操作,例如,将表情分类标识2在表情分类标识列表中删除。
在本发明实施例中,通过对表情面板中的移动至目标位置的第一表情分类标识执行相应的操作,来实现对第一表情分类标识的控制,例如,显示,删除或者移动等控制,相对于现有技术中不能对即时通讯业务中的表情进行灵活控制的缺点,达到了灵活控制表情面板中的表情分类标识的目的,从而实现了提高控制表情分类标识的灵活度的技术效果,进而解决了现有技术中控制表情分类标识的灵活度较差的技术问题。
可选地,执行单元包括:判断模块,用于判断第一区域是否包括表情分类标识列表中任意相邻的两个表情分类标识所在的第二区域;显示模块,用于在判断出第一区域包括表情分类标识列表中的第二区域的情况下,在第一表情分类标识与第二区域部分或者全部重叠时,在相邻的两个表情分类标识之间显示空闲区域;移动模块,用于将第一表情分类标识移动至空闲区域。
可选地,显示模块包括:第一控制子模块,用于控制处于目标位置的第一表情分类标识的第一侧的表情分类标识向第一方向移动,其中,第一方向为第一表情分类标识至第一侧的表情分类标识的方向;和/或第二控制子模块,用于控制处于目标位置的第一表情分类标识的第二侧的表情分类标识向第二方向移动,其中,第二方向为第一表情分类标识至第二侧的表情分类标识的方向,其中,第一方向与第二方向相反。
可选地,该装置还包括:删除单元,用于在第一操作为将第一表情分类标识移动至空闲区域时,在第一表情分类标识移动至第一区域以外的区域时,删除第一表情分类标识。
可选地,该装置还包括:第一控制单元,用于在获取移动指令之后,控制第一表情分类标识从表情分类标识列表中的初始区域移动至第一区域;第二控制单元,用于控制表情分类标识列表中除第一表情分类标识以外的其他表情分类标识按照指向初始区域的方向依次移动。
可选地,该装置还包括:判断单元,用于在控制表情分类标识列表中除第一表情分类标识以外的其他表情分类标识按照指向初始区域的方向依次移动之后,判断其他表情分类标识的数量是否小于表情分类标识列表中用于显示表情分类标识的区域的数量;第一显示单元,用于在其他表情分类标识的数量小于表情分类标识列表中用于显示表情分类标识的区域的数量的情况下,将表情分类标识列表中未显示表情分类标识的区域显示为空闲区域;第二显示单元,用于在其他表情分类标识的数量大于或等于表情分类标识列表中用于显示表情分类标识的区域的数量的情况下,将第二表情分类标识显示在表情分类标识列表中,其中,第二表情分类标识在移动第一表情分类标识之前未显示在表情分类标识列表中。
可选地,该装置还包括:放大单元,用于在获取移动指令之后,放大第一表情分类标识,得到放大后的第一表情分类标识;第三显示单元,用于显示放大后的第一表情分类标识。
可选地,执行单元包括:删除模块,用于在第一区域中删除第一表情分类标识,其中,在删除第一表情分类标识的时候,在表情分类标识列表中删除第一表情分类标识包括的多个表情图标。
可选地,在第一区域中显示删除图标,删除模块包括:第三控制子模块,用于在第一表情分类标识向删除图标移动时,控制删除图标的显示颜色逐渐加深,并在第一表情分类标识处于删除图标所在的预设区域时控制删除图标从第一图标变为第二图标。
实施例3
根据本发明实施例,还提供了一种用于实施上述表情分类标识的控制方法的移动终端,如图11所示,该移动终端主要包括处理器401、显示器402、数据接口403、存储器404和网络接口405,其中:
显示器402主要用于显示表情面板,其中,表情面板中包括表情分类标识。
数据接口403则主要通过数据传输的方式将用户选取的表情分类标识传输给处理器401。
存储器404主要用于存储移动或者删除表情分类标识的相关记录。
网络接口405主要用于与服务器进行网络通信,为表情分类标识的控制提供数据支持。
处理器401主要用于执行如下操作:
获取移动指令,其中,移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,表情面板上显示有包括第一表情分类标识在内的一个或多个表情分类标识,并且第一表情分类标识包括一个或多个表情图标;获取目标位置所在的第一区域所指示的第一操作;对第一表情分类标识执行第一操作。
处理器401还用于判断第一区域是否包括表情分类标识列表中任意相邻的两个表情分类标识所在的第二区域;若判断出第一区域包括表情分类标识列表中的第二区域,则在第一表情分类标识与第二区域部分或者全部重叠时,在相邻的两个表情分类标识之间显示空闲区域;将第一表情分类标识移动至空闲区域。
处理器401还用于控制处于目标位置的第一表情分类标识的第一侧的表情分类标识向第一方向移动,其中,第一方向为第一表情分类标识至第 一侧的表情分类标识的方向;和/或控制处于目标位置的第一表情分类标识的第二侧的表情分类标识向第二方向移动,其中,第二方向为第一表情分类标识至第二侧的表情分类标识的方向,其中,第一方向与第二方向相反。
处理器401还用于在第一操作为将第一表情分类标识移动至空闲区域时,在第一表情分类标识移动至第一区域以外的区域时,删除第一表情分类标识。
处理器401还用于在获取移动指令之后,控制第一表情分类标识从表情分类标识列表中的初始区域移动至第一区域;控制表情分类标识列表中除第一表情分类标识以外的其他表情分类标识按照指向初始区域的方向依次移动。
处理器401还用于在控制表情分类标识列表中除第一表情分类标识以外的其他表情分类标识按照指向初始区域的方向依次移动之后,判断其他表情分类标识的数量是否小于表情分类标识列表中用于显示表情分类标识的区域的数量;如果其他表情分类标识的数量小于表情分类标识列表中用于显示表情分类标识的区域的数量,则将表情分类标识列表中未显示表情分类标识的区域显示为空闲区域;如果其他表情分类标识的数量大于或等于表情分类标识列表中用于显示表情分类标识的区域的数量,则将第二表情分类标识显示在表情分类标识列表中,其中,第二表情分类标识在移动第一表情分类标识之前未显示在表情分类标识列表中。
处理器401还用于在获取移动指令之后,放大第一表情分类标识,得到放大后的第一表情分类标识;显示放大后的第一表情分类标识。
处理器401还用于在第一区域中删除第一表情分类标识,其中,在删除第一表情分类标识的时候,在表情分类标识列表中删除第一表情分类标识包括的多个表情图标。
处理器401还用于在第一表情分类标识向删除图标移动时,控制删除图标的显示颜色逐渐加深,并在第一表情分类标识处于删除图标所在的预 设区域时控制删除图标从第一图标变为第二图标。
可选地,本实施例中的具体示例可以参考上述实施例1和实施例2中所描述的示例,本实施例在此不再赘述。
实施例4
本发明的实施例还提供了一种存储介质。可选地,在本实施例中,上述存储介质可以用于存储本发明实施例的表情分类标识的控制方法的程序代码。
可选地,在本实施例中,上述存储介质可以位于移动通信网络、广域网、城域网或局域网的网络中的多个网络设备中的至少一个网络设备。
可选地,在本实施例中,存储介质被设置为存储用于执行以下步骤的程序代码:
S1,获取移动指令,其中,所述移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,所述表情面板上显示有包括所述第一表情分类标识在内的一个或多个表情分类标识,并且所述第一表情分类标识包括一个或多个表情图标;
S2,获取所述目标位置所在的第一区域所指示的第一操作;
S3,对所述第一表情分类标识执行所述第一操作。
可选地,在本实施例中,上述存储介质可以包括但不限于:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
可选地,本实施例中的具体示例可以参考上述实施例1和实施例2中所描述的示例,本实施例在此不再赘述。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
上述实施例中的集成的单元如果以软件操作单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干指令用以使得一台或多台计算机设备(可为个人计算机、服务器或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。
在本发明的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的客户端,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,仅仅为一种逻辑操作划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各操作单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件操作单元的形式实现。
以上所述仅是本发明的优选实施方式,应当指出,对于本技术领域的 普通技术人员来说,在不脱离本发明原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本发明的保护范围。

Claims (19)

  1. 一种表情分类标识的控制方法,包括:
    获取移动指令,其中,所述移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,所述表情面板上显示有包括所述第一表情分类标识在内的一个或多个表情分类标识,并且所述第一表情分类标识包括一个或多个表情图标;
    获取所述目标位置所在的第一区域所指示的第一操作;
    对所述第一表情分类标识执行所述第一操作。
  2. 根据权利要求1所述的方法,其中,对所述第一表情分类标识执行所述第一操作包括:
    判断所述第一区域是否包括所述表情分类标识列表中任意相邻的两个表情分类标识所在的第二区域;
    若判断出所述第一区域包括所述表情分类标识列表中的所述第二区域,则在所述第一表情分类标识与所述第二区域部分或者全部重叠时,在所述相邻的两个表情分类标识之间显示空闲区域;
    将所述第一表情分类标识移动至所述空闲区域。
  3. 根据权利要求2所述的方法,其中,在所述相邻的两个表情分类标识之间显示空闲区域包括:
    控制处于所述目标位置的所述第一表情分类标识的第一侧的表情分类标识向第一方向移动,其中,所述第一方向为所述第一表情分类标识至所述第一侧的表情分类标识的方向;和/或
    控制处于所述目标位置的所述第一表情分类标识的第二侧的表情分类标识向第二方向移动,其中,所述第二方向为所述第一表情分类标识至所述第二侧的表情分类标识的方向,其中,所述第一方向与所述第二方向相反。
  4. 根据权利要求2所述的方法,其中,在所述第一操作为将所述第一表情分类标识移动至所述空闲区域时,在所述第一表情分类标识移动至所述第一区域以外的区域时,删除所述第一表情分类标识。
  5. 根据权利要求1所述的方法,其中,在获取移动指令之后,所述方法还包括:
    控制所述第一表情分类标识从所述表情分类标识列表中的初始区域移动至所述第一区域;
    控制所述表情分类标识列表中除所述第一表情分类标识以外的其他表情分类标识按照指向所述初始区域的方向依次移动。
  6. 根据权利要求5所述的方法,其中,在控制所述表情分类标识列表中除所述第一表情分类标识以外的其他表情分类标识按照指向所述初始区域的方向依次移动之后,所述方法还包括:
    判断所述其他表情分类标识的数量是否小于所述表情分类标识列表中用于显示表情分类标识的区域的数量;
    如果所述其他表情分类标识的数量小于所述表情分类标识列表中用于显示表情分类标识的区域的数量,则将所述表情分类标识列表中未显示表情分类标识的区域显示为空闲区域;
    如果所述其他表情分类标识的数量大于或等于所述表情分类标识列表中用于显示表情分类标识的区域的数量,则将第二表情分类标识显示在所述表情分类标识列表中,其中,所述第二表情分类标识在移动所述第一表情分类标识之前未显示在所述表情分类标识列表中。
  7. 根据权利要求1所述的方法,其中,在获取移动指令之后,所述方法还包括:
    放大所述第一表情分类标识,得到放大后的第一表情分类标识;
    显示所述放大后的第一表情分类标识。
  8. 根据权利要求1所述的方法,其中,对所述第一表情分类标识执行所述第一操作包括:
    在所述第一区域中删除所述第一表情分类标识,其中,在删除所述第一表情分类标识的时候,在所述表情分类标识列表中删除所述第一表情分类标识包括的所述多个表情图标。
  9. 根据权利要求8所述的方法,其中,在所述第一区域中显示删除图标,在所述第一区域中删除所述第一表情分类标识包括:
    在所述第一表情分类标识向所述删除图标移动时,控制所述删除图标的显示颜色逐渐加深,并在所述第一表情分类标识处于所述删除图标所在的预设区域时控制所述删除图标从第一图标变为第二图标。
  10. 一种表情分类标识的控制装置,包括:
    第一获取单元,用于获取移动指令,其中,所述移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,所述表情面板上显示有包括所述第一表情分类标识在内的一个或多个表情分类标识,并且所述第一表情分类标识包括一个或多个表情图标;
    第二获取单元,用于获取所述目标位置所在的第一区域所指示的第一操作;
    执行单元,用于对所述第一表情分类标识执行所述第一操作。
  11. 根据权利要求10所述的装置,其中,所述执行单元包括:
    判断模块,用于判断所述第一区域是否包括所述表情分类标识列表中任意相邻的两个表情分类标识所在的第二区域;
    显示模块,用于在判断出所述第一区域包括所述表情分类标识列表中的所述第二区域的情况下,在所述第一表情分类标识与所述第二区域部分或者全部重叠时,在所述相邻的两个表情分类标识之间显示空闲区域;
    移动模块,用于将所述第一表情分类标识移动至所述空闲区域。
  12. 根据权利要求11所述的装置,其中,所述显示模块包括:
    第一控制子模块,用于控制处于所述目标位置的所述第一表情分 类标识的第一侧的表情分类标识向第一方向移动,其中,所述第一方向为所述第一表情分类标识至所述第一侧的表情分类标识的方向;和/或
    第二控制子模块,用于控制处于所述目标位置的所述第一表情分类标识的第二侧的表情分类标识向第二方向移动,其中,所述第二方向为所述第一表情分类标识至所述第二侧的表情分类标识的方向,其中,所述第一方向与所述第二方向相反。
  13. 根据权利要求11所述的装置,其中,所述装置还包括:
    删除单元,用于在所述第一操作为将所述第一表情分类标识移动至所述空闲区域时,在所述第一表情分类标识移动至所述第一区域以外的区域时,删除所述第一表情分类标识。
  14. 根据权利要求10所述的装置,其中,所述装置还包括:
    第一控制单元,用于在获取移动指令之后,控制所述第一表情分类标识从所述表情分类标识列表中的初始区域移动至所述第一区域;
    第二控制单元,用于控制所述表情分类标识列表中除所述第一表情分类标识以外的其他表情分类标识按照指向所述初始区域的方向依次移动。
  15. 根据权利要求14所述的装置,其中,所述装置还包括:
    判断单元,用于在控制所述表情分类标识列表中除所述第一表情分类标识以外的其他表情分类标识按照指向所述初始区域的方向依次移动之后,判断所述其他表情分类标识的数量是否小于所述表情分类标识列表中用于显示表情分类标识的区域的数量;
    第一显示单元,用于在所述其他表情分类标识的数量小于所述表情分类标识列表中用于显示表情分类标识的区域的数量的情况下,将所述表情分类标识列表中未显示表情分类标识的区域显示为空闲区域;
    第二显示单元,用于在所述其他表情分类标识的数量大于或等于 所述表情分类标识列表中用于显示表情分类标识的区域的数量的情况下,将第二表情分类标识显示在所述表情分类标识列表中,其中,所述第二表情分类标识在移动所述第一表情分类标识之前未显示在所述表情分类标识列表中。
  16. 根据权利要求10所述的装置,其中,所述装置还包括:
    放大单元,用于在获取移动指令之后,放大所述第一表情分类标识,得到放大后的第一表情分类标识;
    第三显示单元,用于显示所述放大后的第一表情分类标识。
  17. 根据权利要求10所述的装置,其中,所述执行单元包括:
    删除模块,用于在所述第一区域中删除所述第一表情分类标识,其中,在删除所述第一表情分类标识的时候,在所述表情分类标识列表中删除所述第一表情分类标识包括的所述多个表情图标。
  18. 根据权利要求17所述的装置,其中,在所述第一区域中显示删除图标,所述删除模块包括:
    第三控制子模块,用于在所述第一表情分类标识向所述删除图标移动时,控制所述删除图标的显示颜色逐渐加深,并在所述第一表情分类标识处于所述删除图标所在的预设区域时控制所述删除图标从第一图标变为第二图标。
  19. 一种存储介质,所述存储介质被设置为存储用于执行以下步骤的程序代码,包括:
    获取移动指令,其中,所述移动指令用于将位于表情面板上的第一表情分类标识移动至目标位置,所述表情面板上显示有包括所述第一表情分类标识在内的一个或多个表情分类标识,并且所述第一表情分类标识包括一个或多个表情图标;
    获取所述目标位置所在的第一区域所指示的第一操作;
    对所述第一表情分类标识执行所述第一操作。
PCT/CN2017/074795 2016-04-15 2017-02-24 表情分类标识的控制方法、装置和存储介质 WO2017177770A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/112,473 US20180365527A1 (en) 2016-04-15 2018-08-24 Method and device for controlling expression classification identifier, and a storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610235335.0 2016-04-15
CN201610235335 2016-04-15

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/112,473 Continuation US20180365527A1 (en) 2016-04-15 2018-08-24 Method and device for controlling expression classification identifier, and a storage medium

Publications (1)

Publication Number Publication Date
WO2017177770A1 true WO2017177770A1 (zh) 2017-10-19

Family

ID=56841066

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/074795 WO2017177770A1 (zh) 2016-04-15 2017-02-24 表情分类标识的控制方法、装置和存储介质

Country Status (3)

Country Link
US (1) US20180365527A1 (zh)
CN (1) CN105930828B (zh)
WO (1) WO2017177770A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110276406A (zh) * 2019-06-26 2019-09-24 腾讯科技(深圳)有限公司 表情分类方法、装置、计算机设备及存储介质

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107479784B (zh) 2017-07-31 2022-01-25 腾讯科技(深圳)有限公司 表情展示方法、装置及计算机可读存储介质
CN110134452B (zh) * 2018-02-09 2022-10-25 阿里巴巴集团控股有限公司 对象处理方法及装置
CN109510897B (zh) * 2018-10-25 2021-04-27 维沃移动通信有限公司 一种表情图片管理方法及移动终端
CN109947321A (zh) * 2019-03-15 2019-06-28 努比亚技术有限公司 界面显示方法、可穿戴设备及计算机可读存储介质
WO2020211958A1 (en) * 2019-04-19 2020-10-22 Toyota Motor Europe Neural menu navigator and navigation methods
KR20210135683A (ko) 2020-05-06 2021-11-16 라인플러스 주식회사 인터넷 전화 기반 통화 중 리액션을 표시하는 방법, 시스템, 및 컴퓨터 프로그램
CN114553810A (zh) * 2022-02-22 2022-05-27 广州博冠信息科技有限公司 表情图片合成方法及装置、电子设备
CN114840117A (zh) * 2022-05-10 2022-08-02 北京字跳网络技术有限公司 信息输入页面的元素控制方法、装置、设备、介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101252549A (zh) * 2008-03-27 2008-08-27 腾讯科技(深圳)有限公司 表情图片缩略图的位置调整方法及***
CN102265586A (zh) * 2008-11-19 2011-11-30 苹果公司 使用表情字符的便携式触摸屏设备、方法和图形用户界面
CN103226473A (zh) * 2013-04-08 2013-07-31 北京小米科技有限责任公司 一种整理图标的方法、装置和设备
CN105446620A (zh) * 2015-11-17 2016-03-30 厦门飞信网络科技有限公司 一种图标整理方法及装置

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7853868B2 (en) * 2005-09-02 2010-12-14 Microsoft Corporation Button for adding a new tabbed sheet
US8059100B2 (en) * 2005-11-17 2011-11-15 Lg Electronics Inc. Method for allocating/arranging keys on touch-screen, and mobile terminal for use of the same
TWI351638B (en) * 2007-04-27 2011-11-01 Htc Corp Touch-based tab navigation method and related devi
US8631340B2 (en) * 2008-06-25 2014-01-14 Microsoft Corporation Tab management in a user interface window
JP2011058816A (ja) * 2009-09-07 2011-03-24 Yokogawa Electric Corp 測定装置
KR101682710B1 (ko) * 2009-11-17 2016-12-05 엘지전자 주식회사 네트워크에 접속 가능한 tv를 이용한 광고 방법
US20130024781A1 (en) * 2011-07-22 2013-01-24 Sony Corporation Multi-Modal and Updating Interface for Messaging
US10102567B2 (en) * 2012-06-07 2018-10-16 Google Llc User curated collections for an online application environment
US10354004B2 (en) * 2012-06-07 2019-07-16 Apple Inc. Intelligent presentation of documents
TWI483174B (zh) * 2012-12-12 2015-05-01 Acer Inc 網頁的群組管理方法
CN104424221B (zh) * 2013-08-23 2019-02-05 联想(北京)有限公司 一种信息处理方法及电子设备
KR20150057341A (ko) * 2013-11-19 2015-05-28 엘지전자 주식회사 이동 단말기 및 이의 제어 방법
CN104935491B (zh) * 2014-03-17 2018-08-07 腾讯科技(深圳)有限公司 一种发送表情图像的方法及装置
JP6413391B2 (ja) * 2014-06-27 2018-10-31 富士通株式会社 変換装置、変換プログラム、及び変換方法
US10203843B2 (en) * 2015-09-21 2019-02-12 Microsoft Technology Licensing, Llc Facilitating selection of attribute values for graphical elements
US10225602B1 (en) * 2016-12-30 2019-03-05 Jamdeo Canada Ltd. System and method for digital television operation and control-contextual interface

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101252549A (zh) * 2008-03-27 2008-08-27 腾讯科技(深圳)有限公司 表情图片缩略图的位置调整方法及***
CN102265586A (zh) * 2008-11-19 2011-11-30 苹果公司 使用表情字符的便携式触摸屏设备、方法和图形用户界面
CN103226473A (zh) * 2013-04-08 2013-07-31 北京小米科技有限责任公司 一种整理图标的方法、装置和设备
CN105446620A (zh) * 2015-11-17 2016-03-30 厦门飞信网络科技有限公司 一种图标整理方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110276406A (zh) * 2019-06-26 2019-09-24 腾讯科技(深圳)有限公司 表情分类方法、装置、计算机设备及存储介质
CN110276406B (zh) * 2019-06-26 2023-09-01 腾讯科技(深圳)有限公司 表情分类方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
CN105930828A (zh) 2016-09-07
US20180365527A1 (en) 2018-12-20
CN105930828B (zh) 2021-05-14

Similar Documents

Publication Publication Date Title
WO2017177770A1 (zh) 表情分类标识的控制方法、装置和存储介质
US20230088677A1 (en) Information sending method and apparatus and electronic device
EP3675460B1 (en) File sharing method and terminal
US20180088784A1 (en) Method and Device for Sharing Content
CN105975201B (zh) 移动终端及其分屏处理方法
US10942616B2 (en) Multimedia resource management method and apparatus, and storage medium
CA3053980A1 (en) Task management based on instant communication message
CN111381739B (zh) 应用图标显示方法、装置、电子设备及存储介质
WO2019015582A1 (zh) 信息提示方法及移动终端
CN110019058B (zh) 文件操作的共享方法和装置
WO2022156668A1 (zh) 信息处理方法和电子设备
WO2023131055A1 (zh) 消息发送方法、装置和电子设备
CN105487752A (zh) 一种应用控制方法及应用该方法的终端
JP2021528706A (ja) ファイル伝送方法、装置、およびコンピュータ読み取り可能な記憶媒体
WO2016155145A1 (zh) 信息共享方法和装置
CN106302932A (zh) 在通信终端中查看通讯记录的方法和设备
CN115357158A (zh) 消息处理方法、装置、电子设备及存储介质
WO2024109731A1 (zh) 内容分享方法、装置、电子设备和可读存储介质
WO2017045652A2 (zh) 一种信息内容的显示方法及相应的终端设备
WO2024093815A1 (zh) 数据共享方法、装置、电子设备及介质
WO2024037419A1 (zh) 显示控制方法、装置、电子设备及可读存储介质
CN111324262B (zh) 一种应用界面控制方法、装置、终端及介质
CN113051018A (zh) 会议***的控制方法、装置、设备和介质
CN115361354A (zh) 消息处理方法、装置、电子设备及可读存储介质
CN112581102A (zh) 任务管理方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17781746

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17781746

Country of ref document: EP

Kind code of ref document: A1