CN110851040A - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN110851040A
CN110851040A CN201911031661.XA CN201911031661A CN110851040A CN 110851040 A CN110851040 A CN 110851040A CN 201911031661 A CN201911031661 A CN 201911031661A CN 110851040 A CN110851040 A CN 110851040A
Authority
CN
China
Prior art keywords
information
control
input
target
hover
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911031661.XA
Other languages
Chinese (zh)
Other versions
CN110851040B (en
Inventor
严超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911031661.XA priority Critical patent/CN110851040B/en
Publication of CN110851040A publication Critical patent/CN110851040A/en
Priority to PCT/CN2020/113010 priority patent/WO2021082716A1/en
Application granted granted Critical
Publication of CN110851040B publication Critical patent/CN110851040B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to the technical field of communication, and provides an information processing method and electronic equipment, wherein the method comprises the following steps: receiving a first input for first information; in response to the first input, displaying a first hover control corresponding to the first information; receiving a second input for the target hover control; in response to the second input, sending target information corresponding to the target hover control; wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control. The embodiment of the invention can simplify the information sending process.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to the field of communications technologies, and in particular, to an information processing method and an electronic device.
Background
With the development of electronic devices, the functions of the electronic devices are more and more, and the electronic devices have become indispensable tools. Various types of information, such as documents, pictures, and videos, are typically stored on electronic devices. At present, when electronic equipment sends information, aiming at each information, the electronic equipment needs to enter a directory where the information is located, and aiming at each information, an information sending event is executed, so that the information sending process is complicated.
Disclosure of Invention
The embodiment of the invention provides an information processing method and electronic equipment, and aims to solve the problem that an information sending process is complicated in the prior art.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an information processing method applied to an electronic device, including:
receiving a first input for first information;
in response to the first input, displaying a first hover control corresponding to the first information;
receiving a second input for the target hover control;
in response to the second input, sending target information corresponding to the target hover control;
wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control.
In a second aspect, an embodiment of the present invention provides an electronic device, including:
a first receiving module for receiving a first input for first information;
the first display module is used for responding to the first input and displaying a first floating control corresponding to the first information;
the second receiving module is used for receiving a second input aiming at the target suspension control;
the sending module is used for responding to the second input and sending target information corresponding to the target suspension control;
wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control.
In a third aspect, an embodiment of the present invention further provides an electronic device, including: comprising a processor, a memory and a computer program stored on said memory and executable on said processor, said computer program realizing the steps of the information processing method according to the first aspect when executed by said processor.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the information processing method according to the first aspect.
In an embodiment of the present invention, a first input for first information is received; in response to the first input, displaying a first hover control corresponding to the first information; receiving a second input for the target hover control; in response to the second input, sending target information corresponding to the target hover control; wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control. Therefore, the target information is sent through the target floating control, and the information sending process can be simplified.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a flowchart of an information processing method according to an embodiment of the present invention;
FIG. 2 is a schematic view of a display interface of an electronic device according to an embodiment of the present invention;
fig. 3 is a second schematic view of a display interface of an electronic device according to an embodiment of the invention;
fig. 4 is a third schematic view of a display interface of an electronic device according to an embodiment of the present invention;
FIG. 5 is a fourth schematic view of a display interface of an electronic device according to an embodiment of the present invention;
FIG. 6 is a fifth schematic view of a display interface of an electronic device according to an embodiment of the present invention;
FIG. 7 is a sixth schematic view of a display interface of an electronic device according to an embodiment of the present invention;
fig. 8 is a seventh schematic view of a display interface of an electronic device according to an embodiment of the present invention;
fig. 9 is an eighth schematic view of a display interface of an electronic device according to an embodiment of the present invention;
FIG. 10 is a ninth schematic diagram of a display interface of an electronic device according to an embodiment of the present invention;
FIG. 11 is a schematic diagram of a display interface of an electronic device according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 13 is a second schematic structural diagram of an electronic device according to a second embodiment of the invention;
fig. 14 is a third schematic structural diagram of an electronic apparatus according to an embodiment of the invention;
fig. 15 is a fourth schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 16 is a fifth schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 17 is a sixth schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
Referring to fig. 1, fig. 1 is a flowchart of an information processing method provided by an embodiment of the present invention, where the method is applied to an electronic device, and as shown in fig. 1, the method includes the following steps:
step 101, receiving a first input for first information.
The first information may include various types of files such as text documents, pictures, videos, audios, and compressed packets, and may further include a web page link, and the first information is not limited in the embodiment of the present invention. The first input may be an operation of pressing the first information, or may be an operation of clicking the first information, or may also be an operation of moving the first information along a preset track, or the first input may include an operation of selecting the first information and confirming display of a floating control corresponding to the first information. The embodiment of the present invention does not limit the specific expression of the first input.
It should be noted that, in practical applications, when the first information is a file, after receiving an input of pressing the file, a selection control may be displayed, where the selection control is used to select whether to generate a floating control corresponding to the file, and after receiving a confirmed operation, the floating control corresponding to the file may be displayed. For example, a hover control corresponding to a picture may be displayed, or, as shown in fig. 2, a hover control corresponding to a selected segment of text may be displayed. For example, as shown in fig. 3, when the hover control is a hover ball, a prompt box indicating whether to convert the file into the hover ball is displayed, and when a confirmation operation is received, the hover ball corresponding to the file is displayed.
And 102, responding to the first input, and displaying a first floating control corresponding to the first information.
The first levitation control can be any shape, such as a circle. The first hover control may be displayed at a preset transparency. The first floating control is associated with the first information, the first floating control can be associated with the storage address of the first information, and the storage position and the content of the first information cannot be changed after the first floating control corresponding to the first information is displayed. The operations of copying and sending the first information and the like can be realized through the first suspension control.
In addition, the first levitation control can be moved to a certain position, and the first levitation control can stay at the certain position; or, the first floating control may be moved, and after the first floating control is moved, the floating control may automatically float in an edge area of the display interface of the electronic device, for example, as shown in fig. 4, in a case that the electronic device is a mobile phone, the first floating control may automatically float in two side edges of the display interface nearest to the first floating control, so that the floating control may not interfere with arrangement and daily use of the application icons on the mobile phone. If the first hovering control is not moved, the first hovering control may be placed at an initial position, which may be a preset position.
Step 103, receiving a second input for a target floating control, where the target floating control is the first floating control or a second floating control displayed in response to a third input for the first floating control.
The second input may be an operation of pressing the target suspension control, or may also be an operation of moving the target suspension control along a preset trajectory, or a sending button may be arranged on the target suspension control, and the second input is an input of clicking the sending button. The embodiment of the present invention does not limit the specific expression of the second input. The second levitation control can be any shape, such as a circle. The second hovering control may be displayed according to a preset transparency. The second hover control is displayed in response to a third input to the first hover control, which may be an input to merge the first hover control and the other hover controls, e.g., an input to merge after moving the first hover control so that the first hover control and the other hover controls are in contact, the second hover control being the merged hover control; or the suspension control may be split input, for example, double-clicking the first suspension control splits the first suspension control into a plurality of suspension controls, and the second suspension control is a plurality of suspension controls obtained after splitting. The third input is not limited in the embodiment of the present invention.
And 104, responding to the second input, and sending target information corresponding to the target floating control.
If the target floating control is the first floating control, the target information may be first information, and if the target floating control is a second floating control, the target information is information corresponding to the second floating control. The target information may be sent through a target hover control, for example, in an application of the electronic device that includes a plurality of contacts, the target hover control may be moved to a name or avatar of a contact, the target information may be sent to the contact, or in a case of communicating with a target object through the application, an input of moving the target hover control into a chat input box with the target object may be received, and the target information may be sent to the target object through the application.
It should be noted that, in practical applications, a configuration interface of the floating control may be provided, and the display size, the transparency, the color, and the like of the floating control may be set on the configuration interface. For example, in the case that the hover control is a hover ball, as shown in fig. 5, the hover ball may be enabled on the setting interface, and as shown in fig. 6, the display size, transparency, and color of the hover ball corresponding to different types of information may be set. Different styles of hover controls may be set for different types of information. Under the condition that the user does not perform self-defined setting, the suspension control can adopt default setting. The first hover control may be deleted, for example, the first hover control may be deleted by double-clicking the first hover control. All hovering controls, including the first hovering control, may also be cleared. The floating control can be cleared when an operation of pressing the screen along a preset track is received. For example, a gesture of a user sliding down three fingers is received. And a mode of clearing the floating control by one key can be set on a configuration interface of the floating control. The floating control can also be hidden and displayed, for example, a hidden button can be provided, the hidden button has two states of hiding and hiding cancellation, and the floating control can be selected to be hidden or displayed by setting the state of the hidden button.
After the first hover control is displayed, preview information for the first information may be displayed after an input is received to press or click the first hover control. Therefore, the first information can be previewed through the first suspension control, and the operation of a user is more convenient and fast. The preview information for displaying the first information may be partial information or all information for displaying the first information, for example, when the first information is a segment of text, the segment of text may be displayed; as shown in fig. 7, in the case where the first information is a picture, the picture may be displayed; in the case where the first information is a document, a top page of the document may be displayed.
In an embodiment of the present invention, a first input for first information is received; in response to the first input, displaying a first hover control corresponding to the first information; receiving a second input for the target hover control; in response to the second input, sending target information corresponding to the target hover control; wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control. Therefore, the target information is sent through the target floating control, and the information sending process can be simplified.
Optionally, the method further includes:
receiving a fourth input for second information;
in response to the fourth input, displaying a third hover control corresponding to the second information;
receiving a fifth input of moving the third hover control to a first target position of a target file while editing the target file;
in response to the fifth input, adding the second information in the target file.
The second information may include various types of files such as text documents, pictures, videos, audios, and compressed packets, and may further include web links, and the second information is not limited in the embodiment of the present invention. The fourth input may be an operation of pressing the second information, or may be an operation of clicking the second information, or may also be an operation of moving the second information along a preset track, or the fourth input may include an operation of selecting the second information and confirming display of a hover control corresponding to the second information. The embodiment of the present invention does not limit the concrete expression of the fourth input. The third levitation control can be any shape, such as a circle. The third floating control can be displayed according to the preset transparency. The third floating control corresponds to the second information, and the third floating control can be associated with the storage address of the second information.
In addition, the first target position may be a position where the cursor is located in the target file, or may also be a position where the user wants to insert the second information in the target file, or may also be a position where the icon of the target file is located. For example, as shown in fig. 8, when a document is edited, a floating control corresponding to a text and a floating control corresponding to a picture may be moved to a position where a cursor is located. And adding the second information in the target file, wherein if the first target position is in the target file, the second information can be added in the first target position. If the first target location is a location where an icon of the target file is located, the second information may be added at a preset location of the target file, for example, the second information may be added at an end location of the target file, or the second information may be added at a start location of the target file, and so on. For example, when the electronic device is a mobile phone, when a document is edited on the mobile phone, the mobile phone is small in size, so that characters are not convenient to edit in the document, and information can be quickly inserted into the document by moving the third floating control, which is convenient.
In this embodiment, a fourth input for second information is received; in response to the fourth input, displaying a third hover control corresponding to the second information; receiving a fifth input of moving the third hover control to a first target position of a target file while editing the target file; and responding to the fifth input, and adding the second information into the target file, so that the information processing process can be simplified, the file editing efficiency is improved, and the user operation is more convenient.
Optionally, the second input is an input to move the target levitation control to a second target position;
the sending of the target information corresponding to the target levitation control comprises:
and sending the target information to a target object corresponding to the second target position.
The second target position may be an input box for communicating with the target object, or may be a position where the target object is located, for example, an avatar of the target object or a name of the target object, and the second target position is not limited in the embodiment of the present invention. The communication with the target object can be performed through the application program, and the target object can be one or more. In the case that there is one target object, when communicating with the target object through the application, an input box moving the target hover control into communication with the target object may be received to enable sending the target information to the target object. For example, as shown in fig. 9, when chatting with a target object in a chat software, an input of moving a target hover control into an input box is received, and target information may be transmitted to the target object. When a plurality of target objects are available, the input for selecting the plurality of target objects can be received, and then the input for moving the target levitation control into one of the target objects is received, so that the target information can be simultaneously sent to the plurality of target objects. After receiving an input to move the target hover control into a second target position, the target information may be sent to the target object through the application.
In this embodiment, the second input is an input to move the target hover control to a second target position; and sending the target information to the target object corresponding to the second target position, so that the target information is sent by moving the target suspension control, the information sending process can be simplified, and the information sending efficiency is improved.
Optionally, before receiving the second input for the target hover control, the method further includes:
receiving a sixth input for third information;
in response to the sixth input, displaying a fourth hover control corresponding to the third information;
the third input is input for the first hover control and the fourth hover control;
and responding to the third input, canceling to display the first floating control and the fourth floating control on a display interface, and displaying a second floating control, wherein the second floating control corresponds to fourth information, and the fourth information is information obtained by combining the first information and the third information.
The third information may include various types of files such as text documents, pictures, videos, audios, and compressed packets, and may further include web links, and the third information is not limited in the embodiment of the present invention. The sixth input may be an operation of pressing the third information, or may be an operation of clicking the third information, or may also be an operation of moving the third information along a preset trajectory, or the sixth input may include an operation of selecting the third information and confirming display of a floating control corresponding to the third information. The embodiment of the present invention does not limit the specific expression of the sixth input.
The fourth levitation control can be any shape, such as a circle. The fourth floating control can be displayed according to the preset transparency. The fourth floating control is associated with the third information, the fourth floating control can be associated with the storage address of the third information, and the storage position and the content of the third information cannot be changed after the fourth floating control corresponding to the third information is displayed. And the operations of copying and sending the third information and the like can be realized through the fourth floating control. The fourth hover control may be located on the same display interface as the first hover control.
Wherein the third input may be moving the first hover control and/or the fourth hover control such that the first hover control and the fourth hover control are in contact; or the first floating control and/or the fourth floating control are/is moved, so that the first floating control and the fourth floating control are partially overlapped; or the first floating control and the fourth floating control are moved to enable the first floating control and the fourth floating control to be located in the same area. The third input is not limited in the embodiment of the present invention.
The second levitation control can be in any shape, and for example, can be a circle. The second hovering control may be displayed according to a preset transparency. And the first information and the third information can be processed simultaneously according to the second floating control. The second floating control may be directly associated with the first information and directly associated with the third information, or the expansion interface corresponding to the second floating control may include the first floating control and the fourth floating control, and be associated with the first information through the first floating control and be associated with the third information through the fourth floating control. When the second floating control is operated, the first information and the third information can be processed simultaneously. For example, in the case of communicating with a target object through an application, receiving input to move a second hover control into a first target position, the first and third information may be sent to the target object through the application; or, in the case of editing a file, receiving an input of moving the second hover control into a second target position of the file, and may add the first information and the third information to the file at the same time.
After receiving a third input for the first floating control and the fourth floating control, the first floating control and the fourth floating control may be cancelled from being displayed on a display interface, and the second floating control is displayed, so that the first floating control and the fourth floating control can be fused. For example, as shown in fig. 10, one floating control corresponding to a text may be fused with another floating control corresponding to the text, and one floating control corresponding to the text may also be fused with another floating control corresponding to a picture. The plurality of suspension controls can be fused into one suspension control in the same manner, and the fused second suspension control can be further fused with other suspension controls, which is not limited in the embodiment of the present invention.
The first information and the third information may be the same type of information, for example, the first information may be text, the third information may be text, and the fourth information may be text obtained by combining the first information and the third information, specifically, the first information may be abc, the third information may be def, and the fourth information may be abcdef. Or, the first information may be a picture, the third information may be a picture, and the fourth information may be a picture obtained by combining the first information and the third information. The first information and the third information may also be different types of information, for example, the first information may be characters, the third information may be pictures, and the first information may be converted into the pictures and combined in a picture manner.
Further, as shown in fig. 10, the expansion interface corresponding to the second suspension control may further include a plurality of suspension controls, each suspension control may be provided with a number, the number may be displayed on the suspension control, and the number may be used to indicate the order in which the suspension controls are fused. The less numbered hover controls may be displayed before and the more numbered hover controls may be displayed after. As shown in fig. 11, the floating control in the expansion interface corresponding to the second floating control may also be moved out, and the moved floating control is removed from fusion.
It should be noted that information that cannot be merged may be preset, for example, when the first information is set as a document and the third information is set as a picture, merging cannot be performed, so as to prevent poor effect after merging. As shown in fig. 10, if it is detected that the user fuses the floating control corresponding to the document and the floating control corresponding to the picture, an "x" may be prompted to indicate that merging is not possible.
In this embodiment, a sixth input for third information is received; in response to the sixth input, displaying a fourth hover control corresponding to the third information; the third input is input for the first hover control and the fourth hover control; and responding to the third input, canceling to display the first floating control and the fourth floating control on a display interface, and displaying a second floating control, wherein the second floating control corresponds to fourth information, and the fourth information is information obtained by combining the first information and the third information. Therefore, the first suspension control and the fourth suspension control can be integrated into the second suspension control by operating the first suspension control and the fourth suspension control, the operation of the suspension control is flexible, and the user experience is good.
Optionally, the first floating control and the fourth floating control are included in the expansion interface corresponding to the second floating control, and after the target information corresponding to the target floating control is sent, the method further includes:
receiving a seventh input for the second hover control;
and responding to the seventh input, canceling the display of the second floating control on the display interface, and displaying the first floating control and the fourth floating control.
The seventh input may be an operation of pressing the second floating control, or may also be an operation of moving the second floating control along a preset track, or a split button may be disposed on the second floating control, and the seventh input is an input of clicking the split button. The embodiment of the present invention does not limit the specific expression of the seventh input. And restoring and displaying all the floating controls fused into the second floating control on the display interface through the seventh input. For example, after the split button is clicked, the first floating control and the fourth floating control merged into the second floating control are displayed on the display interface again, and the first floating control is not displayed.
In this embodiment, a seventh input is received for the second hover control; and responding to the seventh input, canceling the display of the second floating control on the display interface, and displaying the first floating control and the fourth floating control. Therefore, the suspended control can be operated to rapidly split the fused suspended control, and the operation of a user is more convenient.
Optionally, the first hovering control and the fourth hovering control are included in the expansion interface corresponding to the second hovering control, and after the second hovering control is displayed, the method further includes:
receiving an eighth input;
and responding to the eighth input, adjusting the position relationship of the first floating control and the fourth floating control, and adjusting the positions of the first information and the third information in the fourth information based on the adjusted position relationship of the first floating control and the fourth floating control.
The eighth input may be an input of moving the first hovering control, or may be an input of moving the fourth hovering control. The fourth information may be associated with the positions of the first hover control and the fourth hover control, and the first information may precede the third information if the first hover control display precedes the fourth hover control. For example, if the first hover control is displayed before the fourth hover control, the first information is abc, and the third information is def, the fourth information is abcdef. And if the fourth floating control is displayed before the first floating control, the first information is abc, and the third information is def, the fourth information is defabc. The first floating control and the fourth floating control can be moved relatively to change the relative position relation displayed by the first floating control and the fourth floating control, so that the fourth information is changed.
In this embodiment, the position relationship between the first floating control and the fourth floating control is adjusted, so that the positions of the first information and the third information in the fourth information can be adjusted, the operation is convenient and fast, the intelligent degree is high, and the user experience is good.
Optionally, the first information is text information, the third information is first picture information, and the fourth information is second picture information obtained by combining the text information and the first picture information.
The text information can be converted into picture information corresponding to the text information and then combined with the first picture information to obtain second picture information; or the text information and the first picture information can be directly merged to obtain the second picture information.
In this embodiment, the first information is text information, the third information is first picture information, and the fourth information is second picture information obtained by combining the text information and the first picture information, so that the text information and the picture information can be combined, the method and the device are applicable to more use scenes, the intelligent degree is high, and the user experience is good.
Referring to fig. 12, fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 12, the electronic device 200 includes:
a first receiving module 201, configured to receive a first input for first information;
a first display module 202, configured to display, in response to the first input, a first hover control corresponding to the first information;
a second receiving module 203, configured to receive a second input for the target hover control;
a sending module 204, configured to send, in response to the second input, target information corresponding to the target hovering control;
wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control.
In the electronic device of the embodiment of the invention, a first receiving module receives a first input aiming at first information; the first display module responds to the first input and displays a first floating control corresponding to the first information; the second receiving module receives a second input aiming at the target suspension control; the sending module responds to the second input and sends target information corresponding to the target suspension control; wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control. Therefore, the target information is sent through the target floating control, and the information sending process can be simplified.
Optionally, as shown in fig. 13, the electronic device 200 further includes:
a third receiving module 205, configured to receive a fourth input for the second information;
a second display module 206, configured to display, in response to the fourth input, a third hover control corresponding to the second information;
a fourth receiving module 207, configured to receive a fifth input that the third hovering control is moved to the first target position of the target file when the target file is edited;
an adding module 208, configured to add the second information in the target file in response to the fifth input.
Optionally, the second input is an input to move the target levitation control to a second target position;
the sending module 204 is specifically configured to:
and sending the target information to a target object corresponding to the second target position.
Optionally, as shown in fig. 14, the electronic device 200 further includes:
a fifth receiving module 209, configured to receive a sixth input for the third information;
a third display module 210, configured to display, in response to the sixth input, a fourth floating control corresponding to the third information;
a fourth display module 211, configured to cancel displaying the first floating control and the fourth floating control on a display interface in response to the third input, and display a second floating control, where the second floating control corresponds to fourth information, the fourth information is information obtained by combining the first information and the third information, and the third input is an input for the first floating control and the fourth floating control.
Optionally, the first floating control and the fourth floating control are included in the expansion interface corresponding to the second floating control, as shown in fig. 15, the electronic device 200 further includes:
a sixth receiving module 212, configured to receive a seventh input for the second hovering control;
a fifth display module 213, configured to cancel displaying the second hovering control and display the first hovering control and the fourth hovering control on the display interface in response to the seventh input.
Optionally, the first hovering control and the fourth hovering control are included in the expansion interface corresponding to the second hovering control, as shown in fig. 16, the electronic device 200 further includes:
a seventh receiving module 214, configured to receive an eighth input;
an adjusting module 215, configured to adjust a position relationship between the first floating control and the fourth floating control in response to the eighth input, and adjust positions of the first information and the third information in the fourth information based on the adjusted position relationship between the first floating control and the fourth floating control.
Optionally, the first information is text information, the third information is first picture information, and the fourth information is second picture information obtained by combining the text information and the first picture information.
The electronic device can implement each process implemented by the electronic device in the method embodiment of fig. 1, and details are not described here to avoid repetition.
Fig. 17 is a schematic diagram of a hardware structure of an electronic device for implementing various embodiments of the present invention.
The electronic device 300 includes, but is not limited to: radio frequency unit 301, network module 302, audio output unit 303, input unit 304, sensor 305, display unit 306, user input unit 307, interface unit 308, memory 309, processor 310, and power supply 311. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 17 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a robot, a wearable device, a pedometer, and the like.
Wherein the user input unit 307 is configured to: receiving a first input for first information;
the display unit 306 is configured to: in response to the first input, displaying a first hover control corresponding to the first information;
the user input unit 307 is further configured to: receiving a second input for the target hover control;
the radio frequency unit 301 is configured to: in response to the second input, sending target information corresponding to the target hover control;
wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control.
The electronic device 300 can implement the processes implemented by the electronic device in the foregoing embodiments, and in order to avoid repetition, the detailed description is omitted here.
In the electronic device of the embodiment of the invention, a user input unit receives a first input aiming at first information; the display unit responds to the first input and displays a first floating control corresponding to the first information; the user input unit receives a second input aiming at the target suspension control; the radio frequency unit responds to the second input and sends target information corresponding to the target suspension control; wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control. Therefore, the target information is sent through the target floating control, and the information sending process can be simplified.
Optionally, the user input unit 307 is further configured to: receiving a fourth input for second information;
the display unit 306 is also used to: in response to the fourth input, displaying a third hover control corresponding to the second information;
the user input unit 307 is further configured to: receiving a fifth input of moving the third hover control to a first target position of a target file while editing the target file;
the processor 310 is configured to: in response to the fifth input, adding the second information in the target file.
Optionally, the second input is an input to move the target levitation control to a second target position;
the radio frequency unit 301 is further configured to:
and sending the target information to a target object corresponding to the second target position.
Optionally, the user input unit 307 is further configured to: receiving a sixth input for third information;
the display unit 306 is also used to: in response to the sixth input, displaying a fourth hover control corresponding to the third information;
the third input is input for the first hover control and the fourth hover control;
the display unit 306 is also used to: and responding to the third input, canceling to display the first floating control and the fourth floating control on a display interface, and displaying a second floating control, wherein the second floating control corresponds to fourth information, and the fourth information is information obtained by combining the first information and the third information.
Optionally, the expansion interface corresponding to the second floating control includes the first floating control and the fourth floating control, and the user input unit 307 is further configured to: receiving a seventh input for the second hover control;
the display unit 306 is also used to: and responding to the seventh input, canceling the display of the second floating control on the display interface, and displaying the first floating control and the fourth floating control.
Optionally, the expansion interface corresponding to the second floating control includes the first floating control and the fourth floating control, and the user input unit 307 is further configured to: receiving an eighth input;
the processor 310 is further configured to: and responding to the eighth input, adjusting the position relationship of the first floating control and the fourth floating control, and adjusting the positions of the first information and the third information in the fourth information based on the adjusted position relationship of the first floating control and the fourth floating control.
Optionally, the first information is text information, the third information is first picture information, and the fourth information is second picture information obtained by combining the text information and the first picture information.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 301 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 310; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 301 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 301 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 302, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 303 may convert audio data received by the radio frequency unit 301 or the network module 302 or stored in the memory 309 into an audio signal and output as sound. Also, the audio output unit 303 may also provide audio output related to a specific function performed by the electronic apparatus 300 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 303 includes a speaker, a buzzer, a receiver, and the like.
The input unit 304 is used to receive audio or video signals. The input Unit 304 may include a Graphics Processing Unit (GPU) 3041 and a microphone 3042, and the Graphics processor 3041 processes image data of a still picture or video obtained by an image capturing apparatus (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 306. The image frames processed by the graphic processor 3041 may be stored in the memory 309 (or other storage medium) or transmitted via the radio frequency unit 301 or the network module 302. The microphone 3042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 301 in case of the phone call mode.
The electronic device 300 also includes at least one sensor 305, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 3061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 3061 and/or the backlight when the electronic device 300 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 305 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 306 is used to display information input by the user or information provided to the user. The Display unit 306 may include a Display panel 3061, and the Display panel 3061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 307 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 307 includes a touch panel 3071 and other input devices 3072. The touch panel 3071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 3071 (e.g., operations by a user on or near the touch panel 3071 using a finger, a stylus, or any suitable object or attachment). The touch panel 3071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 310, and receives and executes commands sent by the processor 310. In addition, the touch panel 3071 may be implemented using various types, such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 307 may include other input devices 3072 in addition to the touch panel 3071. Specifically, the other input devices 3072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein.
Further, the touch panel 3071 may be overlaid on the display panel 3061, and when the touch panel 3071 detects a touch operation on or near the touch panel, the touch operation is transmitted to the processor 310 to determine the type of the touch event, and then the processor 310 provides a corresponding visual output on the display panel 3061 according to the type of the touch event. Although in fig. 17, the touch panel 3071 and the display panel 3061 are implemented as two separate components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 3071 and the display panel 3061 may be integrated to implement the input and output functions of the electronic device, which is not limited herein.
The interface unit 308 is an interface for connecting an external device to the electronic apparatus 300. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 308 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 300 or may be used to transmit data between the electronic apparatus 300 and the external device.
The memory 309 may be used to store software programs as well as various data. The memory 309 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 309 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 310 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 309 and calling data stored in the memory 309, thereby performing overall monitoring of the electronic device. Processor 310 may include one or more processing units; preferably, the processor 310 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 310.
The electronic device 300 may further include a power supply 311 (such as a battery) for supplying power to various components, and preferably, the power supply 311 may be logically connected to the processor 310 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 300 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program, when executed by the processor, implements each process in the embodiment of the information processing method, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment in the information processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (15)

1. An information processing method applied to an electronic device, the method comprising:
receiving a first input for first information;
in response to the first input, displaying a first hover control corresponding to the first information;
receiving a second input for the target hover control;
in response to the second input, sending target information corresponding to the target hover control;
wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control.
2. The method of claim 1, further comprising:
receiving a fourth input for second information;
in response to the fourth input, displaying a third hover control corresponding to the second information;
receiving a fifth input of moving the third hover control to a first target position of a target file while editing the target file;
in response to the fifth input, adding the second information in the target file.
3. The method of claim 1, wherein the second input is an input to move the target hover control to a second target position;
the sending of the target information corresponding to the target levitation control comprises:
and sending the target information to a target object corresponding to the second target position.
4. The method of claim 1, wherein prior to receiving the second input for the target hover control, the method further comprises:
receiving a sixth input for third information;
in response to the sixth input, displaying a fourth hover control corresponding to the third information;
the third input is input for the first hover control and the fourth hover control;
and responding to the third input, canceling to display the first floating control and the fourth floating control on a display interface, and displaying a second floating control, wherein the second floating control corresponds to fourth information, and the fourth information is information obtained by combining the first information and the third information.
5. The method of claim 4, wherein the first hovering control and the fourth hovering control are included in an expansion interface corresponding to the second hovering control, and wherein after sending target information corresponding to the target hovering control, the method further comprises:
receiving a seventh input for the second hover control;
and responding to the seventh input, canceling the display of the second floating control on the display interface, and displaying the first floating control and the fourth floating control.
6. The method of claim 4, wherein the first hover control and the fourth hover control are included within the expansion interface corresponding to the second hover control, and wherein after displaying the second hover control, the method further comprises:
receiving an eighth input;
and responding to the eighth input, adjusting the position relationship of the first floating control and the fourth floating control, and adjusting the positions of the first information and the third information in the fourth information based on the adjusted position relationship of the first floating control and the fourth floating control.
7. The method according to claim 4, wherein the first information is text information, the third information is first picture information, and the fourth information is second picture information obtained by combining the text information and the first picture information.
8. An electronic device, characterized in that the electronic device comprises:
a first receiving module for receiving a first input for first information;
the first display module is used for responding to the first input and displaying a first floating control corresponding to the first information;
the second receiving module is used for receiving a second input aiming at the target suspension control;
the sending module is used for responding to the second input and sending target information corresponding to the target suspension control;
wherein the target hover control is the first hover control or a second hover control displayed in response to a third input to the first hover control.
9. The electronic device of claim 8, further comprising:
a third receiving module for receiving a fourth input for the second information;
the second display module is used for responding to the fourth input and displaying a third floating control corresponding to the second information;
a fourth receiving module, configured to receive a fifth input that the third hovering control is moved to a first target position of a target file when the target file is edited;
an adding module, configured to add the second information in the target file in response to the fifth input.
10. The electronic device of claim 8, wherein the second input is an input to move the target hover control to a second target position;
the sending module is specifically configured to:
and sending the target information to a target object corresponding to the second target position.
11. The electronic device of claim 8, further comprising:
a fifth receiving module, configured to receive a sixth input for third information;
a third display module, configured to display, in response to the sixth input, a fourth floating control corresponding to the third information;
and the fourth display module is used for responding to the third input, canceling to display the first floating control and the fourth floating control on a display interface, and displaying a second floating control, wherein the second floating control corresponds to fourth information, the fourth information is information obtained by combining the first information and the third information, and the third input is input aiming at the first floating control and the fourth floating control.
12. The electronic device of claim 11, wherein the first hovering control and the fourth hovering control are included in an expansion interface corresponding to the second hovering control, and wherein the electronic device further comprises:
a sixth receiving module, configured to receive a seventh input for the second hover control;
and the fifth display module is used for responding to the seventh input, canceling the display of the second floating control on the display interface, and displaying the first floating control and the fourth floating control.
13. The electronic device of claim 11, wherein the first hovering control and the fourth hovering control are included in an expansion interface corresponding to the second hovering control, and wherein the electronic device further comprises:
a seventh receiving module, configured to receive an eighth input;
and an adjusting module, configured to adjust a position relationship between the first floating control and the fourth floating control in response to the eighth input, and adjust positions of the first information and the third information in the fourth information based on the adjusted position relationship between the first floating control and the fourth floating control.
14. The electronic device of claim 11, wherein the first information is text information, the third information is first picture information, and the fourth information is second picture information obtained by combining the text information and the first picture information.
15. An electronic device, comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the information processing method according to any one of claims 1 to 7.
CN201911031661.XA 2019-10-28 2019-10-28 Information processing method and electronic equipment Active CN110851040B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911031661.XA CN110851040B (en) 2019-10-28 2019-10-28 Information processing method and electronic equipment
PCT/CN2020/113010 WO2021082716A1 (en) 2019-10-28 2020-09-02 Information processing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911031661.XA CN110851040B (en) 2019-10-28 2019-10-28 Information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN110851040A true CN110851040A (en) 2020-02-28
CN110851040B CN110851040B (en) 2021-07-20

Family

ID=69598141

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911031661.XA Active CN110851040B (en) 2019-10-28 2019-10-28 Information processing method and electronic equipment

Country Status (2)

Country Link
CN (1) CN110851040B (en)
WO (1) WO2021082716A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111601012A (en) * 2020-05-28 2020-08-28 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN112346636A (en) * 2020-11-05 2021-02-09 网易(杭州)网络有限公司 In-game information processing method and device and terminal
WO2021082716A1 (en) * 2019-10-28 2021-05-06 维沃移动通信有限公司 Information processing method and electronic device
CN112764634A (en) * 2021-01-22 2021-05-07 维沃移动通信有限公司 Content processing method and device
CN114756148A (en) * 2022-04-27 2022-07-15 北京达佳互联信息技术有限公司 Control display method and device, electronic equipment and storage medium
CN114818625A (en) * 2022-06-29 2022-07-29 天津联想协同科技有限公司 Method and device for processing document area

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113364845B (en) * 2021-05-31 2023-08-18 维沃移动通信有限公司 File transmission method and device
CN114860127A (en) * 2022-05-27 2022-08-05 维沃移动通信有限公司 Information transmission method and information transmission device

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130132904A1 (en) * 2011-11-22 2013-05-23 Backplane, Inc. Content sharing application utilizing radially-distributed menus
CN104298558A (en) * 2014-09-23 2015-01-21 广州华多网络科技有限公司 Information processing method and device
CN104951186A (en) * 2015-06-12 2015-09-30 联想(北京)有限公司 Information processing method and electronic equipment
CN105988665A (en) * 2016-03-17 2016-10-05 广州阿里巴巴文学信息技术有限公司 Information copying system, information copying method and electronic device
CN106502527A (en) * 2016-09-29 2017-03-15 北京小米移动软件有限公司 Method, device and terminal that a kind of content is shared
CN107967093A (en) * 2017-12-21 2018-04-27 维沃移动通信有限公司 A kind of multistage text clone method and mobile terminal
CN108170338A (en) * 2018-01-09 2018-06-15 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN108174022A (en) * 2017-12-12 2018-06-15 上海爱优威软件开发有限公司 A kind of quick sending method of sectional drawing and terminal device
CN108228058A (en) * 2018-03-19 2018-06-29 网易(杭州)网络有限公司 Information stickup method and device, electronic equipment, storage medium
CN108536344A (en) * 2018-01-09 2018-09-14 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN108614725A (en) * 2018-05-11 2018-10-02 维沃移动通信有限公司 a kind of interface display method and terminal
CN108762954A (en) * 2018-05-29 2018-11-06 维沃移动通信有限公司 A kind of object sharing method and mobile terminal
US20190056859A1 (en) * 2016-02-04 2019-02-21 Huawei Technologies Co., Ltd. Information processing method and electronic device
CN110333814A (en) * 2019-05-31 2019-10-15 华为技术有限公司 A kind of method and electronic equipment of sharing contents

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140365944A1 (en) * 2013-06-09 2014-12-11 Apple Inc. Location-Based Application Recommendations
CN107229389A (en) * 2017-05-24 2017-10-03 努比亚技术有限公司 A kind of method of shared file, equipment and computer-readable recording medium
CN108228902B (en) * 2018-02-08 2020-06-26 维沃移动通信有限公司 File display method and mobile terminal
CN109343755B (en) * 2018-09-21 2020-09-29 维沃移动通信有限公司 File processing method and terminal equipment
CN109684110A (en) * 2018-12-28 2019-04-26 北京小米移动软件有限公司 Multimedia resource sharing method, device and storage medium
CN110851040B (en) * 2019-10-28 2021-07-20 维沃移动通信有限公司 Information processing method and electronic equipment
CN111601012B (en) * 2020-05-28 2022-10-25 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130132904A1 (en) * 2011-11-22 2013-05-23 Backplane, Inc. Content sharing application utilizing radially-distributed menus
CN104298558A (en) * 2014-09-23 2015-01-21 广州华多网络科技有限公司 Information processing method and device
CN104951186A (en) * 2015-06-12 2015-09-30 联想(北京)有限公司 Information processing method and electronic equipment
US20190056859A1 (en) * 2016-02-04 2019-02-21 Huawei Technologies Co., Ltd. Information processing method and electronic device
CN105988665A (en) * 2016-03-17 2016-10-05 广州阿里巴巴文学信息技术有限公司 Information copying system, information copying method and electronic device
CN106502527A (en) * 2016-09-29 2017-03-15 北京小米移动软件有限公司 Method, device and terminal that a kind of content is shared
CN108174022A (en) * 2017-12-12 2018-06-15 上海爱优威软件开发有限公司 A kind of quick sending method of sectional drawing and terminal device
CN107967093A (en) * 2017-12-21 2018-04-27 维沃移动通信有限公司 A kind of multistage text clone method and mobile terminal
CN108170338A (en) * 2018-01-09 2018-06-15 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN108536344A (en) * 2018-01-09 2018-09-14 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN108228058A (en) * 2018-03-19 2018-06-29 网易(杭州)网络有限公司 Information stickup method and device, electronic equipment, storage medium
CN108614725A (en) * 2018-05-11 2018-10-02 维沃移动通信有限公司 a kind of interface display method and terminal
CN108762954A (en) * 2018-05-29 2018-11-06 维沃移动通信有限公司 A kind of object sharing method and mobile terminal
CN110333814A (en) * 2019-05-31 2019-10-15 华为技术有限公司 A kind of method and electronic equipment of sharing contents

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
康丞信息: "《电脑技巧快易通》", 31 August 2005 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021082716A1 (en) * 2019-10-28 2021-05-06 维沃移动通信有限公司 Information processing method and electronic device
CN111601012A (en) * 2020-05-28 2020-08-28 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN111601012B (en) * 2020-05-28 2022-10-25 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN112346636A (en) * 2020-11-05 2021-02-09 网易(杭州)网络有限公司 In-game information processing method and device and terminal
CN112764634A (en) * 2021-01-22 2021-05-07 维沃移动通信有限公司 Content processing method and device
CN114756148A (en) * 2022-04-27 2022-07-15 北京达佳互联信息技术有限公司 Control display method and device, electronic equipment and storage medium
CN114756148B (en) * 2022-04-27 2023-10-31 北京达佳互联信息技术有限公司 Control display method and device, electronic equipment and storage medium
CN114818625A (en) * 2022-06-29 2022-07-29 天津联想协同科技有限公司 Method and device for processing document area

Also Published As

Publication number Publication date
WO2021082716A1 (en) 2021-05-06
CN110851040B (en) 2021-07-20

Similar Documents

Publication Publication Date Title
CN110851040B (en) Information processing method and electronic equipment
CN108762954B (en) Object sharing method and mobile terminal
CN108469898B (en) Image processing method and flexible screen terminal
CN110413168B (en) Icon management method and terminal
CN108737904B (en) Video data processing method and mobile terminal
CN110308839B (en) File management method and terminal equipment
CN108132752B (en) Text editing method and mobile terminal
CN109407932B (en) Icon moving method and mobile terminal
CN107943390B (en) Character copying method and mobile terminal
CN109491738B (en) Terminal device control method and terminal device
CN109782998B (en) Display screen control method and mobile terminal
CN109814786B (en) Image storage method and terminal equipment
CN109739407B (en) Information processing method and terminal equipment
CN110830363B (en) Information sharing method and electronic equipment
CN110442279B (en) Message sending method and mobile terminal
CN107728923B (en) Operation processing method and mobile terminal
CN110502309B (en) Display control method and electronic equipment
CN110673770B (en) Message display method and terminal equipment
CN109683802B (en) Icon moving method and terminal
CN108646960B (en) File processing method and flexible screen terminal
CN108763540B (en) File browsing method and terminal
CN110196668B (en) Information processing method and terminal equipment
CN110308834B (en) Setting method of application icon display mode and terminal
CN108228902B (en) File display method and mobile terminal
CN109271262B (en) Display method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant