CN109408171B - Display control method and terminal - Google Patents

Display control method and terminal Download PDF

Info

Publication number
CN109408171B
CN109408171B CN201811130507.3A CN201811130507A CN109408171B CN 109408171 B CN109408171 B CN 109408171B CN 201811130507 A CN201811130507 A CN 201811130507A CN 109408171 B CN109408171 B CN 109408171B
Authority
CN
China
Prior art keywords
input
screen
sub
sliding
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811130507.3A
Other languages
Chinese (zh)
Other versions
CN109408171A (en
Inventor
林雄周
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811130507.3A priority Critical patent/CN109408171B/en
Publication of CN109408171A publication Critical patent/CN109408171A/en
Application granted granted Critical
Publication of CN109408171B publication Critical patent/CN109408171B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a display control method and a terminal. The display control method comprises the following steps: receiving a first input of a user to a first object displayed on the first screen; in response to the first input, displaying a second object on the first screen, the second object being associated with a second screen; and displaying the first object on the second screen under the condition that the input end position of the first input corresponds to the second object. The embodiment of the invention designs a screen inlet (namely a second object), the content in one screen can be displayed on the other screen through the screen inlet, and a user can display the display content in the first screen in the second screen through simple operation in the first screen, so that the operation is convenient and fast.

Description

Display control method and terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a display control method and a terminal.
Background
At present, in the scenes that a user purchases and pays or adds friends by using a single-side-screen terminal, a two-dimensional code/bar code interface needs to be loaded on the terminal, after the two-dimensional code or bar code interface is displayed on a display screen, the user needs to turn over a mobile phone towards other people to scan the other people, the operation is not convenient enough, and the mobile phone is easy to fall off in places with many people.
Disclosure of Invention
The embodiment of the invention provides a display control method and a terminal, which are used for solving the problem of inconvenient operation of displaying a two-dimensional code or a bar code interface in scenes such as shopping payment or friend adding and the like.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a display control method, which is applied to a terminal, where the terminal includes a first screen and a second screen, and the method includes:
receiving a first input of a user to a first object displayed on the first screen;
in response to the first input, displaying a second object on the first screen, the second object being associated with a second screen;
displaying the first object on the second screen in a case where the input end position of the first input corresponds to the second object;
wherein the corresponding of the input end position of the first input to the second object comprises: the input end position of the first input is located on the second object, or the distance between the input end position of the first input and the second object is smaller than a first preset threshold.
In a second aspect, an embodiment of the present invention provides a terminal, where the terminal includes a first screen and a second screen, and the terminal further includes:
the receiving module is used for receiving a first input of a user to a first object displayed on the first screen;
a response module for displaying a second object on the first screen in response to the first input received by the receiving module, the second object being associated with a second screen;
a display module, configured to display the first object on the second screen when the input end position of the first input received by the receiving module corresponds to the second object;
wherein the corresponding of the input end position of the first input to the second object comprises: the input end position of the first input is located on the second object, or the distance between the input end position of the first input and the second object is smaller than a first preset threshold.
In a third aspect, an embodiment of the present invention provides a terminal, including a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the display control method described above.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the display control method as described above.
In the embodiment of the invention, a screen entry (namely, a second object) is designed, contents in one screen can be displayed on the other screen through the screen entry, and a user can display the display contents in the first screen in the second screen through simple operation in the first screen, so that the operation is convenient and fast.
Drawings
Fig. 1 is a flowchart illustrating a display control method according to an embodiment of the present invention;
FIG. 2 shows one of the schematic diagrams of an example provided by an embodiment of the present invention;
FIG. 3 is a second schematic diagram of an example provided by an embodiment of the present invention;
FIG. 4 is a third schematic diagram of an example provided by an embodiment of the present invention;
FIG. 5 is a fourth schematic diagram of an example provided by an embodiment of the present invention;
FIG. 6 shows a fifth exemplary schematic diagram provided by an embodiment of the present invention;
FIG. 7 shows a sixth exemplary schematic representation provided by an embodiment of the present invention;
fig. 8 shows one of block diagrams of a terminal according to an embodiment of the present invention;
fig. 9 shows a second block diagram of the terminal according to the embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
According to an aspect of an embodiment of the present invention, a display control method is provided, which is applied to a terminal. The terminal at least comprises: a first screen and a second screen. The first screen and the second screen can be screens positioned on the same side of the terminal, for example, the first screen and the second screen are both screens positioned on the front side of the terminal; the screen can also be respectively positioned on different sides of the terminal, for example, the first screen is positioned on the front side of the terminal, and the second screen is positioned on the back side of the terminal, and the screen can be set according to actual requirements under specific conditions.
As shown in fig. 1, the display control method includes:
step 101: a first input of a user to a first object displayed on a first screen is received.
Wherein the first object described herein comprises at least one of: the system comprises an identifier, a file (such as a picture, a document and the like), a video playing window, and any interface content of any application program (such as a bar code display interface, wherein the bar code comprises a one-dimensional bar code and a two-dimensional bar code, and specifically, the bar code can be a payment code and the like).
In the embodiment of the present invention, when a user wants to display a first object displayed on a first screen on a second screen, the first object displayed on the first screen can be displayed on the second screen by performing an operation (i.e., a first input) conforming to a preset condition on the first object on the first screen.
The first input may be a first operation, and at least includes a touch operation, such as: specifically, for example, the first input includes one or more of a single-click operation, a double-click operation, a long-press operation, a slide operation, and the like on the screen: long press operation and slide operation.
Step 102: in response to the first input, a second object is displayed on the first screen.
After detecting a first input to the first object, a second object is displayed on the first screen. The second object is used to display the first object on the second screen, that is, the second object is an entrance where the first object is displayed on the second screen. The second object and the second screen are associated in advance, namely the second object is associated with the second screen.
Step 103: in a case where the input end position of the first input corresponds to the second object, the first object is displayed on the second screen.
In this step, when the first input is finished and corresponds to the second object, the first object is displayed on the second screen, so that the content displayed on one screen is displayed on the other screen. The user can display the content displayed in the first screen in the second screen through simple operation in the first screen, and the operation is convenient and fast.
Wherein, the input ending position of the first input corresponding to the second object includes: the input end position of the first input is located on the second object, or the distance between the input end position of the first input and the second object is smaller than a first preset threshold value. The first preset threshold may be a numerical value or a value range, and the specific condition may be selected according to actual requirements.
For example, when the first input is a sliding operation, the user drags the first object to move in the direction of the second object through the sliding operation, and after the user drags the first object to the second object and finishes the touch operation on the first object, the first object is displayed in the second screen; or when the first object is dragged to a preset distance range from the second object and the touch operation on the first object is finished, displaying the first object in the second screen.
When the first object is displayed on the second screen, the first screen can be controlled to continue to display the first object, and the first object can also be controlled to disappear from the first screen, and the specific situation can be selected according to actual requirements.
In order to more clearly understand the above method, an example is described below.
The first screen is a screen on the front side of the terminal, the second screen is a screen on the back side of the terminal, the first object is a two-dimensional code display interface, and the two-dimensional code displayed on the two-dimensional code display interface is a payment code. When the user uses the payment code to pay, the payment code is called out and displayed on the first screen, when the user wants to display the payment code on the second screen, the payment code is operated, if the terminal detects that the operation of the user on the payment code matches with the operation of displaying the payment code on the second screen, a second object for displaying the payment code on the second screen is first displayed on the first screen, and when the user's manipulation of the payment code is finished on the second object or within a range of 0 to 10mm from the second object, the payment code is controlled to be displayed on the second screen, so that the user does not need the first screen toward the user, offer the payee again after through the upset and scan the sign indicating number, the payee can directly scan the sign indicating number through the payment sign indicating in the second screen, convenient operation also can avoid simultaneously leading to the safety problem that the terminal dropped etc. because of the upset operation.
Further, in the embodiment of the present invention, when the first object is displayed on the second screen, specifically: and displaying the first object in a preset area of the second screen.
The preset area is an area of a preset range in the second screen, and the area of the preset range can be all display areas of the second screen or part of display areas of the second screen, namely, the first object can be displayed on the second screen in a full screen manner, so that the display effect of the first object on the second screen is improved; the method can also be displayed in a non-full screen window of the second screen, so that the occupation of the first object on the display area of the second screen can be reduced, the second screen can also display other contents at the same time, and the utilization rate of the second screen is improved.
Further, in the embodiment of the present invention, the second object may be a mark (hereinafter referred to as a first mark) for indicating the second screen, and since the mark is smaller, the display area occupied by the screen is small, which is convenient for displaying on the screen. In addition, the second object may be a floating frame, where the floating frame includes N (N is a natural number) thumbnails, where each thumbnail corresponds to one object displayed in the second screen, that is, the second screen includes N objects, each object is displayed in one sub-area on the second screen, and in the floating frame, a thumbnail of each object in the second screen is displayed, which may facilitate a user to know a display condition of the second screen. In the embodiment of the invention, the user can display the first object in any sub-area in the second screen.
To further understand the first identification and the floating frame, the following description is made by way of example.
As shown in fig. 2, a two-dimensional code 203 is displayed on the first screen 201, and a first mark 204 for indicating the second screen 202 is displayed on the upper right corner of the first screen 201. When a user wants to display the two-dimensional code 203 displayed on the first screen 201 on the second screen 202, the user can press and drag the two-dimensional code 203 on the first screen 201 to move towards the first identifier 204, and when the user drags the two-dimensional code 203 to the first identifier 204 and releases his hand or drags the two-dimensional code 203 to a preset distance range from the first identifier 204 and releases his hand, the terminal controls the two-dimensional code 203 to be displayed on the second screen 202.
As shown in fig. 3, a two-dimensional code 203 is displayed on the first screen 201, a floating frame 205 is displayed on the upper right corner of the first screen 201, when a user wants to display the two-dimensional code 203 displayed on the first screen 201 on the second screen 202, the user can press and drag the two-dimensional code 203 on the first screen 201 to move in the direction of the floating frame 205, and when the user drags the two-dimensional code 203 to the floating frame 205 and releases his/her hands, or drags the two-dimensional code 203 to a preset distance range from the floating frame 205 and releases his/her hands, the terminal controls the two-dimensional code 203 to be displayed on the second screen 202.
In order to enable the user to know the display position of the first object on the second screen, the display position of the first object can be further identified in the floating frame, that is, the floating frame further includes a second identifier, and the second identifier indicates a target sub-area for displaying the first object in the second screen. When the terminal detects that the input end position of the first input is located on the hover frame, the first object is displayed in the target sub-area in the second screen.
As shown in fig. 4, a dashed box 208 is provided in the floating box 205 at the upper right corner of the first screen 201, and the dashed box 208 is the second marker. When the first object (e.g., two-dimensional code) is displayed on the second screen, it is displayed in particular on the target sub-region identified by the dashed box 208. It is understood that the form of the second indicator is not limited to the dashed box form, and may also be other forms of representation, such as indicating the target sub-area for displaying the first object in the second screen by a shaded area, which may be designed according to actual requirements, and this is not limited by the embodiment of the present invention.
The target sub-area in the second screen, after displaying the first object, further includes: in the event that a third object has been displayed by the target sub-region prior to receiving the first input, displaying the third object in the first sub-region; wherein the first sub-area is any one of the sub-areas of the second screen except the target sub-area.
When the second identifier indicates the first target sub-area, after receiving the first input, the first object is displayed in the first target sub-area, and the two-dimensional code displayed in the first target sub-area is updated to the second target sub-area for display. In this way, the user can display the first object in any one of the sub-areas in the second screen.
In order to improve the operation accuracy, the first object is displayed in the target sub-area in the second screen only when the input end position of the first input is located in the target sub-area, so that the occurrence of misoperation can be avoided.
Wherein, when other objects are also displayed in the second screen, thumbnails of the objects may be displayed in the floating frame. As shown in fig. 4, if another two-dimensional code 207 (or other display contents such as a picture) is displayed before the two-dimensional code 203 in the first screen 201 is not displayed in the second screen 202, a thumbnail 210 of the two-dimensional code 207 displayed in the second screen 202 is displayed in the floating frame 205 of the first screen 201.
Wherein, due to the limited display area of the screen, when the second screen does not have enough display area to display the first object, the first input may be invalidated, i.e., the second object is not displayed; if the second screen has sufficient display area to display the first object, i.e., the target sub-region does not display the object before the first input is received, the second object is displayed in response to the first input. The method specifically comprises the following steps: when a first input of the first object by the user is detected and a display area for displaying the first object with the current size is arranged on the second screen, displaying the second object on the first screen, wherein the second object is represented by the fact that the first object can be displayed on the second screen; if the second screen does not have a display area for displaying the first object with the current size, the second object is not displayed on the first screen, and the first object cannot be displayed on the second screen.
It is understood that, when the second screen does not have a display area for displaying the first object with the current size, the size of the first object and the size of the object in the second screen may also be optionally adjusted to enable the first object to be displayed on the second screen, so that the second screen can display more contents.
The above display method is not limited to the case where at least one object is displayed on the second screen, and is also applicable to the case where no object is displayed on the second screen.
It should be noted that, for the first mark and the floating frame, they can be used separately, for example, only the first mark is used as the entry of the first object displayed on the second screen, or only the floating frame is used as the entry of the first object displayed on the second screen; for example, when the first input is detected, if it is detected that no object is displayed on the second screen, the first identifier is displayed on the first screen; and if the fact that at least one object is displayed on the second screen is detected, displaying the suspension frame on the first screen.
Preferably, the first input comprises: a press sub input and a first slide sub input. Wherein, the pressing sub-input described herein includes but is not limited to: a light touch operation, a heavy touch operation, a long touch operation, and the like. Specifically, a press sub-input of a first object displayed on a first screen by a user is received; displaying a second object in a first area of the first screen in response to the pressing sub-input; receiving a first sliding sub-input of a user (wherein a sliding start position of the first sliding sub-input is located on a first object); in a case where the slide end position of the first slide sub input is located on the second object, the first object is displayed on the second screen.
In the embodiment of the invention, the second object can be called out by pressing the first object, and then the first object is dragged to the second object by dragging the first object, that is, in the input process of the first input, the first object moves to the second object along with the finger of the user corresponding to the first input, so that the first object is displayed in the second screen. Therefore, the second object is called up through one operation, and then the first object is displayed on the second screen through another operation, so that the intention of a user can be clarified, and the occurrence of misoperation can be reduced.
Further, as shown in fig. 5 to 7, in the embodiment of the present invention, the first object includes: a third identification 206, the first input comprising: and a second sliding sub-input.
The third mark is a mark capable of moving positions, and a correlation relationship is established between the third mark and the first object. The third identifier may be displayed when the terminal detects that the first screen displays the first object.
In the embodiment of the invention, the first input can directly act on the third identifier, and the first object is displayed in the second screen by operating the third identifier. The specific implementation mode is as follows:
receiving a second sliding sub input of the user (wherein the sliding starting position of the second sliding sub input corresponds to a third identifier); displaying a second object in a second area of the first screen in response to the second slide sub-input; in a case where the slide end position of the second slide sub input is located on the second object, the first object is displayed on the second screen.
Wherein, the third identifier corresponding to the sliding start position of the second sliding sub input here includes: the sliding starting position of the second sliding sub input is located on the third identifier, or the distance between the sliding starting position of the second sliding sub input and the third identifier is smaller than a third preset threshold value.
The display position of the second area is associated with the display position of the third mark, the second area can be changed according to the change of the display position of the third mark, and the second area is generally positioned near the third mark, so that the third mark can be conveniently dragged to the second object. Of course, it can be understood that the display position of the second area may also be designed to be unrelated to the display position of the third mark, and in particular, the display position may be designed according to actual requirements.
Here, the second region may be the same region as or different from the first region described above. In addition, the first object can select not to display the third identifier when being displayed on the second screen.
In order to better understand the above technical solutions, some examples are described below.
As shown in fig. 5, a two-dimensional code 203 is displayed on the first screen 201, a first mark 204 for indicating the second screen 202 is displayed on the upper right corner of the first screen 201, and a third mark 206 is displayed on the upper left corner of the two-dimensional code 203. When a user wants to display the two-dimensional code 203 displayed on the first screen 201 on the second screen 202, the user can press and drag the third identifier 206 on the first screen 201 to move in the direction of the first identifier 204, and at this time, the user can set the two-dimensional code 203 to move along with the third identifier 206, or set the two-dimensional code 203 not to move along with the third identifier 206. When the user drags the third identifier 206 to the first identifier 204 and releases his hand, or drags the third identifier 206 to a range of a preset distance from the first identifier 204 and releases his hand, the terminal controls the two-dimensional code 203 to be displayed on the second screen 202.
As shown in fig. 6, a two-dimensional code 203 is displayed on the first screen 201, a floating frame 205 is displayed on the upper right corner of the first screen 201, and a third identifier 206 is displayed on the upper left corner of the two-dimensional code 203. When the user wants to display the two-dimensional code 203 displayed on the first screen 201 on the second screen 202, the user can press and drag the third identifier 206 on the first screen 201 to move in the direction of the floating frame 205, and at this time, the two-dimensional code 203 can be set to move along with the third identifier 206, or the two-dimensional code 203 can be set not to move along with the third identifier 206. When the user drags the third identifier 206 to the floating frame 205 and releases his hand, or drags the third identifier 206 to a preset distance range from the floating frame 205 and releases his hand, the terminal controls the two-dimensional code 203 to be displayed on the second screen 202.
As shown in fig. 7, a two-dimensional code 203 is displayed on the first screen 201, a floating frame 205 is displayed in the upper right corner of the first screen 201, and a thumbnail 210 of a two-dimensional code 207 in the second screen 202 and a dotted frame 208 indicating the display position of the first object in the second screen are displayed in the floating frame 205. A third logo 206 is displayed on the upper left corner of the two-dimensional code 203. When the user wants to display the two-dimensional code 203 displayed on the first screen 201 on the second screen 202, the user can press and drag the third identifier 206 on the first screen 201 to move in the direction of the floating frame 205, and at this time, the two-dimensional code 203 can be set to move along with the third identifier 206, or the two-dimensional code 203 can be set not to move along with the third identifier 206. When the user drags the third identifier 206 to the floating frame 205 and releases his hand, or drags the third identifier 206 to a preset distance range from the floating frame 205 and releases his hand, the terminal controls the two-dimensional code 203 to be displayed in the target sub-area corresponding to the dashed-line frame 208 in the second screen 202.
Finally, it should be noted that, in the embodiment of the present invention, values of the first preset threshold, the second preset threshold, and the third preset threshold may be the same or different, and specific conditions may be selected according to actual requirements.
In summary, the embodiment of the present invention provides a screen entry (i.e., a second object), through which content in one of the screens can be displayed, and a user can perform simple operation on the first screen to display the display content in the first screen on the second screen, which is convenient and fast to operate.
According to another aspect of the embodiments of the present invention, there is provided a terminal 800, where the terminal 800 includes at least a first screen and a second screen, and details of the display control method can be implemented and the same effects can be achieved.
As shown in fig. 8, the terminal 800 includes:
a receiving module 801, configured to receive a first input of a first object displayed on the first screen from a user.
A response module 802, configured to display a second object on the first screen in response to the first input received by the receiving module 801.
Wherein the second object is associated with a second screen.
A display module 803, configured to display the first object on the second screen when the input end position of the first input received by the receiving module 801 corresponds to the second object.
Wherein the corresponding of the input end position of the first input to the second object comprises: the input end position of the first input is located on the second object, or the distance between the input end position of the first input and the second object is smaller than a first preset threshold.
Further, the second object is a first identifier, and the first identifier is used for indicating a second screen.
Further, the display module 803 includes:
and the first display unit is used for displaying the first object in a preset area of the second screen.
And the preset area is an area within a preset range in the second screen.
Further, the second screen comprises N objects, and each object is displayed in a sub-area on the second screen.
The second object is a floating frame which comprises N thumbnails, wherein each thumbnail corresponds to one object displayed in the second screen; n is a natural number.
Further, a second identifier is further included in the floating frame, and the second identifier indicates a target sub-area in the second screen.
The display module 803 includes:
a second display unit, configured to display the first object in the target sub-area in the second screen.
Further, the target sub-region does not display an object before the receiving module 801 receives the first input.
Wherein the input end position of the first input is located in the target sub-region.
Further, the first input includes a press sub-input and a first slide sub-input.
The receiving module 801 includes:
a first receiving unit for receiving a press sub-input of a first object displayed on the first screen by a user.
The response module 802 includes:
and the first response unit is used for responding to the press sub-input received by the first receiving unit and displaying a second object in the first area of the first screen.
And the second receiving unit is used for receiving the first sliding sub-input of the user.
Wherein, the sliding start position of the first sliding sub input corresponds to a first object, and the sliding start position of the first sliding sub input corresponds to the first object, and the method comprises the following steps: the sliding starting position of the first sliding sub input is located on the first object, or the distance between the sliding starting position of the first sliding sub input and the first object is smaller than a second preset threshold value.
The display module 803 includes:
and the third display unit is used for displaying the first object on the second screen under the condition that the sliding end position of the first sliding sub-input received by the second receiving unit corresponds to the second object displayed by the first response unit after responding to the pressing sub-input.
Further, the first input comprises a second slide sub-input; the first object includes a third identifier.
The receiving module 801 includes:
and the third receiving unit is used for receiving a second sliding sub-input of the user, and the sliding starting position of the second sliding sub-input corresponds to the third identifier.
Wherein, the sliding start position of the second sliding sub-input corresponding to the third identifier comprises: the sliding starting position of the second sliding sub input is located on the third identifier, or the distance between the sliding starting position of the second sliding sub input and the third identifier is smaller than a third preset threshold value.
The response module 802 includes:
and the second response unit is used for responding to the second sliding sub-input received by the third receiving unit and displaying a second object in a second area of the first screen.
Wherein the display position of the second area is associated with the display position of the third identifier.
The display module 803 includes:
and the fourth display unit is used for displaying the first object on the second screen under the condition that the sliding end position of the second sliding sub input received by the third receiving unit corresponds to the second object displayed by the second response unit after responding to the second sliding sub input.
Further, during the input process of the first input, the first object moves along with the finger of the user corresponding to the first input.
Further, the first object comprises at least one of: identification, file, any interface content of any application.
Further, the first object includes a payment code.
In summary, the embodiment of the present invention provides a screen entry (i.e., a second object), through which the content in one of the screens can be displayed on the other screen, and a user can display the content displayed in the first screen on the second screen by performing a simple operation on the first screen, which is convenient and fast to operate.
Fig. 9 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention.
The terminal 900 includes but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, a processor 910, and a power supply 911. Those skilled in the art will appreciate that the terminal configuration shown in fig. 9 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
A processor 910, configured to, after the user input unit 907 receives a first input to a first object displayed on the first screen by a user, control, through the display unit 906, to display a second object on the first screen in response to the first input, and display the first object on the second screen in a case where an input end position of the first input corresponds to the second object.
Wherein the second object is associated with a second screen.
Wherein the corresponding of the input end position of the first input to the second object comprises: the input end position of the first input is located on the second object, or the distance between the input end position of the first input and the second object is smaller than a first preset threshold.
The embodiment of the invention designs a screen inlet (namely a second object), the content in one screen can be displayed on the other screen through the screen inlet, and a user can display the display content in the first screen in the second screen through simple operation in the first screen, so that the operation is convenient and fast.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 901 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 910; in addition, the uplink data is transmitted to the base station. Generally, the radio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 901 can also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 902, such as helping the user send and receive e-mails, browse web pages, access streaming media, and the like.
The audio output unit 903 may convert audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output as sound. Also, the audio output unit 903 may also provide audio output related to a specific function performed by the terminal 900 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
The input unit 904 is used to receive audio or video signals. The input Unit 904 may include a Graphics Processing Unit (GPU) 9041 and a microphone 9042, and the Graphics processor 9041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 906. The image frames processed by the graphic processor 9041 may be stored in the memory 909 (or other storage medium) or transmitted via the radio frequency unit 901 or the network module 902. The microphone 9042 can receive sounds and can process such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 901 in case of the phone call mode.
Terminal 900 can also include at least one sensor 905, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 9061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 9061 and/or backlight when the terminal 900 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 905 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described in detail herein.
The display unit 906 is used to display information input by the user or information provided to the user. The Display unit 906 may include a Display panel 9061, and the Display panel 9061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 907 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 907 includes a touch panel 9071 and other input devices 9072. The touch panel 9071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 9071 (e.g., operations by a user on or near the touch panel 9071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 9071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 910, receives a command from the processor 910, and executes the command. In addition, the touch panel 9071 may be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 907 may include other input devices 9072 in addition to the touch panel 9071. Specifically, the other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (such as a volume control key, a switch key, and the like), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 9071 may be overlaid on the display panel 9061, and when the touch panel 9071 detects a touch operation on or near the touch panel 9071, the touch panel is transmitted to the processor 910 to determine the type of the touch event, and then the processor 910 provides a corresponding visual output on the display panel 9061 according to the type of the touch event. Although in fig. 9, the touch panel 9071 and the display panel 9061 are two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 9071 and the display panel 9061 may be integrated to implement the input and output functions of the terminal, which is not limited herein.
The interface unit 908 is an interface through which an external device is connected to the terminal 900. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Interface unit 908 can be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within terminal 900 or can be used to transmit data between terminal 900 and external devices.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 909 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 910 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by running or executing software programs and/or modules stored in the memory 909 and calling data stored in the memory 909, thereby integrally monitoring the terminal. Processor 910 may include one or more processing units; preferably, the processor 910 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 910.
The terminal 900 can also include a power supply 911 (e.g., a battery) for powering the various components, and preferably, the power supply 911 can be logically connected to the processor 910 via a power management system such that the functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, the terminal 900 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 910, a memory 909, and a computer program stored in the memory 909 and capable of running on the processor 910, where the computer program is executed by the processor 910 to implement each process of the above-mentioned display control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the display control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (22)

1. A display control method is applied to a terminal, the terminal comprises a first screen and a second screen, and the method is characterized by comprising the following steps:
receiving a first input of a user to a first object displayed on the first screen;
in response to the first input, displaying a second object on the first screen, the second object being associated with a second screen;
displaying the first object on the second screen in a case where the input end position of the first input corresponds to the second object;
wherein the corresponding of the input end position of the first input to the second object comprises: the input end position of the first input is located on the second object, or the distance between the input end position of the first input and the second object is smaller than a first preset threshold;
the second screen comprises N objects, and each object is displayed in a sub-area on the second screen;
the second object is a floating frame which comprises N thumbnails, wherein each thumbnail corresponds to one object displayed in the second screen; n is a natural number.
2. The method of claim 1, wherein the second object is a first indicator, and wherein the first indicator indicates a second screen.
3. The method of claim 2, wherein said displaying the first object on the second screen comprises:
displaying the first object in a preset area of the second screen;
and the preset area is an area within a preset range in the second screen.
4. The method of claim 1, further comprising a second identifier in the floating frame, the second identifier indicating a target sub-area in the second screen;
the displaying, on the second screen, the first object includes:
displaying the first object in the target sub-region in the second screen.
5. The method of claim 4, wherein prior to receiving the first input, the target sub-region does not display an object;
the input end position of the first input is located in the target sub-region.
6. The method of claim 1 or 2, wherein the first input comprises a press sub-input and a first slide sub-input;
the receiving a first input of a user to a first object displayed on the first screen comprises:
receiving a press sub-input of a first object displayed on the first screen by a user;
the displaying a second object on the first screen in response to the first input, comprising:
displaying a second object in a first area of the first screen in response to the press sub-input;
receiving a first sliding sub-input of a user, wherein the sliding starting position of the first sliding sub-input corresponds to the first object;
wherein, the sliding start position of the first sliding sub-input corresponding to the first object comprises: the sliding starting position of the first sliding sub input is located on the first object, or the distance between the sliding starting position of the first sliding sub input and the first object is smaller than a second preset threshold;
the displaying, on the second screen, the first object in a case where the input end position of the first input corresponds to the second object, includes:
and in the case that the sliding end position of the first sliding sub-input corresponds to the second object, displaying the first object on the second screen.
7. The method of claim 1 or 2, wherein the first input comprises a second slide sub-input; the first object comprises a third identifier;
the receiving a first input of a user to a first object displayed on the first screen comprises:
receiving a second sliding sub-input of a user, wherein the sliding starting position of the second sliding sub-input corresponds to the third identifier;
wherein, the sliding start position of the second sliding sub-input corresponding to the third identifier comprises: the sliding starting position of the second sliding sub input is located on the third identifier, or the distance between the sliding starting position of the second sliding sub input and the third identifier is smaller than a third preset threshold;
the displaying a second object on the first screen in response to the first input, comprising:
displaying a second object in a second area of the first screen in response to the second slide sub-input;
wherein the display position of the second area is associated with the display position of the third identifier;
the displaying, on the second screen, the first object in a case where the input end position of the first input corresponds to the second object, includes:
and displaying the first object on the second screen under the condition that the sliding end position of the second sliding sub input corresponds to the second object.
8. The method of claim 1, wherein the first object follows a user finger movement corresponding to the first input during the input of the first input.
9. The method of claim 1, wherein the first object comprises at least one of: identification, file, any interface content of any application.
10. The method of claim 1, wherein the first object comprises a payment code.
11. A terminal, the terminal includes first screen and second screen, its characterized in that, the terminal still includes:
the receiving module is used for receiving a first input of a user to a first object displayed on the first screen;
a response module for displaying a second object on the first screen in response to the first input received by the receiving module, the second object being associated with a second screen;
a display module, configured to display the first object on the second screen when the input end position of the first input received by the receiving module corresponds to the second object;
wherein the corresponding of the input end position of the first input to the second object comprises: the input end position of the first input is located on the second object, or the distance between the input end position of the first input and the second object is smaller than a first preset threshold;
the second screen comprises N objects, and each object is displayed in a sub-area on the second screen;
the second object is a floating frame which comprises N thumbnails, wherein each thumbnail corresponds to one object displayed in the second screen; n is a natural number.
12. The terminal of claim 11, wherein the second object is a first indicator, and wherein the first indicator is used to indicate a second screen.
13. The terminal of claim 12, wherein the display module comprises:
the first display unit is used for displaying the first object in a preset area of the second screen;
and the preset area is an area within a preset range in the second screen.
14. The terminal of claim 11, wherein the floating frame further comprises a second identifier, and the second identifier indicates a target sub-area in the second screen;
the display module includes:
a second display unit, configured to display the first object in the target sub-area in the second screen.
15. The terminal of claim 14, wherein the target sub-region does not display an object before the receiving module receives the first input;
the input end position of the first input is located in the target sub-region.
16. The terminal according to claim 11 or 12, wherein the first input comprises a press sub-input and a first slide sub-input;
the receiving module includes:
a first receiving unit for receiving a press sub-input of a first object displayed on the first screen by a user;
the response module includes:
a first response unit, configured to display a second object in a first area of the first screen in response to the press sub-input received by the first receiving unit;
the second receiving unit is used for receiving a first sliding sub-input of a user, and the sliding starting position of the first sliding sub-input corresponds to the first object;
wherein, the sliding start position of the first sliding sub-input corresponding to the first object comprises: the sliding starting position of the first sliding sub input is located on the first object, or the distance between the sliding starting position of the first sliding sub input and the first object is smaller than a second preset threshold;
the display module includes:
and the third display unit is used for displaying the first object on the second screen under the condition that the sliding end position of the first sliding sub-input received by the second receiving unit corresponds to the second object displayed by the first response unit after responding to the pressing sub-input.
17. A terminal as claimed in claim 11 or 12, characterised in that the first input comprises a second slide sub-input; the first object comprises a third identifier;
the receiving module includes:
a third receiving unit, configured to receive a second sliding sub-input of the user, where a sliding start position of the second sliding sub-input corresponds to the third identifier;
wherein, the sliding start position of the second sliding sub-input corresponding to the third identifier comprises: the sliding starting position of the second sliding sub input is located on the third identifier, or the distance between the sliding starting position of the second sliding sub input and the third identifier is smaller than a third preset threshold;
the response module includes:
a second response unit, configured to display a second object in a second area of the first screen in response to the second slide sub-input received by the third receiving unit;
wherein the display position of the second area is associated with the display position of the third identifier;
the display module includes:
and the fourth display unit is used for displaying the first object on the second screen under the condition that the sliding end position of the second sliding sub input received by the third receiving unit corresponds to the second object displayed by the second response unit after responding to the second sliding sub input.
18. The terminal of claim 11, wherein the first object follows a user finger movement corresponding to the first input during the input of the first input.
19. The terminal of claim 11, wherein the first object comprises at least one of: identification, file, any interface content of any application.
20. The terminal of claim 11, wherein the first object comprises a payment code.
21. A terminal comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the display control method according to any one of claims 1 to 10.
22. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the display control method according to any one of claims 1 to 10.
CN201811130507.3A 2018-09-27 2018-09-27 Display control method and terminal Active CN109408171B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811130507.3A CN109408171B (en) 2018-09-27 2018-09-27 Display control method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811130507.3A CN109408171B (en) 2018-09-27 2018-09-27 Display control method and terminal

Publications (2)

Publication Number Publication Date
CN109408171A CN109408171A (en) 2019-03-01
CN109408171B true CN109408171B (en) 2021-06-22

Family

ID=65466484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811130507.3A Active CN109408171B (en) 2018-09-27 2018-09-27 Display control method and terminal

Country Status (1)

Country Link
CN (1) CN109408171B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110007841B (en) * 2019-03-29 2021-05-18 联想(北京)有限公司 Control method and electronic equipment
CN113330407A (en) * 2019-04-11 2021-08-31 深圳市柔宇科技股份有限公司 Interaction method, flexible electronic device and readable storage medium
CN110196702A (en) * 2019-05-06 2019-09-03 珠海格力电器股份有限公司 A kind of file content inspection method, device, terminal and storage medium
CN110489029B (en) * 2019-07-22 2021-07-13 维沃移动通信有限公司 Icon display method and terminal equipment
CN115116189A (en) * 2022-06-14 2022-09-27 武汉同创元盛科技有限公司 Management terminal device and system for guaranteeing payment safety

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2712165A1 (en) * 2012-09-25 2014-03-26 Samsung Electronics Co., Ltd Method and electronic device for transmitting images during a messaging session
CN106170808A (en) * 2016-06-22 2016-11-30 北京小米移动软件有限公司 Pay changing method and device
CN106210307A (en) * 2016-07-08 2016-12-07 努比亚技术有限公司 Mobile terminal and screen switching
CN106228364A (en) * 2016-07-15 2016-12-14 珠海市魅族科技有限公司 A kind of information demonstrating method and present device
CN107508974A (en) * 2017-08-10 2017-12-22 努比亚技术有限公司 A kind of interface display method, mobile terminal and computer-readable recording medium
CN107645611A (en) * 2017-10-17 2018-01-30 维沃移动通信有限公司 A kind of method of payment and mobile terminal
CN107835316A (en) * 2017-11-09 2018-03-23 青岛海信移动通信技术股份有限公司 Method for displaying image and device, storage medium for mobile terminal
CN108037901A (en) * 2017-10-31 2018-05-15 努比亚技术有限公司 Display content method for handover control, terminal and computer-readable recording medium
CN108170354A (en) * 2016-12-07 2018-06-15 中兴通讯股份有限公司 A kind of display methods and device, mobile terminal
CN108205430A (en) * 2017-11-01 2018-06-26 中兴通讯股份有限公司 Dual-screen mobile terminal, corresponding control method and storage medium
CN108323197A (en) * 2016-12-27 2018-07-24 华为技术有限公司 A kind of method and apparatus of multihead display
CN108510390A (en) * 2018-03-30 2018-09-07 深圳市富途网络科技有限公司 A kind of information suspension display unit and method for securities system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101548958B1 (en) * 2008-09-18 2015-09-01 삼성전자주식회사 A method for operating control in mobile terminal with touch screen and apparatus thereof.
CN108132747A (en) * 2017-01-03 2018-06-08 中兴通讯股份有限公司 A kind of screen content switching method and dual-screen mobile terminal

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2712165A1 (en) * 2012-09-25 2014-03-26 Samsung Electronics Co., Ltd Method and electronic device for transmitting images during a messaging session
CN106170808A (en) * 2016-06-22 2016-11-30 北京小米移动软件有限公司 Pay changing method and device
CN106210307A (en) * 2016-07-08 2016-12-07 努比亚技术有限公司 Mobile terminal and screen switching
CN106228364A (en) * 2016-07-15 2016-12-14 珠海市魅族科技有限公司 A kind of information demonstrating method and present device
CN108170354A (en) * 2016-12-07 2018-06-15 中兴通讯股份有限公司 A kind of display methods and device, mobile terminal
CN108323197A (en) * 2016-12-27 2018-07-24 华为技术有限公司 A kind of method and apparatus of multihead display
CN107508974A (en) * 2017-08-10 2017-12-22 努比亚技术有限公司 A kind of interface display method, mobile terminal and computer-readable recording medium
CN107645611A (en) * 2017-10-17 2018-01-30 维沃移动通信有限公司 A kind of method of payment and mobile terminal
CN108037901A (en) * 2017-10-31 2018-05-15 努比亚技术有限公司 Display content method for handover control, terminal and computer-readable recording medium
CN108205430A (en) * 2017-11-01 2018-06-26 中兴通讯股份有限公司 Dual-screen mobile terminal, corresponding control method and storage medium
CN107835316A (en) * 2017-11-09 2018-03-23 青岛海信移动通信技术股份有限公司 Method for displaying image and device, storage medium for mobile terminal
CN108510390A (en) * 2018-03-30 2018-09-07 深圳市富途网络科技有限公司 A kind of information suspension display unit and method for securities system

Also Published As

Publication number Publication date
CN109408171A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
CN108668083B (en) Photographing method and terminal
CN108536365B (en) Image sharing method and terminal
CN108471498B (en) Shooting preview method and terminal
CN109408171B (en) Display control method and terminal
CN108491123B (en) Method for adjusting application program icon and mobile terminal
CN109032445B (en) Screen display control method and terminal equipment
CN108132752B (en) Text editing method and mobile terminal
CN110069178B (en) Interface control method and terminal equipment
CN110830363B (en) Information sharing method and electronic equipment
CN111142723B (en) Icon moving method and electronic equipment
CN108228902B (en) File display method and mobile terminal
CN110531915B (en) Screen operation method and terminal equipment
CN109407949B (en) Display control method and terminal
CN109683802B (en) Icon moving method and terminal
CN108062194B (en) Display method and device and mobile terminal
CN107741814B (en) Display control method and mobile terminal
CN110944139B (en) Display control method and electronic equipment
CN109388324B (en) Display control method and terminal
CN110795189A (en) Application starting method and electronic equipment
CN108132749B (en) Image editing method and mobile terminal
CN108469940B (en) Screenshot method and terminal
CN108108113B (en) Webpage switching method and device
CN110795002A (en) Screenshot method and terminal equipment
CN111061446A (en) Display method and electronic equipment
CN110941469A (en) Application body-splitting creating method and terminal equipment thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant