WO2022213831A1 - Procédé d'affichage de commande et dispositif associé - Google Patents

Procédé d'affichage de commande et dispositif associé Download PDF

Info

Publication number
WO2022213831A1
WO2022213831A1 PCT/CN2022/083215 CN2022083215W WO2022213831A1 WO 2022213831 A1 WO2022213831 A1 WO 2022213831A1 CN 2022083215 W CN2022083215 W CN 2022083215W WO 2022213831 A1 WO2022213831 A1 WO 2022213831A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
electronic device
target
control
controls
Prior art date
Application number
PCT/CN2022/083215
Other languages
English (en)
Chinese (zh)
Inventor
周雪怡
徐杰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022213831A1 publication Critical patent/WO2022213831A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present application relates to the field of communication technologies, and in particular, to a control display method and related equipment.
  • a drag-and-drop operation is often involved to move objects such as a window or a file displayed in the device interface to a certain position. This position may correspond to a short-distance or long-distance movement within the current interface, or may be a movement across devices or across screens.
  • the electronic device in order to realize cross-device or cross-screen movement, the electronic device often triggers the display of corresponding functional controls based on the user's drag operation on the window. At this time, the user needs to drag the window continuously until a certain functional control is selected. However, as mentioned above, during the dragging process, the user often needs to keep clicking the selected window and not release it until it moves to the desired function control and then release it. Therefore, in the case of dragging with a long path, Especially in the environment where the touchpad is used for operation, the user's operation burden will be greatly increased, and the operation efficiency and operation experience thereof will also be greatly reduced. In addition, the frequency of dragging and dropping a window or file in the current interface is often greater than the movement across devices or screens. Therefore, if the corresponding function control is triggered to be displayed every time a window is dragged, it will cause visual disturbance to the actual operation of the user. interference and reduce the user experience.
  • Embodiments of the present application provide a control display method and related devices, so as to improve the user's operation experience.
  • an embodiment of the present application provides a control display method, which is applied to a first electronic device, and may include:
  • the first interface includes a target object, the target object includes a first target point, and the first target point is located at a first position of the first interface; receiving and responding to the The first drag operation of the first target point on the target object, determining the first area where the first position is located, and determining that the first target point is located at the second position of the first interface; if detecting When the second position is within the first area, and the duration of the first drag operation is greater than the first duration threshold, display the target control at the third position of the first interface; receive and respond In the second drag operation for the first target point, it is determined that the first target point is located at the fourth position of the first interface, and a target frame is displayed at the third position; the fourth position in the second area where the third position is located; the target frame includes N controls, where N is an integer greater than or equal to 1; receiving and responding to a click operation on one of the N controls, The second interface is displayed.
  • the electronic device can determine the user's operation intention by detecting the time and movement range of the drag operation performed by the user. If the user quickly drags the window or file (that is, the target object) away from the original position in a short period of time, it can be considered that the user only wants to move the window or file in this interface, and the display may not be triggered for sharing or Controls (ie, target controls, or portals) for cross-screen display (or switching screen displays). Conversely, if the user does not drag the window or file away from a certain range (ie, the first area) where the original location is located for a period of time, it can be considered that the user wants to share the window or file across devices or display it across screens.
  • a certain range ie, the first area
  • the first electronic device includes a first display screen and M second display screens, and the first interface and the second interface are interfaces displayed on the first display screen ;
  • the N controls include M switch screen display controls corresponding to the M second display screens one-to-one, and M is an integer greater than or equal to 1; the receiving and responding to the N controls
  • the click operation of one of the controls of the , to display the second interface specifically includes: receiving and responding to the click operation for the ith switching screen display control in the M switching screen display controls, displaying a second interface: the first switching screen display control.
  • the second interface does not include the target object;
  • the third interface displayed on the i-th second display screen corresponding to the i-th switch screen display control includes the target object; i is greater than or equal to 1, and An integer less than or equal to M.
  • the N controls include a full-screen display control; the receiving and responding to a click operation on one of the N controls, displaying the second interface, specifically includes: receiving and in response to a click operation on the full-screen display control, a second interface is displayed; the second interface includes the target object, and the size of the target object in the second interface is larger than that in the first interface. Describe the size of the target object.
  • the N widgets include K shared widgets corresponding to the K second electronic devices one-to-one; the receiving and responding to a request for one of the N widgets Click operation to display the second interface, which specifically includes: receiving and responding to the click operation on the jth sharing control among the K sharing controls, sending the target object to the jth sharing control corresponding to the jth sharing control.
  • the jth second electronic device displays a second interface; wherein, the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
  • the method further includes:
  • Receive and respond to the third drag operation for the x th sharing control in the K sharing controls form a snapshot corresponding to the x th sharing control, and detect that the x th sharing control corresponds to the third drag operation.
  • a multi-device sharing control corresponding to the xth second electronic device and the yth second electronic device is generated; wherein, the N controls also include all The multi-device sharing control; x and y are integers greater than or equal to 1 and less than or equal to K.
  • the receiving and responding to a click operation on one of the N controls, and displaying the second interface specifically includes: receiving and responding to a click operation on the multi-device sharing control. Click operation to send the target object to the x-th second electronic device and the y-th second electronic device corresponding to the multi-device sharing control, respectively, and display a second interface; wherein, the The second interface is the same as the first interface.
  • the receiving and responding to a second drag operation for the first target point determine that the first target point is located at a fourth position on the first interface, and Displaying a target frame at the third position specifically includes: receiving and responding to a second drag operation for the first target point, gradually reducing the size of the target object, and gradually increasing the size of the target control; determining that the first target point is located at the fourth position of the first interface, and displaying a target frame at the third position; wherein, when it is detected that the first target point is located at the fourth position where the third position is located In the second area, change the color of the target control to the target color; the target color is used to prompt the user that the first target point is located in the second area where the third position is located, so as to complete the second dragging drag operation.
  • the direction in which the first position points to the third position is different from the direction in which the first position points to the second position.
  • the target object is any one of a window, a file, a business card and a link.
  • an embodiment of the present application provides an electronic device, the electronic device is a first electronic device, and includes a first display screen, a memory, and one or more processors; the first display screen, the memory coupled to the one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors invoke to cause the electronic device to execute : displaying a first interface; wherein, the first interface includes a target object, the target object includes a first target point, and the first target point is located at a first position of the first interface; receiving and responding to the the first drag operation of the first target point on the target object, determine the first area where the first position is located, and determine that the first target point is located at the second position of the first interface; if It is detected that the second position is within the first area, and the duration of the first drag operation is greater than the first duration threshold, displaying a target control at the third position of the first interface; receiving and In response to a second drag operation for the first target point, it is determined that the first target point
  • the electronic device further includes M second display screens; the first display screen, the M second display screens, the memory and the one or more processors coupling; the first interface and the second interface are interfaces displayed on the first display screen; the N controls include M switch screen displays corresponding to the M second display screens one-to-one A control, where M is an integer greater than or equal to 1; the one or more processors are further configured to invoke the computer instructions to cause the electronic device to execute:
  • the second interface does not include the target object; the i-th switch
  • the third interface displayed on the i-th second display screen corresponding to the screen display control includes the target object; i is an integer greater than or equal to 1 and less than or equal to M.
  • the N controls include a full-screen display control; the one or more processors are further configured to invoke the computer instructions to cause the electronic device to execute: receive and respond to the The click operation of the full-screen display control displays a second interface; the second interface includes the target object, and the size of the target object in the second interface is larger than the size of the target object in the first interface .
  • the N controls include K shared controls corresponding to the K second electronic devices one-to-one; the one or more processors are further configured to invoke the computer instructions to make The electronic device performs: receiving and responding to a click operation for the jth sharing control among the K sharing controls, sending the target object to the jth second corresponding to the jth sharing control; The electronic device displays a second interface; wherein, the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
  • the one or more processors are further configured to invoke the computer instructions to cause the electronic device to execute: receive and respond to the xth sharing among the K sharing controls
  • the third drag-and-drop operation of the control forms a snapshot corresponding to the xth sharing control, and when it is detected that the snapshot corresponding to the xth sharing control is located in the third area where the yth sharing control is located, a snapshot corresponding to the xth sharing control is generated.
  • the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: receiving and responding to a click operation on the multi-device sharing control, converting all The target object is respectively sent to the xth second electronic device and the yth second electronic device corresponding to the multi-device sharing control, and a second interface is displayed; wherein, the second interface and the The first interface is the same as described above.
  • the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: receiving and responding to a second drag operation for the first target point , gradually reduce the size of the target object, and gradually increase the size of the target control; determine that the first target point is located at the fourth position of the first interface, and display a target frame at the third position ; wherein, when it is detected that the first target point is located in the second area where the third position is located, the color of the target control is changed to a target color; the target color is used to prompt the user that the first target The point is located in the second area where the third position is located to complete the second drag operation.
  • the direction in which the first position points to the third position is different from the direction in which the first position points to the second position.
  • the target object is any one of a window, a file, a business card and a link.
  • an embodiment of the present application provides a control display device, which is applied to a first electronic device and may include:
  • a first display unit configured to display a first interface; wherein the first interface includes a target object, the target object includes a first target point, and the first target point is located at a first position of the first interface;
  • a first determining unit configured to receive and respond to a first drag operation for the first target point on the target object, determine a first area where the first position is located, and determine the first target the point is located at the second position of the first interface
  • the second display unit is configured to, if it is detected that the second position is within the first area, and the duration of the first drag operation is greater than the first duration threshold, display the display at the third position of the first interface. Display the target control on the position;
  • a third display unit configured to receive and respond to a second drag operation for the first target point, determine that the first target point is located at a fourth position on the first interface, and is at the third position A target frame is displayed on the top; the fourth position is in the second area where the third position is located; the target frame includes N controls, and N is an integer greater than or equal to 1;
  • the fourth display unit is configured to receive and respond to a click operation on one of the N controls to display a second interface.
  • the first electronic device includes a first display screen and M second display screens, and the first interface and the second interface are interfaces displayed on the first display screen
  • the N controls include M switch screen display controls corresponding to the M second display screens one-to-one, and M is an integer greater than or equal to 1; the fourth display unit is specifically used for:
  • the second interface does not include the target object; the i-th switch
  • the third interface displayed on the i-th second display screen corresponding to the screen display control includes the target object; i is an integer greater than or equal to 1 and less than or equal to M.
  • the N controls include a full-screen display control; the fourth display unit is specifically used for:
  • the second interface includes the target object, and the size of the target object in the second interface is larger than that in the first interface The size of the target object.
  • the N controls include K share controls corresponding to the K second electronic devices one-to-one; the fourth display unit is specifically used for:
  • the device further includes:
  • the generating unit is used to receive and respond to the third drag operation for the xth sharing control in the K sharing controls, and form a snapshot corresponding to the xth sharing control, when detecting the xth sharing control
  • the control also includes the multi-device sharing control; x and y are integers greater than or equal to 1 and less than or equal to K.
  • the fourth display unit is specifically used for:
  • Receive and respond to the click operation for the multi-device sharing control respectively sending the target object to the xth second electronic device and the yth second electronic device corresponding to the multi-device sharing control device, and display a second interface; wherein, the second interface is the same as the first interface.
  • the third display unit is specifically used for:
  • the direction in which the first position points to the third position is different from the direction in which the first position points to the second position.
  • the target object is any one of a window, a file, a business card and a link.
  • an embodiment of the present application provides a computer storage medium for storing computer software instructions used for the control display device provided in the third aspect, including a program designed to execute the above aspect.
  • an embodiment of the present application provides a computer program, the computer program includes instructions, when the computer program is executed by a computer, the computer can execute the process performed by the control display device in the third aspect.
  • FIG. 1 is a schematic diagram of a cross-screen switching display.
  • Figures 2a-2d are a set of schematic diagrams of user interfaces.
  • FIG. 3 is a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
  • FIG. 4 is a software structural block diagram of an electronic device 100 provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of a control display method provided by an embodiment of the present application.
  • 6a-6c are schematic diagrams of a group of user interfaces provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a change process of a transmission gate provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of another transmission gate change process provided by an embodiment of the present application.
  • Fig. 9a is a schematic diagram of a transmission gate and a button backplane provided by an embodiment of the present application.
  • FIG. 9b is a schematic diagram of a button backplane provided by an embodiment of the present application.
  • 10a-10h are schematic diagrams of a group of user interfaces provided by embodiments of the present application.
  • FIG. 11 is a schematic diagram of a transmission gate provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of another transmission gate provided by an embodiment of the present application.
  • 13a-13e are schematic diagrams of a group of user interfaces provided by an embodiment of the present application.
  • 14a-14c are schematic diagrams of a group of user interfaces provided by the embodiments of the present application.
  • 15a-15e are schematic diagrams of a group of user interfaces provided by embodiments of the present application.
  • 16a-16d are schematic diagrams of a group of user interfaces provided by embodiments of the present application.
  • 17a-17d are schematic diagrams of a group of user interfaces provided by the embodiments of the present application.
  • FIG. 18 is a schematic structural diagram of a control display device provided by an embodiment of the present application.
  • the electronic device may be a portable electronic device that also includes other functions, such as cross-screen display, cross-device sharing, and other functions, such as a mobile phone, a tablet computer, and a wearable electronic device (such as a smart watch) with wireless communication functions. Wait.
  • portable electronic devices include, but are not limited to, carry-on Or portable electronic devices with other operating systems.
  • the above-mentioned portable electronic device may also be other portable electronic devices, such as a laptop or the like with a touch-sensitive surface or touch panel.
  • the above-mentioned electronic device may not be a portable electronic device, but a desktop computer, a vehicle-mounted computer, etc. having a touch-sensitive surface or a touch panel.
  • the embodiment of the present application takes a smart phone as an example for introduction, but it is not limited to a smart phone, and can also be other smart devices with communication functions, such as smart watches, smart bracelets, and virtual reality technology (virtual reality, VR). glasses, etc.
  • VR virtual reality
  • UI user interface
  • the term "user interface (UI)" in the description, claims and drawings of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the internal form of information Conversion to and from user-acceptable forms.
  • the user interface of the application is the source code written in a specific computer language such as java and extensible markup language (XML).
  • the interface source code is parsed and rendered on the terminal device, and finally presented as content that the user can recognize.
  • Controls also known as widgets, are the basic elements of the user interface. Typical controls include toolbars, menu bars, text boxes, buttons, and scroll bars. (scrollbar), pictures and text.
  • the attributes and content of controls in the interface are defined by tags or nodes.
  • XML specifies the controls contained in the interface through nodes such as ⁇ Textview>, ⁇ ImgView>, and ⁇ VideoView>.
  • a node corresponds to a control or property in the interface, and the node is rendered as user-visible content after parsing and rendering.
  • applications such as hybrid applications, often contain web pages in their interface.
  • a web page, also known as a page can be understood as a special control embedded in an application program interface.
  • a web page is source code written in a specific computer language, such as hypertext markup language (HTML), cascading styles Tables (cascading style sheets, CSS), java scripts (JavaScript, JS), etc.
  • the source code of the web page can be loaded and displayed as user-identifiable content by a browser or a web page display component similar in function to a browser.
  • the specific content contained in a web page is also defined by tags or nodes in the source code of the web page. For example, HTML defines the elements and attributes of web pages through ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
  • GUI graphical user interface
  • GUI refers to a user interface related to computer operations that is displayed graphically. It can be an icon, window, control and other interface elements displayed on the display screen of the electronic device, wherein the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. visual interface elements.
  • Drag and drop also known as drag and drop, refers to "using a touchpad, mouse or finger to click on an object and hold it in a laptop, desktop computer, tablet computer or smartphone. Release, and move the trackpad, mouse, or finger in the original plane, and then release” the entire operation process.
  • the drag operation is usually used to move objects such as windows or files to a certain position, which can be a short-distance or long-distance movement within the current interface, or a cross-device or cross-screen movement.
  • FIG. 1 is a schematic diagram of a cross-screen switching display. As shown in FIG.
  • the electronic device may include two display screens, a first display screen and a second display screen, for example, two display screens connected by a computer host, or a notebook computer or a tablet computer. Two display screens with a belt, etc., which are not specifically limited in this embodiment of the present application. Then, a corresponding drag operation can be performed for the window shown in FIG. 1, so that the window can be switched and displayed in the first display screen and the second display screen.
  • two display screens with a belt, etc. which are not specifically limited in this embodiment of the present application. Then, a corresponding drag operation can be performed for the window shown in FIG. 1, so that the window can be switched and displayed in the first display screen and the second display screen.
  • Fig. 2a-Fig. 2d are a set of schematic diagrams of user interfaces.
  • the electronic device taking a notebook computer as an example in Fig. 2a-Fig. 2d
  • the screen specifically, may include a first display screen 01 and a second display screen 02 .
  • the interface displayed on the first display screen 01 of the electronic device may include a window 03.
  • the electronic device When the user uses a mouse, a touchpad or a finger (the finger is taken as an example in the figure, the dots in the figure That is, the position touched by the finger, the electronic device may be an electronic device with a touch screen function), when the window 03 is dragged from the window bar of the window 03 (that is, as shown in FIG. When inputting operation 04a), the electronic device will display a control combination 05 (or called a button combination, that is, the above-mentioned app switcher) in the moving direction of the dragging operation. As shown in FIG.
  • the control combination 05 may include two controls 06a and 06b, wherein the control 06a may be a control for switching to the display of the second display screen 02, and the control 06b may be used for switching to the dual screen and performing Controls that are displayed continuously in full screen.
  • the electronic device receives the user's input operation 04b (for example, on the basis of Fig. 2b, the user does not release his finger and continues to drag the window until his finger touches the control 06a), and in response to the input operation 04b, the electronic device A secondary menu 07 related to the function of switching to the second display screen 02 is displayed below the control combination 05, and at this time, the control combination 05 and the secondary menu 07 may be in a floating state. As shown in FIG.
  • the secondary menu 07 may include a control 07a (which can be used to switch to the left half of the second display screen 02 for display), a control 07b (which can be used to switch to the right half of the second display screen 02) display), control 07c (can be used to switch to two-thirds of the second display 02 and the left screen for display) and control 07d (can be used to switch to two-thirds of the second display 02 and display on the right screen).
  • the electronic device receives an input operation 04c from the user (for example, on the basis of FIG. 2c, the user does not release his finger and continues to drag the window until his finger touches the control 07a).
  • the electronic device Switch the window 03 to the left half of the second display screen 02 for display.
  • the app switcher appears in the same direction as the window moves, which makes it easy for users to touch it by mistake.
  • the control combination 05 will inevitably pass by in its moving path.
  • the floating state of the control combination 05 will inevitably be triggered, and the secondary menu 07 will be displayed (as shown in the figure). 2c), resulting in a greater visual disturbance to the user.
  • Fig. 2b due to the large size of the control combination 05, it is easy to block the actual operation target of the user. For example, when the user drags the window with the goal of not using the app swither, due to the large size of the control combination 05, it is very likely to block the actual operation target of the user. For example, when the user wants to drag to the edge in windows ("window" operating system) to trigger the split screen function, during the process of the user dragging to the edge, the electronic device will trigger the display of the above-mentioned control combination 05. At this time, the user In order to avoid the control combination 05, the split-screen operation needs to be performed away from the original path direction, which greatly increases the user's operation burden and reduces the user's operation efficiency.
  • window window operating system
  • the actual technical problems to be solved by this application may include the following aspects: avoiding the excessive burden of user operations under long-path dragging (especially when using a touchpad to operate In the environment), the operation efficiency is too low; distinguish whether the user expects to move within the current interface, or move across devices or screens, in order to provide a more effective and fast solution.
  • the embodiments of the present application identify the user's dragging intention by detecting the duration and movement range of the user's drag operation, and display the portal of the desired target at an appropriate position and time (also That is, the corresponding controls), shorten the user's operation path, and optimize and enhance the user's experience in the dragging process in terms of vision and interaction.
  • FIG. 3 is a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application, wherein the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus) bus, USB) interface 130, charging management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphones Interface 170D, sensor module 180, key 190, motor 191, indicator 192, camera 193, display screen 194, and subscriber identification module (SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation Satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation Satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and radiate it into electromagnetic
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou satellite navigation system (bei dou navigation satellite system, BDS), quasi-zenith satellite system ( quasi-zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou satellite navigation system bei dou navigation satellite system, BDS
  • quasi-zenith satellite system quasi-zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the display screen 194 is used to display exemplary user interfaces provided by subsequent embodiments of the present application.
  • the specific description of the user interface can be referred to later.
  • the display screen 194 can be used to display target objects such as windows and files, and also used to display related controls for cross-device and cross-screen display, etc., which will not be described in detail here.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • NPU is a Neural-network Processing Unit. By borrowing the structure of biological neural network, such as the transmission mode between neurons in the human brain, it can quickly process the input information, and can also continuously learn by itself. Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 can also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • the electronic device 100 can also calculate two positions according to the detection signal of the pressure sensor 180A distance between them, and the duration of the move operation, etc.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • Distance sensor 180F for measuring distance.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • LEDs light emitting diodes
  • photodiodes such as photodiodes
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided via display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the touch screen can detect a user operation on a file, a window, or a link, etc., and the user operation can be a drag operation on the file.
  • the touch screen can detect a user operation on the control, and the user operation can be a click operation on the control, etc.
  • the above-mentioned user operation can also have other implementation forms, which are not limited in this embodiment of the present application. For the specific implementation of the above-mentioned user operation, reference may be made to the detailed description of the subsequent method embodiments, and details are not described here.
  • the processor 110 may trigger the display of target controls on the display screen 194 according to certain rules in response to user operations on files, windows, or links.
  • the processor 110 may trigger the display of target controls on the display screen 194 according to certain rules in response to user operations on files, windows, or links.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the electronic device 100 may be a smart wearable device, a smart phone, a tablet computer, a notebook computer, a desktop computer, etc. with the above functions, which are not specifically limited in this embodiment of the present application.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of the electronic device 100 .
  • FIG. 4 is a software structural block diagram of an electronic device 100 provided by an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and so on.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the electronic device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, G.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • the software system shown in Figure 4 involves application presentation (such as gallery, file manager) that uses sharing capabilities, an instant sharing module that provides sharing capabilities, and a print service and print spooler that provide printing capabilities. ), and the application framework layer provides printing framework, WLAN service, Bluetooth service, and the kernel and bottom layer provide WLAN Bluetooth capability and basic communication protocol.
  • FIG. 5 is a schematic flowchart of a control display method provided by an embodiment of the present application. The method can be applied to the electronic device 100 described in FIG. 3 and FIG. 4 , and the electronic device 100 can be used to support and execute steps S101 to S105 of the method flow shown in FIG. 5 .
  • a control display method provided by an embodiment of the present application may include:
  • Step S101 displaying a first interface; wherein, the first interface includes a target object, the target object includes a first target point, and the first target point is located at a first position of the first interface.
  • the electronic device 100 displays a first interface
  • the first interface may include a target object.
  • the target object may be a window, a file (may include a document, a picture, a folder, etc.), Business cards and connections (such as web page links, etc.), etc., are not specifically limited in this embodiment of the present application.
  • the target object may include a plurality of target points, which may include a first target point, and the first target point may be located at a first position of the first interface.
  • FIG. 6a-FIG. 6c are schematic diagrams of a group of user interfaces provided by an embodiment of the present application.
  • the position of the finger/cursor (or mouse) in the target object may be the first position of the first target point in the target object.
  • Step S102 Receive and respond to a first drag operation for a first target point on the target object, determine a first area where the first position is located, and determine that the first target point is located at a second position on the first interface.
  • the electronic device 100 receives and responds to the first drag operation for the first target point on the target object, and determines the first position where the first position (for example, the original position of the finger/cursor in FIG. 6b) is located. a region (eg, the dotted circle in FIG. 6b ), and determine that the first target point is located at the second position of the first interface. As shown in Figure 6b, the user drags the window to move downward and to the right.
  • Step S103 if it is detected that the second position is within the first area and the duration of the first drag operation is longer than the first duration threshold, display the target control at the third position of the first interface.
  • the electronic device 100 may determine the user's operation intention according to the speed, duration, and scope of the user's drag operation. If the electronic device 100 detects that the second position corresponding to the first drag operation is within the first area, and the duration of the first drag operation is greater than the first duration threshold (for example, 1 second, 2 seconds, 3 seconds, etc.) , the target control is displayed on the third position of the first interface.
  • the first duration threshold for example, 1 second, 2 seconds, 3 seconds, etc.
  • the target control is displayed on the third position of the first interface.
  • the target control that is, the portal in Fig.
  • the target control can be displayed in different directions of the finger/cursor movement track, that is, the direction in which the first position points to the third position and the direction in which the first position points to the second position (for example, as shown in Figure 6b, the target control can be displayed on the opposite direction of the finger/cursor movement trajectory), so that the target can be displayed when the user does not want to share across devices or display across screens, but still triggers the display of the target.
  • the control After the control is installed, avoid the user from accidentally touching the target control in the positive direction of the movement track, and avoid occlusion and visual interference to the user's actual dragging path.
  • the shape of the first region may be an ellipse, a square, a triangle, etc., in addition to the circle shown in FIG. 6b and FIG. 6c , which is not specifically limited in this embodiment of the present application.
  • the embodiment of the present application reduces it to a circular target control, representing all functions that may be triggered in this dragging scene.
  • the user can also release the finger or mouse at the third position, and the electronic device Receive and in response to the user releasing the finger or the mouse, cancel the display of the target control.
  • the target control may continue to be displayed.
  • the target control can also be canceled by clicking on a blank space on the first interface, and so on, which is not specifically limited in this embodiment of the present application.
  • Step S104 Receive and respond to the second drag operation for the first target point, determine that the first target point is located at the fourth position of the first interface, and display the target frame at the third position; the fourth position is at the fourth position.
  • the target frame includes N controls.
  • the electronic device 100 receives and responds to the second drag operation for the first target point, determines that the first target point is located at the fourth position of the first interface, and displays the target frame at the third position; the fourth The position is in the second area where the third position is located; the target frame includes N controls. Wherein, N is an integer greater than or equal to 1.
  • FIG. 7 is a schematic diagram of a change process of a transmission gate provided by an embodiment of the present application.
  • the actual hot area of the target control that is, the portal in FIG. 7
  • the electronic device 100 can gradually enlarge the portal and gradually shrink the window, so that the user is on the visual feedback Generates the visual effect of dragging the target object into the portal, improving the user's operating experience, and ensuring that the user can easily operate even on objects that seem small.
  • the size of the portal can be Zoom in to the size of the actual hot area, and it needs to be larger than the size of the finger/cursor to ensure the user's recognition of the target control, improve the visual experience, and avoid user operation errors.
  • parameters such as the shape, transparency and size of the transmission gate can be customized by the user through the electronic device 100 .
  • FIG. 8 is a schematic diagram of another transmission gate change process provided by an embodiment of the present application. As shown in Figure 8, when the user continues to drag the window until the finger/cursor touches the portal, the portal can change color to indicate that the user can release the finger or mouse at this time to trigger subsequent functions, and so on.
  • FIG. 9a is a schematic diagram of a transmission gate and a button back panel provided by an embodiment of the present application.
  • the user can release the finger or the mouse (that is, release the dragged target object) to complete the second drag operation.
  • the electronic device 100 can cancel the display transmission door, and display the button backplate where the portal is.
  • the portal is transformed from a circle as shown in FIG. 9a into a button backplane (ie, a target frame).
  • a button backplane ie, a target frame
  • the button back panel may include a plurality of controls, for example, may include a control for switching to the left half of the second display screen for display, and a control for switching to the right half of the second display screen for display controls and full screen display controls, etc.
  • the user does not need to face the burden of dragging and dragging for a long time, and the corresponding function can be realized by clicking the control that is called up after releasing the hand.
  • the embodiment of the present application can accommodate more functional controls. Please refer to FIG. 9b.
  • FIG. 9b is a schematic diagram of a button backplane provided by an embodiment of the present application.
  • the number of portals can be two or more, and different portals can correspond to different button backplanes.
  • the user can customize two portals through the electronic device 100, and one portal can correspond to The button back panel containing multiple share buttons can share the dragged content to other devices, and another portal can correspond to the button back panel containing multiple switching screen display buttons, which can switch the dragged content to other displays. display, etc., which are not specifically limited in this embodiment of the present application.
  • Step S105 receiving and responding to a click operation on one of the N controls, displaying a second interface.
  • the electronic device 100 receives and displays the second interface in response to a click operation on one of the N controls.
  • the electronic device 100 may include a first display screen and M second display screens, where M is an integer greater than or equal to 1, and the N controls may include a one-to-one correspondence with the M second display screens.
  • M is an integer greater than or equal to 1
  • the N controls may include a one-to-one correspondence with the M second display screens.
  • the M switching screens display controls, and even, considering the different display schemes for each of the above-mentioned second display screens, more controls can be included, which greatly enriches the choices of users and meets different needs of users.
  • the electronic device 100 receives and responds to the user's click operation on the control, and can switch the target object to the second display screen for display, and so on.
  • first display screen and the second display screen in the figures of the embodiments of the present application are only examples, and the size of the second display screen may be smaller than the size of the first display screen as shown in the figure, or may be equal to or larger than that of the first display screen.
  • the first display screen, etc. are not specifically limited in this embodiment of the present application.
  • first and “second” in the first display screen and the second display screen described in the embodiments of the present application do not constitute limitations on the display screen specifications, primary and secondary or priority levels, but only for distinguishing the current target The display the object is on, and other displays.
  • FIGS. 10a-10h are a set of schematic diagrams of user interfaces provided by the embodiments of the present application.
  • the electronic device 100 may include two display screens, namely a first display screen and a second display screen.
  • the electronic device 100 displays a first interface through the first display screen, and the first interface includes a target object (window), and the window may be a common browser window or a chat window, etc., which is not specifically limited in this embodiment of the present application .
  • the electronic device 100 in response to the user releasing the mouse/touchpad, displays the target frame at the position where the target control was originally located, and cancels the display of the target control, or can display it at any suitable position in the interface
  • the target controller, etc. are not specifically limited in this embodiment of the present application.
  • the target frame may include multiple controls, which may implement various corresponding functions.
  • the electronic device 100 receives and responds to the user's click operation on the control used to switch to the display of the second display screen in the left half of FIG. 10g, and displays the second interface through the first display screen.
  • the second interface does not include the target object
  • the third interface is displayed through the second display screen, and the third interface includes the target object. That is, the target object (window) is switched from the first display screen to the left half of the second display screen for display.
  • FIG. 11 is a schematic diagram of a transmission gate provided by an embodiment of the present application.
  • the brightness of the interface background is used as the judgment standard, and the background color is selected when the portal (ie, the target control) appears, so as to adjust the color of the portal accordingly to ensure the recognizability of the portal and improve the user experience. operating experience.
  • FIG. 12 is a schematic diagram of another transmission gate provided by an embodiment of the present application.
  • a capsule-shaped prompt can be used, and the capsule can be marked with specific function prompts; when the cursor/finger moves to touch the portal (when the portal changes color), the cursor The corner mark prompts the user to release the mouse/trackpad/finger to open the corresponding multiple functional controls, etc., so as to ensure that the user can still recognize and know its function when seeing an abstract dot with no specific intention conveyed .
  • FIG. 13a-FIG. 13e are schematic diagrams of a set of user interfaces provided by the embodiments of the present application.
  • the electronic device 100 takes the electronic device 100 as a notebook computer as an example, the electronic device 100 displays a user interface 21, which may include multiple files (eg, file 11, file 12, file 13, and file 14).
  • the electronic device 100 receives the user's drag operation on the file 16, and in response to the drag operation, determines the first area 22 where the original position of the file 16 is located (the first area 22 may be a 13b), and form a snapshot of the file 16, which is dragged out with the user's drag operation.
  • parameters such as the transparency and size of the snapshot are not specifically limited in this embodiment of the present application, and may be system default values, or may be customized by the user through the electronic device 100 .
  • the electronic device 100 detects that the snapshot (or cursor) of the file 16 is still located in the first area 22 within a certain period of time, the target control 23 is displayed in a different direction from the drag operation. As shown in FIG.
  • the electronic device 100 receives the user's drag operation for the snapshot of the file 16, and in response to the drag operation, gradually enlarges the size of the target control 23, and gradually reduces the size of the snapshot of the file 16, when the electronic device detects
  • the control target control 23 changes color to remind the user that the mouse/touchpad can be released at this time to end the dragging operation , to trigger more functions.
  • the electronic device 100 can display the target frame 24 in the center of the interface (or other suitable positions), and cancel the display of the target control 23 (in the user's visual field).
  • the target control 23 is shaped as a target frame 24).
  • the target frame 24 may be a suspended window, a suspended backboard, or the like, which is not specifically limited in this embodiment of the present application.
  • the target frame 24 may include a plurality of sharing controls, specifically including a sharing control 25, a sharing control 26, a sharing control 27, a sharing control 28, a sharing control 29 and a sharing control 30, that is, the above-mentioned N controls
  • a plurality of sharing controls corresponding to a plurality of other devices (ie, the second electronic device) one-to-one may be included in the .
  • the electronic device 100 receives the user's click operation on the share control 28, and in response to the click operation, sends the file 16 to the other device "my phone", and then the electronic device 100 can display the same interface as the user interface 21, or, electronically
  • the device 100 cancels the display of the target frame 24 by receiving and responding to the user's click operation on the blank space of the interface, thereby displaying the same interface as the user interface 21 .
  • This file 16 may be received by other devices as shown in Figure 13e.
  • the electronic device 100 and other devices may establish connections with each other through wired or wireless networks (eg, Wireless-Fidelity (WiFi), Bluetooth, and mobile networks), which are not specifically limited in this embodiment of the present application.
  • wired or wireless networks eg, Wireless-Fidelity (WiFi), Bluetooth, and mobile networks
  • Figures 13a-13e demonstrate the process of synchronizing files from the computer side to the mobile phone side.
  • the sharing operation between all electronic devices can be designed in this way.
  • it can also include the sharing of multimedia such as photos/videos, and the sharing of content such as business cards/calendars/memos, and so on.
  • the target frame 24 may further include a shortcut control for sending the dragged content to a printer for printing, a shortcut control for moving the dragged content to a certain folder, and a shortcut control for sending the dragged content to a mail/information/various type.
  • a shortcut control sent by a third-party application (such as a communication application, a short video application, etc.), etc., is not specifically limited in this embodiment of the present application.
  • FIG. 14a-FIG. 14c are a set of schematic diagrams of user interfaces provided by the embodiments of the present application.
  • the electronic device 100 receives the user's drag operation on the share control 28, and in response to the drag operation, forms a snapshot 28' of the share control 28, and drags it out along with the user's drag operation.
  • the electronic device 100 responds to the user's drag operation on the snapshot 28', and in response to the drag operation, detects that the snapshot 28' is located above the sharing control 30, and the user releases the Open the mouse/trackpad, create and display the multi-device sharing control 31 corresponding to the sharing control 28 and the sharing control 30 (that is, the device group including "my mobile phone” and “mobile phone xxx"), that is, among the above N controls
  • a multi-device sharing control for synchronously sharing to multiple other devices ie, the second electronic device may also be included. As shown in FIG.
  • the electronic device 100 receives the user's click operation on the multi-device sharing control 31, and in response to the click operation, the file 16 can be respectively sent to other devices "my phone” and other devices "mobile phone xxx” .
  • the other device "my mobile phone” and the other device “mobile phone xxx” can receive the file 16, thereby greatly improving the efficiency of file sharing and bringing convenience to the user's operation.
  • the user can also drag and drop other sharing controls, for example, drag the sharing control 25 to the multi-device sharing control 31 and release it, thereby creating a control that can be shared with three devices at a time. (that is, the device group including "My Computer”, “My Mobile Phone” and “Mobile Phone xxx”) to further improve the sharing efficiency.
  • this function is mainly aimed at the sharing of commonly used devices of more than 2 people, such as quick sharing among colleagues in the same department/classmates/friends traveling together, bringing convenience to user operations.
  • the multi-device sharing control 31 (that is, the device group including "my mobile phone” and “mobile phone xxx”) can be saved in the background.
  • the electronic device 100 can always display the multi-device sharing control 31 in the target frame 24, or, when the electronic device 100 recognizes any device in the device group through Bluetooth or the like, it can display the multi-device sharing control 31 in the target frame 24.
  • the device sharing control 31, optionally, the unrecognized devices in the device group can be grayed out to prompt the user.
  • FIG. 15a-FIG. 15e are schematic diagrams of a group of user interfaces provided by the embodiments of the present application. As shown in FIG.
  • the electronic device 100 displays a user interface 32, the user interface 32 may include a search window, and the search window may include multiple pieces of search content, the multiple pieces of search content It may be obtained through the global search function of the electronic device 100 and searched according to keywords, for example, it may be a file, a business card, a browser connection, and the like.
  • the electronic device 100 receives the user's drag operation on the search content 1, and in response to the drag operation, forms a snapshot of the search content 1, and drags out the search content 1 along with the user's drag operation.
  • the electronic device 100 When the electronic device 100 detects that the snapshot (or cursor) of the search content 1 is still located in a certain area where the original position of the search content 1 is located within a certain period of time, the target control 23 is displayed in a direction different from the drag operation. . As shown in FIG. 15d, the electronic device 100 receives the user's drag operation for the snapshot of the search content 1, and in response to the drag operation, gradually enlarges the size of the target control 23, and gradually reduces the size of the snapshot of the search content 1.
  • the device When the device detects that the snapshot of the search content 1 is located on the target control 23 (or when the cursor is located on the target control 23), the device controls the target control 23 to change color to remind the user that the mouse/touchpad can be released at this time, and the end Drag and drop to trigger more functions.
  • the electronic device 100 when the electronic device 100 detects that the user has released the mouse/touchpad, the electronic device 100 can display the target frame 24 in the center of the interface (or other suitable positions), and cancel the display of the target control 23 .
  • the target frame 24 may include a plurality of sharing controls, which will not be repeated here.
  • the electronic device 100 receives the user's click operation on the sharing control 28, and in response to the click operation, sends the search content 1 to the other device "my mobile phone".
  • the search content 1 in the figure, the file "graduation album” is taken as an example.
  • FIG. 16a-FIG. 16d are schematic diagrams of a set of user interfaces provided by the embodiments of the present application.
  • the electronic device 100 taking the electronic device 100 as a tablet computer as an example, the electronic device 100 displays a user interface 33, and the user interface 33 may include application programs (such as weather, music, video, application store, mail and gallery, etc.) .
  • the electronic device 100 receives the user's click operation 34 for the application gallery, and in response to the click operation 34, displays the user 35, which may include a plurality of pictures.
  • FIG. 16a-FIG. 16d are schematic diagrams of a set of user interfaces provided by the embodiments of the present application.
  • the electronic device 100 displays a user interface 33, and the user interface 33 may include application programs (such as weather, music, video, application store, mail and gallery, etc.) .
  • the electronic device 100 receives the user's click operation 34 for the application gallery, and in response to the click operation 34, displays the user 35, which may include a pluralit
  • the electronic device 100 receives the user's drag operation 36 for the picture 8, and in response to the drag operation 36, when the electronic device 100 detects that the picture 8 (or finger) is still located in the picture 8 within a certain time range
  • the drag operation 36 starts within a certain area where the original position of the finger is located, the target control 23 is displayed in a different direction from the drag operation 36 . As shown in FIG.
  • the electronic device 100 receives the user's drag operation 37 for the picture 8, and in response to the drag operation 37, gradually enlarges the size of the target control 23, and gradually reduces the size of the picture 8, when the electronic device 100 detects that When the picture 8 is positioned on the target control 23 (or when the finger is positioned on the target control 23 ), the target control 23 is controlled to change color to prompt the user to release the finger and end the drag operation 37 .
  • the electronic device 100 can display the target frame 24 at the position where the target control 23 is located (or other suitable positions), and cancel the display of the target control 23 . As shown in FIG.
  • the target frame 24 may include a plurality of sharing controls, which will not be repeated here.
  • the electronic device 100 receives the user's click operation 38 on the sharing control 26 , and in response to the click operation 38 , sends the picture 8 to the other device "My Computer", and "My Computer” can receive the picture 8 .
  • FIG. 17a-FIG. 17d are schematic diagrams of a set of user interfaces provided by the embodiments of the present application.
  • the electronic device 100 takes the electronic device 100 as a smartphone as an example, the electronic device 100 displays a user interface 39, which may be a resource manager interface, which may include multiple files, documents, etc. (for example, file 1, file 2, etc., and Document 1, Document 2, and Document 3, etc.).
  • a user interface 39 which may be a resource manager interface, which may include multiple files, documents, etc. (for example, file 1, file 2, etc., and Document 1, Document 2, and Document 3, etc.).
  • the electronic device 100 receives the user's drag operation 40 for document 1, and in response to the drag operation 40, when the electronic device 100 detects that the document 1 (or finger) is still located in the document 1 (or finger) within a certain period of time
  • the drag operation 40 starts within a certain area where the original position of the finger is located, the target control 23 is displayed in a different direction from the drag operation 40 . As shown in FIG.
  • the electronic device 100 receives the user's drag operation 41 for document 1, and in response to the drag operation 41, gradually enlarges the size of the target control 23, and gradually reduces the size of the document 1, when the electronic device 100 detects that When the document 1 is positioned on the target control 23 (or when the finger is positioned on the target control 23 ), the target control 23 is controlled to change color to prompt the user to release the finger and end the drag operation 41 .
  • the electronic device 100 when the electronic device 100 detects that the user releases the finger, the electronic device 100 can display the target frame 24 in the center of the interface (or other suitable positions), and cancel the display of the target control 23 . As shown in FIG.
  • the target frame 24 may include a plurality of sharing controls, which will not be repeated here.
  • the electronic device 100 receives the user's click operation 42 on the sharing control 27 , and in response to the click operation 42 , sends the document 1 to the other device "My Tablet", and "My Tablet” can receive the picture 8 .
  • FIGS. 13 a to 13 e For some details of the application scenario 4, reference may be made to the descriptions of the corresponding embodiments in the above-mentioned FIGS. 13 a to 13 e , which will not be repeated here.
  • the embodiment of the present application proposes a control display method, which can identify the user's operation intention by detecting the duration and movement range of the user's drag operation, display the control in a timely manner, and reduce the visual interference to the user's operation.
  • the path for the user to perform the dragging operation is shortened, the operation burden of the user is reduced, and the operation efficiency of the user is improved.
  • the embodiments of the present application explore more visual expressions and interaction methods, and explore more draggable content and specific embodiment scenarios.
  • FIG. 18 is a schematic structural diagram of a control display device provided by an embodiment of the present application.
  • the control display device 20 may include a first display unit 201 , a first determination unit 202 , a second display unit 203 , and a third display unit 201 .
  • the display unit 204 and the fourth display unit 205 may further include a generating unit 206 .
  • the detailed description of each unit is as follows.
  • the first display unit 201 is configured to display a first interface; wherein, the first interface includes a target object, the target object includes a first target point, and the first target point is located at a first position of the first interface ;
  • a first determining unit 202 configured to receive and respond to a first drag operation for the first target point on the target object, determine a first area where the first position is located, and determine the first the target point is located at the second position of the first interface;
  • the second display unit 203 is configured to, if it is detected that the second position is within the first area, and the duration of the first drag operation is greater than the first duration threshold, display the second position on the first interface of the first interface.
  • Display target controls in three positions;
  • the third display unit 204 is configured to receive and respond to the second drag operation for the first target point, determine that the first target point is located at the fourth position of the first interface, and A target frame is displayed on the position; the fourth position is in the second area where the third position is located; the target frame includes N controls, and N is an integer greater than or equal to 1;
  • the fourth display unit 205 is configured to receive and display a second interface in response to a click operation on one of the N controls.
  • the first electronic device includes a first display screen and M second display screens, and the first interface and the second interface are interfaces displayed on the first display screen
  • the N controls include M switch screen display controls corresponding to the M second display screens one-to-one, where M is an integer greater than or equal to 1; the fourth display unit 205 is specifically used for:
  • the second interface does not include the target object; the i-th switch
  • the third interface displayed on the i-th second display screen corresponding to the screen display control includes the target object; i is an integer greater than or equal to 1 and less than or equal to M.
  • the N controls include a full-screen display control; the fourth display unit 205 is specifically used for:
  • the second interface includes the target object, and the size of the target object in the second interface is larger than that in the first interface The size of the target object.
  • the N controls include K share controls corresponding to the K second electronic devices one-to-one; the fourth display unit 205 is specifically used for:
  • the device further includes:
  • the generating unit 206 is configured to receive and respond to the third drag operation for the x th sharing control in the K sharing controls, and form a snapshot corresponding to the x th sharing control.
  • the snapshots corresponding to the x-th sharing controls are located in the third area where the y-th sharing control is located, multi-device sharing controls corresponding to the x-th second electronic device and the y-th second electronic device are generated; wherein the N Each control also includes the multi-device sharing control; x and y are integers greater than or equal to 1 and less than or equal to K.
  • the fourth display unit 205 is specifically used for:
  • Receive and respond to the click operation for the multi-device sharing control respectively sending the target object to the xth second electronic device and the yth second electronic device corresponding to the multi-device sharing control device, and display a second interface; wherein, the second interface is the same as the first interface.
  • the third display unit 204 is specifically used for:
  • the direction in which the first position points to the third position is different from the direction in which the first position points to the second position.
  • the target object is any one of a window, a file, a business card and a link.
  • the disclosed apparatus may be implemented in other manners.
  • the device embodiments described above are only illustrative.
  • the division of the above-mentioned units is only a logical function division.
  • multiple units or components may be combined or integrated. to another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical or other forms.
  • the units described above as separate components may or may not be physically separated, and components shown as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated units are implemented in the form of software functional units and sold or used as independent products, they may be stored in a computer-readable storage medium.
  • the technical solutions of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, and the computer software products are stored in a storage medium , including several instructions to enable a computer device (which may be a personal computer, a server, or a network device, etc., specifically a processor in the computer device) to execute all or part of the steps of the above methods in various embodiments of the present application.
  • a computer device which may be a personal computer, a server, or a network device, etc., specifically a processor in the computer device
  • the aforementioned storage medium may include: U disk, mobile hard disk, magnetic disk, optical disk, Read-Only Memory (Read-Only Memory, abbreviation: ROM) or Random Access Memory (Random Access Memory, abbreviation: RAM), etc.
  • a medium that can store program code may include: U disk, mobile hard disk, magnetic disk, optical disk, Read-Only Memory (Read-Only Memory, abbreviation: ROM) or Random Access Memory (Random Access Memory, abbreviation: RAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Les modes de réalisation de la présente demande concernent un procédé d'affichage de commande et un dispositif associé. Le procédé consiste à : afficher une première interface, la première interface comprenant un objet cible, l'objet cible comprenant un premier point cible, et le premier point cible étant situé à une première position dans la première interface ; recevoir une première opération de glissement pour le premier point cible, et en réponse à la première opération de glissement, déterminer une première zone dans laquelle se situe la première position, puis déterminer que le premier point cible se situe à une deuxième position dans la première interface ; s'il est détecté que la deuxième position est dans la première zone, et que la durée de la première opération de glissement est supérieure à une première valeur seuil de durée, afficher une commande cible à une troisième position dans la première interface ; recevoir une seconde opération de glissement pour le premier point cible, et en réponse à la seconde opération de glissement, déterminer que le premier point cible se situe à une quatrième position dans la première interface, puis afficher une zone cible à la troisième position, la zone cible comprenant N commandes ; et recevoir une opération de clic pour l'une des N commandes, et en réponse à l'opération de clic, afficher une seconde interface. Au moyen de la mise en œuvre des modes de réalisation de la présente demande, l'expérience de fonctionnement d'un utilisateur peut être améliorée.
PCT/CN2022/083215 2021-04-07 2022-03-26 Procédé d'affichage de commande et dispositif associé WO2022213831A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110372085.6 2021-04-07
CN202110372085.6A CN115185440B (zh) 2021-04-07 2021-04-07 一种控件显示方法及相关设备

Publications (1)

Publication Number Publication Date
WO2022213831A1 true WO2022213831A1 (fr) 2022-10-13

Family

ID=83512323

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/083215 WO2022213831A1 (fr) 2021-04-07 2022-03-26 Procédé d'affichage de commande et dispositif associé

Country Status (2)

Country Link
CN (1) CN115185440B (fr)
WO (1) WO2022213831A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116680019B (zh) * 2022-10-26 2024-06-28 荣耀终端有限公司 一种屏幕图标移动方法、电子设备及存储介质
CN116820229B (zh) * 2023-05-17 2024-06-07 荣耀终端有限公司 Xr空间的显示方法、xr设备、电子设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101615102A (zh) * 2008-06-26 2009-12-30 鸿富锦精密工业(深圳)有限公司 基于触摸屏的输入方法
JP2015228260A (ja) * 2015-09-16 2015-12-17 Kddi株式会社 接触度合いによるスクロール制御が可能なユーザインタフェース装置、画像スクロール方法及びプログラム
CN106249981A (zh) * 2015-06-04 2016-12-21 Lg电子株式会社 移动终端及其控制方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101387527B1 (ko) * 2007-12-06 2014-04-23 엘지전자 주식회사 단말기 및 그 메뉴 아이콘 디스플레이 방법
CN103324404B (zh) * 2012-03-20 2017-02-01 宇龙计算机通信科技(深圳)有限公司 图标的移动方法及通信终端
CN105653178A (zh) * 2015-05-28 2016-06-08 宇龙计算机通信科技(深圳)有限公司 一种信息分享方法及装置
CN108228053A (zh) * 2017-12-29 2018-06-29 努比亚技术有限公司 一种信息分享方法、智能终端及存储介质
CN111602381A (zh) * 2018-07-02 2020-08-28 华为技术有限公司 一种图标切换方法、显示gui的方法及电子设备
CN109725789B (zh) * 2018-12-27 2021-06-04 维沃移动通信有限公司 一种应用图标归档方法及终端设备
CN109684110A (zh) * 2018-12-28 2019-04-26 北京小米移动软件有限公司 多媒体资源分享方法、装置及存储介质
CN111443842B (zh) * 2020-03-26 2021-11-12 维沃移动通信有限公司 控制电子设备的方法及电子设备

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101615102A (zh) * 2008-06-26 2009-12-30 鸿富锦精密工业(深圳)有限公司 基于触摸屏的输入方法
CN106249981A (zh) * 2015-06-04 2016-12-21 Lg电子株式会社 移动终端及其控制方法
JP2015228260A (ja) * 2015-09-16 2015-12-17 Kddi株式会社 接触度合いによるスクロール制御が可能なユーザインタフェース装置、画像スクロール方法及びプログラム

Also Published As

Publication number Publication date
CN115185440B (zh) 2024-05-10
CN115185440A (zh) 2022-10-14

Similar Documents

Publication Publication Date Title
WO2021013158A1 (fr) Procédé d'affichage et appareil associé
JP7385008B2 (ja) スクリーン取り込み方法及び関連するデバイス
WO2021129326A1 (fr) Procédé d'affichage d'écran et dispositif électronique
WO2021103981A1 (fr) Procédé et appareil de traitement d'affichage à écran divisé, et dispositif électronique
WO2021052147A1 (fr) Procédé de transmission de données et dispositifs associés
WO2021139768A1 (fr) Procédé d'interaction pour traitement de tâches inter-appareils, et dispositif électronique et support de stockage
WO2021036571A1 (fr) Procédé d'édition de bureau et dispositif électronique
EP3848786B1 (fr) Procédé de commande d'affichage pour une barre de navigation de système, interface utilisateur graphique et dispositif électronique
WO2021036735A1 (fr) Procédé d'affichage d'interface utilisateur et dispositif électronique
WO2021000839A1 (fr) Procédé de division d'écran et dispositif électronique
WO2021104030A1 (fr) Procédé d'affichage à écran partagé et dispositif électronique
WO2021063098A1 (fr) Procédé de réponse d'écran tactile, et dispositif électronique
WO2020253758A1 (fr) Procédé de disposition d'interface utilisateur et dispositif électronique
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2022068483A1 (fr) Procédé et appareil de démarrage d'application, et dispositif électronique
WO2021196970A1 (fr) Procédé de création de raccourcis d'application, dispositif électronique et système
WO2021078032A1 (fr) Procédé d'affichage d'interface utilisateur et dispositif électronique
WO2020238759A1 (fr) Procédé d'affichage d'interface et dispositif électronique
WO2022213831A1 (fr) Procédé d'affichage de commande et dispositif associé
WO2022017393A1 (fr) Système d'interaction d'affichage, procédé d'affichage, et dispositif
WO2022052662A1 (fr) Procédé d'affichage et dispositif électronique
WO2021190524A1 (fr) Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal
EP4024839A1 (fr) Procédé d'opération et dispositif électronique
WO2022052712A1 (fr) Procédé et appareil pour traiter un événement d'interaction
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22783898

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22783898

Country of ref document: EP

Kind code of ref document: A1