CN117519861A - Interface display method and related device - Google Patents

Interface display method and related device Download PDF

Info

Publication number
CN117519861A
CN117519861A CN202210915485.1A CN202210915485A CN117519861A CN 117519861 A CN117519861 A CN 117519861A CN 202210915485 A CN202210915485 A CN 202210915485A CN 117519861 A CN117519861 A CN 117519861A
Authority
CN
China
Prior art keywords
page
view object
view
application
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210915485.1A
Other languages
Chinese (zh)
Inventor
胡怡洁
谭文宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210915485.1A priority Critical patent/CN117519861A/en
Priority to PCT/CN2023/109733 priority patent/WO2024027570A1/en
Publication of CN117519861A publication Critical patent/CN117519861A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an interface display method and a related device, which are characterized by being applied to electronic equipment, wherein the method comprises the following steps: displaying a first page of a first application, the first page including one or more view objects; executing a first function of a first view object, which is any one of one or more view objects of a first page, in a case that a first operation of the first view object acting on the first page is detected; in the event that a second operation is detected for the first view object of the first page, adding the first view object to the second page; a third operation is detected for the first view object of the second page, and in response to the third operation, a first function of the first view object is performed. Therefore, the operation convenience of the common functions is improved, the personalized requirements of the user are met, and the user experience is effectively improved.

Description

Interface display method and related device
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to an interface display method and a related device.
Background
With the continuous development and innovation of intelligent terminals (such as mobile phones), the functions of terminal applications are more and more abundant. The function is rich and various, and simultaneously, the difficulty and inconvenience of using the mobile phone by people are also increased. Some function layers are too deep, and the user needs to operate for multiple times and search layer by layer to trigger the mobile phone to display the target function.
Because the personal use habits are different, some deep-level functions can be commonly used by users, and the operation of each search of the users is complicated; while some of the functions are shallower, the user usage may be lower. Therefore, the current application functions cannot meet the personalized requirements of different users.
Disclosure of Invention
The interface display method and the related device improve the operation convenience of the common functions, meet the personalized requirements of users and effectively improve the user experience.
In a first aspect, the present application provides an interface display method, including: the method is applied to the electronic equipment and comprises the following steps: displaying a first page of a first application, the first page including one or more view objects; executing a first function of a first view object, which is any one of one or more view objects of a first page, in a case that a first operation of the first view object acting on the first page is detected; in the event that a second operation is detected for the first view object of the first page, adding the first view object to the second page; a third operation is detected for the first view object of the second page, and in response to the third operation, a first function of the first view object is performed.
By implementing the embodiment of the application, the first view object extracted from the first page of the first application can be added to the second page; the first view object in the first page continues to retain the original function (e.g., the first function); in the second page, the first view object also has the original functions. Therefore, through the second page, the user can quickly inquire and use the view object corresponding to the common function of the first application, the operation convenience of the common function is improved, sufficient user customization is realized, and the use experience of the user is effectively improved.
In one implementation, the method further includes: a fourth operation is detected for the first view object of the second page, and in response to the fourth operation, a second function of the first view object is performed, the first view object of the first page not having the second function. By implementing the embodiment of the application, the view object added to the second page can have more functions, so that the use experience of a user is effectively improved.
In one implementation, the method further includes: displaying a third page of the second application, the third page including one or more view objects, the second view object being any one of the one or more view objects of the third page; a fifth operation is detected for the second view object of the third page, and in response to the fifth operation, the second view object is added to the second page, and the second view object is displayed on the second page. By implementing the embodiment of the application, the view objects of the common functions of a plurality of applications can be added to the second page in a unified way; through the second page, the user can quickly inquire and use the view object corresponding to the common function of each application, so that the operation convenience of the common function is improved, the full user customization is realized, and the use experience of the user is effectively improved.
In one implementation, the first function includes jumping to a fourth page associated with the first view object; the third operation includes a first gesture for zooming in on the first view object, the first gesture including a user's double-finger sliding on the display screen with an increasing distance between the double-fingers, the second gesture including a user's double-finger sliding on the display screen with a decreasing distance between the double-fingers; the detecting of the third operation of the first view object for the second page, and the executing of the first function of the first view object in response to the third operation includes: detecting a first gesture acting on a first view object of a second page; in response to the first gesture, zooming in on the first view object; displaying a fourth page when the area of the first view object is enlarged to a preset value along with the first gesture; the method further comprises the following steps: and when the second gesture aiming at the fourth page is detected, displaying the second page. By implementing the embodiment of the application, the view object added to the second page can have the functions of amplifying the gesture trigger jump page and shrinking the gesture trigger return to the second page, so that the use experience of a user is effectively improved.
In one implementation, the first function includes jumping to a fourth page associated with the first view object; the detecting of the third operation of the first view object for the second page, and the executing of the first function of the first view object in response to the third operation includes: detecting a sixth operation of the first view object acting on the second page; responding to a sixth operation, displaying a first popup window, wherein the first popup window is used for displaying part or all of the content of a fourth page; detecting a seventh operation on the first popup; displaying a fourth page in response to the seventh operation; the third operation includes a sixth operation and a seventh operation. By implementing the embodiment of the present application, the fourth page may be a next-level page of the first view object; the first view object added to the second page can trigger the electronic equipment to display a popup window on the second page; the method includes that part or all of the content of a next-level page of a first view object can be displayed through the popup window without jumping to the next-level page of the first view object; for example, a common view object of the user in the next level page is displayed.
In one implementation, the first function includes jumping to a fourth page associated with the first view object; the detecting of the third operation of the first view object for the second page, and the executing of the first function of the first view object in response to the third operation includes: detecting an eighth operation of the first view object acting on the second page; responding to the eighth operation, and displaying a third view object corresponding to the common function in the fourth page in the display area of the first view object of the second page; detecting a ninth operation on the third view object; displaying a fourth page in response to the ninth operation; the third operation includes an eighth operation and a ninth operation. By implementing the embodiment of the present application, the fourth page may be a next-level page of the first view object; the first view object added to the second page can trigger the electronic device to replace the display content of the first view object in the second page with the related content of the next-level page; for example, the view object corresponding to the user's usual function in the next level page.
In one implementation, the third operation includes a click operation that acts on the first view object of the second page; when the clicking operation is detected, the electronic equipment responds to the clicking operation after displaying the focusing effect of the first view object on the second page; the focusing effect of the first view object includes one or more of: the first view object is zoomed out and then zoomed in, and view objects outside the first view object in the second page are uniformly changed from the first display form to the second display form. By implementing the embodiment of the application, when the user clicks the view object in the second page, the view object can present a focusing effect, so that the interestingness of the user operation is improved, and the use experience of the user is effectively improved.
In one implementation, the first view object includes one or more controls including the first control, the second function includes adjusting a display position and/or size of the first control in the first view object, detecting a fourth operation of the first view object for the second page, and in response to the fourth operation, performing a second function of the first view object, including: detecting a tenth operation of the first view object acting on the second page; in response to a tenth operation, displaying indication information, wherein the indication information is used for indicating entering an editing mode of a first view object, and the editing mode of the first view object is used for adjusting the display position and/or the size of any control in the first view object; detecting a drag operation acting on a first control in an editing mode of the first view object; in response to the drag operation, moving the first control within the display area of the first view object based on an operation track of the drag operation; the fourth operation includes a tenth operation and a drag operation. By implementing the embodiment of the application, the user can adjust the internal layout of the view object in the second page, and the use experience of the user is effectively improved.
In one implementation manner, the detecting the fourth operation of the first view object for the second page, and in response to the fourth operation, executing the second function of the first view object includes: detecting an eleventh operation of the first view object acting on the second page; in response to an eleventh operation, displaying function options corresponding to a plurality of functions of the first view object, the plurality of function options including function options corresponding to the second function; detecting a twelfth operation acting on the function option corresponding to the second function; in response to the twelfth operation, performing a second function; the fourth operation includes an eleventh operation and a twelfth operation. By implementing the embodiment of the application, the view object of the second page can trigger the electronic device to display a plurality of function options related to the view object, and the user can select any function option to realize the corresponding function, so that the use experience of the user is effectively improved.
In one implementation, after adding multiple view objects of the first application to the second page, the second page displays an application identifier of the first application; the method further comprises the following steps: detecting a thirteenth operation of the application identification acting on the first application; in response to the thirteenth operation, a plurality of view objects of the first application are displayed on the second page. By implementing the embodiment of the application, the view objects added to the second page can be classified and managed; for example, classification management is performed according to the application to which it belongs.
In one implementation, before displaying the first page of the first application, the method further includes: displaying a second page, wherein the second page comprises a third control; detecting a fourteenth operation on the third control; in response to the fourteenth operation, displaying application identifications of a plurality of applications, the plurality of applications including the first application; detecting a fifteenth operation of an application identifier acting on the first application; the displaying the first page of the first application includes: in response to a fifteenth operation, displaying a first page of the first application and entering an extraction mode of the view object; in the fetch mode, a second operation is to add the view object to the second page.
In one implementation, before displaying the first page of the first application, the method further includes: displaying a fifth page, wherein the fifth page comprises a switch control of an extraction mode of the view object; detecting a sixteenth operation on the switch control; in response to the sixteenth operation, turning on the extraction mode; in the fetch mode, a second operation is to add the view object to the second page.
In one implementation manner, in the case that the second operation of the first view object for the first page is detected, before the first view object is added to the second page, the method further includes: detecting a seventeenth operation on the first page, wherein a touch position of the seventeenth operation is included in the first view object and the fourth view object; in response to the seventeenth operation, a pointing frame of the first view object and a pointing frame of the fourth view object are displayed.
In one implementation manner, in the case that the second operation of the first view object for the first page is detected, before the first view object is added to the second page, the method further includes: an eighteenth operation is detected; and responding to the eighteenth operation, and displaying indication frames respectively corresponding to all view objects of the first page.
In one implementation, in the case of detecting the second operation of the first view object for the first page, adding the first view object to the second page includes: detecting a nineteenth operation on the first view object of the first page; in response to the nineteenth operation, displaying indication information of the first view object and a second control, wherein the indication information of the first view object is used for indicating that the first view object is selected; detecting a twentieth operation acting on the second control; in response to the twentieth operation, the first view object is added to the second page.
In one implementation, before the detecting the twentieth operation on the second control, the method further includes: detecting a twenty-first operation of a fifth view object acting on the first page; in response to the twenty-first operation, displaying indication information of a fifth view object for indicating that the fifth view object is selected; the adding the first view object to the second page in response to the twentieth operation includes: in response to the twentieth operation, the first view object and the fifth view object are added to the second page.
In a second aspect, the present application provides an electronic device comprising one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the interface display method in any of the possible implementations of the first aspect described above.
In a third aspect, embodiments of the present application provide a computer storage medium, including computer instructions that, when executed on an electronic device, cause the electronic device to perform the interface display method in any one of the possible implementation manners of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product, which when run on a computer causes the computer to perform the interface display method in any one of the possible implementation manners of the first aspect.
Drawings
Fig. 1A is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 1B is a block diagram of a software architecture provided in an embodiment of the present application;
FIGS. 2A-2F are exemplary user interfaces provided by embodiments of the present application;
fig. 3A to 3B are schematic diagrams of a call management page according to an embodiment of the present application;
fig. 4A to fig. 4G are schematic diagrams of an add view object provided in an embodiment of the present application;
fig. 5A to 5D are schematic diagrams of an add view object provided in an embodiment of the present application;
FIGS. 6A-6K are schematic diagrams of an add-view object provided in an embodiment of the present application;
fig. 7A to 7J are schematic diagrams of an add view object provided in an embodiment of the present application;
fig. 8A to 8D are schematic diagrams of an add view object provided in an embodiment of the present application;
FIGS. 9A and 9B are schematic diagrams of obtaining a copy of a view object according to embodiments of the present application;
fig. 10A to 10F are schematic diagrams of a management page provided in an embodiment of the present application;
fig. 11A to 11L are schematic diagrams illustrating a layout manner of a management page according to an embodiment of the present application;
fig. 12A and fig. 12B are schematic diagrams illustrating functions of a view object provided in an embodiment of the present application;
fig. 13A to 13D are schematic diagrams illustrating functions of a view object provided in an embodiment of the present application;
fig. 14A to 14E are schematic diagrams illustrating functions of a view object provided in an embodiment of the present application;
Fig. 15A to 15D are schematic diagrams illustrating functions of the view object provided in the embodiments of the present application;
fig. 16A to 16D are schematic diagrams illustrating functions of a view object provided in an embodiment of the present application;
fig. 17 is a schematic diagram of functions of a view object provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and in addition, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
Fig. 1A shows a schematic configuration of an electronic device 100.
The electronic device 100 may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook, as well as a cellular telephone, personal digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) device, virtual Reality (VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device, the electronic device may be equipped with iOS, android, microsoft or other operating system, and the specific type of the electronic device is not particularly limited in the embodiments of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, demodulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The random access memory may include a static random-access memory (SRAM), a dynamic random-access memory (dynamic random access memory, DRAM), a synchronous dynamic random-access memory (synchronous dynamic random access memory, SDRAM), a double data rate synchronous dynamic random-access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as fifth generation DDR SDRAM is commonly referred to as DDR5 SDRAM), etc.; the nonvolatile memory may include a disk storage device, a flash memory (flash memory).
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. divided according to an operation principle, may include single-level memory cells (SLC), multi-level memory cells (MLC), triple-level memory cells (TLC), quad-level memory cells (QLC), etc. divided according to a storage specification, may include universal FLASH memory (english: universal FLASH storage, UFS), embedded multimedia memory cards (embedded multi media Card, eMMC), etc. divided according to a storage specification.
The random access memory may be read directly from and written to by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like.
The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C.
The earphone interface 170D is used to connect a wired earphone.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B.
The air pressure sensor 180C is used to measure air pressure.
The magnetic sensor 180D includes a hall sensor, and can detect opening and closing of the flip cover.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes).
A distance sensor 180F for measuring a distance.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal and a blood pressure pulsation signal of a human vocal tract vibration bone mass.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the invention, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 1B is a software block diagram of an electronic device 100 according to an embodiment of the invention.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 1B, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 1B, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated below in connection with capturing a photo scene.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as a touch click operation, taking a control corresponding to the click operation as an example of a control of a camera application icon, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera driver by calling a kernel layer, and captures a still image or video by the camera 193.
The following describes concepts of views, view groups, view controls, and view objects related to embodiments of the present invention.
View (View): is the basic element of the user interface displayed by the electronic device, which can define the layout (i.e., size, position, and appearance) and interaction behavior of a view. The interaction behavior of the view may include: and responding to touch, gestures and other events, drawing custom content, supporting drag interaction, responding to focus change, and carrying out animation processing on the size, position and appearance attributes of the view. For example, the view may be a control, window, or other element that may be seen by the user.
View control (View control): or simply referred to as a control. Controls are inherited from the View class (e.g., UIView), and are all subclasses of the View class. The control may be an encapsulation of data and methods, may have its own properties and methods, the properties being simple visitors to the control data, and the methods being some simple visible functionality of the control. For example, the types of controls may include, but are not limited to: user interface controls (controls for developing and constructing a user interface, such as controls for interface elements such as windows, text boxes, buttons, drop-down menus, and the like), chart controls (controls for developing charts, data visualization, and the like can be realized), report controls (controls for developing reports, functions such as browsing, designing, editing, printing, and the like of the reports are realized), form controls (controls for developing forms (CELL, functions of data processing and operation in grids are realized), and the like. The types of the controls in the embodiment of the application may further include: composite control (combining various existing controls to form a new control and centralizing the performances of various controls), extension control (deriving a new control according to the existing control, adding new performances to the existing control or modifying the performances of the existing control), custom control and the like.
View Group (View Group): including one or more views and possibly one or more other view groups.
It will be appreciated that all visual interface elements are inherited from the View class, and that users build user interfaces through View, viewGroup or extended from their (i.e., view and ViewGroup) classes. View objects to which embodiments of the present application relate may include one or more of View, viewGroup and classes that extend from them (i.e., view and viewsroup), one View object typically being used to implement a specified function.
User interface: the term "User Interface (UI)" in the description and claims of the present application and in the drawings may also be referred to as "page", which is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, the interface source code is analyzed and rendered on the electronic equipment, and finally the interface source code is presented as content which can be identified by a user, such as a picture, characters, buttons and the like. The properties and content of the controls in the interface are defined by labels or nodes, such as XML specifies the controls contained in the interface by nodes of < Textview >, < ImgView >, < VideoView >, etc. One node corresponds to a control or attribute in the interface, and the node is rendered into visual content for a user after being analyzed and rendered. In addition, many applications, such as the interface of a hybrid application (hybrid application), typically include web pages. A web page may be understood as a special control embedded in an application interface, the web page being source code written in a specific computer language, such as hypertext markup language (hyper text markup language, HTML), cascading style sheets (cascading style sheets, CSS), java script (JavaScript, JS), etc., which may be loaded and displayed as user-recognizable content by a browser or web page display component similar to the browser functionality. The specific content contained in a web page is also defined by tags or nodes in the web page source code, such as HTML defines the elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
Some exemplary graphical user interfaces implemented on the electronic device 100 provided by embodiments of the present application are described below.
Fig. 2A illustrates a first screen 11 for presenting a main interface of an application installed by the electronic device 100. The first screen 11 of the main interface may include: status bar 201, calendar indicator, weather indicator, tray with commonly used application icons, and display area 202 of other application icons. Wherein:
the display area 202 of other application icons may show: an icon 202A of a video application, an icon of an instant messaging application, an icon of an album, an icon of a gallery, an icon of music, an icon of a memo, a set icon.
In some embodiments, the first screen 11 may also include a navigation bar. The navigation bar may include: a return key, a home screen key multitasking key, and the like. When detecting that the user clicks the return key, the electronic device 100 may display a page of the previous level of the current page. When it is detected that the user clicks the home screen key, the electronic device 100 may display a home interface. When detecting that the user clicks the multitasking key, the electronic device 100 may display, on the history task interface, the application identification of each application that the user recently viewed and still running in the background.
In the embodiment of the present application, the first screen 11 may not include a navigation bar, and each navigation key in the navigation bar may be implemented as a physical key or a gesture of a user. In some embodiments, the electronic device 100 implements the functionality of each navigation key in the navigation bar through user gestures. For example, the return gesture corresponding to the return key includes a gesture that slides up from the left side of the lower edge of the display screen 194 of the electronic device 100.
The first screen 11 may also include a page indicator 203. Other application icons in the main interface may be distributed across multiple pages, and page indicator 203 may be used to indicate which page the user is currently viewing the application in. The user may slide the area of the other application icons left and right to view the application icons in the other pages.
By way of example, as shown in fig. 2A and 2B, a user may view a second screen 12 of the main interface by sliding the first screen 11 of the main interface to the right, the second screen 12 presenting icons of more other applications.
Illustratively, as shown in FIGS. 2C and 2D, the user may also slide the first screen 11 of the main interface to the right to view the negative one screen 13 of the electronic device 100. The negative screen 13 is typically an aggregated portal for functions such as user center, search, application recommendation, news information recommendation, contextual intelligent services, and the like. In this embodiment of the present application, the negative two screens may be added, and the user may further slide the negative one screen 13 rightward to view the negative two screens of the electronic device 100.
In some embodiments, the user may preset the operation call-out notification screen and the control screen.
Illustratively, when the electronic device 100 displays the main interface, the user may view the control screen 14 shown in fig. 2E by sliding down starting from the left half of the status bar. When the electronic device 100 displays the main interface, the user slides down with the right half of the status bar as the starting point, and can view the notification screen 15 shown in fig. 2F.
Wherein the control screen 14 may include a display area 204 of switch icons for the shortcut function. The display area 204 may show switch icons for controlling various shortcut functions of the electronic device 100, such as switch icons for turning on/off a flight mode, wifi, movement data, bluetooth, screen shots, etc. Optionally, the control screen 14 may further include one or more control cards of smart home devices, such as control cards of smart home devices including air cleaners, smart lights, smart sounds, and the like. The notification screen 15 may be used to display notification messages to notify the user of the relevant activity of the electronic device. For example, the notification message may include: push messages for third party applications, reminder messages for telephone calls, calendar reminder messages, etc.
In different implementations, the user may call out the control screen or the notification screen in other call-out manners, which is not specifically limited in the embodiments of the present application. It should be noted that the control screen or the notification screen is not limited to being called out when the main interface is displayed, and may be called out when other interfaces (for example, a negative screen, a user interface of other applications, etc.) are displayed.
The application provides an interface display method, wherein a user uniformly adds view objects corresponding to common functions extracted from pages of applications (for example, extracts view object 1 in page 1 of application 2) into a management page of the common objects for use and management. In page 1, view object 1 continues to retain the original function; in the multi-application common object management page, the view object 1 not only has the original functions, but also can have multiple new functions. Therefore, in the common object management page, the user can quickly inquire and use the view object corresponding to the common function of each application, so that sufficient user customization is realized, the use mode is simplified, and the use experience of the user is effectively improved. In this embodiment of the present application, for convenience of description, a common object management page of multiple applications may be simply referred to as a management page, an application 2 may be referred to as a native application of a view object 1 in the management page, and a page 1 may be referred to as a native page of a view object 1 in the management page.
1. Exemplary descriptions of user interfaces and related implementations of add view objects provided by embodiments of the present application are presented below.
Illustratively, the management page includes a page of the independent application 1 for implementing view object management. As shown in fig. 3A, the first screen 11 of the main interface may further include an icon 202D of application 1; upon detecting that the user clicks the icon 202D, the electronic device 100 displays the management page 21 of the view object shown in fig. 3B.
It should be noted that, in the embodiment of the present application, the common object management page is not limited to the management page 21, and may be implemented as other pages, and an exemplary description is given below of how to add a view object by taking the management page 21 as an example.
In some embodiments, the management page 21 may not include view objects of other applications when the user does not manually add view objects to the management page 21. In other embodiments, when the user does not manually add the view objects to the management page 21, the electronic device 100 may also obtain the usage rate of each view object in each application by the user, and automatically add a preset number of view objects with high usage rate to the management page 21 according to the usage rate.
The following description proceeds with reference to view object 301 in a video application.
Illustratively, as shown in fig. 4A and 4B, after a user operates (e.g., clicks on) an icon 202A of a video application in the first screen 11, the electronic device 100 displays a page 31 of the video application. The page 31 may include a plurality of extractable view objects, for example, view object 301 and view object 306. View object 301 may be used to play video 1 and view object 301 may include a number of controls such as a cover, a name (i.e., "sun exposure" as shown in fig. 4B), a video type (i.e., "television show" as shown in fig. 4B), an update schedule (i.e., "update to 8 sets" as shown in fig. 4B), and so on for video 1.
It should be noted that, the view object 301 of the video application related to the subsequent embodiment is only exemplary and should not be limited to the embodiment of the present application. The view objects of the video application commonly used by the user may be deep, and the user may need to search through multiple operations such as switching the menu 302 and/or the menu 303 in the page 31, sliding operation, page turning operation, and the like.
In embodiments of the present application, view objects may include one or more of View, viewGroup and classes that extend from them (i.e., view and ViewGroup). A page may include one or more extractable view objects, two view objects may overlap, and one view object may include another view object; the division of the extractable view object may be determined by the electronic device 100 in advance according to the display content in each page of each application, or may be determined according to the display content of the currently displayed page after entering the view object extraction mode, which is not particularly limited herein. The dividing century of the view object in the embodiment of the present application is not particularly limited.
In this embodiment, a view object extracted from a page by a user includes one or more interface elements of the page. In some embodiments, electronic device 100 breaks down a page into a plurality of extractable view objects based on the tree structure of the interface layout of the page and/or the functionality implemented by the views in the page. In one implementation, the electronic device 100 determines view groups in the page based on the tree structure and treats each view group as an extractable view object. In some embodiments, the electronic device 100 determines the functionality implemented by each view in the page, dividing the view of the functionality in relation to a particular target into one view object. Illustratively, as shown in FIG. 4B, view object 301 and view object 306 are both view groups (view groups), and the view group corresponding to view object 306 includes the view group corresponding to view object 301. The view object 306 includes view objects corresponding to a plurality of hot-cast videos, the view objects corresponding to the plurality of hot-cast videos include a view object 301, the view object 301 is used for playing a television play "sunlight shining", and functions of view implementation in the view object 301 are all related to the television play "sunlight shining". In some embodiments, the electronic device 100 may treat any interface element (e.g., control, icon, etc.) in one page as one view object, or treat multiple adjacent interface elements (e.g., control, icon, etc.) in one page as one view object. The division manner of the view object in the embodiment of the present application is not specifically limited.
(1) Fig. 4B to 4D show one example of adding view objects to the management page 21.
In some embodiments, when the electronic device 100 displays the page 31 of the video application, the user may select the extraction object as the view object 301 through the input operation 1 acting on the view object 301, and trigger the electronic device 100 to enter the extraction mode of the view object. The user, again through input operation 2, may trigger the electronic device 100 to add the user-selected extraction object to the management page 21. The view object 301 in the management page 21 may be regarded as a copy of the view object 301 in the page 31.
For example, as shown in fig. 4B, the input operation 1 may be an operation of pressing the view object 301 with two fingers, and in response to the input operation 1, the electronic device 100 determines that the extraction object is the view object 301, displays an indication frame corresponding to the view object 301 shown in fig. 4B, where the indication frame may be used to indicate a display range and display content of the extractable view object 301; as shown in fig. 4C, the input operation 2 may be an operation of clicking on the above-described instruction box, and in response to the input operation 2, the electronic device 100 determines to add the view object 301 to the management page 21 and displays the view object 301 in the management page 21.
The above-described input operation 1 may be, not limited to the operation of the two-finger long press view object 301, other finger touch operation (e.g., a two-finger tap operation, a lateral sliding operation, etc.) acting on the view object 301, and may be an input operation of other devices (e.g., a stylus, a mouse, etc.) connected to the electronic device 100. The input operation 1 is not specifically limited herein, and the predefined input operation 1 of the electronic device 100 does not conflict with the original business logic of each view object. The input operation 2 may be other finger touch operation or input operation of other devices, not limited to clicking the instruction box.
In some embodiments, when the view object 301 is added to the management page 21, the page identifier of the original page (i.e., the page 31) of the view object and/or the application identifier of the original application (i.e., the video application) may also be displayed on the management page 21, so that the user may intuitively determine the original application and the original page of the view object.
In the embodiment of the present application, in response to the input operation 1 of the user, the electronic device 100 may enter the view object extraction mode. In one implementation, upon entering the view object extraction mode, the original business logic of the view object 301 may be temporarily disabled; for example, after the user clicks the video cover in the view object 301, the electronic device 100 does not execute the original business logic, that is, does not display the video playing interface corresponding to the view object 301; thus, in one example, the above-described input operation 2 may also be an operation of clicking on the view object 301. In another implementation, after entering the extraction mode, the view object 301 maintains the original business logic, nor does the input operation 2 predefined by the electronic device 100 conflict with the original business logic of the view object 301.
The embodiment of the present application is not limited to indicating the display range of the view object 301 by the indication frame, but may also indicate the view object 301 by other means, for example, the view object 301 is highlighted as a whole, the view object 301 is changed in tone as a whole, or the edge of the view object 301 is highlighted as a whole, and the like, which is not particularly limited herein.
In some embodiments, referring to fig. 4B and 4D, without input operation 2, in response to input operation 1, electronic device 100 may also directly add view object 301 to management page 21.
In some embodiments, referring to fig. 4B and 4E, in response to user input operation 1, when the electronic device 100 determines that the extraction object selected by the user is the view object 301, a prompt box 304 may also be displayed in the page 31, where the prompt box may be used to prompt the user whether to determine to add the selected extraction object to the management page, and may also be used to prompt the electronic device 100 to enter the view object extraction mode. As shown in FIG. 4E, the prompt box 304 may include a cancel control 304A and a confirm control 304B. The confirmation control 304B is used to determine to add the view object selected by the user, and the input operation 2 may be an input operation (such as a clicking operation) applied to the confirmation control 304B shown in fig. 4E; the cancel control 304A is used to trigger the electronic device 100 to exit the view object extraction mode.
In some embodiments, in response to user input operation 1, the user may also switch the selected extraction object from view object 301 to another view object after the electronic device 100 enters the view object extraction mode. Illustratively, taking FIG. 4E as an example, page 31 also includes a view object 305; as shown in fig. 4E and 4F, after detecting an input operation (e.g., a touch operation) acting on the view object 305, the electronic device 100 switches the extraction object to the view object 305, displays a pointing frame corresponding to the view object 305, and stops displaying a pointing frame corresponding to the view object 301. As shown in fig. 4F and 4G, upon detecting an input operation to the confirmation control 304B, the electronic device 100 displays the view object 305 in the management page 21.
In some embodiments, in response to user input operation 2, after adding view object 301 to management page 21, electronic device 100 may continue to display page 31 of the video application. Subsequent users may reenter the management page 21 to view the view object 301 when they intend to reuse the view object 301. For example, the user may enter the management page 21 through the icon 202D shown in fig. 3A.
In some embodiments, the input operation 1 may be divided into two operations, that is, the user triggers the electronic device 100 to enter the view object extraction mode through one operation, and then selects the extraction object as the view object 301 through one operation, and displays an indication frame corresponding to the view object 301.
It will be appreciated that in the examples shown in fig. 4A to 4G, the electronic device 100 displays only the indication frame corresponding to one extraction object selected by the user, and the user can switch the extraction objects. However, in some embodiments, there may be the following: two extractable view objects in a page overlap, and one view object may also include another view object. Illustratively, as shown in FIG. 4B, view object 306 includes view object 301.
In the examples of fig. 4A to 4G, when a touch position on the display screen of a touch operation (e.g., input operation 1, input operation 2, or input operation to switch an extraction object) to select a view object is included in a plurality of extractable view objects (e.g., view object 1 and view object 2), the electronic apparatus 100 may determine a view object, which the user intends to select, from the plurality of extractable view objects according to the touch position. In one implementation, the electronic device 100 obtains a touch position of a touch operation of a user, and determines that a view object whose center positions are closer to the touch position than the center positions of two view objects (i.e., the view object 1 and the view object 2) are view objects that the user intends to select.
(2) Fig. 5A to 5C show another example of adding view objects to the management page 21.
In some embodiments, when the electronic device 100 displays the page 31 of the video application, the user performs the input operation 1 on the view object 301, and when the touch position of the input operation 1 on the display screen is included in a plurality of extractable view objects, the electronic device 100 may display the indication frames respectively corresponding to the plurality of view objects. Then, the user selects an extraction object from the plurality of extractable view objects and adds the extraction object to the management page 21.
Illustratively, as shown in FIG. 5A, page 31 includes a view object 301 and a view object 306, with view object 306 including view object 301; the input operation 1 of the user is detected, and the touch position of the input operation 1 on the display screen is located within the view object 301 and the view object 306. As shown in fig. 5B, in response to the input operation 1, the electronic device 100 enters the view object extraction mode, and displays a pointing frame corresponding to the view object 301 and a pointing frame corresponding to the view object 306. As shown in fig. 5B and 5D, an input operation 3 of selecting the view object 306 by the user (e.g., clicking on the indication box corresponding to the view object 306) is detected, in response to which the electronic device 100 determines to add the view object 306 to the management page 21, and displays the view object 306 in the management page 21.
The input operation 3 of selecting the view object 306 may be other finger touch operations or other input operations of other devices, which are not limited to clicking the indication frame corresponding to the view object 306.
In some embodiments, after the user selects the extraction object as the view object 306, the electronic device 100 may indicate that the selected one of the two view objects is the view object 306 through a change in display form of the view object 306 and/or a display frame corresponding to the view object 306, and display a prompt box 307 shown in fig. 5C, where the prompt box 307 may refer to the related description of the prompt box 304. After the user operates (e.g., clicks on) the confirmation control in the prompt 307, the electronic device 100 adds the view object 306 to the management page 21 and displays the management page 21 shown in fig. 5D. The above-described changes in display form include changes in one or more of hue, transparency, and brightness, and for example, referring to fig. 5C, the electronic device 100 indicates that the user has selected the view object 306 through a color change of the indication frame.
For the examples of fig. 4A to 4G and the examples of fig. 5A to 5D, the input operation 1 triggering the view object extraction mode acts on the page 31 of the video application. In one implementation, the view object extraction mode is only for the page 31 of the video application, and the user may exit the view object extraction mode by exiting the page 31 (e.g., switching the page 31 to be the main interface, other pages of the video application, or pages of other applications), or may exit the view object extraction mode by a cancel control in the prompt box (e.g., cancel control 304A). In another implementation, the view object extraction mode may be specific to all pages of the video application, that is, the user may switch the page 31 to be another page of the video application, and switch the extracted object to be a view object in the other page; the user may exit the view object extraction mode by exiting the video application (e.g., switching the video application to the background) or a cancel control in the prompt box.
(3) Fig. 6A to 6C show another example of adding view objects to the management page 21.
In some embodiments, the user triggers the electronic device 100 to enter a view object extraction mode through input operation 4; after entering the view object extraction mode, the electronic device 100 may display an indication frame of all extractable view objects in the page 31; the user may select one or more view objects in the page 31 by entering operation 5, adding to the management page 21.
The input operation 4 may be a touch operation, a voice command, a hover gesture, or an input operation of other devices, and is not specifically limited herein. In one implementation, electronic device 100 may be provided with a switch for view object extraction mode in one or more of the notification screen, the control screen, the negative one screen, and the system settings. The user can turn on/off the view object extraction mode through the switch, and the above-described input operation 4 may be a touch operation acting on the switch.
By way of example, the following description will be given taking "a switch for setting a view object extraction mode on a control screen" as an example.
As shown in fig. 6A and 6B, when the electronic device 100 displays the page 31 of the video application, an input operation of the user calling up the control screen 14 is detected, and the electronic device 100 displays the control screen 14; the control screen 14 also includes a switch icon 401 for the view object extraction mode. The switch icon 401 has two display states, namely an on state and an off state; the switch icon 401 shown in fig. 6B is currently in an off state, indicating that the current view object extraction mode has been turned off. As shown in fig. 6B and 6C, after the user clicks the switch icon 401, the electronic device 100 switches the switch icon 401 to an on state, and turns on the view object extraction mode. As shown in fig. 6D, after exiting the control screen 14, the electronic device 100 continues to display the page 31, and displays the indication frames corresponding to all the extractable view objects in the page 31, for example, the indication frames corresponding to the view object 301, the view object 305, and the view object 306, respectively.
As shown in fig. 6D, 6E, and 6F, the user sequentially selects the view object 301 and the view object 305 through the input operation 5, and indicates that the view object is selected through a change in the display form of the view object or the indication frame corresponding to the view object, for example, the selected view object covers a translucent mask; upon selection of a view object by the user, the electronic device 100 displays a prompt box 402 on the page 31, and the prompt box 402 may refer to the description related to the prompt box 304. As shown in fig. 6F, after the user selects the view object 301 and the view object 305, the electronic device 100 detects that the user clicks the confirmation control in the prompt box 402, and the electronic device 100 adds the view object 301 and the view object 305 to the management page 21 and displays the view object 301 and the view object 305 in the management page 21. The input operation 5 for selecting the view object may refer to the foregoing input operation 3, and will not be described herein.
In some embodiments, as shown in fig. 6H, after the user triggers the electronic device 100 to enter the view object extraction mode through the input operation 4, the electronic device 100 may display a selection control (e.g., a selection control 403 corresponding to the view object 301 and a selection control 404 corresponding to the view object 305) on each view object, where the input operation 5 may be a click selection control. As shown in fig. 6H, 6I, and 6J, the user sequentially selects the view object 301 and the view object 305 by clicking on the selection control 403 and the selection control 404; before the user clicks the selection control, the selection control presents an unselected state; after clicking the selection control, the user selects the control to present the selected state. Then, similarly, as shown in fig. 6J and 6K, the view object 301 and the view object 305 can be added to the management page 21 by clicking on the confirmation control in the prompt box 402.
For the examples of fig. 6A to 6H, in one implementation, when the electronic device 100 displays the page 31 of the video application, the electronic device 100 is triggered by the input operation 4 to enter the view object extraction mode, where the view object extraction mode is only for the page 31 (or the video application) of the video application, and the user may exit the view object extraction mode by exiting the page 31 (or the video application), or may exit the view object extraction mode by canceling the control, the switch icon 401, the preset voice command, or the preset gesture. In another implementation manner, when the electronic device 100 displays any page, the electronic device 100 is triggered to enter a view object extraction mode through the input operation 4, and the view object extraction mode can be specific to any application supporting the mode, namely, when the page of any application supporting the mode is displayed, a user can select an extraction object in the page and add the extraction object to the management page 21; the user may exit the view object extraction mode by cancelling the control, switch icon 401, preset voice command, or preset gesture.
(4) Fig. 7A to 7D show another example of adding view objects to the management page 21.
In some embodiments, as shown in FIG. 7A, the management page 21 is provided with an add control 501 for view objects, and a user can add view objects of a specified application to the management page 21 through the add control 501.
7A and 7B, upon detecting an input operation (e.g., a click operation) for the add control 501, the electronic device 100 can display an application identification 502 for each application that supports the view object extraction mode; taking the application identifier 502A of the video application as an example, after the user clicks the application identifier 502A, the electronic device 100 may enter a view object extraction mode for the video application, and display the page 31 and the prompt box of the video application shown in fig. 7C or fig. 7D. The prompt box may refer to the prompt box 304 described above.
In one implementation, upon detecting that the user clicks on the application identifier 502A, the electronic device 100 displays the page 31 shown in fig. 7C, and the user may select one view object in the page 31 to add to the management page 21 through one or more of the aforementioned input operation 1, input operation 2, and input operation 3. In particular, reference may be made to the implementations provided in fig. 4A to 4G, or to the implementations provided in fig. 5A to 5D. And will not be described in detail herein.
In one implementation, when detecting that the user clicks on the application identifier 502, the electronic device 100 displays the page 31 shown in fig. 7D, and an indication frame corresponding to all extractable view objects in the page 31, and the user may select one or more view objects in the page 31 to add to the management page 21 through the foregoing input operation 5. In particular, reference may be made to the relevant descriptions of fig. 6A to 6J, which are not repeated here.
(5) Fig. 7E to 7F show another example of adding view objects to the management page 21.
In some embodiments, after entering the extraction mode, the electronic device 100 may receive an input operation of a user defining a specified area on the page 31, combine interface elements in the specified area into one view object, and add the view object to the management page.
For example, as shown in fig. 7E, after the electronic device 100 enters the extraction mode, the user's finger slides on the page 31, the electronic device 100 determines a designated area with the start point and the end point of the sliding of the user's finger, and regards the interface element in the designated area as the view object 308; the sliding starting point is the upper left corner of the designated area, and the sliding ending point is the lower right corner of the designated area. As shown in fig. 7F, the electronic device 100 displays an indication box of the view object 308; upon detecting an input operation by the user to add view object 308 to the management page (e.g., clicking on a confirmation control in the prompt box), view object 308 is added to management page 21.
(6) Fig. 7G to 7H show another example of adding view objects to the management page 21.
In some embodiments, one or more interface elements (e.g., controls, icons, etc.) in a page may be combined into one view object and added to the management page; the one or more interface elements may or may not be contiguous.
Illustratively, as shown in FIG. 7G, after the user clicks on a chat entry for a particular contact in the page 34 of the instant messaging application, the electronic device 100 displays the chat page 38 for that contact. As shown in fig. 7G and 7H, after the user clicks on the control 511 in the chat page 38, the electronic device 100 displays a chat toolbar 512, the chat toolbar 512 including icons 513 for video calls. Upon detecting an input operation of the user to add the icon 513 of the video call to the management page, the electronic device 100 adds the view object 309 to the management page 21. The view object 309 may include not only the icon 513 for the video call, but also the avatar and/or nickname of the particular contact described above. It will be appreciated that the composition of the icon 513 for a video call and the avatar and/or nickname of a particular contact as view object 309 is added to the management page to facilitate the user's determination of which contact the icon for the video call is for use in making a video call with.
Fig. 7I and 7J illustrate that, after the electronic device 100 adds the view object 301 of the video application, the view object 505 of the instant messaging application, the view object 504 of the music application, and the view object 310 of the office application are added to the management page 21.
The home page 32 of the music application shown in fig. 7I includes a plurality of menu options, such as a user center option 503, and after the user clicks on the user center option 503, the electronic device 100 displays the user center page 33 of the music application; the user center page 33 includes a view object 504, the view object 504 being used for adding and playing songs liked by the user, the view object 504 displaying the number of songs added and the number of songs downloaded in the added songs; the user can add the view object 504 to the management page 21 through a preset operation.
The instant messaging application page 34 shown in fig. 7I includes chat entries for one or more contacts, the page 34 including a view object 505; view object 505 is one of the chat entries of the plurality of contacts, and view object 505 may include a avatar 505A of the contact, a nickname 505B of the contact, recent chat information 505C, and a time of receipt and transmission 505D of the recent chat information; the user can also add the view object 504 to the management page 21 through a preset operation.
The page 39 of the office application shown in fig. 7J includes a menu bar that includes a message option, an address book option, a workbench option 514, and a local option. After the user clicks on the workstation option 514, the electronic device 100 displays the page 40, the page 40 including icons of various office items, such as the icon 515 of a punch-out item. After the user clicks the icon 515 of the card transaction, the electronic device 100 displays the card punch page 41, and the card punch page 41 can be used for on-duty card punch and off-duty card punch. The punch page 41 includes a view object 310, the view object 310 including a punch control 310A and indication information 310B; the card punching control 310A displays the current time, the card punching control 310A also displays 'on duty card punching' when the current time is in the on duty card punching time period, and the card punching control 310A also displays 'off duty card punching' when the current time is in the off duty card punching time period; the indication information 310B is used to indicate the current location of the user, and whether the location is within the attendance range. The user can add the view object 310 to the management page 21 through a preset operation.
Specific implementations of adding view objects 309, 504, 505, 310 to the management page 21 may refer to the relevant descriptions of fig. 4A-7F, which are not repeated here.
(7) Fig. 8A to 8D show another example of adding a cross-device view object to the management page 21.
Fig. 4A to 7J show view objects in an installed application of the electronic device 100 added to a management page of the electronic device 100. In some embodiments, the electronic device 100 is cooperatively connected with one or more other devices (e.g., a cell phone, tablet, computer, in-vehicle device, etc.). The management page in the electronic device 100 may add view objects extracted from installed applications of other devices (e.g., the electronic device 200) that have been coordinated in addition to view objects extracted from installed applications of the present device. In this way, the user may use the view object of the electronic device 200 through the management page of the electronic device 100, i.e. the view object of the electronic device 200 that has been coordinated may be streamed to the electronic device 100 for opening.
In some embodiments, the coordinated multiple electronic devices (e.g., electronic device 100 and electronic device 200) are provided with the same management page (e.g., management page 21), and view objects extracted from the application of the device can be added to the management interface separately, where the management pages on the multiple electronic devices remain synchronized; when an update occurs to a management page of one device (e.g., adding a view object, deleting a view object, adjusting a view object layout, etc.), the device may send updated relevant information to other devices that have been coordinated to ensure that the management page of the other device remains synchronized.
In this embodiment, the electronic device 100 and other devices may establish a cooperative connection through one or more modes of logging in the same account number, connecting to the same lan, and the like, which is not limited herein.
In some embodiments, through a view object of electronic device 200 displayed by a management page of electronic device 100, electronic device 100 may invoke applications and hardware (e.g., a camera, a microphone, a positioning module, etc.) in electronic device 200 that are associated with the view object.
As shown in fig. 8A, the electronic device 100 is an in-vehicle device, and the electronic device 200 is a mobile phone; the electronic device 200 displays the view object 301 of the video application, and after detecting an input operation of adding the view object 301 to the management page by the user, the electronic device 200 transmits related data of the view object 301 to the electronic device 100; the electronic device 100 adds the view object 301 to the management page 21 of the electronic device 100 based on the related data of the view object 301 transmitted by the electronic device 200. After the management page of the electronic device 100 adds the view object 301 of the video application of the electronic device 200, the electronic device 100 may call the video application of the electronic device 200 through the view object 301. As shown in fig. 8B, after the user clicks the view object 301 displayed on the management page of the electronic device 100, the electronic device 100 displays a video play page corresponding to the view object 301 in the video application of the electronic device 200. In one implementation, after detecting that a user clicks on a view object 301 displayed on a management page of the electronic device 100, the electronic device 100 sends information about the click event to the electronic device 200; the electronic device 200 responds to the click event, acquires page data of the video playing page corresponding to the view object 301, and sends the page data to the electronic device 100; the electronic device 100 displays a video play page based on the page data.
As shown in fig. 8C, the electronic device 100 is an in-vehicle device, and the electronic device 200 is a mobile phone; the electronic device 200 displays a video call icon 513 in a chat page of the instant messaging application, and after detecting an input operation of adding the video call icon 513 to a management page by a user, the electronic device 200 sends related data of a view object 309 to the electronic device 100, wherein the view object 309 comprises the video call icon 513; the electronic device 100 adds the view object 308 to the management page 21 of the electronic device 100 based on the related data of the view object 309 transmitted by the electronic device 200. After the view object 308 is added to the management page 21 of the electronic device 100, the electronic device 100 may call the instant messaging application, the camera and the microphone of the electronic device 200 through the view object 308 to perform a video call. As shown in fig. 8D, after the user clicks on the view object 308 displayed on the management page of the electronic device 100, the electronic device 100 displays a video call option 516, a voice call option, and a cancel option. After detecting that the user clicks the video call option 516, the electronic device 100 displays a video call page in the instant messaging application of the electronic device 200, where the video call page includes an image collected by a camera of the electronic device 200, and audio of the video call played by the electronic device 100 may be collected by a microphone of the electronic device 200.
(8) Two implementations of adding view objects to the management page 21 are described exemplarily below.
The implementation mode is as follows: at the application layer, a selection view (SelectView) is newly added, and the electronic device 100 invokes the SelectView to add view objects of the respective applications to the management page 21. For example, referring to fig. 9A, the video application may call the common interface of the SelectView to draw the view object 301 and add the drawn view object 301 to the management page 21.
It should be noted that, when the management page is implemented as a page (e.g., management page 21) of the independent application 1, the SelectView may be implemented as a custom View of the application 1. When the management page is implemented as a newly added system page (e.g., a negative two-screen), an existing system page (e.g., a control screen), or a specific region in an existing system page, the SelectView may be implemented as a system component of a system application (e.g., a desktop application).
The SelectView inherits from the view, and the electronic device 100 may customize a subclass of SelectView, that is, the SelectView may set multiple common interfaces for each application (e.g., application 2) to adapt, and one common interface is used to implement one or more specific functions of the view object. When the application 2 adapts to the public interfaces 1 of the plurality of public interfaces, the application 2 can call the public interfaces 1, so that the view objects of the application 2 in the management page 21 can realize the functions corresponding to the public interfaces 1; for the common interface 2 not adapted to the application 2 in the above multiple common interfaces, the application 2 cannot call, and the view object of the application 2 in the management page 21 cannot realize the function corresponding to the common interface naturally. It will be appreciated that the business logic (e.g., refresh logic) that manages the view objects of application 2 in page 21 is all implemented through the common interface of the SelectView.
In some embodiments, application 2 may control the view object of application 2 in management page 21 through the adapted public interface to implement the original functionality of the view object in the original application (i.e., application 2) in management page 21, such as: refreshing the extracted view object, and various clicking monitoring and other basic functions of the extracted view object; through the adapted public interface, personalized new functionality may also be provided for managing view objects in the page 21. Illustratively, all of the functionality that view object 301 in page 31 possesses is collectively referred to as the native functionality of view object 301). The following embodiments will describe the original function and the added function in detail, and will not be described here again.
In some embodiments, upon detecting an input operation (e.g., input operation 1 or input operation 4) that triggers the video application to enter a view object extraction mode or the user to select a view object 301 in the video application, the video application invokes the SelectView to draw the view object 301 and adds to the management page 21. In some embodiments, the video application invokes the SelectView draw view object 301 upon detecting an input operation (e.g., input operation 2, click confirm control, previously described) that determines to add the view object 301 to the management page 21.
The implementation mode II is as follows: view objects in the video application are added to the management page 21 using the native mechanism of the operating system (e.g., android system) of the electronic device 100.
For example, referring to fig. 9B, when detecting an input operation (e.g., input operation 1, input operation 3, or input operation 5 described above) of selecting the view object 301 by the user or determining an input operation (e.g., input operation 2 described above, clicking the confirm control) of adding the view object 301 to the management page 21, the video application acquires the page data of the page 31 using the native mechanism of the operating system, and sets the attribute of the other interface elements in the page 31 except for the view object 301 to be invisible, i.e., the interface elements except for the view object 301 are hidden; the video application then adds a page 31 to the management page 21 that hides the other interface elements. It is understood that the page 31 added to the management page 21 may be considered as a copy of the page 31 of the video application, the interface visible to the copy including only the view object 301. In one implementation, referring to fig. 9B, the electronic device 100 may reduce the size of the copy of the page 31 according to the size of the visible view object 301 and add the reduced copy of the page 31 to the management page 21; the size of the scaled-down copy of the page 31 is greater than or equal to the size of the view object 301.
The manner in which the interface element is not visible is not particularly limited in the embodiments of the present application. In one implementation, setting the transparency of an interface element to 100% may enable the interface element to be invisible.
In the second implementation, the interface elements other than the view object 301 in the added page 31 in the management page 21 are not visible, and the added page 31 may be regarded as the view object 301. For the view object 301 generated by the second implementation manner in the management page 21, all functions, refreshing logic and life cycle of the view object 301 are consistent with those of the view object 301 in the original application, and the functions provided by the management page and the original functions of the view object 301 in the video application are removed, so that no new functions are added.
In the embodiment of the present application, the application supporting the view object extraction mode can select and add the view object to the management page 21, whether or not the SelectView is adapted. If the application is adapted to the SelectView, the function of the view object can be realized by adopting a public interface of the SelectView; if the application is not adapted to the SelectView, the function of hiding the unselected interface elements in the page is used, and only the function of the selected view object is reserved. The application may choose whether to adapt the SelectView based on whether to add functionality to the view object.
It should be noted that, in the first and second embodiments, the view object 301 of the video application is taken as an example for illustration, and the first and second embodiments are also applicable to other view objects, which are not described herein.
2. The display position, display mode and layout mode of the management page provided in the embodiment of the application are described in detail below.
(1) Further display positions of the management page according to the embodiment of the present application are exemplarily described below.
The management page according to the embodiment of the present application may be a new page (for example, the management page 21 of the application 1 for implementing view object management, a new negative two-screen, a page corresponding to a new Widget), an existing page (for example, a first screen of a main interface, a second screen of a main interface, a control screen, a notification screen), a page in a preset area (a Widget, a card, or a Widget) of a page (for example, the new page or the existing page), and the embodiment of the present application is not limited in particular.
In some embodiments, referring to the related descriptions of fig. 3A to 8D, the management page may be the management page 21 of the application 1 for implementing view object management.
The exemplary description proceeds below taking the management page as an example with the addition of view object 301, view object 504, and view object 505.
In some embodiments, the management page may be a newly added negative two screen. For example, as shown in fig. 10A and 10B, upon detecting that the user slides the negative first screen 13 to the left, the electronic device 100 displays the negative second screen 22, and the negative second screen 22 may include view objects added by the user from each application.
In some embodiments, the management page may be a control screen that provides an interface to add view objects for other applications. For example, as shown in FIG. 10C, the control screen 14 may also include one or more view objects (e.g., view object 301, view object 504, and view object 505) added by the user from applications, each of which is combined in the control screen 14 in the form of a shortcut card/shortcut component, the layout style of which follows the existing layout style of the control screen 14. As shown in fig. 10C, the control screen 14 may further include other components, such as one or more control cards of the smart home, where the one or more control cards and the one or more view objects may be arranged in a mixed manner or may be arranged in a separate manner, which is not specifically limited herein.
In some embodiments, the management page may be a widget in a control screen that provides an interface for adding view objects of other applications. For example, as shown in FIG. 10D, a preset widget in the control screen 14 may include one or more view objects added by a user from various applications.
In some embodiments, the management page may be a page corresponding to an newly added widget in the second screen of the main interface. 10E and 10F, the second screen 12 of the main interface includes a widget506, and after detecting an input operation (e.g., a long press operation) performed on the widget506 by the user, the electronic device 100 displays a page 23 corresponding to the widget506, where the page 23 may include one or more view objects added by the user from each application. The widget506 in the second screen 12 may be displayed with a view object added to the page 23, which may be regarded as a cover of the widget 506; the view object may be a recently added view object in the page 23, or may be a view object set by a user (for example, the view object 301), which is not specifically limited herein. In some embodiments, taking the view object 301 as an example, the user may also directly control the view object 301 displayed by the widget506 of the second screen 12 without opening the page 23 through the widget506 and then controlling the view object 301, that is, the view object displayed by the widget506 also has the original function and some or all of the newly added functions of the view object.
In some embodiments, the main interface of the electronic device 100 displays a sidebar that the user can bring up the management page by clicking or sliding. The electronic device 100 is an in-vehicle device or tablet computer, for example.
The following embodiment will proceed with an exemplary description taking the management page 21 as an example.
(2) The following describes the display manner of the management page.
In this embodiment of the present application, when the management page is an existing page, the display mode of the management page follows the display mode of the existing page. When the management page is a newly added page or a page in a preset area (for example, a small window) in the existing page, the display modes of the management page include but are not limited to the following three modes.
Display mode 1: infinite page in any direction
In the display mode 1, the management page is a single page, and the single page accommodates all view objects of other added applications; limited by the size of the display screen, the user may slide the management page 21 in any direction to view more content in the management page 21 when the display screen is unable to display all view objects in the management page 21. In one implementation, the parent layout of the management page invokes a vertical scroll view (scrollview) and a horizontal scroll view (HorizontalScrollView) to enable the management page to be slid in any direction. The electronic device 100 is not limited to the size of the page. In one implementation, the electronic device 100 may define an initial size of the page, which may expand as view objects are added.
Display mode 2: unidirectional infinite page
In the display mode 2, the management page is a single page, the single page accommodates all view objects of other added applications, and the unidirectional infinite page can comprise a horizontal infinite page and a vertical infinite page. In one implementation, the parent layout of the management page invokes a circular scroll view (recyclevew) or list view (listview) to enable the management page to slide in a single direction (e.g., horizontal or vertical).
When the management page is a vertically infinite page, the user can slide the management page 21 in the vertical direction to view more content in the management page 21. The electronic device 100 limits the width of the page, without limitation to the height of the page. In one implementation, the electronic device 100 may define an initial height of the page, which may expand as view objects are added.
When the management page is a horizontal infinite page, the user can slide the management page 21 left and right in the horizontal direction to view more contents in the management page 21. The electronic device 100 limits the height of the page, without limitation to the width of the page. In one implementation, the electronic device 100 may define an initial width of the page, which may expand as view objects are added.
Display mode 3: multi-page capable of turning pages
In the display mode 3, the management page includes a plurality of single pages, and the management page 21 can only display a part of the plurality of single pages (for example, 1 single page) due to the size of the display screen, and the user can view the page that is not displayed in the plurality of single pages by turning left and right. The electronic device 100 limits the size of each single page, with the sizes of the multiple single pages all being the same. In one implementation, a parent layout of a management page invokes a page flip view (viewpager) to enable the flipping of multiple individual pages in the management page.
In some embodiments, when a user adds a new view object to a management page, if the management page has insufficient pages to accommodate the view object, the electronic device 100 may automatically add a new page to the management page and add the view object to the new page. In some embodiments, the management page may include a control for adding pages through which a user may add new blank pages to the management page. In some embodiments, managing pages may include controls for pruning pages through which a user may prune a particular page and view objects in the particular page.
(3) The layout of view objects within a management page is described below.
In some embodiments, the management page invokes two linear layouts (linearlayout) to implement the arrangement of view objects; the two linear layouts (linear layout) include a horizontal layout (for example, an android: orientation= "vertical") and a vertical layout (for example, an android: orientation= "vertical") so that the internal layout of the management page can store a plurality of view objects in the form of a table.
In some embodiments, when a user adds a new view object to the management page, the electronic device 100 automatically lays out the view object based on a default arrangement.
In some embodiments, view objects may be divided into categories based on view object size specifications, such as square, vertical rectangle, horizontal rectangle, etc.; the arrangement of the sub-layouts of the various standards may then be defined based on the size specification and the kind of view objects. The electronic device 100 stores the above-described various arrangements, and the parent layout of the management page may add the arrangement of the above-described various child layouts. Each arrangement may accommodate one or more dimensional view objects. By way of example, fig. 11A exemplarily shows arrangement forms of four sub-layouts, namely arrangement form 1 to arrangement form 4 of the sub-layouts. In general, the preset arrangement form utilizes the characteristics of the dimension specification of each view object, so that each view object in the arrangement form is mutually attached without a large amount of white space.
In the display mode 1, the parent layout of the management page has no limitation on the size and number of child layouts; in the display mode 2 and the display mode 3, the parent layout of the management page has a limitation on the size of the child layout, and the number of child layouts is not limited. For example, in the display mode 3, the width of the child layout is equal to or smaller than the width of a single page of the management page.
In some embodiments, when a user adds a new view object to the management page, the electronic device 100 may automatically layout the view object based on a pre-set arrangement of one or more sub-layouts. In some embodiments, the electronic device 100 may determine the category of the view object according to the size specification of the view object, and select one of the above-mentioned multiple arrangement forms to arrange the view object according to the size specification and/or the category of the view object, so as to determine the display position of the view object on the management page. In some embodiments, when a view object is added to a particular arrangement, the view object may be scaled equally to fit the particular arrangement. For example, the specified area of arrangement 1 may accommodate rectangular view objects, and electronic device 100 may arrange the rectangular view objects using arrangement 1; when the dimensions of the rectangular view object 1 added by the user are not consistent with those of the specified area, the view object 1 can be scaled in equal proportion to adapt to the specified area. It will be appreciated that after equal scaling of view object 1, view object 1 and the above-described designated area are equal in width and/or height.
In some embodiments, the user adds view object 1 and view object 2 in sequence. As shown in fig. 11B, after the user adds the view object 1, the electronic device 100 determines that the arrangement form 4 can accommodate the size specification of the view object 1, and the electronic device 100 uses the arrangement form 4 to layout the view object 1 and determines the display position of the view object 1. As shown in fig. 11C, after the user adds the view object 2, if the arrangement form 4 can also accommodate the size specification of the view object 2, the electronic device 100 continues to determine the display position of the view object 2 based on the arrangement form 4. As shown in fig. 11D, after the user adds the view object 2, if the arrangement form 4 cannot accommodate the size specification of the view object 2 and the arrangement form 1 can accommodate the size specification of the view object 2, the electronic device 100 adopts the new arrangement form 1 to layout the view object 2, and determines the display position of the view object 2. In fig. 11D, the sub-layouts corresponding to the arrangement form 4 and the arrangement form 1 are arranged laterally; not limited to the lateral arrangement, in this embodiment of the present application, the sub-layouts corresponding to the two arrangements may be arranged longitudinally.
In some embodiments, the layout may be performed without using the above-mentioned preset sub-layout arrangement. When the user adds a new view object to the management page, the electronic device 100 continues to arrange the view object in the horizontal/vertical direction against the existing view object in the empty area of the management page. Illustratively, as shown in FIG. 11E, the user adds view object 1 and view object 2 in sequence to the management page; when the user views the object 2 to the management page, the view object 2 is arranged in the vertical direction against the existing view object 1 in the blank area of the management page 21.
For convenience of users, the system also provides a one-key ordering function, and for the condition that the default arrangement form is not satisfied, the user can refresh the arrangement form by using the one-key ordering function. In some embodiments, the management page may perform overall layout on all view objects based on a plurality of preset arrangement rules; the management page comprises a control for one-key arrangement, and when the user clicks the control, the electronic device 100 rearranges all view objects in the management page based on one of the above-mentioned arrangement rules; if the user continues to click on the control, the electronic device 100 continues to replace one of the arrangement rules for re-layout until the various arrangement rules are traversed one time. In some embodiments, the management page includes controls corresponding to the above multiple arrangement rules, and when detecting that the user clicks a control corresponding to one arrangement rule, the electronic device 100 rearranges all view objects in the management page according to the arrangement rule.
In some embodiments, the user may manually adjust the layout position and size of the view object (e.g., view object 1) through a preset operation. In one implementation, the sub-layout of the management page may set a long press click response; after the child layout fills in the view object of the other application, when the long press operation of the user is detected, the electronic device 100 displays indication information on the management page to indicate that the user can adjust the position/size of the view object.
It will be appreciated that in some implementations, when the user adds the view object 1, the electronic device 100 may scale the view object 1 equally according to the size of the view object in the preset arrangement, and the user may manually adjust the size of the view object 1 at a later time.
For example, referring to fig. 11F, when detecting that the user presses a blank area of the management page 21 or a view object in the management page 21 for a long time, the electronic device 100 displays an indication frame corresponding to each view object in the management page 21 to indicate that the user can adjust the position/size of the view object. As shown in FIG. 11G, the user may drag the view object 505 to other locations of the management page.
In some embodiments, the user may delete the view objects of other applications in the management page through a preset operation, and the user may delete the view objects of other applications in the management page one by one or in batches.
For example, referring to fig. 11H, when detecting that the user presses a blank area of the management page 21 or a view object in the management page 21 for a long time, the electronic device may further display a delete control in the management page; as shown in fig. 11I and 11J, upon detecting that the user drags the view object 505 to the delete control, the electronic device 100 may delete the view object 1 in the management page 21. Optionally, as shown in fig. 11K, after deleting the view object 505, other view objects in the management page 21 may automatically fill the display position of the view object 505, so that the layout of the view objects in the management page 21 is compact, and no blank area is left between the view objects as much as possible.
In some embodiments, the electronic device 100 may automatically adjust the hue of the management page to be consistent with the system theme hue. Typically, the electronic device 100 is provided with a system theme that can be used to indicate desktop layout style, icon style, and hue. In some embodiments, the initial hue of the management page is a default hue; the management page includes a control for adjusting the tone, and when the user clicks the control, the electronic device 100 may adjust the tone of the management page to the system theme tone; the user clicks the control again and the default hue of the management page may be restored.
In some embodiments, view objects in a management page are classified and managed according to the application to which they belong. After a user adds a plurality of view objects of the application 2 to the management page, merging the view objects into an application identifier of the application 2, and displaying the application identifier on the management page; upon detecting that the user clicks on the application identification, the electronic device 100 displays a plurality of view objects from the application 2 on the management page.
Illustratively, taking an instant messaging application as an example, the user adds a view object 505, a view object 507, and a view object 508 of the instant messaging application to the management page 21; as shown in fig. 11L, the management page 21 displays an application identifier 509 of the instant messaging application; upon detecting that the user clicks on the application identifier 509, the electronic device 100 displays the view object 505, the view object, and the view object 508 from the instant messaging application on the management page. Alternatively, the display of view objects from the instant messaging application may be stopped upon detecting that the user clicks on the application identifier 509 again.
In some embodiments, view objects in a management page are classified and managed according to the original page of the view objects. After a user adds a plurality of view objects of the same page of an application to a management page, combining the view objects into a page identifier of the page, and displaying the page identifier on the management page; upon detecting that the user clicks on the page identification, the electronic device 100 displays multiple view objects from the page on the management page.
The method is not limited to classified management according to the application and the original page, and the embodiment of the application can also perform classified management on the managed view objects according to other rules, which is not particularly limited herein.
3. The original functions and the newly added functions of the view object in the management page are exemplarily described below.
In this embodiment of the present application, after the management page 21 adds the view object 1 of the application 2, the original function of the view object 1 may be implemented in the management page 21, so as to maintain the original service logic of the current view object 1. The original function of the view object 1 refers to a function provided in the view object 1 in the application 2.
Illustratively, as shown in FIG. 12A, the management page 21 adds a view object 301 of a video application, a view object 505 of an instant messaging application, and a view object 504 of a music application.
As shown in fig. 12A, both the management page 21 and the view object 301 in the video application have the original functions of the view object. The view object 301 includes a cover of the video 1, and after detecting that the user clicks the cover of the view object 301 in the page 31 of the video application, the electronic device 100 displays a next-level page corresponding to the view object 301, that is, a video playing page 35, where the video playing page 35 is used to play the video 1. Upon detecting that the user clicks on the cover of the view object 301 in the management page 21, the electronic device 100 runs the video application in the foreground and also jumps to the video play page 35.
As shown in fig. 12B, the view object 504 in the management page 21 and the music application has the original function of the view object. Upon detecting that the user clicks on the view object 504 in the user-centric page 33 of the music application, the electronic device 100 displays a next-level page, i.e., the song list page 36, corresponding to the view object 504. Upon detecting that the user clicks on the cover of the view object 504 in the management page 21, the electronic device 100 foreground runs a music application and also jumps to the song list page 36.
View object 504 is a chat entry for a particular contact, and after the user clicks on view object 504 in management page 21 and the instant messaging application, electronic device 100 may jump to the chat interface for the particular contact.
It will be appreciated that the management page 21 adds the view object 1 of the application 2 and does not affect the normal use of the view object 1 in the application 2. After adding the view objects corresponding to the common functions in each application to the management page 21, the user can quickly query and use the view objects corresponding to the common functions of the user through the management page 21. Because the usage habits of the users are different, the common functions are different, and the view objects added by the management pages 21 of the electronic equipment of the users are also different, so that personalized customization is effectively realized.
In some embodiments, applications may modify the triggering of the original functionality of a view object (e.g., view object 1) added to the management page 21, and may also provide more added functionality for that view object. The original function of modifying the trigger mode can also be regarded as a new function of the view object. The newly added functions are exemplarily described below.
(1) Gesture monitoring, managing pages based on preset gesture amplification, and realizing page jumping of view objects.
In some embodiments, the administrative page supports gesture recognition, and the administrative page may be zoomed in and out based on preset gestures, with each view object in the administrative page also zooming in/out. In some embodiments, when the management page is an infinite page in any direction, the management page may zoom in and out with the gestures of the user.
In some embodiments, when the area of the view object 1 in the management page is enlarged to a preset value along with the preset gesture, the electronic device 100 jumps to a next page of the view object 1 in the original application.
As illustrated in fig. 13A and 13B, the electronic device 100 detects a zoom-in gesture of the user on the management page, the zoom-in gesture including a double-finger of the user sliding on the management page, and a distance between touch points of the double-finger on the display screen increasing; in response to the zoom-in gesture, electronic device 100 zooms in on the management page as well as the view objects in the management page. Conversely, a zoom-out gesture includes a user's double-finger sliding over the administrative page and a reduced distance between the double-finger's touch points on the display screen, which may be used to zoom out of the administrative page.
In some embodiments, referring to fig. 13A to 13C, taking the view object 301 as an example, the above-mentioned magnification gesture acts on the view object 301, and the view object 301 is magnified along with the magnification gesture; when the area of the view object 301 is enlarged to a preset value, the interface displayed by the electronic device 100 jumps to the original application (i.e., video application) of the view object 301. In one implementation, when the area of the view object 301 is detected to be enlarged to a preset value, the next page of the view object 301 in the video application, that is, the video playing page 35 shown in fig. 13C, may be skipped. In one implementation, upon detecting that the area of view object 301 is enlarged to a preset value, the jump may be made to the original page (i.e., page 31) of view object 301 in the video application.
In some embodiments, referring to fig. 13C and 13D, after jumping from the management page to the video application corresponding to the view object 301, in response to the user's zoom-out gesture, the electronic device 100 may close the video application or put the application into background operation, returning to the management page 21.
(2) Click monitoring, click event triggering display of related content of view objects on a management page
In some embodiments, the user clicks the view object in the management page, and may not jump the management page to the next level page of the view object in the original application, that is, display the next level page in full screen, but display a popup window in the management page, where the popup window is used to display part or all of the content of the next level page; the user can trigger the electronic equipment to jump to the next-level page through the popup window. In one implementation, the popup is used to display the view content of interest to the user in the next level page. In this embodiment of the present application, the next page of the view object may be a page in the original application of the view object, or may be a page of another application, which is not specifically limited herein.
Illustratively, the next page of the view object 301 in the video application is the video playing interface 35, and the most focused and commonly used view content of the user in the video playing interface 35 generally includes a video display area and a video progress bar; as shown in fig. 14A, when detecting that the user clicks on the view object 301 in the management page 21, the electronic device 100 displays a popup 601, and the popup 601 may include the video display area and the video progress bar described above. In some embodiments, the pop-up window 601 may also include a zoom-in control 602 and/or a close control 603; the zoom-in control 602 is used to trigger the electronic device 100 to jump to a next level page (i.e. the video playing interface 35) of the video object 301 in the video application, and full-screen display the video playing interface 35; the close control 603 is used to trigger the electronic device 100 to close the pop-up window 601.
Illustratively, the next level page of view object 504 in the music application is song list page 36, which contains the most interesting and popular view content for the user, typically including a list of songs liked by the user, in song list page 36. As shown in fig. 14B, upon detecting that the user clicks on the view object 504 in the management page 21, the electronic device 100 displays a pop-up window 604, and the pop-up window 604 may include a list of songs liked by the user as described above. Similarly, the pop-up window 604 may also include a zoom-in control 605 and/or a close control 606.
In some embodiments, the user clicks on the view object in the management page, and may not jump to the next-level page in the original application of the view object, that is, display the next-level page in full screen, and display the related content of the common function of the next-level page in the display area of the view object in the management page.
Illustratively, referring to FIG. 14C, the next page of the view object 504 of the music application is the song list page 36, and the user's usual functions include clicking on a song in the song list to play. As shown in fig. 14C, when detecting that the user clicks on the view object 504 in the management page 21, the electronic device 100 displays a music playing component in the display area of the view object 504, and plays song 1 in the song list described above; if the user continues to click on the music playing component, the electronic device 100 then jumps to the song list page 36. The music playing component may include information such as the cover, song name, artist and progress of playing of song 1, and controls for switching songs. In some embodiments, electronic device 100 replaces the display content of view object 504 with the music playing component described above.
It can be understood that the click effect illustrated in fig. 14A to 14C can be regarded as a lightweight click effect of the view object, that is, related content of a next-level page that the user intends to view is directly shown on the management page without jumping to the next-level page of the view object in the original application. The management page according to the embodiment of the present application may also provide other lightweight click effects, which are not limited to the lightweight click effects shown in fig. 14A to 14C.
It will be appreciated that in the examples of fig. 14A-14C, the electronic device 100 modifies the business logic of the original function of the view object, including the triggering manner of the original function. In some embodiments of the present application, the service logic (e.g., trigger mode) of the original function of the view object in the management page may also be kept unchanged, and the service logic of the new function of the view object and the service logic of the original function do not conflict. For example, the user may trigger the electronic device 100 to display a pop-up window corresponding to the view object on the management page through other preset operations, or display related content of a common function of the next level page in the display area of the view object in the management page, while retaining the original click effect.
For example, the clicking effect of the original function is triggered by a single click operation, and the other preset operations are different from those of the single click operation, for example, a double click operation, a three click operation, a single click operation, a double click operation, a finger sliding operation, a finger joint sliding operation, and the like, which are not particularly limited in the embodiment of the present application. Illustratively, upon detecting a double-click operation for view object 504, electronic device 100 displays a pop-up window 604 corresponding to view object 504 on the management page. Upon detecting a two-finger swipe operation for the view object 504, the electronic device 100 displays a music playing component in the display area of the view object 504.
In some embodiments, taking the view object 504 of the music application as an example, as shown in fig. 14D and 14E, after the management page 21 jumps to the next level page (i.e. the song list page 36) corresponding to the view object 504, the electronic device 100 may be triggered to return to the user center page 33 of the music application or the electronic device 100 may be triggered to return to the management page 21 through a return operation (such as the aforementioned return gesture) for returning to the previous level page, which is not specifically limited herein. It will be appreciated that in a music application, the user center page 33 is the upper level page of the song list page 36. In one implementation, the view object 504 in the management page is taken as an entry of the music application, and after entering the music application through the entry, the music application and the management interface are decoupled; the return operation triggers a return to the page of the song list page 36 of the music application that is the upper level, i.e. the user center page 33. In one implementation, after the management page jumps to the next level page (i.e., the song list page 36) corresponding to the view object 504, the management page may also be regarded as the previous level page of the song list page 36, and the return operation triggers the return of the management page.
In some embodiments, in the case that the view object 504 is added by adopting the system native mechanism, the view object 504 retains the original business logic, and after the management page is set to jump to the next level page corresponding to the view object 504 in the original business logic of the music application, the return operation triggers whether to return to the management page or to the previous level page in the music application. In some embodiments, with the Seclectroview adding the view object, the music application may implement a new response event (such as the foregoing lightweight click effect) of clicking skip through the public interface adapted to SeclectrectView, and the page to which to return after skip may also be determined through the public interface, which is not specifically limited herein.
(3) Focusing effect of view objects
In some embodiments, when the user performs a preset operation (e.g., a click operation) on the view object 1 in the management page, the electronic device may display a focusing effect, such as a zooming effect, on the view object 1. In some embodiments, when the user performs a preset operation (such as a clicking operation) on the view object 1 in the management page, for view objects that are not operated by other users, the view objects operated by the user may also be highlighted by setting one or more modes of transparency, a mask, a dynamic effect, and the like.
15A, 15B and 15C, when it is detected that the user clicks on view object 504, electronic device 100 displays view object 504 for zooming, i.e., zooming out and then in to the original size, and displays semi-transparent overlays on the other view objects (i.e., view object 301 and view object 505) during the zooming process; as shown in fig. 15D, after displaying the focusing effect, the electronic device 100 further responds to the clicking operation of the user to execute a corresponding response event, for example, displaying the popup window 604 corresponding to the view object 504.
(4) Adjusting internal layout of view objects
In some embodiments, the user may trigger the view object 1 to enter an edit mode of the internal layout through a preset operation. In the editing mode, the user may modify the internal layout of view object 1, adjusting the position and/or size of the interface elements within view object 1.
For example, taking the view object 505 as an example, as shown in fig. 16A and 16B, when a preset operation (such as a two-finger long press operation) of the user on the view object 505 is detected, the electronic device 100 displays an indication frame corresponding to each control in the view object 505, where the indication frame is used to indicate that the current editing mode is entered, and the user may adjust the position and/or size of the control in the view object 505. As shown in fig. 16B and 16C, the user may drag control 505D within the display area of view object 505, and electronic device 100 displays control 505D as a user's finger moves. The above-mentioned preset operation for triggering and adjusting the internal layout is not particularly limited in the embodiments of the present application. Not limited to the above-described indication boxes, the current entry into edit mode may also be indicated by other means, such as by highlighting an edge of a control within view object 505.
In some embodiments, after a user adjusts the layout of the view object internal controls, the electronic device 100 may adaptively adjust the size of the view object. For example, as shown in fig. 16C and 16D, after the user adjusts the display position of the control 505D in the view object 505, the right portion area of the view object 505 has no control, and the electronic device 100 may zoom out the view object 505 based on the adjusted display position of each control.
(5) Interface function extension
In some embodiments, an application adapts to various common interfaces of the SelectView, and the interface functions that can be implemented by view object 1 of the application in the management page are rich. When detecting the preset operation of the user on the view object 1, the electronic device 100 may display function options corresponding to multiple functions that can be implemented by the view object 1, and the user may select any one of the function options to implement the corresponding function. It should be noted that the functional options that can be implemented by different view objects may be the same or different. Therefore, the user does not need to memorize the respective triggering modes of the multiple functions, and the operation of the user is convenient. The embodiment of the present application does not specifically limit the above-mentioned preset operation. In the following, a view object 505 is taken as an example, and as shown in fig. 15A, the view object 505 is a chat entry of a specific contact, and may be used to trigger a chat page of the specific contact.
In some embodiments, the preset operation includes a touch operation; as shown in fig. 17, upon detecting that the user touches the view object 505, the view object 505 displays a plurality of function options 607, e.g., the plurality of function options 607 includes: displaying popup window, adjusting internal layout, voice call, video call, sending red packet, sending position, sending voice, etc. As shown in fig. 17, when it is detected that the user touches the view object 505 and does not loosen his hand, and the finger slides directly to one of the plurality of function options 607, the electronic device 100 executes a response event corresponding to the function option; for example, when the function option is a function option corresponding to a voice call, the electronic device 100 displays the voice call page 37 of the specific contact. Upon detecting that the user has released his hand upon touching the view object 505, the electronic device 100 may execute a response event corresponding to clicking on the view object 505.
In some embodiments, the click operation may be divided into a flick operation and a swipe operation, the electronic device 100 may distinguish between a flick operation and a swipe operation entered by a user on the display screen, and the flick view object and the swipe view object may trigger different response events; the preset operation may include a heavy clicking operation; upon detecting a user's heavy click on view object 505, a plurality of functional options 607 that view object 505 may implement are displayed; upon detecting a user clicking on one of the plurality of function options 607 (e.g., voice over corresponding function option), the electronic device 100 displays the voice call page 37. Upon detecting that the user taps the view object 505, the electronic device 100 may execute a response event corresponding to tapping the view object 505, for example, jumping to a next level page corresponding to the view object, that is, a chat page of the specific contact.
The application provides an interface display method, which is applied to electronic equipment and comprises the following steps:
displaying a first page of a first application, the first page including one or more view objects; executing a first function of a first view object, which is any one of one or more view objects of a first page, in a case that a first operation of the first view object acting on the first page is detected; in the event that a second operation is detected for the first view object of the first page, adding the first view object to the second page; a third operation is detected for the first view object of the second page, and in response to the third operation, a first function of the first view object is performed.
In this embodiment of the present application, the second page may be the aforementioned management page, and the first function may be the aforementioned original function.
Illustratively, the first page of the first application may be the page 31 of the video application, and the first view object may be the view object 301 in the page 31; referring to the related description of fig. 4A to 7H, the first operation may include an input operation for adding the view object 301 in the page 31 to the management page, such as one or more of the aforementioned input operation 1, input operation 2, input operation 3, input operation 4, and input operation 5; referring to fig. 12A, the first operation may include an input operation (e.g., a click operation) on the view object 301 in the page 31, the first operation may include an input operation (e.g., a click operation) on the view object 301 in the management page 21, and the first function of the view object 301 includes jumping to the video play page 35.
Illustratively, the first page of the first application may be the user-centric page 33 of the aforementioned music application, and the first view object may be the view object 504 in the user-centric page 33; referring to fig. 12B, the first operation may include an input operation (e.g., a click operation) on the view object 504 in the user center page 33, and the first operation may include an input operation (e.g., a click operation) on the view object 504 in the management page 21, and the first function of the view object 504 includes jumping to the song list page 36.
Illustratively, the first page of the first application may be the page 34 of the instant messaging application, and the first view object may be the view object 505 in the page 34; the first operation may include an input operation (e.g., a click operation) on the view object 505 in the page 34, and the first operation may include an input operation (e.g., a click operation) on the view object 505 in the management page 21.
In some embodiments, the above method further comprises: a fourth operation is detected for the first view object of the second page, and in response to the fourth operation, a second function of the first view object is performed, the first view object of the first page not having the second function.
In this embodiment of the present application, the second function may be the aforementioned additional function, and referring to fig. 13A to 17, the additional function of managing view objects in a page includes one or more of the following: the gesture is amplified to trigger page jump, and the gesture is reduced to return to the management page; triggering the electronic equipment to display a popup window, wherein the popup window comprises related contents of a next-level page of the view object; triggering the electronic equipment to display related contents of a next-level page of the view object in a display area of the view object; triggering the electronic equipment to display the focusing effect of the view object; triggering the electronic equipment to adjust the internal layout of the view object; triggering the electronic device to display a plurality of function options associated with the view object.
In some embodiments, the above method further comprises: displaying a third page of the second application, the third page including one or more view objects, the second view object being any one of the one or more view objects of the third page; a fifth operation is detected for the second view object of the third page, and in response to the fifth operation, the second view object is added to the second page, and the second view object is displayed on the second page.
The second application is different from the first application, and the second view object of the third page of the second application may refer to the related description of the first view object of the first page of the first application, which is not described herein.
In some embodiments, the first function includes jumping to a fourth page associated with the first view object; the third operation includes a first gesture for zooming in on the first view object, the first gesture including a user's double-finger sliding on the display screen with an increasing distance between the double-fingers, the second gesture including a user's double-finger sliding on the display screen with a decreasing distance between the double-fingers; the detecting of the third operation of the first view object for the second page, and the executing of the first function of the first view object in response to the third operation includes: detecting a first gesture acting on a first view object of a second page; in response to the first gesture, zooming in on the first view object; displaying a fourth page when the area of the first view object is enlarged to a preset value along with the first gesture; the method further comprises the following steps: and when the second gesture aiming at the fourth page is detected, displaying the second page.
In this embodiment of the present application, the first gesture may be the aforementioned zoom-in gesture, the second gesture may be the aforementioned zoom-out gesture, and the fourth page may be a next-level page of the first view object. For example, referring to the related descriptions of fig. 13A to 13C, the first view object is the aforementioned view object 301, and the fourth page may be the video playing page 35 of the video application; the zoom-in gesture on view object 301 may trigger the electronic device to jump to video play page 35; a zoom-out gesture applied to the video playback page 35 may trigger the electronic device to return to the management page 21.
In some embodiments, the first function includes jumping to a fourth page associated with the first view object; the detecting of the third operation of the first view object for the second page, and the executing of the first function of the first view object in response to the third operation includes: detecting a sixth operation of the first view object acting on the second page; responding to a sixth operation, displaying a first popup window, wherein the first popup window is used for displaying part or all of the content of a fourth page; detecting a seventh operation on the first popup; displaying a fourth page in response to the seventh operation; the third operation includes a sixth operation and a seventh operation.
Illustratively, referring to the related description of fig. 14A, the first view object is the aforementioned view object 301, the fourth page may be the video playing page 35 of the video application, and the first popup may be the popup 601; the sixth operation includes an input operation (e.g., a click operation) on the view object 301 in the management page 21, and the seventh operation includes an input operation (e.g., a click operation) on the zoom-in control 602 in the pop-up window 601.
Illustratively, referring to the associated description of FIG. 14B, the first view object is the aforementioned view object 504, the fourth page may be the page 36 of the video application, and the first popup may be the popup 604; the sixth operation includes an input operation (e.g., a click operation) on the view object 504 in the management page 21, and the seventh operation includes an input operation (e.g., a click operation) on the zoom-in control in the pop-up window 604.
In some embodiments, the first function includes jumping to a fourth page associated with the first view object; the detecting of the third operation of the first view object for the second page, and the executing of the first function of the first view object in response to the third operation includes: detecting an eighth operation of the first view object acting on the second page; responding to the eighth operation, and displaying a third view object corresponding to the common function in the fourth page in the display area of the first view object of the second page; detecting a ninth operation on the third view object; displaying a fourth page in response to the ninth operation; the third operation includes an eighth operation and a ninth operation.
Illustratively, referring to the associated description of FIG. 14C, the first view object is the aforementioned view object 504, the fourth page may be the page 36 of the video application, and the first popup may be the popup 604; the eighth operation includes an input operation (e.g., a click operation) on the view object 504 in the management page 21, the third view object includes the aforementioned music playing component, and the ninth operation includes an input operation (e.g., a click operation) on a blank area within the music playing component.
In some embodiments, the third operation includes a click operation that acts on the first view object of the second page; when the clicking operation is detected, the electronic equipment responds to the clicking operation after displaying the focusing effect of the first view object on the second page; the focusing effect of the first view object includes one or more of: the first view object is zoomed out and then zoomed in, and view objects outside the first view object in the second page are uniformly changed from the first display form to the second display form.
For example, referring to the associated description of fig. 15A-15D, the first view object is the aforementioned view object 504; the focusing effect of view object 504 includes: view object 504 is scaled and view objects outside of view object 504 add a semi-transparent covering. That is, the first display form may be the original display form of the view object 504, and the second display form may add a translucent cover layer to the view object 504.
In some embodiments, the first view object includes one or more controls including the first control, the second function includes adjusting a display position and/or size of the first control in the first view object, detecting a fourth operation of the first view object for the second page, and in response to the fourth operation, performing a second function of the first view object, including: detecting a tenth operation of the first view object acting on the second page; in response to a tenth operation, displaying indication information, wherein the indication information is used for indicating entering an editing mode of a first view object, and the editing mode of the first view object is used for adjusting the display position and/or the size of any control in the first view object; detecting a drag operation acting on a first control in an editing mode of the first view object; in response to the drag operation, moving the first control within the display area of the first view object based on an operation track of the drag operation; the fourth operation includes a tenth operation and a drag operation.
For example, referring to the associated descriptions of fig. 16A-16D, the first view object is the aforementioned view object 505, and the second function includes adjusting the internal layout of the view object; the tenth operation includes an input operation (e.g., a two-finger long press operation) for the view object 505 in the management page 21, and the indication information may include an indication box of each control inside the view object, and the first control may be the control 505D.
In some embodiments, the detecting the fourth operation of the first view object for the second page, and in response to the fourth operation, performing the second function of the first view object includes: detecting an eleventh operation of the first view object acting on the second page; in response to an eleventh operation, displaying function options corresponding to a plurality of functions of the first view object, the plurality of function options including function options corresponding to the second function; detecting a twelfth operation acting on the function option corresponding to the second function; in response to the twelfth operation, performing a second function; the fourth operation includes an eleventh operation and a twelfth operation.
Illustratively, with reference to the associated description of FIG. 17, the first view object is the aforementioned view object 505, and the second function includes a plurality of function options for triggering the electronic device to display the view object, and further includes a voice call function; the eleventh operation includes an input operation (e.g., a touch operation) for the view object 505 in the management page 21, and the twelfth operation may include a slide operation of a finger to a function option of the second function after touching the view object 505.
In some embodiments, after adding the plurality of view objects of the first application to the second page, the second page displays an application identifier of the first application; the method further comprises the following steps: detecting a thirteenth operation of the application identification acting on the first application; in response to the thirteenth operation, a plurality of view objects of the first application are displayed on the second page.
Illustratively, referring to the associated description of FIG. 11L, the first view object of the first application may be the view object 505 of the instant messaging application, the management page includes the application identifier 509 of the instant messaging application, and the thirteenth operation may include an input operation (e.g., a click operation) for the application identifier 509.
In some embodiments, before displaying the first page of the first application, the method further includes: displaying a second page, wherein the second page comprises a third control; detecting a fourteenth operation on the third control; in response to the fourteenth operation, displaying application identifications of a plurality of applications, the plurality of applications including the first application; detecting a fifteenth operation of an application identifier acting on the first application; the displaying the first page of the first application includes: in response to a fifteenth operation, displaying a first page of the first application and entering an extraction mode of the view object; in the fetch mode, a second operation is to add the view object to the second page.
For example, referring to the associated descriptions of fig. 7A-7D, the first view object is the view object 301 of the video application, the third control may include the add control 501 in the management page 21, the fourteenth operation includes an input operation (e.g., a click operation) for the add control 501, and the fifteenth operation includes an input operation (e.g., a click operation) for the application identifier 502A of the video application.
In some embodiments, before displaying the first page of the first application, the method further includes: displaying a fifth page, wherein the fifth page comprises a switch control of an extraction mode of the view object; detecting a sixteenth operation on the switch control; in response to the sixteenth operation, turning on the extraction mode; in the fetch mode, a second operation is to add the view object to the second page.
For example, referring to the related description of fig. 6A to 6G, the first view object is the view object 301 of the video application, the fifth page includes the control screen 14, the switch control of the extraction mode includes the switch icon 401, and the sixteenth operation includes an input operation (e.g., a click operation) acting on the switch icon 401.
In some embodiments, in the case of detecting the second operation of the first view object for the first page, before adding the first view object to the second page, the method further includes: detecting a seventeenth operation on the first page, wherein a touch position of the seventeenth operation is included in the first view object and the fourth view object; in response to the seventeenth operation, a pointing frame of the first view object and a pointing frame of the fourth view object are displayed.
For example, referring to the related description of fig. 5A to 5D, the seventeenth operation may be the aforementioned input operation 1, such as a two-finger long press operation applied to the page 31, the first view object is the view object 301, and the fourth view object is the view object 306; the second operation may be the aforementioned input operation 2.
In some embodiments, in the case of detecting the second operation of the first view object for the first page, before adding the first view object to the second page, the method further includes: an eighteenth operation is detected; and responding to the eighteenth operation, and displaying indication frames respectively corresponding to all view objects of the first page.
For example, referring to the related description of fig. 6A to 6D, the eighteenth operation may be the aforementioned input operation 4, which is used to turn on the extraction mode. For example, the eighteenth operation may include a click operation of the switch icon 401 acting on the control screen.
In some embodiments, in the case of detecting the second operation of the first view object for the first page, adding the first view object to the second page includes: detecting a nineteenth operation on the first view object of the first page; in response to the nineteenth operation, displaying indication information of the first view object and a second control, wherein the indication information of the first view object is used for indicating that the first view object is selected; detecting a twentieth operation acting on the second control; in response to the twentieth operation, the first view object is added to the second page.
The nineteenth operation is for selecting the first view object as the extraction object, and the second control may include a confirmation control in the aforementioned prompt box, and the twentieth operation includes an input operation (e.g., a click operation) for the confirmation control. For example, referring to the associated descriptions of fig. 5A-5D, the nineteenth operation is for the view object 306, the indication information may include a color-changed indication box of the view object 306, and the second control may include a confirmation control in the prompt box 307. For example, referring to the relevant descriptions of fig. 6D-6G, the nineteenth operation includes an input operation (e.g., a click operation) for the view object 301, the indication information may include a translucent cover layer overlaid on the view object 301, and the second control may include a confirmation control in the prompt box 402.
In some embodiments, before the detecting the twentieth operation on the second control, the method further includes: detecting a twenty-first operation of a fifth view object acting on the first page; in response to the twenty-first operation, displaying indication information of a fifth view object for indicating that the fifth view object is selected; the adding the first view object to the second page in response to the twentieth operation includes: in response to the twentieth operation, the first view object and the fifth view object are added to the second page.
For example, referring to the related descriptions of fig. 6D to 6G, the first view object is the view object 301, the fifth view object is the view object 305, the nineteenth operation includes an input operation (e.g., a click operation) for the view object 301, the twenty first operation includes an input operation (e.g., a click operation) for the view object 305, and the indication information may include a translucent mask layer overlaid on the view object.
The embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
In summary, the foregoing description is only exemplary embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present invention should be included in the protection scope of the present invention.

Claims (18)

1. An interface display method, applied to an electronic device, comprising:
displaying a first page of a first application, the first page comprising one or more view objects;
executing a first function of a first view object on the first page if a first operation of the first view object is detected, wherein the first view object is any one of one or more view objects of the first page;
In the event that a second operation is detected for the first view object of the first page, adding the first view object to a second page;
a third operation is detected for the first view object of the second page, and in response to the third operation, a first function of the first view object is performed.
2. The method according to claim 1, wherein the method further comprises:
a fourth operation is detected for the first view object of the second page, and in response to the fourth operation, a second function of the first view object is performed, the first view object of the first page not having the second function.
3. The method according to claim 1, wherein the method further comprises:
displaying a third page of a second application, the third page comprising one or more view objects, the second view object being any one of the one or more view objects of the third page;
a fifth operation is detected for the second view object of the third page, and in response to the fifth operation, the second view object is added to the second page, and the second view object is displayed on the second page.
4. A method according to any of claims 1-3, wherein the first function comprises jumping to a fourth page associated with the first view object; the third operation includes a first gesture for zooming in the first view object, the first gesture including a user's double-finger sliding on a display screen with an increasing distance between the double-fingers, and a second gesture including a user's double-finger sliding on a display screen with a decreasing distance between the double-fingers;
the detecting of the third operation of the first view object for the second page, and the responding of the third operation, executing the first function of the first view object, includes:
detecting a first gesture acting on the first view object of the second page;
in response to the first gesture, zooming in on the first view object;
displaying the fourth page when the area of the first view object is enlarged to a preset value along with the first gesture;
the method further comprises the steps of:
and when a second gesture aiming at the fourth page is detected, displaying the second page.
5. A method according to any of claims 1-3, wherein the first function comprises jumping to a fourth page associated with the first view object;
The detecting of the third operation of the first view object for the second page, and the responding of the third operation, executing the first function of the first view object, includes:
detecting a sixth operation of the first view object acting on the second page;
responsive to the sixth operation, displaying a first popup for displaying part or all of the content of the fourth page;
detecting a seventh operation on the first popup window;
displaying the fourth page in response to the seventh operation; the third operation includes the sixth operation and the seventh operation.
6. A method according to any of claims 1-3, wherein the first function comprises jumping to a fourth page associated with the first view object;
the detecting of the third operation of the first view object for the second page, and the responding of the third operation, executing the first function of the first view object, includes:
detecting an eighth operation of the first view object acting on the second page;
responding to the eighth operation, and displaying a third view object corresponding to a common function in the fourth page in a display area of the first view object of the second page;
Detecting a ninth operation on the third view object;
displaying the fourth page in response to the ninth operation; the third operation includes the eighth operation and the ninth operation.
7. The method of any of claims 1-6, wherein the third operation comprises a click operation that affects the first view object of the second page; when the clicking operation is detected, the electronic equipment responds to the clicking operation after the focusing effect of the first view object is displayed on the second page;
the focusing effect of the first view object includes one or more of: and the first view object is zoomed out after being zoomed in, and the view objects outside the first view object in the second page are uniformly changed from a first display form to a second display form.
8. The method of claim 2, wherein the first view object comprises one or more controls, the one or more controls comprising a first control, the second function comprising adjusting a display position and/or size of the first control in the first view object, the detecting a fourth operation of the first view object for the second page, responsive to the fourth operation, performing a second function of the first view object, comprising:
Detecting a tenth operation of the first view object acting on the second page;
responsive to the tenth operation, displaying indication information, wherein the indication information is used for indicating entering an editing mode of the first view object, and the editing mode of the first view object is used for adjusting the display position and/or the size of any control in the first view object;
detecting a dragging operation acting on the first control in an editing mode of the first view object;
responding to the dragging operation, and moving a first control in a display area of the first view object based on an operation track of the dragging operation; the fourth operation includes the tenth operation and the drag operation.
9. The method of claim 2, wherein the detecting the fourth operation on the first view object of the second page, in response to which the second function of the first view object is performed, comprises:
detecting an eleventh operation of the first view object acting on the second page;
in response to the eleventh operation, displaying function options corresponding to a plurality of functions of the first view object, the plurality of function options including function options corresponding to the second function;
Detecting a twelfth operation acting on a function option corresponding to the second function;
performing the second function in response to the twelfth operation; the fourth operation includes the eleventh operation and the twelfth operation.
10. A method according to any of claims 1-3, wherein after adding a plurality of view objects of the first application to the second page, the second page displays an application identifier of the first application; the method further comprises the steps of:
detecting a thirteenth operation of an application identification acting on the first application;
and in response to the thirteenth operation, displaying a plurality of view objects of the first application on the second page.
11. A method according to any one of claims 1-3, characterized in that;
before the displaying the first page of the first application, the method further comprises:
displaying the second page, wherein the second page comprises a third control;
detecting a fourteenth operation on the third control;
in response to the fourteenth operation, displaying application identifications of a plurality of applications, the plurality of applications including the first application;
detecting a fifteenth operation acting on an application identification of the first application;
The displaying the first page of the first application includes:
in response to the fifteenth operation, displaying the first page of the first application and entering an extraction mode of a view object; in the extraction mode, the second operation is to add a view object to the second page.
12. A method according to any one of claim 1 to 3, wherein,
before the displaying the first page of the first application, the method further comprises:
displaying a fifth page, wherein the fifth page comprises a switch control of an extraction mode of a view object;
detecting a sixteenth operation acting on the switch control;
responsive to the sixteenth operation, turning on the extraction mode; in the extraction mode, the second operation is to add a view object to the second page.
13. A method according to any of claims 1-3, wherein in case a second operation of the first view object for the first page is detected, before adding the first view object to a second page, further comprising:
detecting a seventeenth operation on the first page, wherein a touch position of the seventeenth operation is included in the first view object and the fourth view object;
And displaying an indication frame of the first view object and an indication frame of the fourth view object in response to the seventeenth operation.
14. A method according to any of claims 1-3, wherein in case a second operation of the first view object for the first page is detected, before adding the first view object to a second page, further comprising:
an eighteenth operation is detected;
and responding to the eighteenth operation, and displaying indication frames respectively corresponding to all view objects of the first page.
15. A method according to any of claims 1-3, wherein, in case a second operation of the first view object for the first page is detected, adding the first view object to a second page comprises:
detecting a nineteenth operation on the first view object of the first page;
responsive to the nineteenth operation, displaying indication information of the first view object and a second control, wherein the indication information of the first view object is used for indicating that the first view object is selected;
detecting a twentieth operation acting on the second control;
In response to the twentieth operation, the first view object is added to the second page.
16. A method according to any one of claims 1-3, wherein prior to detecting a twentieth operation on the second control, further comprising:
detecting a twenty-first operation on a fifth view object of the first page;
displaying indication information of the fifth view object in response to the twenty-first operation, wherein the indication information of the fifth view object is used for indicating that the fifth view object is selected;
the adding the first view object to the second page in response to the twentieth operation includes:
the adding the first view object and the fifth view object to the second page in response to the twentieth operation.
17. An electronic device comprising a memory and a processor, the memory and the processor being electrically coupled, the memory being configured to store program instructions, the processor being configured to invoke all or a portion of the program instructions stored by the memory to perform the method of any of claims 1-16.
18. A computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-17.
CN202210915485.1A 2022-07-30 2022-07-30 Interface display method and related device Pending CN117519861A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210915485.1A CN117519861A (en) 2022-07-30 2022-07-30 Interface display method and related device
PCT/CN2023/109733 WO2024027570A1 (en) 2022-07-30 2023-07-28 Interface display method and related apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210915485.1A CN117519861A (en) 2022-07-30 2022-07-30 Interface display method and related device

Publications (1)

Publication Number Publication Date
CN117519861A true CN117519861A (en) 2024-02-06

Family

ID=89765097

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210915485.1A Pending CN117519861A (en) 2022-07-30 2022-07-30 Interface display method and related device

Country Status (2)

Country Link
CN (1) CN117519861A (en)
WO (1) WO2024027570A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4063153B2 (en) * 2003-06-17 2008-03-19 コニカミノルタビジネステクノロジーズ株式会社 Image forming apparatus and image forming condition display method
CN106603851A (en) * 2016-12-21 2017-04-26 北京奇虎科技有限公司 Communication shortcut realizing method and electronic equipment
CN107422959A (en) * 2017-08-10 2017-12-01 珠海格力电器股份有限公司 Method and device for quickly starting function page and electronic equipment
CN111124207B (en) * 2019-12-23 2021-06-15 维沃移动通信有限公司 Multimedia file playing method and electronic equipment

Also Published As

Publication number Publication date
WO2024027570A1 (en) 2024-02-08

Similar Documents

Publication Publication Date Title
CN110119296B (en) Method for switching parent page and child page and related device
CN110362244B (en) Screen splitting method and electronic equipment
CN111240547A (en) Interactive method for cross-device task processing, electronic device and storage medium
CN111078091A (en) Split screen display processing method and device and electronic equipment
JP7302038B2 (en) USER PROFILE PICTURE GENERATION METHOD AND ELECTRONIC DEVICE
CN114201097B (en) Interaction method between multiple application programs
CN109857401B (en) Display method of electronic equipment, graphical user interface and electronic equipment
CN113986070B (en) Quick viewing method for application card and electronic equipment
US20240077987A1 (en) Widget display method and electronic device
US20240192835A1 (en) Display method and related apparatus
CN112068907A (en) Interface display method and electronic equipment
CN116048933B (en) Fluency detection method
CN113448658A (en) Screen capture processing method, graphical user interface and terminal
CN110471604A (en) A kind of more application switching methods and relevant apparatus
CN115904160A (en) Icon moving method, related graphical interface and electronic equipment
WO2024027570A1 (en) Interface display method and related apparatus
CN115808997A (en) Preview method, electronic equipment and system
US20240086035A1 (en) Display Method and Electronic Device
CN115562535B (en) Application control method and electronic equipment
EP4365722A1 (en) Method for displaying dock bar in launcher and electronic device
WO2023160455A1 (en) Object deletion method and electronic device
WO2024060968A1 (en) Service widget management method and electronic device
US20240061549A1 (en) Application switching method, graphical interface, and related apparatus
WO2023226975A1 (en) Display method and electronic device
CN117991937A (en) Multi-window management method, graphical interface and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination