CN114398016B - Interface display method and device - Google Patents

Interface display method and device Download PDF

Info

Publication number
CN114398016B
CN114398016B CN202210029399.0A CN202210029399A CN114398016B CN 114398016 B CN114398016 B CN 114398016B CN 202210029399 A CN202210029399 A CN 202210029399A CN 114398016 B CN114398016 B CN 114398016B
Authority
CN
China
Prior art keywords
area
interface
touch
display
working
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210029399.0A
Other languages
Chinese (zh)
Other versions
CN114398016A (en
Inventor
于林
张志成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jinhua Hongzheng Technology Co ltd
Original Assignee
Jinhua Hongzheng Technology Co ltd
Filing date
Publication date
Application filed by Jinhua Hongzheng Technology Co ltd filed Critical Jinhua Hongzheng Technology Co ltd
Priority to CN202210029399.0A priority Critical patent/CN114398016B/en
Publication of CN114398016A publication Critical patent/CN114398016A/en
Application granted granted Critical
Publication of CN114398016B publication Critical patent/CN114398016B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The application provides an interface display method and device, and relates to the technical field of mobile Internet. The interface display method comprises the following steps: displaying a first interface comprising a first area on a display of a terminal device, wherein the first area comprises a plurality of first touch areas which are in one-to-one correspondence with a plurality of device types and a plurality of second interfaces; detecting touch operation on a plurality of first touch areas, and displaying a second interface corresponding to the detected first touch areas and comprising a second area on a display of the terminal equipment in response to the detected touch operation, wherein the second area or the interface jumped by the second area comprises a plurality of second touch areas which are in one-to-one correspondence with a plurality of working modes and a plurality of third interfaces; detecting touch operation on the plurality of second touch areas, and displaying a third interface corresponding to the detected second touch areas on a display of the terminal device in response to the detected touch operation. The APP experience of the user can be improved.

Description

Interface display method and device
Technical Field
The present application relates to the field of mobile internet technologies, and in particular, to an interface display method and apparatus.
Background
With the rapid development of internet technology, on the one hand, massive application (App) has been developed, and more users use various apps such as news, entertainment, shopping, and office through terminal devices such as smartphones and tablet computers (PADs). On the other hand, users often have multiple terminal devices (such as wearable devices, mobile phones, tablet computers, notebook computers, etc.), and a technology for sharing screen display content between different terminal devices (i.e., a multi-screen interaction technology) can bring good use experience to users. However, the current APP design cannot enable better application of the multi-screen interaction technology, so that user experience is reduced.
Disclosure of Invention
The embodiment of the application provides an interface display method and device, which improve the existing APP design, so that the multi-screen interaction technology is better applied, and the user experience is improved.
In a first aspect, an interface display method is provided and applied to a terminal device, where the method includes:
Displaying a first interface on a display of the terminal equipment, wherein the first interface comprises a first area, the first area comprises a plurality of first touch areas, the plurality of first touch areas are in one-to-one correspondence with a plurality of equipment types, and the plurality of first touch areas are in one-to-one correspondence with a plurality of second interfaces;
detecting touch operation on the plurality of first touch areas;
In response to the detected touch operation of the first touch area, displaying a second interface corresponding to the detected first touch area on a display of the terminal equipment, wherein the second interface comprises a second area, the second area or the interface jumped by the second area comprises a plurality of second touch areas, the plurality of second touch areas are in one-to-one correspondence with a plurality of working modes, and the plurality of second touch areas are in one-to-one correspondence with a plurality of third interfaces;
detecting touch operation on the plurality of second touch areas;
And responding to the detected touch operation of the second touch area, and displaying a third interface corresponding to the detected second touch area on a display of the terminal equipment.
In one possible implementation, one or more of the following information for the different second interfaces is different: page column division, working modules, working module layout and functional modules; the third interface is different from one or more of the following information: page column, work module layout and function module.
In one possible implementation manner, the second area or the interface jumped to by the second area includes a first adding area, where the first adding area is used for adding the working mode, and the method further includes:
detecting an operation on an interface jumped to by the first adding area;
In response to detecting an operation of the interface jumped to by the first add region, an added mode of operation is displayed on the display.
In one possible implementation manner, the second area or the interface skipped to by the second area includes a first deletion area, where the first deletion area is used to delete the working mode, and the method further includes:
Detecting an operation on the first deleted region;
and deleting the corresponding working mode in response to the detected operation on the first deleting area.
In one possible implementation manner, the second touch area or the interface skipped to by the second touch area includes a configuration area, where the configuration area is used to configure a working mode corresponding to the second touch area, and the method further includes:
Detecting an operation on the configuration area;
and responding to the detected operation on the configuration area, and displaying the updated third interface on a display of the terminal equipment.
In one possible implementation, the configuration area includes a text display template configuration area, where the text display template configuration area is used to configure the text display template, and the method further includes:
detecting operation on a text display template configuration area;
and responding to the detected operation on the configuration area of the text display template, and if the text needs to be displayed, displaying the text on a display of the terminal equipment according to the detected configured text display template.
In one possible implementation, the configuration area includes a text display template modification area, and the text display template modification area is used for modifying the text display template, and the method further includes:
Detecting an operation on a text display template modification area;
the text display template is modified in response to the detected operation of the text display template modification area.
In one possible implementation, the third interface includes a third area, where the third area includes a plurality of working modules, and the method further includes:
detecting operation of a plurality of work modules;
and responding to the detected operation of the first working module, and operating the first working module, wherein the first working module is one of a plurality of working modules.
In a second aspect, there is provided a terminal device comprising: one or more functional modules configured to implement any one of the methods provided in the first aspect.
In a third aspect, there is provided a terminal device comprising: a memory and a processor, the memory having stored therein a computer program which, when run on the processor, causes the processor to perform any one of the methods provided in the first aspect.
In a fourth aspect, a chip is provided, comprising: a processor and an interface, the processor being coupled to the memory through the interface, the processor when executing a computer-executable program or computer-executable instructions in the memory, causing any one of the methods provided in the first aspect to be performed.
In a fifth aspect, there is provided a computer readable storage medium comprising computer executable instructions which, when run on a computer, cause the computer to perform any of the methods provided in the first aspect.
In a sixth aspect, there is provided a computer program product comprising computer-executable instructions which, when run on a computer, cause the computer to perform any one of the methods provided in the first aspect.
According to the method provided by the application, the same APP can display interfaces corresponding to various terminal devices and various working modes on one terminal device, on one hand, a user can select an interface suitable for the currently used terminal device according to the currently used terminal device, for example, when a mobile phone is used for office work, the interface corresponding to the mobile phone can be adopted, and when a computer is used for office work, the interface corresponding to the computer can be adopted, so that the user experience is improved. On the other hand, if the interface needs to be shared, the interface can be adapted to another device, so that the use experience of the user is improved, for example, when the mobile phone is used for office work, if the interface needs to be shared to the computer, the interface corresponding to the computer can be selected on the terminal device, so that the screen of the computer is adapted. On the other hand, the working mode can be selected according to the current requirement, so that the working efficiency is improved, for example, if a meeting is required, the meeting mode can be adopted, and the configuration of the meeting mode can be more suitable for the meeting, so that the working efficiency can be improved.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device provided by the present application.
Fig. 2 is a flow chart of an interface display method provided by the application.
Fig. 3 is a schematic diagram of a displayed interface provided by the present application.
Fig. 4 is a schematic diagram of yet another displayed interface provided by the present application.
Fig. 5 is a schematic view of yet another displayed interface provided by the present application.
FIG. 6 is a schematic illustration of yet another displayed interface provided by the present application.
FIG. 7 is a schematic illustration of yet another displayed interface provided by the present application.
FIG. 8 is a schematic illustration of yet another displayed interface provided by the present application.
FIG. 9 is a schematic illustration of yet another displayed interface provided by the present application.
FIG. 10 is a schematic illustration of yet another displayed interface provided by the present application.
FIG. 11 is a schematic illustration of yet another displayed interface provided by the present application.
FIG. 12 is a schematic illustration of yet another displayed interface provided by the present application.
FIG. 13 is a schematic illustration of yet another displayed interface provided by the present application.
Fig. 14 is a schematic diagram of an interface display device according to the present application.
Detailed Description
In the description of the present application, "/" means or, unless otherwise indicated, for example, A/B may mean A or B. "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In the description of the present application, unless otherwise indicated, "at least one" means one or more, and "a plurality" means two or more.
In addition, in order to facilitate the clear description of the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
The method provided by the embodiment of the application can be applied to various terminal equipment capable of performing touch (such as finger touch, stylus touch, mouse touch and the like), and the terminal equipment can be a wireless terminal or a wired terminal. A terminal device may be a device that provides voice and/or data connectivity to a user, such as a handheld device with wireless connectivity or other processing device connected to a wireless modem. The terminal device may be a smart phone, a satellite radio, a computer, a personal communication service (personal communication service, PCS) phone, virtual Reality (VR) glasses, augmented reality (augment reality, AR) glasses, a machine type communication terminal, an internet of things terminal, a communication device onboard a vehicle, a communication device onboard an unmanned aerial vehicle, or the like. The terminal device may also be referred to as a User Equipment (UE), a terminal, a subscriber unit (subscriber unit), a subscriber station (subscriber station), a mobile station (mobile station), a remote station (remote station), an access point (access point), an access terminal (ACCESS TERMINAL), a user terminal (user terminal), a user agent (user agent), and the like.
Fig. 1 is a schematic structural diagram of a terminal device according to the present application, where the terminal device 100 includes: radio Frequency (RF) circuitry 110, memory 120, other input devices 130, display screen 140, sensors 150, audio circuitry 160, I/O subsystem 170, processor 180, and power supply 190. It will be appreciated by those skilled in the art that the terminal device structure shown in fig. 1 is not limiting of the terminal device and may include more or fewer components than shown, or may combine certain components, or split certain components, or a different arrangement of components. Those skilled in the art will appreciate that display 140 pertains to a User Interface (UI) and that terminal device 100 may include more or fewer User interfaces than shown. The following describes the respective constituent elements of the terminal device 100 in detail with reference to fig. 1:
The RF circuit 110 may be used to receive and transmit information or signals during a call. Specifically, after receiving downlink information of the access network device (e.g., a base station), the downlink information is processed by the processor 180, and uplink data is sent to the access network device. Typically, the RF circuitry 110 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (Low Noise Amplifier, LNA), a duplexer, and the like. In addition, RF circuit 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications (Global System of Mobile communication, GSM), general Packet Radio Service (GPRS), code division multiple access (Code Division Multiple Access, CDMA), wideband code division multiple access (Wideband Code Division Multiple Access, WCDMA), long term evolution (Long Term Evolution, LTE), email, short message Service (Short MESSAGING SERVICE, SMS), and the like.
The memory 120 may be used to store software programs and modules, and the processor 180 performs various functional applications and data processing of the terminal device 100 by running the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the terminal device 100. In addition, memory 120 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
Other input devices 130 may be used to receive entered numeric or character information and to generate key signal inputs related to user settings and function control of the terminal device 100. In particular, other input devices 130 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, a light mouse (a light mouse is a touch-sensitive surface that does not display visual output, or an extension of a touch-sensitive surface formed by a touch screen), and the like. The other input devices 130 are connected to the other input device controllers 171 of the I/O subsystem 170 and interact with the processor 180 under control of the other input device controllers 171.
The display 140 may be used to display information input by a user or information provided to the user and various menus of the terminal device 100, and may also accept user inputs. Specifically, the display screen 140 may include a display panel 141, and a touch panel 142. The display panel 141 may be configured in the form of a Liquid crystal display (Liquid CRYSTAL DISPLAY, LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel 142, which is also referred to as a touch screen, a touch sensitive screen, or the like, may collect contact or non-contact operations on or near the user (e.g., operations of the user using any suitable object such as a finger, a stylus, or the like on the touch panel 142 or near the touch panel 142 may also include somatosensory operations; the operations include operation types such as single-point control operations, multi-point control operations, or the like), and drive the corresponding connection devices according to a preset program. Alternatively, the touch panel 142 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth and the touch gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into information which can be processed by the processor, sends the information to the processor 180, and can receive and execute commands sent by the processor 180. In addition, the touch panel 142 may be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave, and the touch panel 142 may be implemented by any technology developed in the future. Further, the touch panel 142 may overlay the display panel 141, and a user may operate on or near the touch panel 142 overlaid on the display panel 141 based on content displayed by the display panel 141 (including, but not limited to, a soft keyboard, a virtual mouse, virtual keys, icons, etc.), and upon detection of an operation thereon or thereabout by the touch panel 142, the touch panel 142 is passed to the processor 180 via the I/O subsystem 170 to determine user input, and the processor 180 then provides a corresponding visual output on the display panel 141 via the I/O subsystem 170 based on the user input. Although in fig. 1, the touch panel 142 and the display panel 141 implement the input and output functions of the terminal device 100 as two separate components, in some embodiments, the touch panel 142 and the display panel 141 may be integrated to implement the input and output functions of the terminal device 100.
The terminal device 100 may also include at least one sensor 150, such as a light sensor, a motion sensor, and other sensors. In particular, the light sensor may include an ambient light sensor and a proximity sensor. Wherein the ambient light sensor may adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor may turn off the display panel 141 and/or the backlight when the terminal device 100 moves to the ear. As one type of motion sensor, the accelerometer sensor can detect the acceleration in all directions (typically three axes), and can detect the gravity and direction when stationary, and can be used for applications for recognizing the gesture of the terminal device (such as horizontal-vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer, knocking), and the like. The terminal device 100 may further be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described herein.
Audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between the user and terminal device 100. The audio circuit 160 may transmit the received audio data converted signal to the speaker 161, and the speaker 161 converts the signal into a sound signal to output; on the other hand, the microphone 162 converts the collected sound signal into a signal, which is received by the audio circuit 160 and converted into audio data, which is output to the RF circuit 110 for transmission to another device (e.g., another terminal device), or which is output to the memory 120 for further processing.
The I/O subsystem 170 is used to control input and output peripherals, which may include other input device controllers 171, sensor controllers 172, and display controllers 173. Optionally, one or more other input control device controllers 171 receive signals from other input devices 130 and/or send signals to other input devices 130, and other input devices 130 may include physical buttons (push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, light mice. It is noted that other input device controllers 171 may be connected to any one or more of the above devices. The display controller 173 in the I/O subsystem 170 receives signals from the display screen 140 and/or transmits signals to the display screen 140. After the display screen 140 detects the user input, the display controller 173 converts the detected user input into an interaction with the user interface object displayed on the display screen 140, i.e., a man-machine interaction is realized. The sensor controller 172 may receive signals from one or more sensors 150 and/or transmit signals to one or more sensors 150.
The processor 180 is a control center of the terminal device 100, connects respective parts of the entire terminal device using various interfaces and lines, and performs various functions and data processing of the terminal device 100 by running or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the terminal device. Alternatively, the processor 180 may include one or more processing units. Preferably, the processor 180 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The terminal device 100 also includes a power supply 190 (e.g., a battery) that provides power to the various components. Preferably, the power supply may be logically connected to the processor 180 through a power management system, so that functions of managing charge, discharge, power consumption, etc. are implemented through the power management system.
Although not shown, the terminal device 100 may further include a camera, a bluetooth module, etc., which will not be described herein.
In an embodiment of the present application, the processor 180 may execute the method provided by the present application by executing software programs and modules in the memory 120. The method provided by the application can be applied to daily software (such as office software) and can also be applied to an electronic government system, and the application is not limited.
Currently, when using APP, a user may need to share the content of an interface on a mobile phone to a computer, a television or a projection screen, on one hand, the layout of the interface on the mobile phone may not be suitable for the computer, the television or the projection screen, resulting in poor projection effect, and therefore, an interface layout suitable for the computer, the television or the projection screen needs to be designed. On the other hand, because the size of the computer, television or projection curtain is larger than that of the mobile phone, part of the hidden interface (for example, the interface which needs multiple levels of interfaces on the mobile phone to be displayed) can be directly put into the label interface, so that the operation efficiency of the user is improved. Based on these considerations, the present application provides an interface display method, see fig. 2, which is performed by a terminal device, comprising the following steps 201 to 205:
201. the method comprises the steps that a first interface is displayed on a display of a terminal device, the first interface comprises a first area, the first area comprises a plurality of first touch areas, the plurality of first touch areas are in one-to-one correspondence with a plurality of device types, and the plurality of first touch areas are in one-to-one correspondence with a plurality of second interfaces.
The first interface may be a label interface (i.e., an interface corresponding to a label, which may also be referred to as a primary interface) of a certain APP (assumed to be the first APP), and after detecting a touch operation of a user on a touch area where the first APP is located, the terminal device may display the first interface on a display of the terminal device. The first interface may also be a secondary interface (i.e., an interface jumped to by a primary interface) or a tertiary interface (i.e., an interface jumped to by a secondary interface) of the first APP or a lower level interface, which is not limited by the present application.
The device types in the present application may include one or more of a mobile phone, a PAD, a computer, and a projector, and may also include others, and the present application is not limited thereto.
The first area in the application can be positioned at any position in the first interface, the first area comprises a plurality of first touch areas which are in one-to-one correspondence with a plurality of equipment types, the equipment types are in one-to-one correspondence with a plurality of second interfaces, and after a user performs touch operation on one first touch area, the terminal equipment can display the second interface corresponding to the first touch area on the display. For example, assuming that there are three first touch areas in the first area, the correspondence between the first touch areas, the device type, and the second interface may be referred to in table 1. After the user performs the touch operation on the first touch area 1, the terminal device may display a second interface 1 on the display, where the second interface 1 is an interface corresponding to the mobile phone. After the user performs the touch operation on the first touch area 2, the terminal device may display a second interface 2 on the display, where the second interface 2 is an interface corresponding to the PAD. That is, the present application can realize that the interface of the first APP on a plurality of terminal devices is displayed on one terminal device.
TABLE 1
First touch control area Device type Second interface
First touch area 1 Device type 1 (e.g., cell phone) Second interface 1
First touch area 2 Device type 2 (e.g., PAD) Second interface 2
First touch area 3 Device type 3 (e.g., notebook) Second interface 3
For example, referring to fig. 3, after the user clicks the first APP in (a) of fig. 3, the terminal device may display a tab interface, which is shown in (b) of fig. 3 and has a tab name of a workstation, in which other tab interfaces may be skipped by clicking a tab name called a home page, a message, or my tab. The first area and the first touch area in fig. 3 (b) can be referred to as a moving touch area, wherein the first touch area 1 can be also referred to as a desktop touch area. The second interface corresponding to the mobile touch area is a second interface corresponding to the mobile phone (hereinafter simply referred to as a mobile interface), and the interface corresponding to the desktop touch area is a second interface corresponding to the PAD or the notebook (hereinafter simply referred to as a desktop interface). In fig. 3 (b), a mobile interface is currently shown.
Optionally, the second, different interface differs in one or more of the following information: page columns, work modules, work module layouts, function modules, and the like. For example, the number, position, name, etc. of the page columns (i.e., the working columns) on the mobile interface may be different from the desktop interface, the position, number, name, etc. of the working modules on the mobile interface may be different from the desktop interface, the layout of the working modules on the mobile interface, for example, the number of rows and columns, etc. may be different from the desktop interface, and the function modules on the mobile interface, for example, the related modules for setting the working mode, etc. may be different from the desktop interface.
For example, the mobile interface displayed on the mobile phone may be referred to as (a) in fig. 4 and (b) in fig. 4, and the displayed desktop interface may be referred to as fig. 5. The mobile interface displayed on the mobile phone is different in position from the desktop interface, and the desktop interface comprises a functional module which is not in the mobile interface. It should be noted that, because the size according to which the desktop interface is designed is PAD or the screen size of the computer, only a part of the desktop interface may be actually displayed on the mobile phone or the desktop interface may be reduced to be displayed on the mobile phone, and if the desktop interface on the mobile phone is shared to other devices, such as a projection curtain and a computer, the complete interface may be displayed. In addition, because the size of the computer, the television or the projection curtain is larger than that of the mobile phone, part of the hidden interface (for example, the interface which needs multiple levels of interfaces to be displayed on the mobile phone) can be directly placed in the label interface, so that the operation efficiency of the user is improved.
202. Touch operations on the plurality of first touch areas are detected.
In a specific implementation, the terminal device may detect whether the user has touch operation on each area displayed through a sensor in the terminal device.
203. And responding to the detected touch operation of the first touch area, and displaying a second interface corresponding to the detected first touch area on a display of the terminal equipment. The second interface comprises a second area, the second area or the interface jumped to by the second area comprises a plurality of second touch areas, the second touch areas are in one-to-one correspondence with a plurality of working modes, and the second touch areas are in one-to-one correspondence with a plurality of third interfaces.
For example, based on the illustration in fig. 3 (b), if the user performs a touch operation on the mobile touch area, the terminal device may display a mobile interface, and if the user performs a touch operation on the desktop touch area, the terminal device may display a desktop interface.
The second area in the application can be positioned at any position in the second interface, a plurality of second touch areas included in the second area or jumped to by the second area correspond to a plurality of working modes one by one, a plurality of working modes correspond to a plurality of third interfaces one by one, and after a user performs touch operation on one second touch area, the terminal equipment can display the third interface corresponding to the second touch area on the display. For example, assuming that there are three second touch areas, the correspondence between the second touch areas, the operation mode, and the third interface can be seen in table 2. After the user performs the touch operation on the second touch area 1, the terminal device may display a third interface 1 on the display. After the user performs the touch operation on the second touch area 2, the terminal device may display a third interface 2 on the display. That is, the present application can realize that the interface of the first APP in a plurality of operation modes is displayed on one terminal device.
TABLE 2
Second touch control area Mode of operation Third interface
Second touch region 1 Mode of operation 1 Third interface 1
Second touch area 2 Mode of operation 2 Third interface 2
Second touch area 3 Mode of operation 3 Third interface 3
The working modes may include conference mode, normal mode, and other working modes, and the application is not limited and only exemplified.
In one case, referring to fig. 4 (a), the second interface includes a second area, where the second area includes a plurality of second touch areas (i.e., a second touch area 1 and a second touch area 2 in the drawing), the second touch area 1 may be referred to as a normal mode touch area, the second touch area 2 may be referred to as a conference mode touch area, a third interface corresponding to the normal mode touch area may be referred to as a normal mode interface, and a third interface corresponding to the conference mode touch area may be referred to as a conference mode interface. In fig. 4 (a), a normal mode interface is currently shown.
In another case, referring to (b) in fig. 4, the second interface includes a second region therein. Referring to (a) of fig. 6, the user performs a touch operation on the second area, and the interface to jump to is referring to (b) of fig. 6, including a plurality of second touch areas.
204. Touch operations on the plurality of second touch areas are detected.
205. And responding to the detected touch operation of the second touch area, and displaying a third interface corresponding to the detected second touch area on a display of the terminal equipment.
In one case, referring to (a) of fig. 4, if the user performs a touch operation on the normal mode touch area, the terminal device may display a normal mode interface, and if the user performs a touch operation on the conference mode touch area, the terminal device may display a conference mode interface.
In another case, referring to (b) of fig. 6, if the user performs a touch operation on the enable button in the normal mode touch area, the terminal device may display a normal mode interface, and if the user performs a touch operation on the close button in the normal mode touch area, the terminal device may display other mode interfaces (for example, a default operation mode interface). If the user performs a touch operation on the enable button in the conference mode touch area, the terminal device may display a conference mode interface, and if the user performs a touch operation on the close button in the conference mode touch area, the terminal device may display other mode interfaces (for example, a default working mode interface). It should be noted that, in fig. 6 (b) and fig. 5, only the content that can be set in the normal mode is expanded, the content that can be set in the conference mode is folded, and the triangle before the conference mode can be clicked to expand, and an expansion page similar to the normal mode is arranged after the expansion.
Optionally, the third interface is different from one or more of the following information: page columns, work modules, work module layouts, function modules, and the like.
For example, the normal mode interface under the mobile interface may be referred to in fig. 7 (a), the conference mode interface may be referred to in fig. 7 (b), and the conference mode interface may place the working module related to the conference in a front position compared to the normal mode interface, thereby facilitating the search of the user. Under the desktop interface, the common mode interface and the conference mode interface are similar in difference, and are not repeated. Of course, only the differences between the normal mode and the conference mode are exemplified here, and the differences may be other in actual implementation.
According to the method provided by the application, the same APP can display interfaces corresponding to various terminal devices and various working modes on one terminal device, on one hand, a user can select an interface suitable for the currently used terminal device according to the currently used terminal device, for example, when a mobile phone is used for office work, the interface corresponding to the mobile phone can be adopted, and when a computer is used for office work, the interface corresponding to the computer can be adopted, so that the user experience is improved. On the other hand, if the interface needs to be shared, the interface can be adapted to another device, so that the use experience of the user is improved, for example, when the mobile phone is used for office work, if the interface needs to be shared to the computer, the interface corresponding to the computer can be selected on the terminal device, so that the screen of the computer is adapted. On the other hand, the working mode can be selected according to the current requirement, so that the working efficiency is improved, for example, if a meeting is required, the meeting mode can be adopted, and the configuration of the meeting mode can be more suitable for the meeting, so that the working efficiency can be improved.
In order to adapt to the demands of users, the working modes in the application can be added or deleted or modified by the users, and the working modes are respectively described in three cases.
Case 1 addition of working modes
In case 1, optionally, the second area or the interface skipped to by the second area further includes a first adding area, where the first adding area is used to add the working mode, and the method further includes:
detecting an operation on an interface jumped to by the first adding area;
In response to detecting an operation of the interface jumped to by the first add region, an added mode of operation is displayed on the display.
Illustratively, in the case where the first added area is further included in the second area, the first added area may be referred to in (a) of fig. 4. In the case where the first added area is further included in the interface to which the second area jumps, the first added area may be referred to in fig. 5, or (b) in fig. 6. The user jumps to other interfaces by clicking the first adding area, and the other interfaces can have names, layouts (for example, the number of the working modules contained in each row) for setting the working modes and buttons for displaying the templates through texts, so that after the user finishes setting, a new working mode can be added.
Case 2 deletion of working modes
In case 2, optionally, the second area or the interface skipped to by the second area includes a first deletion area, where the first deletion area is used to delete the working mode, and the method further includes:
Detecting an operation on the first deleted region;
and deleting the corresponding working mode in response to the detected operation on the first deleting area.
For example, in the case that the second area further includes the first deletion area, one deletion area may be added after each operation mode for deleting the corresponding operation mode. In the case where the first deletion area is also included in the interface to which the second area jumps, the first deletion area may be referred to in fig. 5, or (b) in fig. 6. One of the operation modes may correspond to a first deletion area, and the user may delete the operation mode corresponding to the first deletion area by clicking the first deletion area.
Case 3 modification of working mode
In case 3, optionally, the second touch area or the interface skipped to by the second touch area includes a configuration area, where the configuration area is used to configure a working mode corresponding to the second touch area, and the method further includes:
Detecting an operation on the configuration area;
and responding to the detected operation on the configuration area, and displaying the updated third interface on a display of the terminal equipment.
For example, in the case where the configuration area is further included in the second touch area, the configuration area may be referred to in fig. 5, or (b) in fig. 6. The configuration area may also be in an interface that is jumped to by the second touch area. For example, the interface to which the second touch area in fig. 4 is jumped after being pressed for a long time can be referred to as a configuration area shown in fig. 5 or fig. 6 (b). For example, the interface to which the second touch area 1 in fig. 4 jumps after being pressed for a long time may be seen in (a) in fig. 8. The first displayed third interface may be a default third interface or a third interface that is displayed by the terminal device before, and the updated third interface may be a third interface that is redisplayed based on the operation of the user. The configuration area may be configured with a page layout, such as the number of work modules per row, and for example, text display templates used, etc.
According to the method provided by the application, the user can set the working mode meeting the self working requirement through the user-defined working mode, so that the working efficiency is improved, and the user experience is improved.
It should be noted that, the terminal device may also directly adjust the layout of the working bars on the third interface, for example, see (a) in fig. 7 and (b) in fig. 7, where the third interface may include a plurality of working bars, and the user may add the working bars by clicking the plus sign of the working bar area, and the process of adding the working bars is similar to the process of adding the working module hereinafter, which may be understood with reference to not be described again. For example, referring to fig. 9, based on the bar shown in (a) of fig. 7, the user adds a bar named representative of the contact and adds a work module named representative of the directory under the bar.
In case 3, the text display template in the present application can also be configured by the user (for example, whether the configuration is used) or added or deleted or modified, and the following description is divided into four cases, respectively.
Case (1) configuration of text display template
In case (1), optionally, the configuration area includes a text display template configuration area, where the text display template configuration area is used to configure the text display template, and the method further includes:
detecting operation on a text display template configuration area;
and responding to the detected operation on the configuration area of the text display template, and if the text needs to be displayed, displaying the text on a display of the terminal equipment according to the detected configured text display template.
For example, the text display template configuration area may be referred to in fig. 5, or (b) in fig. 6. There may be multiple text display templates corresponding to one mode of operation. Referring to fig. 5, or (b) of fig. 6, the user may display text on the display of the terminal device according to the detected configured text display template by clicking a use button corresponding to the text display template, when the text needs to be displayed.
Case (2) modification of text display templates
In case (2), the configuration area includes a text display template modification area therein, the text display template modification area being for modifying the text display template, the method further comprising:
Detecting an operation on a text display template modification area;
the text display template is modified in response to the detected operation of the text display template modification area.
For example, the text display template modification area may be referred to in fig. 5, or (b) in fig. 6. Referring to fig. 5, or (b) of fig. 6, the user may modify the text display template by clicking a modification button corresponding to the text display template, for example, modifying information of a word pitch, a line pitch, a word size, a title level, an alignment, a graphic display, etc.
Case (3) deletion of text display templates
In case (3), the configuration area includes a text display template deletion area, and the text display template deletion area is used for deleting the text display template, and the method further includes:
detecting an operation of deleting a region of a text display template;
And deleting the corresponding text display template in response to the detected operation on the text display template deletion area.
For example, the text display template deletion area may be referred to in fig. 5, or (b) in fig. 6. Referring to fig. 5, or (b) of fig. 6, the user may delete the corresponding text display template by clicking the delete button corresponding to the text display template.
Case (4) addition of text display templates
In case (4), the configuration area includes a text display template adding area, and the text display template adding area is used for adding a text display template, and the method further includes:
detecting an operation of adding an area to a text display template;
and adding the text display template in response to the detected operation of the text display template adding area.
For example, the text display template addition area may be referred to in fig. 5, or (b) in fig. 6, or (a) in fig. 8. Referring to fig. 5, or (b) of fig. 6, or (a) of fig. 8, the user may add a text display template by clicking on the add new template button. For example, based on the example shown in fig. 8 (a), the interface after adding the text display template 4 can be seen in fig. 8 (b).
The text display template in the application can be a word or PPT display template. Since there may be multiple sources of text uploaded to the first APP, the text formats (e.g., font size, first line indentation, alignment, etc.) are also not identical, which can result in very inefficient processing of these texts. According to the method provided by the embodiment of the application, the unified text display template is set, so that the text format can be unified, and the working efficiency is improved. In the concrete implementation, before the text display template is adopted for display, text contents can be extracted into different text streams, the attributes of the text streams are converted into corresponding template items, the template items are applied to the template, and the template items are displayed on a software interface.
Optionally, the user may also manage the work module, for example, delete, add or move the work module, specifically, delete or move the work module in the first mode, add or delete the work module in the second mode, and add the work module in the third mode.
In the first mode, the third interface includes a third area, and the third area includes a plurality of working modules, and at this time, the terminal device detects operations on the plurality of working modules; and responding to the detected operation of the first working module, and operating the first working module, wherein the first working module is one of a plurality of working modules.
For example, referring to fig. 10, when the user presses the work module for a long time and the work module is to shake, the user may drag the work module to move the position of the work module, if the work module is also present at the original position, the work module at the original position may be moved to another place, or may click the fork number at the upper left of the work module to delete the work module.
One or more working bars in the second mode and the third interface are corresponding to management buttons, the management buttons are used for managing the corresponding working modules under the working bars, the interface jumped to by the management buttons comprises adding and deleting buttons of the plurality of working modules, and at the moment, the terminal equipment detects the operation of the adding or deleting buttons of the plurality of working modules; and adding or deleting the second working module in response to the detected operation of the adding or deleting button of the second working module, wherein the second working module is one of a plurality of working modules.
For example, when the user clicks a management button corresponding to a work bar, the name of which represents a contact, in fig. 11 (a), a work module capable of being added or deleted, as in fig. 11 (b), may be displayed, and at this time, the user clicks the addition or deletion to add or delete the corresponding work module, it should be noted that, for the already added work module, only the deletion button is valid, and for the non-added work module, only the addition button is valid. Based on the interface shown in fig. 11 (a), the interface after adding a work module named duplex list under the work bar named representative of the contact can be seen in fig. 12.
In the third mode, the adding icons are included in the working module areas under one or more working bars in the third interface, the interface jumped to by the adding icons includes adding buttons of a plurality of working modules, and at the moment, the terminal equipment detects the operation of the adding buttons of the plurality of working modules; and adding the third working module in response to the detected operation of the adding button of the third working module, wherein the third working module is one of a plurality of working modules.
For example, when the user clicks the add icon under the bar of fig. 13 (a), which is the name of the contact, the user may display the work module that can be added as in fig. 13 (b), and at this time, the user clicks the add to add the corresponding work module.
In the embodiment of the application, a user can conveniently move or delete or add the work module according to the requirement, so that the interface can be adjusted according to the self requirement, the work efficiency is improved, and the user experience is improved.
The touch operation in the above embodiment of the present application may be a single click, a double click, a long press, or the like, which is not limited by the present application.
In the application, a developer can design the first APP by adopting the existing grammar, so that the first APP realizes the functions. The interfaces shown in the various figures of the present application are merely examples, and the locations of the modules in these interfaces may be different or may be located in different levels of interfaces in actual implementations, and the present application is not limited. In the above examples of the present application, the operation of the present application is mainly exemplified by a mobile interface, and operations under a desktop interface or an interface corresponding to other terminal types are similar, and may be understood by referring to the description.
The service scenario described in the embodiment of the present application is for more clearly describing the technical solution of the embodiment of the present application, and does not constitute a limitation on the technical solution provided by the embodiment of the present application. As can be known by those skilled in the art, with the appearance of a new service scenario, the technical solution provided by the embodiment of the present application is applicable to similar technical problems.
The foregoing description of the embodiments of the present application has been presented primarily in terms of methods. It will be appreciated that the terminal device, in order to implement the above-mentioned functions, includes at least one of a hardware structure and a software module for performing the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional units of the terminal equipment according to the method example, for example, each functional unit can be divided corresponding to each function, and two or more functions can be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
By way of example, fig. 14 shows a schematic diagram of one possible configuration of the interface display device (denoted as interface display device 140) related to the above-described embodiment, the interface display device 140 including a display unit 1401, a detection unit 1402, and an execution unit 1403. Wherein:
A display unit 1401, configured to display a first interface on a display of the terminal device, where the first interface includes a first area, the first area includes a plurality of first touch areas, the plurality of first touch areas are in one-to-one correspondence with a plurality of device types, and the plurality of first touch areas are in one-to-one correspondence with a plurality of second interfaces;
a detection unit 1402, configured to detect a touch operation on the plurality of first touch areas;
An execution unit 1403, configured to respond to a touch operation of the detected first touch area, display, by using a display unit 1401, a second interface corresponding to the detected first touch area on a display of the terminal device, where the second interface includes a second area, and the second area or an interface skipped by the second area includes a plurality of second touch areas, where the plurality of second touch areas are in one-to-one correspondence with a plurality of working modes, and the plurality of second touch areas are in one-to-one correspondence with a plurality of third interfaces;
the detecting unit 1402 is further configured to detect a touch operation on the plurality of second touch areas;
the execution unit 1403 is further configured to display, by using the display unit 1401, a third interface corresponding to the detected second touch area on a display of the terminal device in response to the detected touch operation of the second touch area.
Optionally, the second, different interface differs in one or more of the following information: page column division, working modules, working module layout and functional modules; the third interface is different from one or more of the following information: page column, work module layout and function module.
Optionally, the second area or the interface jumped to by the second area comprises a first adding area, the first adding area is used for adding the working mode,
A detecting unit 1402, configured to detect an operation on an interface skipped to by the first added area;
The execution unit 1403 is further configured to display, by means of the display unit 1401, an added operation mode on the display in response to the detected operation of the interface skipped to by the first addition area.
Optionally, the second area or the interface jumped to by the second area comprises a first deleting area, the first deleting area is used for deleting the working mode,
A detecting unit 1402 further configured to detect an operation on the first deletion area;
The execution unit 1403 is further configured to delete the corresponding operation mode in response to the detected operation on the first deletion area.
Optionally, the second touch area or the interface jumped to by the second touch area includes a configuration area, where the configuration area is used to configure a working mode corresponding to the second touch area,
A detecting unit 1402, configured to detect an operation on the configuration area;
The execution unit 1403 is further configured to display, through the display unit 1401, the updated third interface on the display of the terminal device in response to the detected operation on the configuration area.
Optionally, the configuration area includes a text display template configuration area, the text display template configuration area is used for configuring a text display template,
A detecting unit 1402, configured to detect an operation on the text display template configuration area;
The execution unit 1403 is further configured to display, by means of the display unit 1401, text according to the detected configured text display template on the display of the terminal device, if text needs to be displayed, in response to the detected operation on the text display template configuration area.
Optionally, the configuration area includes a text display template modification area, the text display template modification area is used for modifying the text display template,
A detecting unit 1402, configured to detect an operation on the text display template modification area;
the execution unit 1403 is further configured to modify the text display template in response to the detected operation of the text display template modification area.
Optionally, the third interface includes a third area, the third area includes a plurality of working modules,
A detecting unit 1402, configured to detect operations on the plurality of work modules;
The executing unit 1403 is further configured to operate a first working module in response to the detected operation on the first working module, where the first working module is one of the plurality of working modules.
The interface display device 140 may be a device or a chip system, for example.
The integrated units of fig. 14 may be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. The storage medium storing the computer software product includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Embodiments of the present application also provide a computer-readable storage medium comprising computer-executable instructions that, when executed on a computer, cause the computer to perform any of the methods described above.
Embodiments of the present application also provide a computer program product comprising computer-executable instructions which, when run on a computer, cause the computer to perform any of the methods described above.
The embodiment of the application also provides a chip, which comprises: a processor and an interface through which the processor is coupled to the memory, which when executed by the processor causes the processor to perform any of the methods provided by the embodiments described above.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented using a software program, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, a website, computer, server, or data center via a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means. Computer readable storage media can be any available media that can be accessed by a computer or data storage devices including one or more servers, data centers, etc. that can be integrated with the media. Usable media may be magnetic media (e.g., floppy disks, hard disks, magnetic tape), optical media (e.g., DVD), or semiconductor media (e.g., solid State Disk (SSD)) or the like.
Although the application is described herein in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Although the application has been described in connection with specific features and embodiments thereof, it will be apparent that various modifications and combinations can be made without departing from the spirit and scope of the application. Accordingly, the specification and drawings are merely exemplary illustrations of the present application as defined in the appended claims and are considered to cover any and all modifications, variations, combinations, or equivalents that fall within the scope of the application. It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application.

Claims (10)

1. An interface display method, which is applied to a terminal device, the method comprising:
Displaying a first interface on a display of the terminal equipment, wherein the first interface comprises a first area, the first area comprises a plurality of first touch areas, the plurality of first touch areas are in one-to-one correspondence with a plurality of equipment types, and the plurality of first touch areas are in one-to-one correspondence with a plurality of second interfaces;
Detecting touch operation on the plurality of first touch areas;
Responding to the touch operation of the detected first touch area, displaying a second interface corresponding to the detected first touch area on a display of the terminal equipment, wherein the second interface comprises a second area, the second area or an interface jumped to by the second area comprises a plurality of second touch areas, the plurality of second touch areas are in one-to-one correspondence with a plurality of working modes, and the plurality of second touch areas are in one-to-one correspondence with a plurality of third interfaces;
Detecting touch operation on the plurality of second touch areas;
And responding to the detected touch operation of the second touch area, and displaying a third interface corresponding to the detected second touch area on a display of the terminal equipment.
2. The method of claim 1, wherein the second, different interface differs in one or more of the following information: page column division, working modules, working module layout and functional modules; the third interface is different from one or more of the following information: page column, work module layout and function module.
3. The method according to claim 1 or 2, characterized in that the second area or an interface to which the second area jumps comprises a first add-on area for adding an operational mode, the method further comprising:
detecting an operation on an interface jumped to by the first adding area;
and displaying an added working mode on the display in response to the detected operation of the interface jumped to by the first adding area.
4. The method according to claim 1 or 2, wherein the second area or an interface to which the second area jumps comprises a first deletion area, the first deletion area being used for deleting the operation mode, the method further comprising:
detecting an operation on the first deletion area;
And deleting the corresponding working mode in response to the detected operation on the first deleting area.
5. The method according to claim 1 or 2, wherein the second touch area or an interface skipped by the second touch area includes a configuration area, where the configuration area is configured to configure a working mode corresponding to the second touch area, and the method further includes:
detecting an operation on the configuration area;
and responding to the detected operation on the configuration area, and displaying an updated third interface on a display of the terminal equipment.
6. The method of claim 5, wherein the configuration area includes a text display template configuration area for configuring a text display template, the method further comprising:
Detecting operation on the text display template configuration area;
And responding to the detected operation on the text display template configuration area, and if the text needs to be displayed, displaying the text on a display of the terminal equipment according to the detected configured text display template.
7. The method of claim 5, wherein the configuration area includes a text display template modification area therein for modifying a text display template, the method further comprising:
detecting an operation on the text display template modification area;
and modifying the text display template in response to the detected operation of the text display template modification area.
8. The method of claim 1 or 2, wherein the third interface includes a third region therein, the third region including a plurality of work modules therein, the method further comprising:
detecting operation of the plurality of work modules;
and responding to the detected operation of the first working module, and operating the first working module, wherein the first working module is one of the plurality of working modules.
9. A terminal device, comprising: one or more functional modules for implementing the method of any one of claims 1-8.
10. A terminal device, comprising: a memory and a processor, the memory having stored therein a computer program which, when run on the processor, causes the processor to perform the method of any of claims 1-8.
CN202210029399.0A 2022-01-12 Interface display method and device Active CN114398016B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210029399.0A CN114398016B (en) 2022-01-12 Interface display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210029399.0A CN114398016B (en) 2022-01-12 Interface display method and device

Publications (2)

Publication Number Publication Date
CN114398016A CN114398016A (en) 2022-04-26
CN114398016B true CN114398016B (en) 2024-06-11

Family

ID=

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1622619A (en) * 2004-12-24 2005-06-01 北京中星微电子有限公司 A multi-screen display method and device
CN104461433A (en) * 2014-12-19 2015-03-25 北京奇艺世纪科技有限公司 Individual interface display method and device
CN108563374A (en) * 2018-03-05 2018-09-21 维沃移动通信有限公司 A kind of interface display method and terminal device
CN109917956A (en) * 2019-02-22 2019-06-21 华为技术有限公司 It is a kind of to control the method and electronic equipment that screen is shown
CN109997348A (en) * 2017-09-25 2019-07-09 华为技术有限公司 A kind of display methods and terminal of terminal interface
CN110308834A (en) * 2019-04-25 2019-10-08 维沃移动通信有限公司 The setting method and terminal of application icon display mode
CN110442297A (en) * 2019-08-08 2019-11-12 Oppo广东移动通信有限公司 Multi-screen display method, split screen display available device and terminal device
CN110688179A (en) * 2019-08-30 2020-01-14 华为技术有限公司 Display method and terminal equipment
CN111190559A (en) * 2019-12-04 2020-05-22 深圳市东向同人科技有限公司 Screen projection control synchronization method, mobile terminal and computer readable storage medium
CN111221450A (en) * 2020-01-02 2020-06-02 杭州网易云音乐科技有限公司 Information display method and device, electronic equipment and storage medium
CN111367456A (en) * 2020-02-28 2020-07-03 青岛海信移动通信技术股份有限公司 Communication terminal and display method in multi-window mode
CN112532869A (en) * 2018-10-15 2021-03-19 华为技术有限公司 Image display method in shooting scene and electronic equipment
CN113711172A (en) * 2019-04-16 2021-11-26 苹果公司 Systems and methods for interacting with companion display modes of an electronic device with a touch-sensitive display

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1622619A (en) * 2004-12-24 2005-06-01 北京中星微电子有限公司 A multi-screen display method and device
CN104461433A (en) * 2014-12-19 2015-03-25 北京奇艺世纪科技有限公司 Individual interface display method and device
CN109997348A (en) * 2017-09-25 2019-07-09 华为技术有限公司 A kind of display methods and terminal of terminal interface
CN108563374A (en) * 2018-03-05 2018-09-21 维沃移动通信有限公司 A kind of interface display method and terminal device
CN112532869A (en) * 2018-10-15 2021-03-19 华为技术有限公司 Image display method in shooting scene and electronic equipment
CN109917956A (en) * 2019-02-22 2019-06-21 华为技术有限公司 It is a kind of to control the method and electronic equipment that screen is shown
CN113711172A (en) * 2019-04-16 2021-11-26 苹果公司 Systems and methods for interacting with companion display modes of an electronic device with a touch-sensitive display
CN110308834A (en) * 2019-04-25 2019-10-08 维沃移动通信有限公司 The setting method and terminal of application icon display mode
CN110442297A (en) * 2019-08-08 2019-11-12 Oppo广东移动通信有限公司 Multi-screen display method, split screen display available device and terminal device
CN110688179A (en) * 2019-08-30 2020-01-14 华为技术有限公司 Display method and terminal equipment
CN111190559A (en) * 2019-12-04 2020-05-22 深圳市东向同人科技有限公司 Screen projection control synchronization method, mobile terminal and computer readable storage medium
CN111221450A (en) * 2020-01-02 2020-06-02 杭州网易云音乐科技有限公司 Information display method and device, electronic equipment and storage medium
CN111367456A (en) * 2020-02-28 2020-07-03 青岛海信移动通信技术股份有限公司 Communication terminal and display method in multi-window mode

Similar Documents

Publication Publication Date Title
US10275295B2 (en) Method and apparatus for presenting clipboard contents on a mobile terminal
US11054988B2 (en) Graphical user interface display method and electronic device
CN111061574B (en) Object sharing method and electronic device
US11237724B2 (en) Mobile terminal and method for split screen control thereof, and computer readable storage medium
CN109062467B (en) Split screen application switching method and device, storage medium and electronic equipment
US10775979B2 (en) Buddy list presentation control method and system, and computer storage medium
CN105786878B (en) Display method and device of browsing object
CN105975190B (en) Graphical interface processing method, device and system
WO2021083132A1 (en) Icon moving method and electronic device
CN108549519B (en) Split screen processing method and device, storage medium and electronic equipment
US20210352040A1 (en) Message sending method and terminal device
US9798713B2 (en) Method for configuring application template, method for launching application template, and mobile terminal device
CN110908554B (en) Long screenshot method and terminal device
WO2020007144A1 (en) Switching method and device for split screen application, storage medium and electronic device
CN108170329B (en) Display control method and terminal equipment
CN111610903A (en) Information display method and electronic equipment
US10101894B2 (en) Information input user interface
CN105631059B (en) Data processing method, data processing device and data processing system
CN115705124A (en) Application folder control method and device, terminal equipment and storage medium
CN108920086B (en) Split screen quitting method and device, storage medium and electronic equipment
CN109739409B (en) Batch processing method and device and terminal equipment
KR102443123B1 (en) Control method for mobile terminal
CN111309934A (en) Collected resource processing method and electronic equipment
CN110888854A (en) Content sharing method and electronic equipment
CN114398016B (en) Interface display method and device

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant