CN114879880A - Electronic device, display method and medium for application thereof - Google Patents

Electronic device, display method and medium for application thereof Download PDF

Info

Publication number
CN114879880A
CN114879880A CN202110162248.8A CN202110162248A CN114879880A CN 114879880 A CN114879880 A CN 114879880A CN 202110162248 A CN202110162248 A CN 202110162248A CN 114879880 A CN114879880 A CN 114879880A
Authority
CN
China
Prior art keywords
display
application
user
area
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110162248.8A
Other languages
Chinese (zh)
Inventor
胡颖峰
王红军
刘诗聪
崔擎誉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110162248.8A priority Critical patent/CN114879880A/en
Priority to PCT/CN2022/074024 priority patent/WO2022166713A1/en
Publication of CN114879880A publication Critical patent/CN114879880A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to an electronic device and a display method and medium applied to the electronic device, wherein the display method applied to the electronic device comprises the following steps: displaying a first display interface of a first application on a screen of an electronic device; and displaying a second display interface of the second application and a first display area of the first application on the screen, wherein a partial area of the second display interface is blocked by the first display area, and the first display area is a part of the first display interface. By the method, the electronic equipment can display the display content of the display interface of the first application in the display interface of the second application or the desktop in a floating window mode, so that a user can pay attention to the change of the first application while using the second application on the premise of not switching the two applications.

Description

Electronic device, display method and medium for application thereof
Technical Field
The application relates to graphical interface display technology in the field of electronic equipment. And more particularly, to an electronic device and a display method and medium applied thereto.
Background
In the process of using the applications of the electronic device by a user, sometimes, one application needs to be used while the content of one or more other applications needs to be changed. For example, after the user starts the navigation application, the user wants to chat using the instant chat application while paying attention to the content change of the navigation route in the navigation application at any time. In this case, the user needs to switch back and forth between the navigation application and the instant chat application, which is cumbersome.
Disclosure of Invention
The method of the application enables the electronic device to display the display content of the display interface of the first application in the display interface of the second application or the desktop in a floating window mode, and therefore a user can pay attention to the change of the first application while using the second application on the premise that the two applications are not switched.
A first aspect of the present application provides an application display method, including: displaying a first display interface of a first application on a screen of an electronic device; and displaying a second display interface of the second application and a first display area of the first application on the screen, wherein a partial area of the second display interface is blocked by the first display area, and the first display area is a part of the first display interface.
In the embodiment of the application, the electronic device opens the first application and the second application, selects the first display area from the first display interface of the first application, and displays the first display area in the second display interface of the second application, wherein the first display area is smaller than the second display interface.
For example, the electronic device may be a cell phone, the first application may be a navigation application, and the second application may be an instant chat application. The first display interface may be a display interface of a navigation application, the second display interface may be a display interface of an instant chat application, and the first display area may be an area of navigation information displayed in the navigation application. And the mobile phone displays the area of the displayed navigation information in the display interface of the instant chat application, so that the area outside the display interface of the instant chat application is invisible.
In a possible implementation of the first aspect, in a case where it is detected that the user selects to perform the multi-application display, a second display interface of the second application and a first display area of the first application are displayed on the screen.
That is, in the embodiment of the present application, the electronic device may prompt the user whether to start the multi-application display on the second display interface of the currently running second application, and in a case that the user confirms the start, the second display interface of the second application and the first display area of the first application are simultaneously displayed in the screen of the electronic device.
In one possible implementation of the first aspect described above, the user selecting the multi-application display includes the user clicking an icon for hover displaying the first application.
That is, in the embodiment of the present application, the electronic device may prompt, in a manner of an icon, on the second display interface of the currently running second application, for confirming that the multi-application display is performed.
In a possible implementation of the first aspect, the method further includes:
a first display area is selected from the first display interface.
In one possible implementation of the foregoing first aspect, the method further includes:
prompting a user to select a first display area to be subjected to multi-application display from a first display interface;
and determining the first display area according to the selection result of the user.
In a possible implementation of the first aspect, determining the first display area according to a selection result of the user includes:
and determining a first display area according to a track formed by the gesture operation of the user on the first display interface.
In a possible implementation of the first aspect, in a case where a trajectory formed by the gesture operation is a non-closed region, the non-closed region is supplemented to determine the first display region.
That is, in the embodiment of the present application, for example, the first application may be a navigation application, the first display interface may be a display interface of the navigation application, and the first display area may be an area of navigation information displayed in the navigation application, that is, a local area in the first display interface. The user can click a local window button of a control center of the mobile phone to enter a selection interface. The user can execute a selection gesture, namely gesture operation, on the display interface of the navigation application in the selection interface, and the mobile phone determines the local area according to the track of the selection gesture. When the track formed by the selection gesture of the user is arc-shaped, the mobile phone can directly connect the notched part for completion.
In a possible implementation of the first aspect, the method further includes:
adjusting the size and the position of the first display area in response to the adjustment operation of the first display area by the user, wherein the adjustment operation of the user comprises at least one of the following operations:
moving the first display area;
enlarging the first display area;
the first display area is reduced.
That is, in the embodiment of the present application, after the user determines the first display area, the user may perform, for example, a gesture of expanding the display content, a gesture of contracting the display content, and a gesture of moving the display content on the first display area.
In a possible implementation of the first aspect, the method further includes:
and acquiring a region with changed display content in the first display interface, and determining the region with changed display content as a first display region.
That is, in the embodiment of the present application, for example, the first application may be a navigation application, and the cell phone may detect an area of displayed navigation information included in the navigation application through the view system and determine the area as the first display area.
In one possible implementation of the first aspect, the change amount of the screen of the area is detected within a preset time period, and the area is determined as the area where the display content changes if the change amount exceeds a preset change threshold.
That is, in the embodiment of the present application, for example, the method in which the mobile phone can detect the area of the displayed navigation information included in the navigation application through the view system may be to detect the refresh frequency of the display content in the navigation application, that is, to determine whether the number of changes of the screen of the display content exceeds the refresh frequency.
In one possible implementation of the first aspect, the first display region is displayed on the screen by means of a floating window.
That is, in the embodiment of the present application, for example, the first display area is displayed in a floating manner on the second display interface of the second application.
In one possible implementation of the first aspect, the first display region is displayed on the screen by means of a floating window by:
the method comprises the steps of setting a first display interface on a first layer, setting a second display interface on a second layer, superposing the first layer on the second layer, and setting the area except a first display area in the first display interface on the first layer to be transparent display.
That is, in the embodiment of the present application, for example, the view system of the mobile phone sets the navigation application on the first layer, sets the instant chat application on the second layer, and the first layer is covered on the second layer. In the first layer, the area of the navigation information of the navigation application is set to be transparent. Thus, the navigation information of the navigation application and the instant chat application can be simultaneously displayed.
In one possible implementation of the first aspect described above, the first display content is changed to the second display content in the first display area in response to a first change operation by a user in the first display area.
In a possible implementation of the first aspect, a second changing operation of the user in an area outside the first display area is received, and an instruction corresponding to the second changing operation is transmitted to the second layer.
In a possible implementation of the first aspect, the third display content is changed to the fourth display content in the second display interface in response to an instruction corresponding to the second change operation.
That is, in the embodiment of the present application, after the user performs a gesture to expand the display content in the floating window of the navigation application of the mobile phone, the display content of the navigation application is expanded and then displayed, that is, the display content is changed from the first display content to the second display content. After the user performs a sliding gesture from bottom to top in the area outside the floating window of the navigation application of the mobile phone, the display content of the display interface of the instant chat application of the mobile phone achieves an effect of moving upwards, that is, the third display content is changed into the fourth display content.
In one possible implementation of the first aspect, the first display region is displayed on the screen in a picture-in-picture manner.
A second aspect of the present application provides an electronic device, comprising:
a memory storing instructions;
a processor, the processor being coupled to the memory, the program instructions stored by the memory when executed by the processor causing the electronic device to perform the display method of the application as provided in the aforementioned first aspect.
A third aspect of the present application provides a readable medium having instructions stored therein, wherein the instructions, when executed on the readable medium, cause the readable medium to execute the display method of the application as provided in the first aspect.
Drawings
Fig. 1(a) shows an example of a display interface of a navigation application of an electronic device according to an embodiment of the present application;
FIG. 1(b) illustrates an example of a display interface of a navigation application of an electronic device hovering within a display interface of an instant chat application according to an embodiment of the present application;
FIG. 2 shows a block diagram of a hardware architecture of an electronic device according to an embodiment of the application;
FIG. 3 shows a block diagram of a software architecture of an electronic device according to an embodiment of the application;
fig. 4 shows a method flowchart of a display method of an application of a mobile phone according to an embodiment of the present application;
FIG. 5(a) illustrates an example of a guidance interface of a navigation application that supports an illustration of a floating window in accordance with an embodiment of the present application;
fig. 5(b) illustrates an example of a prompt operation of a guidance interface of a navigation application according to an embodiment of the present application;
fig. 5(c) illustrates an example of a display result of a guidance interface of a navigation application according to an embodiment of the present application;
fig. 6(a) shows an example of a selection gesture of a display interface of a mobile phone for acquiring a navigation application according to an embodiment of the present application;
FIG. 6(b) illustrates an example of a preview interface for a local area of a display interface of a navigation application according to an embodiment of the present application;
FIG. 6(c) illustrates an example of an instant chat application in which a local area of a navigation application is displayed in a floating manner, according to an embodiment of the present application;
FIG. 7(a) illustrates an example of display content within a floating window of a navigation application according to an embodiment of the present application;
FIG. 7(b) illustrates an example of a change in display content within a floating window of a navigation application according to an embodiment of the present application;
FIG. 8(a) shows an example of a user performing an operation gesture within a floating window of a navigation application according to an embodiment of the present application;
FIG. 8(b) illustrates another example of a user performing an operation gesture within a floating window of a navigation application according to an embodiment of the present application;
FIG. 8(c) illustrates an example of a change in display content within a floating window of a navigation application according to an embodiment of the present application;
FIG. 9(a) illustrates an example of a user performing an operation gesture within a display interface of an instant chat application in accordance with an embodiment of the present application;
FIG. 9(b) illustrates an example of a change in display content within a display interface of an instant chat application in accordance with embodiments of the application;
FIG. 10(a) shows an example of a user performing a slide-down gesture on a display interface of a video playback application according to an embodiment of the present application;
fig. 10(b) shows an example of a selection interface selected by a user clicking a "local window" button to enter a local area on a display interface of a control center of a mobile phone according to an embodiment of the present application;
FIG. 10(c) shows an example of a user adjusting a preview box in a selection interface selected in a local area according to an embodiment of the present application;
FIG. 10(d) illustrates an example of a user determining a local region of a display interface of a video playback application according to an embodiment of the present application;
FIG. 10(e) illustrates an example of an instant chat application in which a local area of a video playback application is displayed in a floating manner, according to an embodiment of the present application;
FIG. 11(a) illustrates an example of a dynamically changing area within a display interface of a handset recognition navigation application according to an embodiment of the application;
FIG. 11(b) illustrates an example of an instant chat application in which a local area of a navigation application is displayed in a floating manner, according to an embodiment of the present application;
FIG. 12(a) shows an example of a user clicking a button of a floating window display on a display interface of a video playback application to initiate a floating display according to an embodiment of the present application;
FIG. 12(b) shows an example of a user clicking a button within the hover window of a video playback application to close the hover display that exits the hover window display, according to an embodiment of the present application;
fig. 13(a) shows an example that the mobile phone prompts the user that the display content of the navigation application running in the background changes, and the user confirms that the changed display content is displayed according to the embodiment of the application;
fig. 13(b) shows an example of display contents of a navigation application that a mobile phone prompts a user to display by way of a floating window according to an embodiment of the present application;
FIG. 13(c) illustrates an example of an instant chat application in which a local area of a navigation application is displayed in a floating manner, according to an embodiment of the present application;
FIG. 14(a) shows an example of a user performing a swipe down gesture at a display interface of an instant chat application, in accordance with embodiments of the present application;
fig. 14(b) shows an example of entering a summary interface of a background running application by clicking an "app hover display" button on a display interface of a control center of a mobile phone by a user according to an embodiment of the present application;
FIG. 14(c) shows an example of a summary interface for a background running application according to an embodiment of the present application;
FIG. 14(d) illustrates an example of a user selecting an application displayed by way of a floating display in accordance with an embodiment of the present application;
FIG. 14(e) illustrates an example of an instant chat application in which local regions of a video playback application and a navigation application are hovered, in accordance with embodiments of the present application;
FIG. 15 illustrates an example of a navigation application, an instant chat application, and a reading application and desktop displayed through multiple windows in accordance with an embodiment of the present application;
FIG. 16 illustrates an example of a navigation application displayed by way of scaling according to an embodiment of the present application;
fig. 17(a) shows an example of a display interface of a video playback application according to an embodiment of the present application;
fig. 17(b) shows an example of a video playback application in which a local area of the video playback application is displayed in a floating manner according to an embodiment of the present application;
fig. 18 is a block diagram showing a hardware configuration of an electronic apparatus in the embodiment of the present application;
Detailed Description
Embodiments of the present application include, but are not limited to, a display method and medium for an electronic device and applications thereof.
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
In order to facilitate a user to pay attention to a change in display content of one application while using the other application, the application discloses an application display method of an electronic device. Specifically, fig. 1(a) - (b) provide a display scenario of an application of an electronic device according to an embodiment of the present application, in which two applications are displayed simultaneously as an example, where a first application is an instant chat application 102 and a second application is a navigation application 101.
Specifically, fig. 1(a) shows a display interface of the navigation application 101. In this scenario, if the user selects to hover the navigation application 101, the electronic device 100 may hover the navigation application 101 on another interface, for example, the user switches to the instant chat application 102, as shown in fig. 1(b), the view system of the electronic device 100 hovers the entire display interface of the navigation application 101 within the display interface of the instant chat application 102 in the form of a small-sized window, and the display interface of the navigation application 101 is displayed within the hover window. The electronic device 100 may also drag in the screen of the electronic device 100 to change the position of the floating window based on the gesture operation of the user.
In addition to the view system hovering the entire display interface of the navigation application 101 within the display interface of the instant chat application 102 in the form of a small-sized window, in order to make the user more clearly see the display content change of the navigation application 101, in some other embodiments of the present application, only the portion of the navigation application 101 where the display content has changed may be displayed within the hovering window. For example, fig. 2 shows another scene diagram of two applications simultaneously displayed in the screen of the electronic device 100 in the embodiment of the present application. As shown in fig. 2, after the user selects to hover and display the navigation application 101, the view system of the electronic device 100 hovers a partial display area 1011 of the navigation application 101 within the display interface of the instant chat application 102 as a hover window in the form of a small-sized window, where the partial display area 1011 of the navigation application 101 displayed in the hover window is a portion of the navigation application 101 where the displayed content changes over time. Meanwhile, the electronic device 100 may further receive that the user performs a gesture operation on the display interface of the navigation application 101 in the floating window to change the display content of the display interface of the navigation application 101 in the floating window. Also, the user may also click on the "close" button 1018 or the "full screen" button 1019 in the partial display area 1011, causing the navigation application 101 to close or to display full screen within the screen of the electronic device 100.
By the method, the electronic device 100 can display the display content of the display interface of the first application which needs to be paid attention to by the user in real time in the display interface of the second application or the desktop in a floating window mode, so that the user can conveniently pay attention to the change of the first application while using the second application on the premise of not switching the two applications. In addition, by displaying part of the display area of the first application in the floating window, the area concerned by the user can be displayed to the user more clearly, and the problem that the display content in the floating window becomes unclear due to the fact that the floating window displays the whole display interface of the first application in a shrinking mode is avoided. It is understood that the first application and the second application may be third-party applications installed on the electronic device 100, or may be system applications of the electronic device 100 itself, for example, the first application and the second application may be applications such as a desktop and settings of the electronic device 100.
In the scenario diagram shown in fig. 2, the electronic device 100 may be a variety of electronic devices, for example, the electronic device 100 includes, but is not limited to, a laptop computer, a desktop computer, a tablet computer, a cell phone, a server, a wearable device, a head-mounted display, a mobile email device, a portable game console, a portable music player, a reader device, or other electronic device capable of accessing a network. In some embodiments, embodiments of the present application may also be applied to wearable devices worn by a user. For example, a smart watch, bracelet, piece of jewelry (e.g., a device made as a decorative item such as an earring, bracelet, or the like) or glasses, or the like, or as part of a watch, bracelet, piece of jewelry or glasses, or the like.
It is understood that various applications, such as a video conference application, an instant chat application, a video playing application, a navigation application, and the like, may be installed on the electronic device 100, and these applications may perform on-screen multi-application display by using the technical solution of the present application.
Fig. 3 is a block diagram of the software configuration of the electronic apparatus 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. In an embodiment of the application, the application packages may include a launch navigation application 101, an instant chat application 102, a reading application 103, a desktop 104, and a video playback application 105.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, an application service management module, a gesture recognition module, and the like.
The gesture recognition module is used for recognizing gesture operations performed by a user on the screen of the electronic device 100.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be comprised of one or more views for displaying visual controls within the area of the view and handling events occurring in the area of the view. For example, the view may process events corresponding to gesture operations generated within the area in which the view is located. For a display interface of a navigation application, it may include a view of navigation content and a view of navigation information, and for a display interface of an instant chat application, it may include a view of chat content and a view of menus. In an embodiment of the application, the view system may calculate, according to a recognized trajectory of a gesture operation of a user, a local area corresponding to the trajectory from a view of a display interface of an application, and use the local area as a display area, and display the display area in a floating window manner. In some embodiments, the viewing system may also automatically determine a display area in the display interface of the application. The view system may also place the two applications in different layers, respectively, and have one layer above the other. Here, the layer is used for carrying a window corresponding to an application.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
And the application service management module can manage the connection between the application program and the server.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Taking the mobile phone 100 as an example of the electronic device 100, the detailed description will be given by taking an example that a display area is selected from the currently opened navigation application 101 of the mobile phone 100 and is displayed on the display interface of the instant chat application 102 of the mobile phone 100 in a floating window manner.
As shown in fig. 4, the technical solution of the display method of the application of the mobile phone 100 includes:
s401: the mobile phone 100 starts the navigation application 101, and prompts the user that the navigation application 101 supports displaying the display interface of the navigation application 101 in a floating window manner.
After the user opens the mobile phone 100, the user may click a "navigation" icon in a User Interface (UI) of the mobile phone 100, and after the mobile phone 100 receives an instruction of a user click operation, the mobile phone 100 starts the navigation application 101. In the case where the mobile phone 100 starts the navigation application 101 for the first time, as shown in fig. 5(a), the guidance interface 1013 that the application supports displaying by way of the floating window is displayed in the display interface of the navigation application 101, for example, the guidance interface 1013 may display a prompt message "selecting an area in the display interface may display it in the floating window" and "continue" and "cancel" buttons. After the user clicks the "cancel" button 1015 in the guidance interface 1013, the guidance interface 1013 of the navigation application 101 exits. After the user clicks the "continue" button 1014 in the guidance interface 1013, the guidance interface 1013 may prompt the user by means of animation an operation method of displaying the display interface of the navigation application 101 by means of the floating window. For example, as shown in fig. 5(b), the guidance interface 1013 prompts the user to select a display area by performing a gesture operation on the display interface of the navigation application 101, and click the "small window display" button 1016, as shown in fig. 5(c), so that the display area 1011 is displayed by way of a floating window. It will be appreciated that within the display area 1011 of fig. 5(c), two buttons "off" and "full screen display" are also included, and that after the user clicks the "off" button 1018 in the display area 1011, the display area 1011 is off. After the user clicks the "full screen" button 1019 in the display area 1011, the navigation application 101 is displayed in full screen on the screen of the handset 100. Here, after the user clicks the "know" button 1020 in the guidance interface 1013 of fig. 5(c), the guidance interface 1013 of the navigation application 101 is closed.
In the embodiment of the present application, the mobile phone 100 may also start the instant chat application 102 in advance in addition to the navigation application 101. After the display area 1011 displayed in the floating window manner is determined by the navigation application 101, the mobile phone 100 can display the display area 1011 on the display interface of the instant chat application 102.
S402: the handset 100 determines a display area 1011 in the display of the navigation application 101.
Here, the display area 1011 in the display interface of the navigation application 101 may be a local area in the display interface of the navigation application 101, which may be determined by the mobile phone 100 according to a selection gesture performed by the user on the display interface of the navigation application 101.
As shown in fig. 6(a), in some embodiments, the gesture recognition module of the mobile phone 100 may recognize a selection gesture performed by the user on the display interface of the navigation application 101, and the gesture recognition module obtains a track of the selection gesture and sends an area selected by the selection gesture to the view system of the mobile phone 100. As shown in fig. 6(c), the view system of the cell phone 100 determines the area as a display area 1011 in the display interface of the navigation application 101.
At the same time, the viewing system of the handset 100 may also determine the size and location of the display area 1011 in the display interface of the navigation application 101. For example, the display interface of the navigation application 101 may be 90mm × 120mm (length × width), and the view system of the cell phone 100 may determine that the display area 1011 corresponding to the trajectory of the user's selection gesture may be 20mm × 19mm (length × width). The display area 1011 may be located 10mm from the left border, 10mm from the border on the display interface of the navigation application 101. The process of the handset 100 determining the display area 1011 in the display interface of the navigation application 101 will be described in detail below.
In other embodiments, the handset 100 may also automatically determine the display area 1011 in the display interface of the navigation application 101, and the implementation of the method will be described in detail below.
S403: the cell phone 100 displays the display area 1011 of the navigation application 101 by way of a floating window.
Here, after the display area 1011 of the navigation application 101 is determined, as shown in fig. 6(b) to 6(c), the view system of the mobile phone 100 enters the preview interface 120, and a "small window display" button 1202 is provided in the preview interface 120 to set the display area 1011 to be displayed by way of a floating window. In response to the user clicking the "small window display" button 1202 in the preview interface 120, the view system of the cell phone 100 displays the display area 1011 of the navigation application 101 in the floating window manner at the display position of the display interface of the instant chat application 102 of the cell phone 100. As described in step S402, the size of the display area 1011 determined by the user may be determined to be 20mm × 19mm (length × width), for example, while the default display position may be a position 10mm and 10mm from the right boundary on the display interface of the instant chat application 102 of the handset 100. That is, the display area 1011 of the navigation application 101 is displayed at the upper right of the display interface of the instant chat application 102.
In an embodiment of the present application, after the display area 1011 of the navigation application 101 is determined, the view system of the mobile phone 100 sets the navigation application 101 on a first layer, and sets the instant chat application 102 of the mobile phone 100 on a second layer, where the first layer is overlaid on the second layer. That is, the mobile phone 100 runs the navigation application 101 in the first image layer, and the mobile phone 100 runs the display interface of the instant chat application 102 in the second image layer. The cellular phone 100 sets the display area 1011 of the navigation application 101 to be visible, for example, an area of 20mm × 19mm (length × width) in size from the boundary of 20mm, 19mm on the left boundary on the display interface of the navigation application 101 to be visible, and an area outside the display area to be transparent. In addition, the mobile phone 100 positions the mobile navigation application 101 in the first image layer so that the display area 1011 thereof is located at the display position of the display interface of the instant chat application 102 of the mobile phone 100 in the second image layer. Thus, the user can browse the display content of the display interface of the instant chat application 102 of the mobile phone 100 in the second image layer through the area outside the display area 1011 of the navigation application 101, and at the same time, the display area 1011 of the navigation application 101 can be displayed on the display interface of the instant chat application 102 of the mobile phone 100 in a floating window manner, and the user can also browse the display content of the display area of the navigation application 101 in real time.
It will be appreciated that after the cell phone 100 moves the display area of the navigation application 101 to the display position of the display interface of the instant chat application 102 of the cell phone 100 in the second image layer, the portion of the display interface of the navigation application 101 in the first view will be outside the boundaries of the screen of the cell phone 100. Here, the display position of the display interface of the instant chat application 102 of the mobile phone 100 may be a parameter preset in the storage area of the mobile phone 100.
When the display area 1011 of the navigation application 101 is on the display interface of the instant chat application 102, the display content of the navigation application 101 in the display area 1011 changes in real time. That is, after the mobile phone 100 sets the navigation application 101 in the first layer, the navigation application 101 is still in the running state, and therefore, the display content in the display interface of the navigation application 101 also changes in real time. If the display content in the display area 1011 belongs to the display content in the display interface of the navigation application 101, the display content of the navigation application 101 in the display area 1011 on the display interface of the chat application 102 of the mobile phone 100 also changes in real time. For example, as shown in fig. 7(a) to (b), the navigation information in the display interface of the navigation application 101 changes, for example, the route of the navigation information changes after the vehicle runs for a certain period of time, and the change of the navigation information updates the change in the display area 1011 on the display interface of the instant chat application 102 of the mobile phone 100 in real time.
S404: the cellular phone 100 determines whether a gesture operation performed by the user in the display area 1011 of the navigation application 101 is detected.
After the mobile phone 100 receives a gesture operation performed by the user in the display area of the navigation application 101, S405 is performed, and the mobile phone 100 changes the display content in the display area in response to the gesture operation.
S405: the cellular phone 100 changes the display contents within the display area 1011.
Here, the navigation application 101 of the handset 100 runs in a first layer, and the first layer is overlaid on top of a second layer. Therefore, the first layer is a current view in the screen of the mobile phone 100, that is, a gesture operation performed by the user on the screen of the mobile phone 100 acts on the first layer. If the gesture operation of the user is acted in the display area 1011 of the navigation application 101 of the mobile phone 100, which is equivalent to acting on the display interface of the navigation application 101, after the gesture recognition module of the mobile phone 100 recognizes the gesture operation, the navigation application 101 may respond to the instruction corresponding to the gesture operation, so that the display content of the navigation application 101 in the display area 1011 of the navigation application 101 of the mobile phone 100 is changed.
For example, as shown in fig. 8(a), a user performs a gesture to expand the display content of the navigation application 101 in the display area 1011 of the navigation application 101 of the mobile phone 100, and after the gesture recognition module of the mobile phone 100 recognizes the gesture operation, the navigation application 101 responds to the gesture, the display content of the navigation application 101 in the display area 1011 of the navigation application 101 of the mobile phone 100 is expanded and displayed, and the gesture does not act on the display interface of the instant chat application 102. As another example, as shown in fig. 8(b) to 8(c), the user performs a pull-down gesture on the display content of the navigation application 101 in the display area 1011 of the navigation application 101 of the mobile phone 100, and after the gesture recognition module of the mobile phone 100 recognizes the gesture operation, the navigation application 101 is made to respond to the gesture, and the display content of the navigation application 101 in the display area 1011 of the navigation application 101 moves downward. Likewise, the gesture does not work with the display interface of the instant chat application 102.
S406: the cellular phone 100 determines whether or not a gesture operation performed by the user in an area other than the display area 1011 of the navigation application 101 is detected.
After the mobile phone 100 receives the gesture operation performed by the user in the area other than the display area 1011 of the navigation application 101, S407 is performed, and the mobile phone 100 changes the display content of the instant chat application 102 in response to the gesture operation.
S407: the handset 100 changes the display content of the instant chat application 102.
Here, in S403, although the area outside the display area 1011 of the navigation application 101 of the mobile phone 100 is set to be transparent, since the navigation application 101 of the mobile phone 100 is still running in the first drawing layer, the gesture operation performed by the user in the area outside the display area 1011 of the navigation application 101 is also applied to the display interface of the navigation application 101 of the mobile phone 100. At this time, if the mobile phone 100 determines that the user is a gesture operation performed in an area outside the display area 1011 of the navigation application 101 of the mobile phone 100, the mobile phone 100 acquires an instruction corresponding to the gesture operation and causes the instruction not to be performed in the first image layer, the mobile phone 100 transmits the instruction corresponding to the gesture operation to the display interface of the instant chat application 102 of the mobile phone 100 in the second image layer, and the display interface of the instant chat application 102 of the mobile phone 100 may respond to the instruction corresponding to the gesture operation, so that the display content of the display interface of the instant chat application 102 of the mobile phone 100 is changed.
For example, as shown in fig. 9(a) to 9(b), the user performs a bottom-up sliding gesture in a region outside the display region 1011 of the navigation application 101 of the mobile phone 100, for example, in a region of the display interface of the instant chat application 102, after obtaining an instruction corresponding to the gesture operation, the mobile phone 100 determines that the user performs in the region outside the display region 1011 of the navigation application 101 of the mobile phone 100, and the mobile phone 100 transmits the instruction corresponding to the bottom-up sliding gesture to the display interface of the instant chat application 102 of the mobile phone 100 in the second layer, at this time, the display interface of the instant chat application 102 of the mobile phone 100 responds to the instruction corresponding to the bottom-up sliding gesture, and the display content of the display interface of the instant chat application 102 of the mobile phone 100 achieves an effect of moving up.
The following describes a process of determining the display area 1011 in the display interface of the navigation application 101 by the mobile phone 100 described in S402.
S402 a: the handset 100 receives a selection gesture performed by a user on the display interface of the navigation application 101.
In an embodiment of the present application, as shown in fig. 6(a), a user may manually select a local area of the display interface of the navigation application 101 through a selection gesture, and the gesture recognition module of the mobile phone 100 may recognize a track of the selection gesture. The display interface of the navigation application 101 here may be a display interface displaying changes in navigation information, such as: the route of the navigation.
Here, the user may select a local area on the screen of the mobile phone 100 to the display interface of the navigation application 101 through a selection gesture of a finger joint sliding. In this way, conflicts with default gestures supported by the navigation application 101 may be avoided.
The gesture recognition module may be any module that is currently used to implement gesture recognition or finger joint recognition, and is not limited herein.
S402 b: the handset 100 selects a local area within the display interface of the navigation application 101 according to the trajectory of the selection gesture.
After recognizing the trajectory of the selection gesture of the user, the gesture recognition module of the mobile phone 100 sends the trajectory to the view system of the mobile phone 100. The view system of the mobile phone 100 uses the local area corresponding to the track as a display area according to the local area.
Here, the local area of the display interface of the navigation application 101 selected by the user through the selection gesture may be display content included in a trajectory formed within the display interface of the navigation application 101 by the selection gesture performed by the user. Here, the view system of the mobile phone 100 sets a method for monitoring a gesture event, for example, an ontouchvent method, on a view in which the display interface of the navigation application 101 is located, and the method identifies a trajectory of a selection gesture of the user by starting a gesture recognition module of the mobile phone 100. That is, when the user presses or contacts the screen of the cellular phone 100 with a conductor, the method may detect a location of pressing or contacting the screen of the cellular phone 100, which may be simply referred to as a touch point. When the gesture recognition module detects a contact point, it may be determined that a user's selection gesture is detected. Wherein the selection gesture may be a gesture that slides within the screen of the cell phone 100.
When a user executes a selection gesture in the screen of the mobile phone 100, that is, in the display interface of the navigation application 101, the method for monitoring a gesture event may obtain an x coordinate value and a y coordinate value that the selection gesture passes through in the display interface of the navigation application 101, and after the user completes the selection gesture, the method for monitoring a gesture event may calculate a trajectory formed by the selection gesture according to the x coordinate value and the y coordinate value, and determine a local area corresponding to the trajectory.
As shown in fig. 6(a), the trajectory formed by the user's selection gesture may be a closed region or a non-closed region. When the track formed by the selection gesture of the user is a closed area, the selected local area of the display interface of the navigation application 101 is the display content in the closed area. When the trajectory formed by the selection gesture of the user is a non-closed region, the selected local region of the display interface of the navigation application 101 may be the display content in the region formed between the trajectory formed by the selection gesture of the user and the left-right boundary or the upper-lower boundary of the screen of the mobile phone 100.
For example, when the trajectory formed by the selection gesture of the user is a straight line or a curved line, the local area of the display interface of the selected navigation application 101 is the display content in the area formed by the straight line, the curved line, or the circular arc and the left and right boundaries or the upper and lower boundaries of the screen. When the trajectory formed by the selection gesture of the user is a non-closed region, the selected local region of the display interface of the navigation application 101 may also be the display content in the closed region obtained after the non-closed region is completed. For example, when the trajectory formed by the selection gesture of the user is a V-shape or an arc shape, the display content of the selected current display page is the display content in the closed area obtained after the completion of the V-shape or the arc shape. The completion can be that the notched parts are directly connected, or the non-closed regions are completed according to the symmetry principle, or the non-closed regions are completed into a regular pattern according to the shape of the non-closed regions, etc.
S402 c: the mobile phone 100 adjusts a local area of the display interface of the navigation application 101 to form a display area of the navigation application 101.
It can be understood that after the view system of the mobile phone 100 determines that the user selects the local area of the display interface of the navigation application 101 through the selection gesture, the mobile phone 100 may enter the preview interface 120 of the local area of the display interface of the navigation application 101, and in the preview interface 120, the view system of the mobile phone 100 may expand the local area to form a rectangle, or other shapes such as a circle and a diamond frame, as the preview area 1201. For example, as shown in fig. 6(b), in the case that the trajectory formed by the user through the selection gesture is a circular closed region, the view system of the mobile phone 100 may form a preview region 1201 by circumscribing a quadrilateral on the circular closed region, for example, the preview region 1201 may be a region with a size of 20mm × 19mm (length × width). Meanwhile, in the preview interface 120, the user may resize the preview area 1201 again through a gesture. For example, the user may adjust the size of the display area by dragging four sides of a rectangle formed by the preview area 1201, or the user may adjust the position of the preview area 1201 in the display interface of the navigation application 101 by pressing and holding the rectangle formed by dragging the preview area 1201. The view system of the handset 100 may also store the size of the preview area 1201 in its own storage area, as well as the default location of the preview area 1201. For example, the size of the preview area 1201 may be 20mm × 19mm (length × width), and the default position of the preview area 1201 may be the distance from the center of the preview area 1201 to the four sides of the display interface of the navigation application 101.
It is understood that, besides the method for the mobile phone 100 to determine the display area 1011 of the navigation application 101 by the user performing a selection gesture on the display interface of the navigation application 101 in step S402, several other methods for the mobile phone 100 to determine the display area 1011 of the navigation application 101 in step S402 will be described below.
The mobile phone 100 can automatically select the display area 1011 in the display interface of the navigation application 101 through the selection function of the display area of the control center of the mobile phone. After receiving an instruction of a user to start a selection function provided in a control center of the mobile phone 100, the mobile phone 100 enters an application local area selection mode. In the selection mode, the view system of the mobile phone 100 may select a display area in the display interface of the navigation application 101 as described in S402.
The instant chat application 102 and the video playing application 105 are executed in the mobile phone 100 as an example. In fig. 10(a) the user opens the video playback application 105 and the instant chat application 102 on the mobile phone 100 and switches to the video playback application 105. After the user performs a gesture of sliding down on the display interface of the video playing application 105, the mobile phone 100 enters the display interface of the control center 201. In fig. 10(b), a button 2011 named "partial window" is provided on the display interface of the control center 201. After the user presses the "local window" button 2011, the cell phone 100 enters the selection interface 202 for local area selection on the display interface of the video playback application 105. In fig. 10(c), in the selection interface 202, a rectangular preview frame 2021 for selecting a local area and a "small window display" button 2022 are displayed, a user may adjust the size of the local area of the display interface of the selected video playing application 105 by dragging four sides of the preview frame 2021, or a user may adjust the position of the preview frame in the display interface of the video playing application 105 by pressing the dragged preview frame 2021, so as to select the local area of the display interface of the video playing application 105 as the display area. In fig. 10(d), after the user determines the display area in the display interface of the video playback application 105, the user may proceed to S403 by clicking the "small window display" button 2022, and as shown in fig. 10(e), the view system of the mobile phone 100 displays the display area 1051 of the video of the display interface of the video playback application 105 on the display interface of the instant chat application 102 of the mobile phone 100 through the floating window.
In another embodiment of the present application, the mobile phone 100 may implement the display area 1011 in the display interface of the navigation application 101 determined by the mobile phone 100 described in the above S402 through the floating window display mode of the display area.
Unlike the method described above, in the floating window display mode, the view system of the mobile phone 100 may detect and determine an area in the display interface of the navigation application 101 where the display content may dynamically change, and determine the display area 1012 of the navigation application 101 from the area where the display content may dynamically change. The area where the display content is dynamically changed is described herein, which means that the display content in an area of a display interface of an application is changed in a period of time. As shown in fig. 11(a), for the display interface of the navigation application 101, the contained dynamically changed area may be the area where the map is located, and the type thereof may be map view MapView. The display content in the area where the map view is located may change in real time, the view system of the mobile phone 100 may detect each area in the display interface of the navigation application 101, and when the change of the display content exceeds a change threshold, for example, 10 Frames Per Second change occurs, that is, the refresh frequency of the display content is 10FPS, where FPS refers to the number of transmission Frames Per Second (FPS, Frames Per Second), that is, the number of Frames refreshed in the display content Per Second, and 10FPS refers to the number of Frames refreshed in the display content within one Second, which is 10 Frames. The viewing system of the handset 100 confirms that the region belongs to a region where the display content is dynamically changing. In fig. 11(b), after the view system of the mobile phone 100 determines the area where the content in the navigation application 101 dynamically changes, the mobile phone 100 may radiate from the center of the dynamically changing area to the periphery of the dynamically changing area to form a rectangular display area 1011, and the view system of the mobile phone 100 proceeds to S403, and displays the display area 1011 on the display interface of the instant chat application 102 through the floating window.
It will be appreciated that in some embodiments, the view system of the handset 100 may determine which views belong to a dynamically changing region from a list of view types pre-stored in its own memory region. As another example, for the video playing application 105, the contained dynamically changing region may be a video playing view, and the type may be VideoView.
For the above-mentioned floating window display mode, the mobile phone 100 may further set a button for entering/exiting the floating window display mode for an application that has been opened. For example, as shown in fig. 12(a) to 12(b), the user may click on a "floating window display" button 1052 on the display interface of the video playback application 105, so that the cell phone 100 determines a display area 1051 of the video of the display interface of the video playback application 105 and displays it in a floating window. After the mobile phone 100 displays the floating window in the video playing area of the video playing application 105, the user may also click the "exit floating window display" button 1053 in the floating window, so that the mobile phone 100 cancels the display of the floating window in the video displaying area 1051 of the video playing application 105.
In another embodiment of the present application, the floating window display mode of the mobile phone 100 may also be used for a pop-up application on the mobile phone 100 to implement the display area 1011 in the display interface of the navigation application 101 determined by the mobile phone 100 described in the above S402.
Different from the effect of the floating window display mode described above, after the mobile phone 100 starts the floating window display mode, in the floating window display mode, when the mobile phone 100 detects that the display content of the application running in the background changes, the mobile phone prompts the user whether to display the application in the floating window manner, so that the user can pay attention to the display content of the application at any time. For example, as shown in fig. 13(a), when the display interface of the instant chat application 102 is displayed in the screen of the mobile phone 100, the upper part of the screen of the mobile phone 100 prompts the user that the display content of the display interface of the navigation application 101 running in the background changes and whether to click to view. If the user clicks the ignore button, the cell phone 100 will not prompt the user to change the display content of the display interface of the navigation application 101. After the user clicks the "click to view" button, as shown in fig. 13(b), the mobile phone 100 prompts the user again whether to display the display interface of the navigation application 101 in the display interface of the instant chat application 102 in the form of a floating window. If the user selects to click "no", the cell phone 100 will not display the display interface of the navigation application 101 through the floating window. If the user selects "yes" click, the view system of the handset 100 can display the display interface of the navigation application 101 in the display interface of the instant chat application 102 by way of a floating window, as shown in fig. 13 (c).
It is understood that the above process of selecting the display area of the application popped up on the mobile phone 100 and displaying the application through the floating window may be the same as that described in steps S402 and S403, and the description is not repeated here.
In addition to the above-described scenario in which the mobile phone 100 displays a single application on the display interface of another application in a floating window manner, in an embodiment of the present application, the mobile phone 100 may also display multiple applications in a floating window manner.
Here, the mobile phone 100 may start an application floating display function, after starting the function, the view system of the mobile phone 100 may display applications that have run in the background in a list display manner on the screen of the mobile phone 100, and the user may select one or more applications among the applications, so that the applications are displayed in the display interface of the current application in a floating window manner.
The following description will be given taking as an example the navigation application 101, the instant chat application 102, the reading application 103 and the video playing application 105 running in the mobile phone 100, wherein the current application displayed in the screen of the mobile phone 100 is the instant chat application 102. As shown in fig. 14(a), after the user performs a gesture of sliding down on the display interface of the instant chat application 102, the mobile phone 100 enters the display interface 401. In fig. 14(b), a button 4011 named "application floating display" is provided in the display interface of the control center 401. After the user presses the "application floating display" button 4011, the mobile phone 100 enters the list interface 501 of the application currently running in the background. In fig. 14(c), in the list interface 501, the identifications of all applications currently running in the background, for example, the navigation application 101, the reading application 103, and the video playback application 105, are displayed, and the user is prompted to select an application displayed by way of the floating window. The user may determine whether to display each application within the display interface of the current application by clicking on the identification of the application. After the user has selected the applications, as shown in fig. 14(d), in the list interface 501, after the user has selected the navigation application 101 and the video playback application 105, the user can click the "ok" button, as shown in fig. 14(e), and the view system of the mobile phone 100 displays the display interface of the video playback application 105 and the display interface of the navigation application 101 on the display interface of the instant chat application 102 of the mobile phone 100 through the floating window. In FIG. 14(d), the user may also click the "Cancel" button to exit the application hover display function.
In the above step 403, a method for displaying the display area 1011 of the navigation application 101 on the display interface of the instant chat application 102 in the form of a floating window is described, and another implementation method of the above step 403 is described below, in another embodiment of the present application, for the navigation application 101 and the instant chat application 102, the view system of the mobile phone 100 may be used to guide the navigation application 101 and the instant chat application 102
A 10mm area of size 20mm x 19mm (length x width) is cut out so that the user can navigate through the display area to view the display interface of the navigation application 101 in the second layer, the size of the local area being the same as the size of the display area 1011. And the mobile phone 100 positions the mobile navigation application 101 in the second image layer so that the display area position 1011 thereof is aligned with the local area of the instant chat application 102 of the mobile phone 100 in the first image layer. Therefore, the user can browse the display content of the display area 1011 of the display interface of the navigation application 101 of the mobile phone 100 in the second image layer through the local area of the instant chat application 102, and meanwhile, the display area of the navigation application 101 can be displayed on the display interface of the instant chat application 102 of the mobile phone 100 in a manner similar to a floating window, and the user can also browse the display content of the display area of the navigation application 101 in real time.
It is to be understood that the size of the display area may be the same as the size of the display area in step S402, or may be pre-stored in the storage area of the mobile phone 100, for example, 20mm × 10mm (length × width). In some embodiments, the size of the display area may also be the size of a view in which a dynamically changing area in the display interface of the navigation application 101 is located, and the mobile phone 100 may obtain the size of the view while obtaining the type of the view corresponding to the dynamically changing area, and set the size of the view as the size of the display area.
It is understood that the method of displaying the display interface of the navigation application 101 by means of the floating window within the display interface of the instant chat application 102 is not limited to the handset 100 described in fig. 4. In other embodiments of the present application, the handset 100 launches the navigation application 101, the instant chat application 102, and the reading application 103 and the desktop 104. As shown in fig. 15, in the case that the user needs to browse the navigation application 101, the instant chat application 102, the reading application 103 and the desktop 104 at the same time, the window manager of the mobile phone 100 displays the navigation application 101, the instant chat application 102, the reading application 103 and the desktop 104 in a split screen manner by dividing the screen of the mobile phone 100 into a plurality of windows, combines the navigation application 101, the instant chat application 102, the reading application 103 and the desktop 104 in one main window and a plurality of auxiliary windows, and displays the navigation application 101, the instant chat application 102, the reading application 103 and the desktop 104 respectively, so that the user can pay attention to the applications in each window.
Fig. 16 shows a scene diagram of two applications simultaneously displayed on the screen of the mobile phone 100 in other embodiments of the present application. An instant chat application 102 and a navigation application 101 are run on the screen of the mobile phone 100, the view system of the mobile phone 100 can set the window of the navigation application 101 to a resizable application, and in the case that the navigation application 101 is displayed in a floating window manner, the display style of the display content of the navigation application 101 can be kept the same as the display style of the current display interface of the electronic device 100. For example, the font size, font style, etc. of the navigation application 101 within the floating window is the same as the font size, font style of the instant chat application 102.
Fig. 17(a) to (b) show scenes in which display contents of an application are displayed by way of a floating window within a screen of the mobile phone 100 in other embodiments of the present application. Fig. 17(a) shows a display interface of the video playback application 105. In fig. 17(b), the view system of the mobile phone 100 displays the display area 1051 of the video being played in the video playing application 105 in the display interface of the video playing application 105 by means of the floating window, and the user can control the video being played in the floating window by performing gesture operation on the floating window. For example, a user may click on the floating window causing the video being played to pause.
Fig. 18 shows a schematic structural diagram of an electronic device 100 according to an embodiment of the application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an APPlication Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like. In some embodiments, the camera 193 may be the camera 102 in the embodiments of the present application, and is used for capturing the scene image of the current environment.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In an embodiment of the present application, the internal memory 121 stores a list of view types for determining views belonging to dynamically changing areas.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a gesture operation acting thereon or nearby. The touch sensor may pass the detected gesture operation to the application processor to determine the touch event type. Visual output related to the gesture operation may be provided through the display screen 194. In an embodiment of the present application, the touch sensor 180K is configured to receive a gesture operation performed by a user on the touch screen. For example, the touch sensor 180K may determine that the user has performed a gesture operation of selecting a local area on the display interface of the application and a gesture operation of the user changing the display content of the display interface of the application.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, gesture operations applied to different applications (e.g., taking a picture, playing audio, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects by acting on different areas of the display screen 194 through gesturing. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195.
It will be understood that, although the terms "first", "second", etc. may be used herein to describe various features, these features should not be limited by these terms. These terms are used merely for distinguishing and are not intended to indicate or imply relative importance. For example, a first feature may be termed a second feature, and, similarly, a second feature may be termed a first feature, without departing from the scope of example embodiments.
Moreover, various operations will be described as multiple operations separate from one another in a manner that is most helpful in understanding the illustrative embodiments; however, the order of description should not be construed as to imply that these operations are necessarily order dependent, and that many of the operations can be performed in parallel, concurrently, or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when the described operations are completed, but may have additional operations not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
References in the specification to "one embodiment," "an illustrative embodiment," etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Furthermore, when a particular feature is described in connection with a particular embodiment, the knowledge of one skilled in the art can affect such feature in combination with other embodiments, whether or not such embodiments are explicitly described.
The terms "comprising," "having," and "including" are synonymous, unless the context dictates otherwise. The phrase "A/B" means "A or B". The phrase "A and/or B" means "(A), (B) or (A and B)".
As used herein, the term "module" may refer to, be a part of, or include: memory (shared, dedicated, or group) for executing one or more software or firmware programs, an Application Specific Integrated Circuit (ASIC), an electronic circuit and/or processor (shared, dedicated, or group), a combinational logic circuit, and/or other suitable components that provide the described functionality.
In the drawings, some features of the structures or methods may be shown in a particular arrangement and/or order. However, it should be understood that such specific arrangement and/or ordering is not required. Rather, in some embodiments, these features may be described in a manner and/or order different from that shown in the illustrative figures. Additionally, the inclusion of a structural or methodical feature in a particular figure does not imply that all embodiments need to include such feature, and in some embodiments may not include such feature, or may be combined with other features.
While the embodiments of the present application have been described in detail with reference to the accompanying drawings, the application of the present application is not limited to the various applications mentioned in the embodiments of the present application, and various structures and modifications can be easily implemented with reference to the present application to achieve various advantageous effects mentioned herein. Variations that do not depart from the gist of the disclosure are intended to be within the scope of the disclosure.

Claims (18)

1. An application display method, comprising:
displaying a first display interface of a first application on a screen of an electronic device;
displaying a second display interface of a second application and a first display area of a first application on the screen, wherein a partial area of the second display interface is blocked by the first display area, and the first display area is a part of the first display interface.
2. The method according to claim 1, wherein in the case that the user selection for the multi-application display is detected, a second display interface of a second application and a first display area of a first application are displayed on the screen.
3. The method of claim 2, wherein the user selecting a multi-application display comprises the user clicking on an icon for hover display of the first application.
4. The method of claim 1, further comprising:
and selecting the first display area from the first display interface.
5. The method of claim 1, further comprising:
prompting a user to select a first display area to be subjected to multi-application display from the first display interface;
and determining the first display area according to the selection result of the user.
6. The method of claim 5, wherein determining the first display area according to the selection result of the user comprises:
and determining the first display area according to a track formed by the gesture operation of the user on the first display interface.
7. The method according to claim 6, wherein in the case that the trajectory formed by the gesture operation is a non-closed region, the first display region is determined after the non-closed region is completed.
8. The method of claim 1, further comprising:
adjusting the size and the position of the first display area in response to an adjustment operation of the first display area by a user, wherein the adjustment operation of the user comprises at least one of the following operations:
moving the first display area;
enlarging the first display area;
and reducing the first display area.
9. The method of claim 1, further comprising:
and acquiring a region with changed display content in the first display interface, and determining the region with changed display content as the first display region.
10. The method according to claim 9, wherein a change amount of a picture of a region is detected within a preset time period, and in a case where the change amount exceeds a preset change threshold, the region is determined as a region where display content is changed.
11. The method of claim 1, wherein the first display area is displayed on the screen by way of a floating window.
12. The method of claim 11, wherein the first display region is displayed on the screen by way of a floating window by way of:
the first display interface is arranged on a first layer, the second display interface is arranged on a second layer, the first layer is superposed on the second layer, and the area except the first display area in the first display interface on the first layer is arranged to be transparent.
13. The method of claim 12, wherein the first display content is changed to the second display content in the first display area in response to a first change operation performed by a user in the first display area.
14. The method according to claim 12, wherein a second change operation of the user in an area outside the first display area is received, and an instruction corresponding to the second change operation is transmitted to the second layer.
15. The method of claim 14, wherein a third display content is changed to a fourth display content within the second display interface in response to an instruction corresponding to the second change operation.
16. The method of claim 1, wherein the first display region is displayed on the screen in a picture-in-picture manner.
17. An electronic device, comprising:
a memory storing instructions;
a processor coupled to a memory, the program instructions stored by the memory when executed by the processor causing the electronic device to perform a display method of an application of any of claims 1 to 16.
18. A readable medium having stored therein instructions, characterized in that when run on the readable medium, the instructions cause the readable medium to execute a display method of an application according to any one of claims 1 to 16.
CN202110162248.8A 2021-02-05 2021-02-05 Electronic device, display method and medium for application thereof Pending CN114879880A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110162248.8A CN114879880A (en) 2021-02-05 2021-02-05 Electronic device, display method and medium for application thereof
PCT/CN2022/074024 WO2022166713A1 (en) 2021-02-05 2022-01-26 Electronic device and display method for application thereof, and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110162248.8A CN114879880A (en) 2021-02-05 2021-02-05 Electronic device, display method and medium for application thereof

Publications (1)

Publication Number Publication Date
CN114879880A true CN114879880A (en) 2022-08-09

Family

ID=82667956

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110162248.8A Pending CN114879880A (en) 2021-02-05 2021-02-05 Electronic device, display method and medium for application thereof

Country Status (2)

Country Link
CN (1) CN114879880A (en)
WO (1) WO2022166713A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024036998A1 (en) * 2022-08-19 2024-02-22 荣耀终端有限公司 Display method, storage medium, and electronic device
WO2024088187A1 (en) * 2022-10-28 2024-05-02 维沃移动通信有限公司 Information display method and apparatus, and electronic device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106126236A (en) * 2016-06-24 2016-11-16 北京奇虎科技有限公司 The multi-screen display method of terminal screen, device and terminal
CN106909268A (en) * 2015-12-23 2017-06-30 北京奇虎科技有限公司 A kind of method and device that APP suspended windows are set on equipment desktop
US20190272086A1 (en) * 2018-03-01 2019-09-05 Samsung Electronics Co., Ltd Devices, methods, and computer program for displaying user interfaces
CN110243386A (en) * 2019-07-15 2019-09-17 腾讯科技(深圳)有限公司 Navigation information display methods, device, terminal and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106909268A (en) * 2015-12-23 2017-06-30 北京奇虎科技有限公司 A kind of method and device that APP suspended windows are set on equipment desktop
CN106126236A (en) * 2016-06-24 2016-11-16 北京奇虎科技有限公司 The multi-screen display method of terminal screen, device and terminal
US20190272086A1 (en) * 2018-03-01 2019-09-05 Samsung Electronics Co., Ltd Devices, methods, and computer program for displaying user interfaces
CN110243386A (en) * 2019-07-15 2019-09-17 腾讯科技(深圳)有限公司 Navigation information display methods, device, terminal and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024036998A1 (en) * 2022-08-19 2024-02-22 荣耀终端有限公司 Display method, storage medium, and electronic device
CN117632329A (en) * 2022-08-19 2024-03-01 荣耀终端有限公司 Display method, storage medium and electronic device
WO2024088187A1 (en) * 2022-10-28 2024-05-02 维沃移动通信有限公司 Information display method and apparatus, and electronic device and storage medium

Also Published As

Publication number Publication date
WO2022166713A1 (en) 2022-08-11

Similar Documents

Publication Publication Date Title
US11893219B2 (en) Method for quickly invoking small window when video is displayed in full screen, graphic user interface, and terminal
CN110362244B (en) Screen splitting method and electronic equipment
CN110119296B (en) Method for switching parent page and child page and related device
CN116723266A (en) Suspension window management method and related device
CN115297199A (en) Touch method of equipment with folding screen and folding screen equipment
CN111147660B (en) Control operation method and electronic equipment
WO2022166713A1 (en) Electronic device and display method for application thereof, and medium
WO2022161119A1 (en) Display method and electronic device
CN112068907A (en) Interface display method and electronic equipment
CN114844984B (en) Notification message reminding method and electronic equipment
CN115801943B (en) Display method, electronic device and storage medium
CN113448658A (en) Screen capture processing method, graphical user interface and terminal
WO2021042881A1 (en) Message notification method and electronic device
US20220291832A1 (en) Screen Display Method and Electronic Device
CN113805771B (en) Notification reminding method, terminal equipment and computer readable storage medium
CN115567666B (en) Screen recording method, electronic device and readable storage medium
CN116860420B (en) Event processing method, readable storage medium, and electronic device
CN116680019B (en) Screen icon moving method, electronic equipment and storage medium
WO2024082989A1 (en) Full-screen display method, electronic device and computer-readable storage medium
WO2024067169A1 (en) Information processing method and electronic device
WO2024109481A1 (en) Window control method and electronic device
EP4375822A1 (en) Annotation method and electronic device
EP4266164A1 (en) Display method and electronic device
CN117519564A (en) Barrage message publishing method and device
CN118113190A (en) Window display method, electronic device and readable storage medium when application is started

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination