WO2021063074A1 - 一种分屏显示方法与电子设备 - Google Patents
一种分屏显示方法与电子设备 Download PDFInfo
- Publication number
- WO2021063074A1 WO2021063074A1 PCT/CN2020/102488 CN2020102488W WO2021063074A1 WO 2021063074 A1 WO2021063074 A1 WO 2021063074A1 CN 2020102488 W CN2020102488 W CN 2020102488W WO 2021063074 A1 WO2021063074 A1 WO 2021063074A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- task
- screen
- split
- electronic device
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 230000004044 response Effects 0.000 claims abstract description 21
- 230000015654 memory Effects 0.000 claims description 47
- 238000004590 computer program Methods 0.000 claims description 17
- 230000000977 initiatory effect Effects 0.000 claims 2
- 230000003213 activating effect Effects 0.000 abstract 1
- 230000004913 activation Effects 0.000 abstract 1
- 238000013461 design Methods 0.000 description 26
- 238000004891 communication Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 18
- 230000000694 effects Effects 0.000 description 16
- 238000007726 management method Methods 0.000 description 15
- 230000001960 triggered effect Effects 0.000 description 14
- 238000010295 mobile communication Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 210000000988 bone and bone Anatomy 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 229920001621 AMOLED Polymers 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003238 somatosensory effect Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 244000240602 cacao Species 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 239000007858 starting material Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/001—Arbitration of resources in a display system, e.g. control of access to frame buffer by video controller and/or main processor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2330/00—Aspects of power supply; Aspects of display protection and defect management
- G09G2330/04—Display protection
- G09G2330/045—Protection against panel overheating
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/20—Details of the management of multiple sources of image data
Definitions
- This application relates to the field of terminal technology, and in particular to a split-screen display method and electronic equipment.
- the electronic device provides a split-screen mode to meet the needs of users to operate multiple applications at the same time. For example, the electronic device can display the windows of two applications at the same time.
- the electronic device needs to be manually triggered by the user to enter the split screen mode. For example: when the electronic device displays the window of application A in full screen, the user can press and hold the historical task button, the electronic device enters the split-screen mode, and then the user manually selects the application that needs to be displayed in the split-screen window, such as clicking the icon of application B, and finally Make application A and application B be displayed in different windows.
- the electronic device in the prior art needs to be manually triggered by the user to enter the split screen mode, and the operation process is relatively cumbersome and the intelligence is not high.
- the embodiment of the present invention provides a split-screen display method and an electronic device, which are used to achieve the technical effect of triggering the electronic device to start the split-screen display mode through a task and improve the intelligence of the electronic device.
- the implementation of the present invention provides a split-screen display method.
- the method includes: when a first application is running to perform a first task, displaying a display interface corresponding to the first task on the display screen; A first operation for starting a second task is received on a display interface corresponding to a task; in response to the first operation, a split-screen display mode is started. That is to say, when the electronic device is running the first application to perform the first task, if it receives the first operation for starting the second task on the display interface corresponding to the first task, the electronic device can be triggered to enter through the second task.
- the split-screen mode can be triggered without the user performing additional manual operations, which achieves the technical effect of triggering the electronic device to perform split-screen display based on the task, and improves the intelligence of the electronic device.
- the electronic device may display the display interface corresponding to the first task in full screen on the display screen.
- the effect of split-screen display through tasks can be realized in the full-screen display mode, and the full-screen mode can be switched to the split-screen mode without the user's additional manual operation, which improves the intelligence of interaction.
- the second task may be a task in a second application, and the second application is different from the first application. That is to say, the technical solution of the embodiment of the present invention can be applied to the scene of cross-application split screen, which improves the user experience.
- the second task may be a task in the first application. That is to say, the technical solution of the embodiment of the present invention can also be applied to the scene of split screen in the application, which can further improve the intelligence of split screen display and improve user experience.
- the electronic device when the electronic device starts the split-screen display mode in response to the first operation, it may specifically generate at least two display windows on the display screen, and the at least two display windows include the first A display window and a second display window; a display interface corresponding to the first task is displayed in the first display window, and a display interface corresponding to the second task is displayed in the second display window. That is, when the second task is triggered, the electronic device can directly enter the split screen mode, and display the display interface of the second task in the split screen window, which improves the interaction efficiency of the electronic device.
- the electronic device when it starts the split-screen display mode in response to the first operation, it may also generate a view of the display interface corresponding to the second task, and display the view corresponding to the first task.
- the view is displayed on the display interface of the view; in response to the second operation for the view, the position of the split-screen display window is determined according to the second operation; the split-screen display window is displayed at the position of the split-screen display window, and the The display interface corresponding to the second task is displayed in the split-screen display window.
- the display position of the split screen window can also be determined based on the received second operation, that is, the user can instruct the split screen by performing the second operation
- the display position of the screen window can further improve the intelligence of electronic device interaction and improve user experience.
- the second operation may include multiple sub-operation steps to avoid false triggering of the split-screen display and improve the reliability of the split-screen display.
- the second operation may include a first sub-operation
- the first sub-operation may be an operation of dragging the view or a copy of the view to a preset orientation (for example, up, down, left and right, etc.).
- the electronic device determines that the preset orientation is the position of the split-screen display window, and after receiving the first sub-operation, displays the split-screen display window at the position of the split-screen display window.
- the second operation may also include a second sub-operation before the first sub-operation, and the second sub-operation may be an operation of single-finger long press or two-finger long press on the view.
- the user needs to press and hold the view with one finger or long press with two fingers to drag the view, which can prevent the user from accidentally triggering the drag view and improve the reliability of the split-screen display.
- the split screen mode can also be canceled and the display interface of the first task displayed in full screen is restored.
- the second operation may further include a third sub-operation after the first sub-operation, and the third sub-operation may be an operation of clicking the view.
- the electronic device displays the display interface corresponding to the second task in the split-screen display window. That is to say, after the user drags the view to the specified solution position, he also needs to perform the operation of clicking the view to confirm that the display interface of the second task is displayed in the split screen window, which can better prevent the user from triggering the drag by mistake View to improve the reliability of split-screen display.
- an embodiment of the present invention provides an electronic device including a display screen; one or more processors; a memory; multiple applications; and one or more computer programs; wherein the one or more computer programs are stored In the memory, the one or more computer programs include instructions.
- the electronic device is caused to execute the following steps: For a task, display the display interface corresponding to the first task on the display screen; receive a first operation for starting a second task on the display interface corresponding to the first task; respond to the first task Operate to start the split-screen display mode.
- the electronic device when the instruction is invoked and executed by the one or more processors, the electronic device is caused to perform the following steps: display the display corresponding to the first task on the display screen in full screen interface.
- the second task is a task in the first application; or the second task is a task in a second application, and the second application is different from the first application .
- the electronic device when the instruction is invoked and executed by the one or more processors, the electronic device is caused to perform the following steps: generating at least two display windows on the display screen, and the at least The two display windows include a first display window and a second display window; the display interface corresponding to the first task is displayed in the first display window, and the display interface corresponding to the second task is displayed in the second display window.
- UI user interface
- the electronic device when the instruction is invoked and executed by the one or more processors, the electronic device is caused to perform the following steps: generate a view of the display interface corresponding to the second task, and display it in the The view is displayed on the display interface corresponding to the first task; in response to the second operation on the view, the position of the split screen display window is determined according to the second operation; the split screen is displayed at the position of the split screen display window A window is displayed, and a display interface corresponding to the second task is displayed in the split-screen display window.
- the second operation includes a first sub-operation
- the first sub-operation is an operation of dragging the view or a copy of the view to a preset orientation
- the electronic device is caused to perform the following steps: determine that the preset orientation is the position of the split-screen display window; when the instructions are called for execution by the one or more processors , The electronic device is further caused to perform the following steps: after receiving the first sub-operation, display a split-screen display window at the position of the split-screen display window.
- the second operation further includes a second sub-operation before the first sub-operation, and the second sub-operation is a single-finger long press or a two-finger long press on the view Operation.
- the second operation further includes a third sub-operation after the first sub-operation, and the third sub-operation is an operation of clicking the view; when the instruction is When the one or more processors call for execution, the electronic device is also caused to perform the following steps: after receiving the third sub-operation, display the display interface corresponding to the second task in the split-screen display window .
- an embodiment of the present invention provides an electronic device that includes modules/units that execute the first aspect or any one of the possible design methods of the first aspect; these modules/units can be implemented by hardware , It can also be realized by hardware executing corresponding software.
- an embodiment of the present invention provides a computer storage medium, including computer instructions, which when the computer instructions are executed on an electronic device, cause the electronic device to execute the first aspect or the first aspect of the embodiment of the present invention. Any of the possible designs described in the split-screen display method.
- the embodiment of the present invention provides a program product, when the program product is run on a computer, the computer executes any possible design such as the first aspect or the first aspect of the embodiment of the present invention Said split-screen display method.
- an embodiment of the present invention provides a chip, which is coupled with a memory in an electronic device, and is used to call a computer program stored in the memory and execute the first aspect of the embodiment of the present invention and any one of the first aspect thereof Designed technical solution; in the embodiment of the present invention, "coupled” means that two components are directly or indirectly combined with each other.
- an embodiment of the present invention provides a graphical user interface on an electronic device, the electronic device having a display screen, one or more memories, and one or more processors, and the one or more processors are used for One or more computer programs stored in the one or more memories are executed, and the graphical user interface includes the electronic device executing any possible design such as the first aspect or the first aspect of the embodiment of the present invention The graphical user interface displayed in the split-screen display method.
- FIG. 1 is a schematic diagram of triggering an electronic device to enter a split screen mode in the prior art
- FIG. 2 is a schematic diagram of the hardware structure of an electronic device in the implementation of the present invention.
- Figure 3 is a schematic diagram of the software structure of an electronic device in the implementation of the present invention.
- FIG. 5 is a schematic diagram of a split-screen display solution in an embodiment of the present invention.
- FIG. 6 is a schematic diagram of another split-screen display solution in an embodiment of the present invention.
- FIG. 7 is a schematic diagram of another split-screen display solution in an embodiment of the present invention.
- FIG. 8A is a schematic diagram of another split-screen display solution in an embodiment of the present invention.
- FIG. 8B is a schematic diagram of another split-screen display solution in an embodiment of the present invention.
- FIG. 9 is a schematic diagram of another split-screen display solution in an embodiment of the present invention.
- FIG. 10 is a schematic diagram of another split-screen display solution in the implementation of the present invention.
- FIG. 11 is a schematic diagram of another split-screen display solution in an embodiment of the present invention.
- Figure 12 is a schematic diagram of another split-screen display solution in an embodiment of the present invention.
- Figure 13 is a schematic diagram of another split-screen display solution in an embodiment of the present invention.
- FIG. 14 is a schematic diagram of another split-screen display solution in an embodiment of the present invention.
- 15 is a schematic diagram of another split-screen display solution in an embodiment of the present invention.
- FIG. 16 is a schematic diagram of the hardware structure of another electronic device in an embodiment of the present invention.
- the application (application, app for short) involved in the embodiments of the present invention is a software program that can implement one or more specific functions.
- multiple applications can be installed in a terminal device, for example, instant messaging applications, video applications, audio applications, image capturing applications, and so on.
- instant messaging applications for example, may include short message applications, WeChat (WeChat), WhatsApp Messenger, Line, photo sharing (Instagram), Kakao Talk, DingTalk, etc.
- Image shooting applications for example, may include camera applications (system cameras or third-party camera applications).
- Video applications such as Youtube, Twitter, Douyin, iQiyi, Tencent Video, etc.
- Audio applications may include Google Music, Kugou Music, Xiami, QQ Music, and so on.
- the applications mentioned in the following embodiments may be applications that have been installed when the terminal device leaves the factory, or may be applications downloaded from the network or obtained by other terminal devices when the user uses the terminal device.
- the split screen involved in the embodiment of the present invention refers to a technology in which a software system divides a physical screen into several display areas, which can display multiple application pages at the same time.
- the display window involved in the embodiment of the present invention also referred to herein as "application window”, “window”, etc., refers to a display area in which a display interface of an application is displayed.
- One application can correspond to one application window.
- An application window can be reduced or enlarged, so that the display interface in the application window can be reduced or enlarged.
- the display screen of the electronic device may only display one window.
- the window may be displayed on the display screen in full screen or non-full screen, which is not limited in the embodiment of the present invention.
- full-screen display the window occupies all the effective display area of the display screen (or the window occupies the maximum display area allowed by the system).
- non-full-screen display the window only occupies part of the effective display area of the display (or the display area occupied by the window is less than the maximum display area allowed by the system).
- the window occupies The display area is smaller than the maximum display area that the system allows the window to occupy.
- the display screen of an electronic device can also display multiple windows at the same time.
- the display area of the display screen is divided into several display areas, and one display area is a display window.
- the user interface of different applications can be displayed in different display windows.
- the split-screen window or split-screen display window involved in the embodiment of the present invention refers to a display interface that newly appears after the split-screen display mode is triggered.
- the electronic device initially displays application A in full screen, that is, the display has only one full-screen window displaying application A.
- the electronic device display screen is divided into two display areas, which display application A and application A respectively.
- Application B then the display area corresponding to Application B is called a split screen window.
- At least one involved in the embodiment of the present invention includes one or more; wherein, multiple refers to greater than or equal to two.
- references described in this specification to "one embodiment” or “some embodiments”, etc. mean that one or more embodiments of the present invention include a particular feature, structure, or characteristic described in conjunction with the embodiment. Therefore, the sentences “in one embodiment”, “in some embodiments”, “in some other embodiments”, “in some other embodiments”, etc. appearing in different places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless it is specifically emphasized otherwise.
- the terms “including”, “including”, “having” and their variations all mean “including but not limited to”, unless otherwise specifically emphasized.
- FIG. 1 is an example of the electronic device entering the split screen mode in the prior art.
- the user browses "Taobao”
- he intends to share a certain product with his WeChat friends.
- Figure 1 (A) the user has copied the Taobao password of the product and intends to send it to his WeChat friends, but he does not want to quit the current one.
- You can manually start the split-screen mode and open the WeChat interface in the split-screen window to perform sharing operations.
- the user can trigger the electronic device to enter the split screen mode by long pressing the historical task key, and the display state after entering the split screen mode is as shown in Figure 1 (C).
- the screen is divided into two display windows from one display window. Taobao is displayed in the left window, and the desktop is displayed in the right window (ie split-screen window). Then the user selects "WeChat” on the desktop to display the WeChat interface in the right window.
- the user can use “Taobao” to browse products and use "WeChat” to chat with friends, as shown in Figure 1 (D), but this process requires the user to perform multiple operations manually to trigger the split screen mode.
- the Taobao and WeChat interfaces are displayed separately in the window, which is very cumbersome.
- There is another way to start the split screen display is to start the split screen by gesture. For example, use the knuckles to slide in the center of the screen to split the screen in two.
- the display effect after the gesture is activated is similar to (C) in Figure 1.
- the user selects the application (such as WeChat) that needs to be split-screen display in the split-screen window (desktop). It can be seen that the process of starting the split-screen display through gestures also requires the user to perform more cumbersome operations manually, and it also requires the user to pay more. The cost of learning.
- the electronic device needs to be manually triggered by the user to enter the split screen mode, and the operation process is relatively cumbersome.
- the split-screen mode in the prior art can only split the screen for cross-applications (ie, two different applications), such as the aforementioned "Taobao” and "WeChat", which have certain functional limitations. Therefore, the split-screen display solution in the prior art has the problem of low intelligence.
- embodiments of the present invention provide a split-screen display method and electronic device.
- the electronic device When the electronic device is running the first application to perform the first task, it displays the display interface (first display interface) of the first task on the display screen.
- the electronic device When receiving the operation to start the second task, it directly starts the split-screen display mode , Divide the screen into multiple display windows, then display the first display interface in the first display window, run the second task and display the display interface of the second task in the second display window (second display interface) ,
- the second task may be other tasks in the first application, or other applications, such as tasks in the second application, and there is no limitation here.
- the embodiments of the present invention can be applied to electronic devices recognized to have display screens, for example, portable electronic devices such as mobile phones, folding screen phones, tablet computers, etc., or non-convenient electronic devices such as desktop computers and televisions, or Wearable electronic devices, such as bracelets, watches, wearable device helmets, etc., may also be vehicle-mounted devices, smart home devices (for example, televisions), etc., which are not limited in the embodiment of the present invention.
- portable electronic devices such as mobile phones, folding screen phones, tablet computers, etc.
- non-convenient electronic devices such as desktop computers and televisions
- Wearable electronic devices such as bracelets, watches, wearable device helmets, etc.
- smart home devices for example, televisions
- the electronic device is a mobile phone as an example
- FIG. 2 shows a schematic structural diagram of the mobile phone. As shown in FIG.
- the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, and a battery 142, Antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, A display screen 194, and a subscriber identification module (SIM) card interface 195, etc.
- SIM subscriber identification module
- the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
- the processor 110 may include one or more processing units.
- the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait.
- the different processing units may be independent devices or integrated in one or more processors.
- the controller may be the nerve center and command center of the mobile phone 100. The controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
- a memory may also be provided in the processor 110 to store instructions and data. In some embodiments, the memory in the processor 110 is a cache memory.
- the memory can store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
- the execution of the split-screen display method in the embodiment of the present invention may be controlled by the processor 110 or by calling other components to complete, for example, calling the processing program of the embodiment of the present invention stored in the internal memory 121, or calling the second part through the external memory interface 120.
- the processing program of the embodiment of the present invention stored in the three-party device controls the display screen 194 to perform a split-screen display operation.
- the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 100.
- the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save pictures, videos and other files in an external memory card.
- the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
- the processor 110 executes various functional applications and data processing of the mobile phone 100 by running instructions stored in the internal memory 121.
- the internal memory 121 may include a storage program area and a storage data area.
- the storage program area can store an operating system, and software codes of at least one application (for example, an iQiyi application, a WeChat application, etc.).
- the data storage area can store data (such as images, videos, etc.) generated during the use of the mobile phone 100.
- the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
- the internal memory 121 may be used to store the computer executable program code of the split-screen display method proposed by the embodiment of the present invention, where the executable program code includes instructions.
- the processor 110 can run the computer executable program code of the split-screen display method stored in the internal memory 121, so that the mobile phone 100 can complete the split-screen display method proposed in the embodiment of the present invention.
- the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
- the USB interface 130 can be used to connect a charger to charge the mobile phone 100, and can also be used to transfer data between the mobile phone 100 and peripheral devices.
- the charging management module 140 is used to receive charging input from the charger.
- the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
- the power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
- the wireless communication function of the mobile phone 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
- the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in the mobile phone 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
- Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
- the antenna can be used in combination with a tuning switch.
- the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the mobile phone 100.
- the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
- the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
- the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
- at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
- at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
- the wireless communication module 160 can provide applications on the mobile phone 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellite systems. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
- WLAN wireless local area networks
- BT Bluetooth
- GNSS global navigation satellite system
- frequency modulation frequency modulation, FM
- NFC near field communication technology
- infrared technology infrared, IR
- the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
- the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
- the wireless communication module 160 may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic waves to radiate through the antenna
- the antenna 1 of the mobile phone 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone 100 can communicate with the network and other devices through wireless communication technology.
- the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
- the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
- GPS global positioning system
- GLONASS global navigation satellite system
- BDS Beidou navigation satellite system
- QZSS quasi-zenith satellite system
- SBAS satellite-based augmentation systems
- the display screen 194 can be used to display information input by the user or information provided to the user, and various menus of the mobile phone 100, and can also accept user input, such as a user's touch operation. Display the display interface of the application, etc.
- the display screen 194 includes a display panel.
- the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
- LCD liquid crystal display
- OLED organic light-emitting diode
- AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
- the display screen 194 may include a display panel and a touch panel.
- the display panel can be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
- Touch panels also known as touch screens, touch-sensitive screens, etc., can collect user contact or non-contact operations on or near it (for example, the user uses fingers, stylus and other suitable objects or accessories on the touch panel or on the touch panel).
- the operations near the touch panel may also include somatosensory operations; the operations include single-point control operations, multi-point control operations, and other types of operations), and the corresponding connection device is driven according to a preset program.
- the touch panel may include two parts: a touch detection device and a touch controller.
- the touch detection device detects the user's touch position and posture, and detects the signal brought by the input operation, and transmits the signal to the touch controller;
- the touch controller receives the touch information from the touch detection device and converts it into a processor capable of The processed information is then sent to the processor 110, and can receive and execute the commands sent by the processor 110.
- multiple types of resistive, capacitive, infrared, and surface acoustic waves can be used to implement touch panels, and any technology developed in the future can also be used to implement touch panels.
- the touch panel can cover the display panel, and the user can cover the touch screen on the display panel according to the content displayed on the display panel (the display content includes but not limited to, soft keyboard, virtual mouse, virtual buttons, icons, etc.)
- An operation is performed on or near the panel. After the touch panel detects the operation on or near it, it is transmitted to the processor 110 to determine the user input, and then the processor 110 provides corresponding visual output on the display panel according to the user input.
- the touch detection device in the touch panel detects the touch operation input by the user, it sends a signal corresponding to the detected touch operation to a touch controller in real time, and the touch controller converts the signal into a touch.
- the point coordinates are sent to the processor 110, and the processor 110 determines that the touch operation is specifically an operation to start the second task according to the received contact coordinates, and then responds to the touch operation input by the user, starts the split screen mode, and displays the display screen 194
- the area is divided into multiple display windows (for example, divided into a first display window and a second display window), and the second task is started, the second task is displayed in the second display window, and the first task previously displayed in full screen is switched To display in the first display window.
- the mobile phone 100 may include one or N display screens 194.
- One or N display screens 194 can be foldably connected or flexibly connected. When multiple display screens 194 are folded, it is convenient for electronic equipment to be portable. When the display screen 194 is unfolded and connected, it is convenient for the user to use a large screen to view and improve the user experience, where N is a positive integer greater than 1.
- the split-screen display method in the embodiment of the present invention can be applied to one display screen alone, or can be applied to connect multiple display screens to form a large screen as a whole when they are unfolded.
- the camera 193 is used to capture still images or videos.
- the camera 193 may include a front camera and a rear camera.
- the mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
- the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
- the pressure sensor 180A may be provided on the display screen 194.
- the gyro sensor 180B may be used to determine the movement posture of the mobile phone 100.
- the angular velocity of the mobile phone 100 around three axes ie, x, y, and z axes
- the gyro sensor 180B can be used for image stabilization.
- the air pressure sensor 180C is used to measure air pressure.
- the mobile phone 100 uses the air pressure value measured by the air pressure sensor 180C to calculate the altitude to assist positioning and navigation.
- the magnetic sensor 180D includes a Hall sensor.
- the mobile phone 100 can use the magnetic sensor 180D to detect the opening and closing of the flip holster.
- the mobile phone 100 when the mobile phone 100 is a flip phone, the mobile phone 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Furthermore, according to the detected opening and closing state of the holster or the opening and closing state of the flip cover, features such as automatic unlocking of the flip cover are set.
- the acceleration sensor 180E can detect the magnitude of the acceleration of the mobile phone 100 in various directions (generally three axes). When the mobile phone 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and apply to applications such as horizontal and vertical screen switching, pedometers and so on.
- the mobile phone 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the mobile phone 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
- the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
- the light emitting diode may be an infrared light emitting diode.
- the mobile phone 100 emits infrared light to the outside through the light emitting diode.
- the mobile phone 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the mobile phone 100.
- the mobile phone 100 can determine that there is no object near the mobile phone 100.
- the mobile phone 100 can use the proximity light sensor 180G to detect that the user holds the mobile phone 100 close to the ear to talk, so as to automatically turn off the screen to save power.
- the proximity light sensor 180G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
- the ambient light sensor 180L is used to sense the brightness of the ambient light.
- the mobile phone 100 can adaptively adjust the brightness of the display 194 according to the perceived brightness of the ambient light.
- the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
- the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the mobile phone 100 is in the pocket to prevent accidental touch.
- the fingerprint sensor 180H is used to collect fingerprints.
- the mobile phone 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
- the temperature sensor 180J is used to detect temperature.
- the mobile phone 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the mobile phone 100 performs a reduction in the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
- the mobile phone 100 when the temperature is lower than another threshold, the mobile phone 100 heats the battery 142 to avoid abnormal shutdown of the mobile phone 100 due to low temperature.
- the mobile phone 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
- Touch sensor 180K also called “touch panel”.
- the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
- the touch sensor 180K is used to detect touch operations acting on or near it.
- the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
- the visual output related to the touch operation can be provided through the display screen 194.
- the touch sensor 180K may also be disposed on the surface of the mobile phone 100, which is different from the position of the display screen 194.
- the bone conduction sensor 180M can acquire vibration signals.
- the bone conduction sensor 180M can obtain the vibration signal of the vibrating bone mass of the human voice.
- the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure pulse signal.
- the button 190 includes a power-on button, a volume button, and so on.
- the button 190 may be a mechanical button. It can also be a touch button.
- the mobile phone 100 can receive key input, and generate key signal input related to user settings and function control of the mobile phone 100.
- the motor 191 can generate vibration prompts.
- the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback. For example, touch operations that act on different applications (such as photographing, audio playback, etc.) can correspond to different vibration feedback effects.
- the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
- the SIM card interface 195 is used to connect to the SIM card.
- the SIM card can be connected to and separated from the mobile phone 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195.
- the components shown in Figure 2 do not constitute a specific limitation on the mobile phone.
- the mobile phone may also include more or fewer components than those shown in the figure, or combine some components, or split some components, or different The layout of the components.
- the mobile phone 100 shown in FIG. 2 is taken as an example for introduction.
- the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the mobile phone 100.
- the mobile phone 100 may include more or fewer components than those shown in the figure, or combine certain components, or split certain components, or arrange different components.
- the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
- the hardware structure of the mobile phone 100 is described above, and the software structure of the mobile phone 100 is described below.
- the software system of the mobile phone 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
- the embodiment of the present invention takes an android system with a layered architecture as an example to illustrate the software structure of the mobile phone 100.
- the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
- the Android system is divided into four layers, from top to bottom are the application (Applications) layer (referred to as the “application layer” in this article), and the application framework (Application Framework) Layer (referred to as “framework layer” in this article), Android runtime and system library layer (referred to as “system runtime library layer” in this article), and kernel layer.
- application Applications
- Application Framework Application Framework Layer
- Android runtime and system library layer referred to as “system runtime library layer” in this article
- kernel layer the kernel layer.
- the application program layer there is at least one application program running in the application program layer.
- These applications can be Window programs, system setting programs, contact programs, SMS programs, clock programs, camera applications, etc., which are included in the operating system; or Applications developed by third-party developers, such as instant messaging programs, photo beautification programs, game programs, etc.
- the application package in the application layer is not limited to the above examples, and may actually include other application packages, which is not limited in the embodiment of the present invention.
- the view tree (ViewTree) in the application layer is the view structure in the application interface.
- one display interface in an application can correspond to one ViewTree.
- the developer can mark View controls (such as Buttons, ImageViews, etc.) that respond to drag and drop to form a split screen in the ViewTree corresponding to a display interface of an application, for example, The WeChat interface view that can be dragged in Figure 13 is the marked View control.
- the system user interface (system user interface, SystemUI) is a system-level UI component that has system-level global permissions.
- SystemUI includes a drag starter (DragStarter). DragStarter is used to process the response logic of the user's drag gesture, and which decision is made The orientation (up, down, left, right) starts a new split screen window.
- the framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
- the application framework layer includes some predefined functions.
- the application framework layer is equivalent to a processing center, which decides to let applications in the application layer take actions.
- an open API can be added to the framework layer for upper-level applications to call. For example, add the function setLaunchSplitScreenLocation(int location) to the original activity option (a parameter) of the unified control center of the split screen window.
- add the function setLaunchSplitScreenLocation(int location) to the original activity option (a parameter) of the unified control center of the split screen window.
- support left-LEFT, right-RIGHT, up-UP, and down-BOTTOM.
- the application framework layer in the embodiment of the present invention includes activity manager service (AMS) module and window manager service (WMS) module, split-screen window unified control center (MultiWinSupervisor), input management (InputMangerService, IMS) and so on.
- AMS activity manager service
- WMS window manager service
- MultiWinSupervisor split-screen window unified control center
- IMS input management
- Input management is used for input monitoring and processing of touch events.
- the unified control center for split-screen windows is used to: coordinate the scheduling of split-screen windows, respond upward to the call instructions of the application layer, and issue call instructions of system services (such as AMS, WMS, etc.) to the lower layers. For example, when it is detected that the marked View control is dragged, it triggers the execution of related instructions for the split screen to AMS, WMS, etc., so as to achieve passive triggering of the split screen. For another example, when the upper-layer application actively calls the ActivityOption interface, it triggers the execution of related instructions for the split screen to AMS, WMS, etc., so as to achieve the initiative to trigger the split screen.
- the activity management module is used to: manage the life cycle of each application and the usual navigation back functions, such as controlling the exit, opening, and back of the application.
- the window management module is used to manage and draw all window interfaces, such as controlling the display size, position, and level of the window.
- the framework layer may also include functional services, such as content provision, telephone management, resource management, notification management, etc., which are not limited in the embodiment of the present invention.
- the system runtime layer provides support for the upper layer, that is, the framework layer.
- the Android operating system will run the C/C++ library contained in the system runtime layer to realize the functions of the framework layer.
- the kernel layer is the layer between hardware and software. As shown in Figure 3, the kernel layer contains at least the touch screen driver. Of course, during specific implementation, the kernel layer may also include other drivers, such as camera drivers, audio drivers, etc., which are not limited in the embodiment of the present invention.
- the split-screen display method provided by the embodiment of the present invention includes:
- the mobile phone displays the display interface of the first task on the display screen, and receives the first operation to start the second task; wherein, the display corresponding to the second task The interface is different from the display interface corresponding to the first task.
- the mobile phone In response to the first operation, the mobile phone starts a split-screen display mode, divides the screen into multiple display windows, displays a display interface corresponding to the first task in the first display window, starts the second task and The display interface corresponding to the second task is displayed in the second display window.
- the display screen may display the display interface corresponding to the first task in full screen (also referred to as the first display interface herein, for example, see (A) in FIG. 5) ,
- the first task is to run and display the product browsing interface in the Taobao application.
- the first operation may be an input operation performed by the user, and the type of the input operation may be touch input, input, somatosensory input, hovering input, etc., which is not limited here.
- the second task may be an associated task triggered during the execution of the task by the first application, such as a WeChat sharing task triggered in Taobao, or a task triggered separately by the user, for example, the user inputs the voice command "open when browsing Taobao" "WeChat", the embodiment of the present invention does not restrict it here.
- the second task is mainly an associated task triggered during the execution of the task by the first application as an example for detailed introduction.
- the touch panel in the display screen detects the signal brought by the touch operation input by the user, and the sensor converts the detected signal into
- the information that can be processed by the processor is passed to the processor, and the kernel layer running in the processor generates position data corresponding to the operation based on this information (specifically, it can include contact coordinates, timestamps corresponding to the contact coordinates, etc.), and the kernel layer will collect
- the obtained finger position data is uploaded to the framework layer, and the IMS in the framework layer determines the first operation performed by the user (for example, the operation performed by the user is the control shown in Figure 5(A) by clicking on the "Go to WeChat and paste Amoy password" control
- the intention is to start the second task (that is, to start WeChat), and IMS reports the event that starts the second task to the application in the application layer (that is, Taobao), and the application responds to the user's operation to start the task of opening We
- the number of divided windows can be two.
- the display screen is divided into two display windows arranged left and right.
- the left side is the first display window, which is used to display the first display window.
- the display interface (Taobao product browsing page) corresponding to the task, and the second display window on the right is used to display the display interface (WeChat page) corresponding to the second task.
- the positions of the first display window and the second display window can be exchanged.
- the second display window on the left is used to display the display interface corresponding to the second task
- the right is the first display window, which is used to display the second task.
- the Taobao interface and WeChat interface in the drawings are only used to illustrate the first task and the second task. In specific implementation, the first task and the second task may also be tasks in other applications.
- first display window and the second display window may not only be the left and right arrangement as shown in FIG. 5, but also may be the top and bottom arrangement.
- the display screen is divided into upper and lower windows, with the first display window on the top and the second display window on the bottom.
- the positions of the first display window and the second display window in FIG. 6 can also be interchanged.
- the shape and size of the first display window and the second display window may be the same, as shown in FIG. 5 and FIG. 6, for example.
- the shape and size of the first display window and the second display window can also be different.
- the second display window is smaller than the area of the first display window, so that users can view and use the first display window more conveniently.
- the first display interface may be the same, as shown in FIG. 5 and FIG. 6, for example.
- the shape and size of the first display window and the second display window can also be different.
- the second display window is smaller than the area of the first display window, so that users can view and use the first display window more conveniently.
- the user may set the positional relationship and size relationship of the first display window and the second display window after the mobile phone enters the split-screen mode in the system in advance.
- the number of divided windows can also be greater than two.
- the display screen is divided into three display windows, where the third display window is used to display the system desktop, so that the user can also monitor and operate the desktop at the same time.
- the third display window can also be an interface that displays other applications.
- the third display window displays the interface of the video APP "iqiyi", so that users can use "iqiyi" and "iqiyi” at the same time. "Taobao" and "WeChat”.
- first display window, the second display window, and the third display window in the embodiment of the present invention can also be implemented in other ways, for example, it can also be arranged up and down, or in other combination arrangements, which are not limited here.
- shape or size relationship of the first display window, the second display window, and the third display window may be the same or different, and there is no limitation here.
- the second task and the first task may be tasks in different applications, that is, cross-application split-screen display scenes.
- the application corresponding to the first task is Taobao
- the application corresponding to the second task is WeChat.
- the first task and the second task may be tasks in the same application, that is, a scene where the application is displayed on a split screen.
- a split-screen startup function can be added to the first application.
- a split-screen display menu option can be added to the control used to trigger the second task, which will then trigger the function and trigger of the second task.
- the functions of the split-screen mode are bound, so that the user can click the split-screen display menu option to trigger the simultaneous split-screen display mode and trigger the second task.
- the display screen may also display the display interface corresponding to the first task in a non-full screen.
- the display screen is already in split-screen mode, and two or two are already displayed. If there are more than one display window, then after the mobile phone receives the first operation, in a possible design, the mobile phone can further divide the display area corresponding to the first task before receiving the first operation, and then set the The display area is further divided into screens to display the interface of the first task and the interface of the second task. For example, referring to (A) in Figure 11, the mobile phone display screen is divided into a fourth display window and a fifth display window. The fourth display window displays the desktop, and the fifth display window displays the Taobao interface.
- the mobile phone After the user clicks "Go to WeChat to paste to a friend" in the fifth display window (the first operation), the mobile phone further divides the area where the fifth display window is located into the first display window and the second display window, and then the second display The WeChat interface is displayed in the window.
- the mobile phone may also adjust the fourth display window, such as adaptively adjusting the size or position of the fourth display window.
- the mobile phone will receive the first operation after .
- the display interface of the second task can be displayed in other display windows except the display window corresponding to the first task, or the entire display screen can be divided into screens again.
- the mobile phone display screen is divided into a fourth display window and a fifth display window.
- the fourth display window displays the desktop
- the fifth display window displays the Taobao interface
- the user displays the fifth display window.
- the mobile phone After clicking "Go to WeChat and Paste to a Friend" in the window (the first operation), the mobile phone replaces the desktop displayed in the fourth display window with the display interface corresponding to the second task, that is, the WeChat interface (or replaces the fourth display window) Is the second display window, and the second display window is the window displaying the WeChat interface).
- the mobile phone after replacing the desktop displayed in the fourth display window with the display interface corresponding to the second task, that is, the WeChat interface, the mobile phone can also display the fourth display window and the fifth display window.
- the display window is adjusted, for example, the size or position of the fourth display window and the fifth display window are adjusted adaptively.
- the embodiment of the present invention also provides a solution in which the user controls the display position of the split-screen window. Specifically, after the mobile phone receives the first operation and starts the split-screen display mode, it first pops up the view associated with the second task on the first display interface (for example, the thumbnail of the second display interface). At this time, the user can Perform a drag operation on the view, move the view from the screen to the up/down/left/right of the screen, and determine the position of the split screen window (second display window) according to the direction of the view movement.
- the user controls the display position of the split-screen window. Specifically, after the mobile phone receives the first operation and starts the split-screen display mode, it first pops up the view associated with the second task on the first display interface (for example, the thumbnail of the second display interface). At this time, the user can Perform a drag operation on the view, move the view from the screen to the up/down/left/right of the screen, and determine the position of the split screen window (second display window) according to the direction of the
- the touch panel in the display screen detects the signal brought by the touch operation input by the user, and the sensor converts the detected signal into information that can be processed by the processor and transmits it to the processor.
- the core running in the processor The layer generates the position data corresponding to the operation based on this information (specifically, it can include the contact coordinates, the time stamp corresponding to the contact coordinates, etc.), the kernel layer uploads the collected finger position data to the framework layer, and the IMS in the framework layer determines the operation For a preset gesture operation (such as dragging), the gesture is reported to the DragStarter in the SystemUI in the application layer.
- DragStarter processes the response logic of the user's drag gesture and decides which direction (up, down, left, right) to start New split-screen window, and issue instructions to the unified control center of split-screen window to control the split-screen display, and then the split-screen display can be passively triggered based on user operations.
- the application developer of the first application can specify the view (View) to be bound to the split screen event through the layout (Layout) configuration file or API call.
- the mobile phone determines that the second task needs to be started after the first operation is received.
- the user can drag it to the specified position (support: left-LEFT, right-RIGHT, up-UP, and down-BOTTOM), and the phone is formed in the specified position Split screen window (second display window), and display in the split screen display window.
- the mobile phone forms a split-screen window in a designated orientation, it may be after receiving the user's operation to click on the view, and then display the display interface of the second task in the split-screen display window.
- the mobile phone determines where to start the split-screen judgment algorithm:
- the mobile phone responds to the first operation and starts the split screen mode.
- a thumbnail pops up on the Taobao interface, as shown in Figure 13(A).
- the user can drag and drop the thumbnail, such as dragging to the right as shown in (B) in Figure 13, or dragging down as shown in (C) in Figure 13, or as shown in Figure 13 Drag upward as shown in (D), or drag to the left as shown in (E) in Figure 13, and so on.
- the mobile phone determines the position of the split-screen window according to the moving direction of the view.
- the display screen when the moving direction is downward, the display screen is divided into two windows arranged up and down, and the second task (WeChat) is displayed.
- the second display window of) is on the bottom, and the first display window for displaying the first task (Taobao) is on the top.
- the positions of the split-screen display windows of other drag methods are also similar, and will not be repeated here.
- a copy of the view (shadow) that can be dragged by the user is generated.
- the user can The operation of dragging the view copy to the specified position is performed to realize the effect of displaying the second task on a split screen. If the user clicks on the view after the display screen displays the view, the mobile phone will open the new page normally (that is, exit the interface of the first task, and display the interface of the second task in a full-screen overlay). In this way, after the user performs the first operation, the user can choose whether to perform split-screen display according to requirements, which can improve user experience.
- the initial display position of the view may also be at the upper/lower/left/right edge position of the first display interface, for example, as shown in FIG. 14. In this way, the occlusion of the main display content in the first display interface by the view can be further reduced.
- the display windows of the entire display screen can be re-divided according to the user's drag operation, and the first display windows are displayed in the two newly divided display windows.
- the display interface of the first task or the display interface that received the user input operation for the last time, namely Taobao
- the display interface corresponding to the second task are as shown in FIG. 15.
- the effect of flexibly updating the split-screen display window according to user needs can be achieved, and user experience can be further improved.
- the area of the view does not exceed a set threshold area, for example, does not exceed one-third, one-fifth, etc. of the area of the first display interface. In this way, the occlusion of the content of the first display interface by the view can be reduced, and the user experience can be improved.
- the view may be displayed in a semi-transparent manner, which can further reduce the occlusion of the main display content in the first display interface by the view and improve the visual effect.
- the method provided by the embodiment of the present invention is introduced from the perspective of the electronic device (mobile phone 100) as the execution subject.
- the terminal device may include a hardware structure and/or a software module, and realize the above-mentioned functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether a certain function of the above-mentioned functions is executed by a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraint conditions of the technical solution.
- an embodiment of the present invention also provides an electronic device 1600 for executing the steps of the split-screen display method in the foregoing embodiment of the present invention.
- the electronic device 1600 includes: a display screen 1601; one or more processors 1602; a memory 1603; a plurality of applications; and one or more computer programs; wherein the one or more computer programs are stored in the In the memory, the one or more computer programs include instructions, and when the instructions are invoked and executed by the one or more processors, the electronic device realizes the above-mentioned split-screen display method in the embodiment of the present invention.
- the processor 1602 may be a general-purpose processor, a digital signal processor (digital signal processor, DSP), an application specific integrated circuit (ASIC), a ready-made programmable gate array (field programmable gate array, FPGA), or other Programmable logic devices, discrete gates or transistor logic devices, discrete hardware components.
- DSP digital signal processor
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- Programmable logic devices discrete gates or transistor logic devices, discrete hardware components.
- the methods, steps, and logical block diagrams disclosed in the embodiments of the present invention can be implemented or executed.
- the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
- the steps of the method disclosed in combination with the embodiments of the present invention may be directly embodied as being executed and completed by a hardware decoding processor, or executed and completed by a combination of hardware and software modules in the decoding processor.
- the software module can be located in random access memory (RAM), flash memory, read-only memory (read-only memory, ROM), programmable read-only memory, or electrically erasable programmable memory, registers, etc. mature in the field Storage medium.
- RAM random access memory
- flash memory read-only memory
- read-only memory read-only memory
- ROM programmable read-only memory
- electrically erasable programmable memory registers, etc. mature in the field Storage medium.
- the storage medium is located in the memory, and the processor reads the instructions in the memory and completes the steps of the above method in combination with its hardware.
- the processor 1601 may be 110
- the display screen 1601 may be the display screen 194
- the memory 1603 may be the internal memory 121.
- an embodiment of the present invention also provides an electronic device, which includes modules/units that execute the above-mentioned split-screen display method of the embodiments of the present invention; these modules/units can be implemented by hardware or The corresponding software is implemented by hardware.
- an embodiment of the present invention also provides a computer storage medium, including computer instructions, which when the computer instructions run on an electronic device, cause the electronic device to execute the above-mentioned split-screen display in the embodiment of the present invention. method.
- the embodiment of the present invention also provides a program product, which when the program product runs on a computer, causes the computer to execute the above-mentioned split-screen display method in the embodiment of the present invention.
- an embodiment of the present invention also provides a chip, which is coupled with a memory in an electronic device, and is used to call a computer program stored in the memory and execute the above-mentioned split-screen display method in the embodiment of the present invention;
- “coupled” means that two components are directly or indirectly combined with each other.
- the embodiments of the present invention also provide a graphical user interface on an electronic device, the electronic device having a display screen, one or more memories, and one or more processors, the one or more Each processor is configured to execute one or more computer programs stored in the one or more memories, and the graphical user interface includes the electronic device to execute the above-mentioned split-screen display method in the embodiment of the present invention.
- the term “when” can be interpreted as meaning “if" or “after” or “in response to determining" or “in response to detecting".
- the phrase “when determining" or “if detected (statement or event)” can be interpreted as meaning “if determined" or “in response to determining" or “when detected (Condition or event stated)” or “in response to detection of (condition or event stated)”.
- the computer program product includes one or more computer instructions.
- the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
- the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
- the computer instructions may be transmitted from a website, computer, server, or data center.
- the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
- the usable medium may be a magnetic medium, (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state hard disk).
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
Description
Claims (19)
- 一种分屏显示方法,其特征在于,所述方法包括:在运行第一应用执行第一任务时,在显示屏上显示所述第一任务对应的显示界面;在所述第一任务对应的显示界面上接收用于启动第二任务的第一操作;响应于所述第一操作,启动分屏显示模式。
- 如权利要求1所述的方法,其特征在于,在运行第一应用执行第一任务时,在显示屏上显示所述第一任务对应的显示界面,包括:在所述显示屏上全屏显示所述第一任务对应的显示界面。
- 如权利要求1所述的方法,其特征在于,所述第二任务为所述第一应用中的任务;或者所述第二任务为第二应用中的任务,且所述第二应用与所述第一应用不同。
- 如权利要求1所述的方法,其特征在于,响应于所述第一操作,启动分屏显示模式,包括:在显示屏上生成至少两个显示窗口,所述至少两个显示窗口包括第一显示窗口和第二显示窗口;在所述第一显示窗口中显示所述第一任务对应的显示界面,在所述第二显示窗口中显示所述第二任务对应的显示界面。
- 如权利要求1所述的方法,其特征在于,响应于所述第一操作,启动分屏显示模式,包括:生成所述第二任务对应的显示界面的视图,并在所述第一任务对应的显示界面上显示所述视图;响应于针对所述视图的第二操作,根据所述第二操作确定分屏显示窗口位置;在所述分屏显示窗口位置上显示分屏显示窗口,并在所述分屏显示窗口中显示所述第二任务对应的显示界面。
- 如权利要求5所述的方法,其特征在于,所述第二操作包括第一子操作,所述第一子操作为将所述视图或者所述视图的副本拖动至预设方位的操作;根据所述第二操作确定分屏显示窗口位置,包括:确定所述预设方位为分屏显示窗口位置;所述方法还包括:在接收到所述第一子操作之后,在所述分屏显示窗口位置上显示分屏显示窗口。
- 如权利要求6所述的方法,其特征在于,所述第二操作还包括在所述第一子操作之前的第二子操作,所述第二子操作为对所述视图进行单指长按或者双指长按的操作。
- 如权利要求6所述的方法,其特征在于,所述第二操作还包括在所述第一子操作之后的第三子操作,所述第三子操作为单击所述视图的操作;所述方法还包括:在接收到所述第三子操作之后,在所述分屏显示窗口中显示所述第二任务对应的显示界面。
- 一种电子设备,其特征在于,包括显示屏;一个或多个处理器;存储器;多个应用;以及一个或多个计算机程序;其中所述一个或多个计算机程序被存储在所述存储器中,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器调用执行时,使得所述电子设备执行以下 步骤:在运行第一应用执行第一任务时,在所述显示屏上显示所述第一任务对应的显示界面;在所述第一任务对应的显示界面上接收用于启动第二任务的第一操作;响应于所述第一操作,启动分屏显示模式。
- 如权利要求9所述的电子设备,其特征在于,当所述指令被所述一个或多个处理器调用执行时,使得所述电子设备执行以下步骤:在所述显示屏上全屏显示所述第一任务对应的显示界面。
- 如权利要求9所述的电子设备,其特征在于,所述第二任务为所述第一应用中的任务;或者所述第二任务为第二应用中的任务,且所述第二应用与所述第一应用不同。
- 如权利要求9所述的电子设备,其特征在于,当所述指令被所述一个或多个处理器调用执行时,使得所述电子设备执行以下步骤:在所述显示屏上生成至少两个显示窗口,所述至少两个显示窗口包括第一显示窗口和第二显示窗口;在所述第一显示窗口中显示所述第一任务对应的显示界面,在所述第二显示窗口中显示所述第二任务对应的显示界面。
- 如权利要求9所述的电子设备,其特征在于,当所述指令被所述一个或多个处理器调用执行时,使得所述电子设备执行以下步骤:生成所述第二任务对应的显示界面的视图,并在所述第一任务对应的显示界面上显示所述视图;响应于针对所述视图的第二操作,根据所述第二操作确定分屏显示窗口位置;在所述分屏显示窗口位置上显示分屏显示窗口,并在所述分屏显示窗口中显示所述第二任务对应的显示界面。
- 如权利要求13所述的电子设备,其特征在于,所述第二操作包括第一子操作,所述第一子操作为将所述视图或者所述视图的副本拖动至预设方位的操作;当所述指令被所述一个或多个处理器调用执行时,使得所述电子设备执行以下步骤:确定所述预设方位为分屏显示窗口位置;当所述指令被所述一个或多个处理器调用执行时,还使得所述电子设备执行以下步骤:在接收到所述第一子操作之后,在所述分屏显示窗口位置上显示分屏显示窗口。
- 如权利要求14所述的电子设备,其特征在于,所述第二操作还包括在所述第一子操作之前的第二子操作,所述第二子操作为对所述视图进行单指长按或者双指长按的操作。
- 如权利要求14所述的电子设备,其特征在于,所述第二操作还包括在所述第一子操作之后的第三子操作,所述第三子操作为单击所述视图的操作;当所述指令被所述一个或多个处理器调用执行时,还使得所述电子设备执行以下步骤:在接收到所述第三子操作之后,在所述分屏显示窗口中显示所述第二任务对应的显示界面。
- 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1-8中任一项所述的分屏显示方法。
- 一种程序产品,其特征在于,当所述程序产品在计算机上运行时,使得所述计算机执行如权利要求1-8中任一项所述的分屏显示方法。
- 一种电子设备上的图形用户界面,其特征在于,所述电子设备具有显示屏、一个或多个存储器、以及一个或多个处理器,所述一个或多个处理器用于执行存储在所述一个或多个存储器中的一个或多个计算机程序,所述图形用户界面包括所述电子设备执行如权 利要求1至8中任一项所述的分屏显示方法时显示的图形用户界面。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022519756A JP2022549945A (ja) | 2019-09-30 | 2020-07-16 | 分割スクリーン表示方法及び電子デバイス |
US17/764,426 US20220357845A1 (en) | 2019-09-30 | 2020-07-16 | Split-screen display method and electronic device |
KR1020227011757A KR20220058953A (ko) | 2019-09-30 | 2020-07-16 | 화면 분할 디스플레이 방법 및 전자 디바이스 |
CN202080034954.8A CN113811844A (zh) | 2019-09-30 | 2020-07-16 | 一种分屏显示方法与电子设备 |
EP20871830.4A EP4024183B1 (en) | 2019-09-30 | 2020-07-16 | Method for split-screen display and electronic apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910938898.XA CN110865744B (zh) | 2019-09-30 | 2019-09-30 | 一种分屏显示方法与电子设备 |
CN201910938898.X | 2019-09-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021063074A1 true WO2021063074A1 (zh) | 2021-04-08 |
Family
ID=69652287
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/102488 WO2021063074A1 (zh) | 2019-09-30 | 2020-07-16 | 一种分屏显示方法与电子设备 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220357845A1 (zh) |
EP (1) | EP4024183B1 (zh) |
JP (1) | JP2022549945A (zh) |
KR (1) | KR20220058953A (zh) |
CN (2) | CN110865744B (zh) |
WO (1) | WO2021063074A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114690977A (zh) * | 2021-04-22 | 2022-07-01 | 广州创知科技有限公司 | 一种基于弹性波的交互唤起方法及装置 |
USD969828S1 (en) * | 2021-01-12 | 2022-11-15 | Beijing Zitiao Network Technology Co., Ltd. | Display screen or portion thereof with a graphical user interface |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110865744B (zh) * | 2019-09-30 | 2021-12-14 | 华为技术有限公司 | 一种分屏显示方法与电子设备 |
CN113497835B (zh) * | 2020-04-01 | 2023-10-20 | 华为技术有限公司 | 多屏交互方法、电子设备及计算机可读存储介质 |
CN111694475B (zh) * | 2020-04-27 | 2022-04-22 | 华为技术有限公司 | 终端控制方法、装置及终端设备 |
CN111897418B (zh) * | 2020-07-02 | 2021-08-17 | 珠海格力电器股份有限公司 | 一种分屏显示方法、装置、设备及存储介质 |
CN114363462B (zh) * | 2020-09-30 | 2023-01-06 | 华为技术有限公司 | 一种界面显示方法、电子设备及计算机可读介质 |
CN114327666B (zh) * | 2020-09-30 | 2024-04-09 | 华为技术有限公司 | 应用启动方法、装置和电子设备 |
CN112433693B (zh) * | 2020-12-11 | 2023-06-23 | 维沃移动通信(杭州)有限公司 | 分屏显示方法、装置及电子设备 |
KR20230023386A (ko) | 2021-08-10 | 2023-02-17 | 삼성전자주식회사 | 디스플레이 모듈 출력 방법 및 상기 방법을 수행하는 전자 장치 |
CN116820314A (zh) * | 2021-09-22 | 2023-09-29 | 荣耀终端有限公司 | 一种悬浮窗显示方法及电子设备 |
CN114546549A (zh) * | 2022-01-24 | 2022-05-27 | 中国第一汽车股份有限公司 | 应用程序的控制方法、手势处理装置、智能终端及车辆 |
CN114510166B (zh) * | 2022-04-01 | 2022-08-26 | 深圳传音控股股份有限公司 | 操作方法、智能终端及存储介质 |
WO2023245311A1 (zh) * | 2022-06-20 | 2023-12-28 | 北京小米移动软件有限公司 | 一种窗口调整方法、装置、终端及存储介质 |
CN115423578B (zh) * | 2022-09-01 | 2023-12-05 | 广东博成网络科技有限公司 | 基于微服务容器化云平台的招投标方法和*** |
CN116048317B (zh) * | 2023-01-28 | 2023-08-22 | 荣耀终端有限公司 | 一种显示方法及装置 |
CN117931043A (zh) * | 2023-12-20 | 2024-04-26 | 荣耀终端有限公司 | 坐标转换方法、设备、芯片及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140157163A1 (en) * | 2012-11-30 | 2014-06-05 | Hewlett-Packard Development Company, L.P. | Split-screen user interface |
CN104898952A (zh) * | 2015-06-16 | 2015-09-09 | 魅族科技(中国)有限公司 | 一种终端分屏实现方法及终端 |
CN106598429A (zh) * | 2016-11-29 | 2017-04-26 | 北京小米移动软件有限公司 | 移动终端的窗口调整方法及装置 |
CN108804004A (zh) * | 2018-05-03 | 2018-11-13 | 珠海格力电器股份有限公司 | 一种分屏控制方法、装置、存储介质及终端 |
CN110865744A (zh) * | 2019-09-30 | 2020-03-06 | 华为技术有限公司 | 一种分屏显示方法与电子设备 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102069014B1 (ko) * | 2012-09-25 | 2020-02-12 | 삼성전자 주식회사 | 휴대단말기의 분리화면 제어장치 및 방법 |
KR102099646B1 (ko) * | 2012-09-25 | 2020-04-13 | 삼성전자 주식회사 | 휴대단말의 분할화면 전환 장치 및 방법 |
CN103268190A (zh) * | 2013-06-05 | 2013-08-28 | 四目信息科技(上海)有限公司 | 触屏电子设备基于ios操作***实现视图拖拽操作的方法 |
JP6368462B2 (ja) * | 2013-08-01 | 2018-08-01 | シャープ株式会社 | 情報処理装置、情報処理方法及びそのプログラム |
CN103412711A (zh) * | 2013-08-27 | 2013-11-27 | 宇龙计算机通信科技(深圳)有限公司 | 文档对比参照方法和装置 |
WO2015192375A1 (zh) * | 2014-06-20 | 2015-12-23 | 华为技术有限公司 | 应用界面的展示方法、装置及电子设备 |
KR102383103B1 (ko) * | 2014-08-13 | 2022-04-06 | 삼성전자 주식회사 | 전자 장치 및 이의 화면 표시 방법 |
CN104331246A (zh) * | 2014-11-19 | 2015-02-04 | 广州三星通信技术研究有限公司 | 在终端中进行分屏显示的设备和方法 |
JP6553719B2 (ja) * | 2016-10-31 | 2019-07-31 | ベイジン シャオミ モバイル ソフトウェア カンパニーリミテッド | 画面分割表示方法および装置 |
CN106970769A (zh) * | 2017-03-24 | 2017-07-21 | 北京小米移动软件有限公司 | 分屏显示方法及装置 |
US20180286277A1 (en) * | 2017-03-28 | 2018-10-04 | Insy Shah | System and methodology for regulating instructional multimedia applications |
CN107515691A (zh) * | 2017-07-31 | 2017-12-26 | 努比亚技术有限公司 | 一种触控显示方法及移动终端、存储介质 |
CN107908382B (zh) * | 2017-11-10 | 2020-03-03 | 维沃移动通信有限公司 | 一种分屏显示方法及移动终端 |
CN107908351B (zh) * | 2017-11-30 | 2021-07-13 | 北京小米移动软件有限公司 | 应用界面的显示方法、装置及存储介质 |
CN108632462A (zh) * | 2018-04-19 | 2018-10-09 | Oppo广东移动通信有限公司 | 分屏显示的处理方法、装置、存储介质及电子设备 |
CN110244893B (zh) * | 2019-05-05 | 2022-02-25 | 华为技术有限公司 | 一种分屏显示的操作方法及电子设备 |
-
2019
- 2019-09-30 CN CN201910938898.XA patent/CN110865744B/zh active Active
-
2020
- 2020-07-16 KR KR1020227011757A patent/KR20220058953A/ko not_active Application Discontinuation
- 2020-07-16 EP EP20871830.4A patent/EP4024183B1/en active Active
- 2020-07-16 WO PCT/CN2020/102488 patent/WO2021063074A1/zh unknown
- 2020-07-16 CN CN202080034954.8A patent/CN113811844A/zh active Pending
- 2020-07-16 US US17/764,426 patent/US20220357845A1/en active Pending
- 2020-07-16 JP JP2022519756A patent/JP2022549945A/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140157163A1 (en) * | 2012-11-30 | 2014-06-05 | Hewlett-Packard Development Company, L.P. | Split-screen user interface |
CN104898952A (zh) * | 2015-06-16 | 2015-09-09 | 魅族科技(中国)有限公司 | 一种终端分屏实现方法及终端 |
CN106598429A (zh) * | 2016-11-29 | 2017-04-26 | 北京小米移动软件有限公司 | 移动终端的窗口调整方法及装置 |
CN108804004A (zh) * | 2018-05-03 | 2018-11-13 | 珠海格力电器股份有限公司 | 一种分屏控制方法、装置、存储介质及终端 |
CN110865744A (zh) * | 2019-09-30 | 2020-03-06 | 华为技术有限公司 | 一种分屏显示方法与电子设备 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4024183A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD969828S1 (en) * | 2021-01-12 | 2022-11-15 | Beijing Zitiao Network Technology Co., Ltd. | Display screen or portion thereof with a graphical user interface |
CN114690977A (zh) * | 2021-04-22 | 2022-07-01 | 广州创知科技有限公司 | 一种基于弹性波的交互唤起方法及装置 |
CN114690977B (zh) * | 2021-04-22 | 2023-11-21 | 广州创知科技有限公司 | 一种基于弹性波的交互唤起方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
EP4024183B1 (en) | 2024-04-24 |
EP4024183A4 (en) | 2022-11-23 |
CN110865744A (zh) | 2020-03-06 |
JP2022549945A (ja) | 2022-11-29 |
KR20220058953A (ko) | 2022-05-10 |
CN113811844A (zh) | 2021-12-17 |
EP4024183A1 (en) | 2022-07-06 |
US20220357845A1 (en) | 2022-11-10 |
CN110865744B (zh) | 2021-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021063074A1 (zh) | 一种分屏显示方法与电子设备 | |
WO2021043223A1 (zh) | 一种分屏显示方法及电子设备 | |
WO2021013158A1 (zh) | 显示方法及相关装置 | |
WO2021129326A1 (zh) | 一种屏幕显示方法及电子设备 | |
WO2021037084A1 (zh) | 一种分屏显示方法与电子设备 | |
WO2021036628A1 (zh) | 一种具有折叠屏的设备的触控方法与折叠屏设备 | |
WO2021057868A1 (zh) | 一种界面切换方法及电子设备 | |
WO2021063090A1 (zh) | 一种建立应用组合的方法与电子设备 | |
WO2020062294A1 (zh) | ***导航栏的显示控制方法、图形用户界面及电子设备 | |
WO2021057343A1 (zh) | 一种对电子设备的操作方法及电子设备 | |
WO2022068483A1 (zh) | 应用启动方法、装置和电子设备 | |
WO2021063098A1 (zh) | 一种触摸屏的响应方法及电子设备 | |
US20220291794A1 (en) | Display Method and Electronic Device | |
EP4099669A1 (en) | Method for creating application shortcuts, electronic device, and system | |
WO2021037223A1 (zh) | 一种触控方法与电子设备 | |
WO2021078032A1 (zh) | 用户界面的显示方法及电子设备 | |
WO2022057512A1 (zh) | 分屏方法、装置及电子设备 | |
WO2022089060A1 (zh) | 一种界面显示方法及电子设备 | |
EP3958106A1 (en) | Interface display method and electronic device | |
WO2022017393A1 (zh) | 显示交互***、显示方法及设备 | |
WO2021057699A1 (zh) | 具有柔性屏幕的电子设备的控制方法及电子设备 | |
CN114115618A (zh) | 一种应用窗口显示方法与电子设备 | |
US20230236714A1 (en) | Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device | |
WO2023029983A1 (zh) | 一种控件内容的拖拽方法、电子设备及*** | |
WO2023226922A1 (zh) | 卡片管理方法、电子设备及计算机可读存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20871830 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022519756 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112022005991 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 20227011757 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020871830 Country of ref document: EP Effective date: 20220331 |
|
ENP | Entry into the national phase |
Ref document number: 112022005991 Country of ref document: BR Kind code of ref document: A2 Effective date: 20220329 |