WO2024125301A1 - Procédé d'affichage et dispositif électronique - Google Patents

Procédé d'affichage et dispositif électronique Download PDF

Info

Publication number
WO2024125301A1
WO2024125301A1 PCT/CN2023/135028 CN2023135028W WO2024125301A1 WO 2024125301 A1 WO2024125301 A1 WO 2024125301A1 CN 2023135028 W CN2023135028 W CN 2023135028W WO 2024125301 A1 WO2024125301 A1 WO 2024125301A1
Authority
WO
WIPO (PCT)
Prior art keywords
window
electronic device
screen
interface
application
Prior art date
Application number
PCT/CN2023/135028
Other languages
English (en)
Chinese (zh)
Inventor
胡萌
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024125301A1 publication Critical patent/WO2024125301A1/fr

Links

Definitions

  • the embodiments of the present application relate to the field of terminal devices, and more specifically, to a display method and an electronic device.
  • users may need to obtain the content of the application interface or information related to the interface content and further share or record it.
  • they can obtain the information or content of the application interface through functions such as screenshot, screen recording, or screen recognition, save it locally, or share it with other users.
  • functions such as screenshot, screen recording, or screen recognition are all for the entire screen, and the operation process is cumbersome, which reduces the user experience.
  • the embodiments of the present application provide a display method and an electronic device, which can improve the convenience of operation and the user experience when a user needs to share or record information or content of other windows through one window in a multi-window mode.
  • a display method which is applied to an electronic device, the method comprising: displaying a first interface, the first interface comprising a first window and a second window, the first window being a window allowing information insertion; receiving a first user input to the first window; performing a first operation on the second window according to the first user input; and inserting information corresponding to the execution result of the first operation into the first window.
  • the first interface displayed by the electronic device includes a first window and a second window.
  • the electronic device can determine that the target window of the first user input is the second window, and perform a first operation on the second window, and then insert information corresponding to the execution result of the first operation into the first window.
  • the information corresponding to the execution result of the first operation can be text, a link or a file. Since the information corresponding to the execution result of the first operation is directly inserted into the window, the steps of user operation are reduced and the convenience is improved.
  • the first window may include an input box.
  • the information corresponding to the execution result of the first operation is text or a link
  • the text or link can be directly inserted into the input box
  • the icon of the file can be inserted into the input box, or the icon of the file can also be displayed on the interface.
  • the first window may not include an input box, but the window may allow files to be inserted.
  • the window may be a window of a file compression application, and the user can generally drag the file or select the file from the file library to place the file in the area to be compressed and compress it.
  • the electronic device can directly determine the target window of the first operation as the second window, and insert the file corresponding to the execution result of the first operation into the area to be compressed, and the user can directly click to compress it later.
  • the first window can also be an email application.
  • the first window displays the main interface of the email application. There may be no input box on the main interface, but the user can trigger the opening of a new email editing interface by dragging to various functional areas of the main interface, such as dragging to the inbox, and directly insert the file into the new email editing window.
  • the email application can open the new email editing interface by default, and directly insert the file corresponding to the execution result of the first operation into the new email editing interface in the first window.
  • the application of the first window may also be other applications that allow information insertion, which will not be described in detail here.
  • the first user input may be a click input on a control or a gesture input on an interface of the first window, etc.
  • the first operation may be a screenshot operation, a screen recording operation or a screen recognition operation.
  • the electronic device may directly insert the screenshot image file, the screen recording video file or the text, link or file recognized by the screen into the first window, so as to facilitate the user's recording or subsequent sharing.
  • the first operation is directly targeted at the second window required by the user, other editing actions of the user (for example, Such as cropping of screenshots, etc.), which improves the user experience.
  • the first operation is any one or more of a screenshot operation, a screen recording operation, a screen recognition operation, or a sound recognition operation; when the first operation is a screenshot operation, the information corresponding to the execution result of the first operation is a picture file obtained according to the screenshot operation; or, when the first operation is a screen recording operation, the information corresponding to the execution result of the first operation is a video file obtained according to the screen recording operation; or, when the first operation is a screen recognition operation, the information corresponding to the execution result of the first operation is at least one of text or links obtained according to the screen recognition operation, and the screen recognition operation includes text recognition or object recognition; or, when the first operation is the sound recognition operation, the information corresponding to the execution result of the first operation is at least one of a music file or text information obtained according to the sound recognition operation.
  • the information corresponding to the execution result of the first operation may also be a link to the music.
  • the first window includes a first control
  • the first user input is a click operation on the first control
  • the first control is set on the application interface of the first window.
  • the application of the first window or the operating system of the electronic device can determine that the target window for the user to click the control to operate is the second window.
  • the application of the first window can determine the coordinates of the target window of the first operation according to the coordinates of the first window, and transmit the coordinates to the operating system of the electronic device.
  • the corresponding module of the operating system can perform the first operation on the second window according to the coordinates, and directly insert the information corresponding to the execution result of the first operation into the first window or the input box of the first window, thereby improving the user experience.
  • the operating system of the electronic device can determine the position or coordinates of the second window according to the position or coordinates of the first window, and perform the first operation on the second window; the operating system of the electronic device can also directly determine the position or coordinates of the second window according to the relevant information of window management.
  • the first user input is a gesture input on an interface of the first window.
  • the first user input may also be a shortcut key, a drop-down box, etc. corresponding to the first operation on the electronic device.
  • the present application does not limit the form of the first user input corresponding to the first operation.
  • the first interface includes only two windows, and the first window and the second window are two windows of a split-screen interface of the electronic device.
  • the method before performing the first operation on the second window, the method also includes: determining the position of the second window based on the position of the first window; wherein performing the first operation on the second window includes: performing the first operation on the second window based on the position of the second window.
  • the coordinates of the other window can be determined according to the coordinates of one of the windows.
  • the operation of determining the position of the second window can be performed by the application of the first window or the operating system of the electronic device, so that the application of the first window can transmit the position or coordinates of the second window to the operating system of the electronic device, and then the operating system performs the first operation on the second window.
  • the first interface also includes a third window, and before performing the first operation on the second window, the method also includes: prompting the user to determine the target window of the first operation; receiving a second user input from the user, the second user input being used to determine that the target window is the second window.
  • the electronic device when the first interface displayed by the electronic device includes more than two windows, the electronic device can prompt the user to determine the target window of the first operation after the user performs the first user input, and the user can perform the second user input to determine that the target window of the first operation is the second window. Therefore, the technical solution of the embodiment of the present application can be applied in a multi-window scenario and has a wide range of applications.
  • the application corresponding to the first window is a note application or a memo application.
  • the application of the first window is a note application or a memo application.
  • the application of the first window can also be a document application, a chat application, etc., so that when the user uses this type of application, he can conveniently record or share information of other windows in the application, such as screenshots and videos of other windows, thereby improving the speed of recording and the user experience.
  • the application corresponding to the second window is a video application or a document application.
  • an electronic device comprising: a display unit, used to display a first interface, the first interface comprising a first window and a second window, the first window being a window allowing information insertion; a processing unit, used to: receive a first user input to the first window; perform a first operation on the second window according to the first user input; and insert information corresponding to the execution result of the first operation into the first window.
  • the first operation is any one or more of a screenshot operation, a screen recording operation, a screen recognition operation, or a sound recognition operation; when the first operation is a screenshot operation, the execution result of the first operation corresponds to The information corresponding to the execution result of the first operation is a picture file obtained according to the screenshot operation; or, when the first operation is a screen recording operation, the information corresponding to the execution result of the first operation is a video file obtained according to the screen recording operation; or, when the first operation is a screen recognition operation, the information corresponding to the execution result of the first operation is at least one of text or links obtained according to the screen recognition operation, and the screen recognition operation includes text recognition or object recognition; or, when the first operation is the sound recognition operation, the information corresponding to the execution result of the first operation is at least one of a music file or text information obtained according to the sound.
  • the first window includes a first control
  • the first user input is a click operation on the first control
  • the first user input is a gesture input on the interface of the first window.
  • the first interface includes only two windows, and the first window and the second window are two windows of a split-screen interface of the electronic device.
  • the processing unit is further used to: determine the position of the second window based on the position of the first window; the processing unit is specifically used to: perform the first operation on the second window based on the position of the second window.
  • the first interface also includes a third window
  • the processing unit is further used to: prompt the user to determine the target window of the first operation; receive the user's second user input, and the second user input is used to determine that the target window is the second window.
  • the application corresponding to the first window is a note application or a memo application.
  • the application corresponding to the second window is a video application or a document application.
  • a computer storage medium When the computer instructions are executed on an electronic device, the electronic device executes the method according to the first aspect or any one of the implementations of the first aspect.
  • an electronic device comprising: a memory for storing computer instructions; and a processor for executing the computer instructions stored in the memory, so that the electronic device executes the method described in the first aspect or any one of the implementations of the first aspect.
  • a chip system characterized in that it comprises at least one processor, and when program instructions are executed in the at least one processor, the at least one processor executes the method described in the first aspect or any one implementation of the first aspect.
  • a chip comprising a processor and a data interface, wherein the processor reads instructions stored in a memory through the data interface to execute the method described in the first aspect and any possible implementation of the first aspect.
  • the chip may further include a memory, wherein the memory stores instructions.
  • the processor is used to execute instructions stored in the memory. When the instructions are executed, the processor is used to execute the method described in the first aspect and any possible implementation manner of the first aspect.
  • the above chip may specifically be a field programmable gate array or a dedicated integrated circuit.
  • FIG1 is a schematic diagram of the hardware structure of an electronic device to which an embodiment of the present application is applicable.
  • FIG. 2 is a schematic diagram of the software structure of an electronic device to which an embodiment of the present application is applicable.
  • FIG3 shows a schematic interface diagram of a user recording video window content through a note application in a multi-window scenario.
  • FIG. 4 shows a schematic flow chart of a display method provided in an embodiment of the present application.
  • FIG5 shows a schematic diagram of a multi-window scenario provided in an embodiment of the present application.
  • FIG6 shows a schematic diagram of a user interface of a display method provided in an embodiment of the present application.
  • FIG. 7 shows a schematic diagram of another user interface of the display method provided in an embodiment of the present application.
  • FIG. 8 shows a schematic diagram of window coordinates of an electronic device provided in an embodiment of the present application.
  • FIG. 9 shows a schematic diagram of a user interface of a display method provided in an embodiment of the present application.
  • FIG. 10 shows a schematic diagram of another user interface of the display method provided in an embodiment of the present application.
  • FIG. 11 shows a schematic diagram of another user interface of the display method provided in an embodiment of the present application.
  • FIG. 12 shows a schematic block diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 13 shows a schematic block diagram of another electronic device provided in an embodiment of the present application.
  • a and/or B can represent: A exists alone, A and B exist at the same time, and B exists alone, where A and B can be singular or plural.
  • the character "/” generally indicates that the objects associated before and after are in an "or” relationship.
  • references to "one embodiment” or “some embodiments” etc. described in this specification mean that a particular feature, structure or characteristic described in conjunction with the embodiment is included in one or more embodiments of the present application.
  • the phrases “in one embodiment”, “in some embodiments”, “in some other embodiments”, “in some other embodiments”, etc. appearing in different places in this specification do not necessarily all refer to the same embodiment, but mean “one or more but not all embodiments", unless otherwise specifically emphasized in other ways.
  • the terms “including”, “comprising”, “having” and their variations all mean “including but not limited to”, unless otherwise specifically emphasized in other ways.
  • the method provided in the embodiments of the present application is applied to electronic devices, including but not limited to mobile phones, tablet computers, vehicle-mounted devices, wearable devices, augmented reality (AR)/virtual reality (VR) devices, laptop computers, ultra-mobile personal computers (UMPC), netbooks, personal digital assistants (PDA), smart screens, and other electronic devices with display screens.
  • electronic devices including but not limited to mobile phones, tablet computers, vehicle-mounted devices, wearable devices, augmented reality (AR)/virtual reality (VR) devices, laptop computers, ultra-mobile personal computers (UMPC), netbooks, personal digital assistants (PDA), smart screens, and other electronic devices with display screens.
  • the embodiments of the present application do not impose any restrictions on the specific types of electronic devices.
  • FIG1 shows a schematic diagram of the hardware structure of an electronic device provided in an embodiment of the present application.
  • the electronic device 100 may include: a processor 110, a memory 120, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a camera 191, a display screen 192, a button 193, etc.
  • USB universal serial bus
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a controller, a memory, a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural-network processing unit (NPU).
  • AP application processor
  • GPU graphics processor
  • ISP image signal processor
  • controller a memory
  • video codec a digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • Different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller may generate an operation control signal according to the instruction operation code and the timing signal to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or cyclically used. If the processor 110 needs to use the instruction or data again, it may be directly called from the memory, thereby avoiding repeated access, reducing the waiting time of the processor 110, and thus improving the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (SIM) interface, and/or a universal serial bus (USB) interface, etc.
  • I2C inter-integrated circuit
  • I2S inter-integrated circuit sound
  • PCM pulse code modulation
  • UART universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the processor 110 and the touch sensor 180E can communicate through an I2C bus interface to implement the touch function of the electronic device 100.
  • the processor 110 and the camera 191 can communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 192 can communicate through a DSI interface to implement the display function of the electronic device 100.
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is only a schematic illustration and does not constitute a structural limitation on the electronic device 100.
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the power management module 141 is used to connect to the battery 142. While the charging management module 140 is charging the battery 142, the power management module 141 can also be used to power the electronic device 100.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle number, and battery health status.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve the utilization of antennas.
  • antenna 1 can be reused as a diversity antenna for a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide solutions for wireless communications including 2G/3G/4G/5G, etc., applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), etc.
  • the mobile communication module 150 may receive electromagnetic waves from the antenna 1, and perform filtering, amplification, and other processing on the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 may also amplify the signal modulated by the modulation and demodulation processor, and convert it into electromagnetic waves for radiation through the antenna 1.
  • at least some of the functional modules of the mobile communication module 150 may be arranged in the processor 110.
  • at least some of the functional modules of the mobile communication module 150 may be arranged in the same device as at least some of the modules of the processor 110.
  • the wireless communication module 160 can provide wireless communication solutions including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared (IR) and the like applied to the electronic device 100.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared
  • the wireless communication module 160 can be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the frequency of the electromagnetic wave signal and performs filtering processing, and sends the processed signal to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, modulate the frequency of the signal, amplify the signal, and convert it into electromagnetic waves for radiation through the antenna 2.
  • the electronic device 100 implements the display function through a GPU, a display screen 192, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display screen 192 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 192 is used to display images, videos, etc.
  • the display screen 192 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 192, where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through ISP, camera 191, video codec, GPU, display screen 192 and application processor.
  • ISP is used to process the data fed back by camera 191.
  • Camera 191 is used to capture static images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) phototransistor.
  • the electronic device 100 may include 1 or N cameras 191, where N is a positive integer greater than 1.
  • Video codecs are used to compress or decompress digital videos.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record videos in a variety of coding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • the memory 120 is used to store data and/or instructions.
  • the memory 120 may include an internal memory.
  • the internal memory is used to store computer executable program codes, which include instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running the instructions stored in the internal memory.
  • the internal memory may include a program storage area and a data storage area. Among them, the program storage area may store an operating system; the program storage area may also store one or more applications (such as a gallery, contacts, etc.).
  • the data storage area may store data (such as images, contacts, etc.) created during the use of the electronic device 100.
  • the internal memory may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more disk storage devices, flash memory devices, universal flash storage (UFS), etc.
  • the processor 110 may execute instructions stored in the internal memory, and/or instructions stored in a memory provided in the processor 110, so that the electronic device 100 executes the card sharing method provided in the embodiment of the present application.
  • the memory 120 may also include an external memory, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory may communicate with the processor 110 via an external memory interface to implement a data storage function. For example, files such as music and videos may be stored in the external memory.
  • the electronic device 100 can implement audio functions such as audio playback and recording through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone jack 170D, and the application processor.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an acceleration sensor 180C, a distance sensor 180D, a touch sensor 180E, and some other sensors.
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A can be set on the display screen 192.
  • the capacitive pressure sensor can be a parallel plate including at least two conductive materials.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the touch operation intensity according to the pressure sensor 180A.
  • the electronic device 100 can also calculate the touch position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities can correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyroscope sensor 180B is also called an angular velocity sensor, which can be used to determine the motion posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes i.e., x, y, and z axes
  • the gyroscope sensor 180B can be used for anti-shake shooting.
  • the gyroscope sensor 180B can also be used for navigation and somatosensory game scenes. For example, the gyroscope can fully monitor the displacement of the player's hand, thereby achieving various game operation effects, such as changing the horizontal screen to the vertical screen, turning in a racing game, and so on.
  • the acceleration sensor 180C can detect the magnitude of the acceleration of the electronic device 100 in all directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. The acceleration sensor 180C can also be used to identify the posture of the electronic device 100, and is applied to applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180D is used to measure the distance.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180D to measure the distance to achieve fast focusing.
  • the touch sensor 180E is also called a "touch panel”.
  • the touch sensor 180E can be set on the display screen 192, and the touch sensor 180E and the display screen 192 form a touch screen, also called a "touch screen”.
  • the touch sensor 180E is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 192.
  • the touch sensor 180E can also be set on the surface of the electronic device, which is different from the position of the display screen 192.
  • the sensor module 180 may include more or fewer sensors, depending on actual needs, which will not be described in detail here.
  • the key 193 may include a power key, a volume key, etc.
  • the key 193 may be a mechanical key or a touch key.
  • the electronic device 100 may receive key input and generate key signal input related to user settings and function control of the electronic device 100.
  • the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange the components differently.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the above describes a possible hardware structure diagram of the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Taking the system as an example, the software structure of the electronic device 100 is exemplified.
  • FIG2 shows a software structure diagram of an electronic device provided in an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor.
  • the layers communicate with each other through software interfaces.
  • the system is divided into four layers, from top to bottom: application layer, application framework layer, system runtime layer (including system library and Android runtime) and kernel layer. Below the kernel layer is the hardware layer.
  • the application layer may include a series of application (APP) packages. As shown in FIG2 , the application package may include camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and other applications.
  • APP application
  • the application framework layer provides an application programming interface (API) and a programming framework for the applications in the application layer.
  • the application framework layer includes some predefined functions. As shown in Figure 2, the application framework layer may include a window manager. Manager, content provider, phone manager, resource manager, notification manager, view system, card service engine, etc.
  • the window manager is used to manage window programs.
  • the window manager can obtain the display screen size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • the data may include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the phone manager is used to provide communication functions of the electronic device 100, such as management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for applications, such as localized strings, icons, images, layout files, video files, and so on.
  • the notification manager enables applications to display notification information in the status bar. It can be used to convey notification-type messages and can disappear automatically after a short stay without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be a notification that appears in the system top status bar in the form of a chart or scroll bar text, such as notifications of applications running in the background, or a notification that appears on the screen in the form of a dialog window. For example, a text message is displayed in the status bar, a prompt sound is emitted, the terminal device vibrates, the indicator light flashes, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying images, etc.
  • the view system can be used to build applications.
  • a display interface can be composed of one or more views.
  • a display interface including a text notification icon can include a view for displaying text and a view for displaying images.
  • the system runtime library layer (libraries) can be divided into two parts: system libraries and Android runtime.
  • Android runtime is the Android operating environment, including the core library and virtual machine. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one is the function that the Java language needs to call, and the other is the Android core library.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the Java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library is the support of the application framework and may include multiple functional modules, such as: surface manager, media libraries, two-dimensional graphics engine (such as SGL), three-dimensional graphics processing library (such as OpenGL ES), image processing library, etc.
  • the surface manager is used to manage the display subsystem and provide the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, synthesis and layer processing, etc.
  • a 2D graphics engine is a drawing engine for 2D drawings.
  • the kernel layer is the layer between hardware and software, which is used to provide essential functions of the operating system such as file management, memory management, process management, network protocol stack, etc.
  • the kernel layer includes at least display driver, camera driver, audio driver, sensor driver, Bluetooth driver, etc.
  • multi-window mode also called multi-screen mode or split-screen function
  • the user can split the screen of the electronic device, for example, divide the vertical screen into two upper and lower windows, or divide the horizontal screen into two left and right windows, where each window runs an application independently, and the two applications do not affect each other.
  • the display modes of the application window include full-screen display, split-screen display, and floating display.
  • Full screen display also known as full screen mode, means that only one application window is displayed on the screen, and the interface of the application fills the entire screen.
  • the state of the window is defined as the full screen state, and accordingly, the application displayed in the window is defined as the full screen display application.
  • Split-screen display also known as split-screen mode, refers to two or more application windows occupying a portion of the screen respectively, no two application windows overlap, each application window can change size, and the position of the application window is either immovable or can only be moved to a fixed position, such as swapping the positions of two application windows.
  • the state of the window is defined as the split-screen state, and accordingly, the application displayed by the window is defined as the split-screen display application.
  • Floating display also known as floating mode, refers to at least one application window floating on other first interfaces and partially covering them.
  • the first interface is that when the user does not operate the floating application window, the floating application window remains in a fixed position and does not change with the changes of other first interfaces.
  • the at least one application window is displayed in layers on the screen, and the application windows can partially or completely overlap each other. Each application window can be resized and moved.
  • a window when a window is displayed in a floating state, for example, it is suspended above a window displayed in full screen or above a window displayed in split screen, the state of the window is defined as a floating state, and accordingly, the application displayed by the window is defined as a floating display application.
  • the window interface information may be, for example, a screenshot of the window interface, a video of the window interface, or information obtained after further processing of the window interface, such as text obtained by window interface recognition, or text converted from the content of the window interface, etc.
  • the user can then record the obtained window interface information in the windows of other applications or share it through other applications.
  • the user is watching a video in one window and taking notes in another window.
  • the user wants to capture a picture of the current video interface and insert the picture into the interface of the note application (record the information of the video window interface in the window of the note application).
  • the user can take a screenshot of the interface 310 of the current electronic device through the screenshot control on the current interface or drop-down box, gestures, or shortcut keys of the electronic device (for example, pressing the volume key and the power key at the same time), and then the thumbnail 311 of the screenshot image will be displayed on the interface 310, as shown in Figure 3 (a).
  • the user can click on the thumbnail 311 of the screenshot image, and the interface 320 for editing the screenshot image will appear on the display screen, as shown in Figure 3 (b), the user can cut the screenshot image to a suitable size by moving the control 321 at the edge of the screenshot image, and the user can move the control 321 so that the screenshot image is cropped to the size shown in 331 in Figure 3 (c), that is, only the part 331 of the video window in the screenshot image is retained.
  • the user can click the control 332 on the interface 330 to save the cropped screenshot image.
  • the user can click the control 341 on the interface 340 shown in (d) of Figure 3, and click "Select from Album” in the pop-up option box 342 to select the cropped screenshot image from the thumbnails in the album, thereby inserting the screenshot image of the video window into the note-taking application.
  • the above uses screenshots as an example to illustrate the process of a user obtaining information related to one window (screenshot image) and inserting the information into another window application in multi-window mode.
  • the process is similar to the above operation.
  • the whole process has many steps, and for some of the information, such as screen recording files, since special video editing software is required to crop the screen recording files, the user needs to exit the split-screen interface and use the video cropping software to further process the screen recording files, which affects the user experience.
  • the embodiment of the present application provides a display method, which can improve the convenience of operation and enhance the user experience when a user needs to record or share information related to another window in an application in one window in a multi-window mode.
  • the method includes:
  • S410 displaying a first interface, the first interface including a first window and a second window, the first window being a window allowing information to be inserted.
  • S420 Receive a first user input to the first window.
  • S430 Execute a first operation on the second window according to the first user input.
  • the first interface includes a first window and a second window.
  • the first window and the second window may include a full-screen window.
  • the first window is a full-screen window 511
  • the second window is a floating window 512.
  • the floating window 512 may include some controls for window operations. For example, the user clicks on the control 5121 to change the window 512 from a floating window to a full-screen window; the user clicks on the control 5122 to minimize the window 512, and after minimization, the icon of the application of the window can be displayed on the side of the display screen, thereby reducing the impact of the window on the full-screen window 511.
  • the window 512 can be displayed on the interface again as a floating window; when the user clicks on the control 5123, the floating window 512 can be closed.
  • the first window and the second window may not include a full-screen window.
  • the display screen includes a window obtained by splitting the screen and a floating window.
  • the window obtained by splitting the screen may include window 521, window 522, and window 523, and there are controls as shown in 5211 between the split-screen windows.
  • the user can change the size of each split-screen window by moving the control; the floating window may include window 524 and window 525, and each floating window may still include some of the controls described above.
  • the first window may be window 521
  • the second window may be window 522, that is, the first window and the second window are both split-screen windows.
  • the first window may be window 521
  • the second window may be window 524, that is, the first window and the second window are respectively a split-screen window and a floating window.
  • the first window may be window 524
  • the second window may be window 525, that is, the first window and the second window are both floating windows.
  • FIG. 5 (a) may also include other floating windows
  • FIG. 5 (b) may also include other split-screen windows and/or floating windows.
  • the types of the first window and the second window may be interchangeable.
  • the first window may be a floating window 512
  • the second window may be a full-screen window 511.
  • the first window is a window that allows information to be inserted, and the information may be text, a link or a file. That is, the first window allows text, a link or a file to be inserted.
  • the first window includes an input box, which may be an input box of an application, such as an input box of a chat application; or other applications, such as an input box of a note application, which may include memos, notepads, documents, and other note applications; the input box may also be, for example, an input box for commenting on a user or service. Text, links, icons and/or names of files may be inserted into the input box.
  • the window includes an input box, when a file is inserted, the icon or name of the file is not displayed in the input box, but is displayed in other locations of the first window, such as below the input box.
  • the first window may not include an input box, but allows the insertion of files.
  • the first window displays the interface of a file compression application, and when a file is inserted, the icon and/or name of the file will be displayed in the area to be compressed of the file compression application.
  • the first window displays the main interface of an email application, and when a file is inserted, the icon and/or name of the file may be displayed in the default new email editing window as an attachment to the new email.
  • the electronic device receives a first user input on the first window, and performs a first operation on the second window according to the first user input.
  • the first user input may correspond to the first operation, and the first user input may be an input by the user on the interface of the application in the first window, and the first user input may be in various forms.
  • the following takes the case where the first user input of the user corresponds to a screenshot instruction, and the first operation is that the electronic device takes a screenshot of the second window as an example to introduce various forms of the first user input.
  • a first control is set in the application of the first window, and the first user input is a click input of the user to the first control.
  • the first interface 610 may include a first window 601 and a second window 602.
  • the first window 601 runs a note-taking application, and there is an input box on the interface of the application (the input box refers to a location that allows user input).
  • the input box may be in an activated state, and the user may click the screenshot control 611 on the application interface of the first window 601, so that the electronic device may perform subsequent screenshot operations according to the user's click input to the screenshot control.
  • the user may perform gesture input on the interface of the first window.
  • the gesture input may correspond to the screenshot instruction of the application of the first window, for example, the user may perform finger/knuckle tapping on the interface of the application of the first window or slide the finger on the screen in a preset manner and/or trajectory to trigger the screenshot instruction of the application of the first window.
  • the user may perform finger knuckle tapping on the interface of the first window 701 shown in (a) of FIG. 7 , so that the electronic device will perform a subsequent screenshot operation.
  • the electronic device After receiving the first user input, the electronic device determines the position or coordinates of the target window (second window) for performing the first operation corresponding to the first user input.
  • the electronic device can use a window other than the first window that receives the first user input as the target window, that is, the second window as the target window of the first operation. For example, when the first window and the second window are two windows formed by splitting the first interface, the electronic device can determine that the target window is the second window. For another example, when the first window is a full-screen window 511 as shown in (a) of FIG5 , it can be determined that the second window is a floating window 512; and if the first window is a floating window 512, it can be determined that the second window is a full-screen window 511.
  • the electronic device can determine the position or coordinates of the second window based on the position or coordinates of the first window, and then perform the first operation on the second window based on the position or coordinates of the second window.
  • the application of the first window can know that the first user input corresponds to an instruction for performing a first operation on other windows after receiving the first user input.
  • the application of the first window can determine the position or coordinates of the second window according to at least one of the position or coordinates of the first window, so that the electronic device can perform the first operation on the second window according to the position or coordinates of the second window.
  • a coordinate system can be established with the upper left corner of the display interface of the electronic device as the coordinate origin O, with the direction parallel to the time display direction as the x-axis direction, and the direction perpendicular to the time display direction as the y-axis direction (the coordinate axis rotates according to the screen state when the electronic device is in landscape mode or portrait mode).
  • the display interface is as shown in (a) of FIG8 , assuming that the length of the longer side of the display interface is h, the length of the shorter side is w, and the width of the split-screen control 801 is d.
  • the position or coordinates of the first window 810 are: upper left corner vertex (x1, 0), lower right corner vertex (h, w).
  • the application of the first window 810 can receive the first user input and determine the coordinates of the second window 820 that needs to be captured based on the coordinates of the first window 810.
  • the coordinates of the second window 820 are: upper left corner vertex (0, 0), lower right corner vertex (x1-d, w).
  • the application of the first window 810 can send a screenshot instruction to the operating system of the electronic device, and the instruction can carry With the position (coordinates) of the second window 820 , after receiving the screenshot instruction, the screenshot module of the operating system can take a screenshot according to the coordinates of the second window 820 .
  • the position or coordinates of the first window 820 are: upper left vertex (0, 0), lower right vertex (x2, w).
  • the application of the first window 820 can determine that the target window of the screenshot, the second window 810, is located on the right side of the first window 820 according to the coordinates of the first window 820, and the position or coordinates of the second window 810 are: upper left vertex (x2+d, 0), lower right vertex (h, w).
  • the first window 820 can carry the position or coordinates of the second window 810 in the screenshot instruction and send the corresponding instruction to the screenshot module.
  • the display interface is as shown in (b) of FIG8 , assuming that the length of the longer side of the display interface is h, the length of the shorter side is w, and the width of the split-screen control 801 is d.
  • the position or coordinates of the first window 830 are: upper left vertex (0, 0), lower right vertex (w, y1).
  • the application of the first window 830 can determine that the target window of the screenshot, the second window 840, is located below the first window 830 according to the coordinates of the first window 830, and the position or coordinates of the second window 840 are: upper left vertex (0, y1+d), lower right vertex (w, h).
  • the first window 830 can carry the position or coordinates of the second window 840 in the screenshot instruction and send the corresponding instruction to the screenshot module.
  • the position or coordinates of the first window 840 are: upper left vertex (0, y2), lower right vertex (w, h).
  • the application of the first window 840 can determine that the target window of the screenshot, the second window 830, is located above the first window 840 according to the coordinates of the first window 840, and the position or coordinates of the second window 830 are: upper left vertex (0, 0), lower right vertex (w, y2-d).
  • the first window 830 can carry the position or coordinates of the second window 840 in the screenshot instruction and send the corresponding instruction to the screenshot module.
  • the method for determining the position or coordinates of the second window is introduced by splitting the screen left and right.
  • the method for determining the position or coordinates of the second window is similar.
  • the method for determining the position or coordinates of the second window is introduced by taking the split screen up and down as an example.
  • the method for determining the position or coordinates of the second window is similar and will not be repeated here.
  • the coordinate axis can be rotated according to the screen adaptability so that the x-axis clock is parallel to the display direction of time.
  • the position or coordinates of the second window may be determined by the operating system of the electronic device. If the display interface of the electronic device includes only two windows, after receiving the first user input, the operating system of the electronic device may directly use the window other than the first window as the target window of the first operation. Specifically, when the user performs the first user input in the first window, the application of the first window may send the instruction of the first operation to the operating system.
  • the first user input of the user may be preset by the application of the first window or the operating system in a multi-window scenario and when the input box of the first window is in an activated state, and the first user input of the user is used to perform the first operation on the window other than the first window.
  • the application of the first window when the application of the first window receives the first user input, it sends the instruction of the first operation to the operating system and instructs the operating system to perform the first operation on another window.
  • the first application sends the instruction of the first operation
  • the operating system of the electronic device determines that the target window of the first operation is the second window. Therefore, when the display screen of the electronic device includes only two windows, the operating system of the electronic device may determine the position or coordinates of the second window, and perform the first operation on the second window according to the position or coordinates of the second window, such as screenshot, screen recording, screen recognition and other operations.
  • the first window can be preset or instructed by the application of the first window to hide or display the first window when the operating system of the electronic device performs the first operation.
  • the first operation is screenshot or screen recording
  • the floating first window can be hidden when the screenshot or screen recording is performed, so as not to affect the interface of the application of the second window.
  • the user may be prompted to determine the target window of the first operation, and the electronic device may receive the second user input and determine the target window (the second window) of the first operation according to the second user input.
  • the electronic device may prompt the user to select a target window by means of a pop-up window, so that the user may click on one of the windows, and the electronic device may use the window clicked by the user as the target window and perform the first operation on the window.
  • the above describes the steps for the electronic device to determine the target window of the first operation in different scenarios.
  • the information corresponding to the execution node of the first operation can be inserted into the first window.
  • the information corresponding to the first operation is a screenshot image.
  • the electronic device can directly insert the screenshot images 621 and 721 into the input box of the first window, thereby reducing the user's operations and improving the user's experience.
  • the above takes the first user input as an example of the first window (the first user input is a click operation on the control of the first window or a gesture operation on the first window) to introduce the technical solution of the embodiment of the present application.
  • the control related to the first operation clicked by the user can be located in a drop-down box of the electronic device, near a side frame, or on a floating button for assisting user operations.
  • the user or the operating system can preset the screenshot or screen recording operation in the multi-window mode to be performed by window by default.
  • the user clicks the screenshot or screen recording operation control on the drop-down box of the electronic device if the electronic device determines that it is currently in the multi-window mode, it can prompt the user to determine the target window for the screenshot or screen recording. If the electronic device determines that the user Before the first user input (clicking the screenshot or screen recording control), the first window or the input box of the first window is in an activated state, and the file corresponding to the screenshot or screen recording can be directly inserted into the first window after the user determines the target window for the screenshot or screen recording, thereby improving the convenience of user operation.
  • the activation state of the window or input box can be ignored, and the user can be directly prompted to determine the insertion window of the file corresponding to the screenshot or screen recording, that is, the electronic device can prompt the user to determine the target window (second window) for the screenshot or screen recording operation and the target insertion window (first window) for the file of the screenshot or screen recording operation.
  • the first operation is any one or more of a screenshot operation, a screen recording operation, a screen recognition operation, or a sound recognition operation;
  • the first operation may be a screenshot operation, and the information corresponding to the execution result of the first operation is a picture file obtained according to the screenshot operation; or, the first operation may be a screen recording operation, and the information corresponding to the execution result of the first operation is a video file obtained according to the screen recording operation; or, the first operation may be a screen recognition operation, and the information corresponding to the execution result of the first operation is text or a link obtained according to the screen recognition operation, and the screen recognition operation includes text recognition or object recognition; or, the first operation may be a sound recognition operation, and the information corresponding to the execution result of the first operation is at least one of a music file or text information obtained according to the sound recognition operation.
  • window 902 is a video window
  • window 901 is a window of a note-taking application.
  • the user can click on the interface screen recognition control 911.
  • the application of the first window After the application of the first window receives the user's click operation, it can determine the coordinates of the target window 902 for screen recognition according to the coordinates of window 901, and send a screen recognition request to the operating system of the electronic device according to the coordinates of the window, and carry the coordinates of window 902 in the request.
  • the operating system of the electronic device can automatically recognize window 902 and automatically insert the recognized text into the input box of window 901, as shown in FIG9 (b).
  • screen recognition is recognition of text
  • object recognition may be performed, such as commodity recognition, and allowing the user to insert a commodity link in an input box.
  • window 1001 is a chat window
  • window 1002 is a video window.
  • the user can click the screen recording control 1011 on the interface 1010 so that the electronic device can record the window 1002.
  • the user can click the screen recording status control 1012 to end the screen recording, so that the recorded file is directly inserted into the window 1001 without any other user operation.
  • the icon and name corresponding to the file are displayed below the input box, as shown in (b) of Figure 10.
  • the user can then click the "Send" control on the interface 1020 to share the recorded file with others.
  • the display screen of the electronic device includes four windows, among which window 1111, window 1112 and window 1113 are split-screen windows, window 1114 is a floating window, and window 1113 can be currently in an activated state.
  • the user can click on the screenshot control 1115 on the floating button of the electronic device, at which time, a prompt box 1116 prompting the user to input can appear on the interface 1110, prompting the user to select a screenshot window (the target window for the screenshot).
  • the user can select any one of the four windows on the interface 1110 for screenshot.
  • the user can click on the interface of window 1114, so that a control 1126 will appear on the border of window 1114 (interface 1120 of FIG11 (b)), prompting the user to select the target window as window 1114, and then the electronic device can insert the screenshot image file of window 1114 into window 1113 (for example, in the input box of the window), as shown in interface 1130 of FIG11 (c), the user can click on the "Send" control on the interface 1130 to share the screenshot file with others.
  • the display method provided by the embodiment of the present application is introduced above with reference to FIGS. 4 to 11
  • the electronic device provided by the embodiment of the present application is introduced below with reference to FIGS. 12 and 13 .
  • FIG. 12 shows an electronic device 1200 provided in an embodiment of the present application.
  • the electronic device 1200 includes a display unit 1210 and a processing unit 1220 .
  • the electronic device 1200 may include a unit for executing the method of Fig. 4. Moreover, each unit in the electronic device 1200 and the above-mentioned other operations and/or functions are respectively for implementing the corresponding processes of the embodiment of the method of Fig. 4.
  • the electronic device 1200 includes: a display unit 1210, used to display a first interface, the first interface includes a first window and a second window, the first window is a window that allows information insertion; a processing unit 1220, used to: receive a first user input from a user to the first window; perform a first operation on the second window according to the first user input; and insert information corresponding to the execution result of the first operation into the first window.
  • a display unit 1210 used to display a first interface
  • the first interface includes a first window and a second window
  • the first window is a window that allows information insertion
  • a processing unit 1220 used to: receive a first user input from a user to the first window; perform a first operation on the second window according to the first user input; and insert information corresponding to the execution result of the first operation into the first window.
  • the first operation is any one or more of a screenshot operation, a screen recording operation, a screen recognition operation, or a sound recognition operation; when the first operation is a screenshot operation, the information corresponding to the execution result of the first operation is a picture file obtained according to the screenshot operation; or, when the first operation is a screen recording operation, the information corresponding to the execution result of the first operation is a video file obtained according to the screen recording operation; or, when the first operation is a screen recognition operation, the information corresponding to the execution result of the first operation is text or a link obtained according to the screen recognition operation, and the screen recognition operation includes text recognition or object recognition; or, when the first operation is the sound recognition operation, the information corresponding to the execution result of the first operation is at least one of a music file or text information obtained according to the sound.
  • the first window includes a first control
  • the first user input is a click operation on the first control
  • the first user input is a gesture input on the interface of the first window.
  • the first window and the second window are two windows of the split-screen interface of the electronic device.
  • the processing unit 1220 is further used to: determine the position of the second window according to the position of the first window; the processing unit 1220 is specifically used to: perform the first operation on the second window according to the position of the second window.
  • the first interface also includes a third window
  • the processing unit 1220 is further used to: prompt the user to determine the target window of the first operation; receive the user's second user input, and the second user input is used to determine that the target window is the second window.
  • the application corresponding to the first window is a note application or a memo application.
  • the application corresponding to the second window is a video application or a document application.
  • FIG13 shows an electronic device 1300 provided in an embodiment of the present application.
  • the electronic device 1300 shown in FIG13 may correspond to the electronic device described above.
  • the electronic device 1300 may be a specific example of the electronic device in FIG1 .
  • the electronic device 1300 includes: a processor 1320.
  • the processor 1320 is used to implement corresponding control management operations.
  • the processor 1320 is used to support the electronic device 1300 to perform the method, operation or function of the aforementioned embodiment.
  • the electronic device 1300 may also include: a memory 1310 and a communication interface 1330; the processor 1320, the communication interface 1330 and the memory 1310 may be connected to each other or connected to each other through a bus 1340.
  • the communication interface 1330 is used to support the electronic device to communicate with other devices, etc.
  • the memory 1310 is used to store program codes and data of the electronic device.
  • the processor 1320 calls the code or data stored in the memory 1310 to implement corresponding operations.
  • the memory 1310 may be coupled to the processor or not.
  • the coupling in the embodiments of the present application is an indirect coupling or communication connection between electronic devices, units or modules, which can be electrical, mechanical or other forms, and is used for information exchange between electronic devices, units or modules.
  • the processor 1320 can be a central processing unit, a general processor, a digital signal processor, an application-specific integrated circuit, a field programmable gate array or other programmable logic device, a transistor logic device, a hardware component or any combination thereof. It can implement or execute various exemplary logic blocks, modules and circuits described in conjunction with the disclosure of this application.
  • the processor can also be a combination that implements a computing function, such as a combination of one or more microprocessors, a combination of a digital signal processor and a microprocessor, and the like.
  • the communication interface 1330 can be a transceiver, a circuit, a bus, a module or other types of communication interfaces.
  • the bus 1340 can be a peripheral component interconnect standard (PCI) bus or an extended industry standard architecture (EISA) bus, etc.
  • PCI peripheral component interconnect standard
  • EISA extended industry standard architecture
  • the bus can be divided into an address bus, a data bus, a control bus, etc. For ease of representation, only one thick line is used in FIG. 13, but it does not mean that there is only one bus or one type of bus.
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the units is only a logical function division. There may be other division methods in actual implementation, such as multiple units or components can be combined or integrated into another system, or some features can be ignored or not executed.
  • Another point is that the mutual coupling or direct coupling or communication connection shown or discussed can be through some interfaces, indirect coupling or communication connection of devices or units, which can be electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place or distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application or the part that contributes to the prior art, or the part of the technical solution, can be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for a computer device (which can be a personal computer, server, or network device, etc.) to execute the various embodiments of the present application.
  • the aforementioned storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk, and other media that can store program codes.

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

Les modes de réalisation de la présente demande concernent un procédé d'affichage et un dispositif électronique, le procédé d'affichage étant appliqué au dispositif électronique. Le procédé consiste à : afficher une première interface, la première interface comprenant une première fenêtre et une seconde fenêtre, et la première fenêtre étant une fenêtre dans laquelle des informations sont autorisées à être insérées ; recevoir une première entrée d'utilisateur d'un utilisateur pour la première fenêtre ; selon la première entrée d'utilisateur, exécuter une première opération sur la seconde fenêtre ; et insérer, dans la première fenêtre, des informations correspondant au résultat d'exécution de la première opération. Au moyen du procédé d'affichage et du dispositif électronique décrits dans les modes de réalisation de la présente demande, dans un mode à fenêtres multiples, lorsqu'un utilisateur a besoin de partager ou d'enregistrer des informations ou un contenu d'autres fenêtres au moyen d'une fenêtre, la commodité d'opération et l'expérience d'utilisateur peuvent être améliorées.
PCT/CN2023/135028 2022-12-14 2023-11-29 Procédé d'affichage et dispositif électronique WO2024125301A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211608702.9A CN118193092A (zh) 2022-12-14 2022-12-14 显示方法和电子设备
CN202211608702.9 2022-12-14

Publications (1)

Publication Number Publication Date
WO2024125301A1 true WO2024125301A1 (fr) 2024-06-20

Family

ID=91393581

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/135028 WO2024125301A1 (fr) 2022-12-14 2023-11-29 Procédé d'affichage et dispositif électronique

Country Status (2)

Country Link
CN (1) CN118193092A (fr)
WO (1) WO2024125301A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112306607A (zh) * 2020-10-30 2021-02-02 维沃移动通信有限公司 截图方法和装置、电子设备和可读存储介质
CN114281439A (zh) * 2020-09-18 2022-04-05 华为技术有限公司 分屏方法、装置及电子设备
US20220247857A1 (en) * 2019-06-25 2022-08-04 Huawei Technologies Co., Ltd. Full-screen display method for mobile terminal and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220247857A1 (en) * 2019-06-25 2022-08-04 Huawei Technologies Co., Ltd. Full-screen display method for mobile terminal and device
CN114281439A (zh) * 2020-09-18 2022-04-05 华为技术有限公司 分屏方法、装置及电子设备
CN112306607A (zh) * 2020-10-30 2021-02-02 维沃移动通信有限公司 截图方法和装置、电子设备和可读存储介质

Also Published As

Publication number Publication date
CN118193092A (zh) 2024-06-14

Similar Documents

Publication Publication Date Title
US11922005B2 (en) Screen capture method and related device
WO2022022495A1 (fr) Procédé et dispositif de glissement d'objet de dispositif transversal et dispositif
US20220342850A1 (en) Data transmission method and related device
WO2021129326A1 (fr) Procédé d'affichage d'écran et dispositif électronique
WO2021036571A1 (fr) Procédé d'édition de bureau et dispositif électronique
WO2021103981A1 (fr) Procédé et appareil de traitement d'affichage à écran divisé, et dispositif électronique
US11921987B2 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
WO2021115194A1 (fr) Procédé d'affichage d'icône d'application et dispositif électronique
WO2021104030A1 (fr) Procédé d'affichage à écran partagé et dispositif électronique
WO2021000839A1 (fr) Procédé de division d'écran et dispositif électronique
WO2021110133A1 (fr) Procédé d'opération de commande et dispositif électronique
WO2022033342A1 (fr) Procédé et dispositif de transmission de données
US20220214891A1 (en) Interface display method and electronic device
US20220357818A1 (en) Operation method and electronic device
WO2021057699A1 (fr) Procédé de commande d'un dispositif électronique à écran flexible et dispositif électronique
CN110865765A (zh) 终端及地图控制方法
WO2021190524A1 (fr) Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal
WO2022213831A1 (fr) Procédé d'affichage de commande et dispositif associé
WO2022042285A1 (fr) Procédé d'affichage d'interface de programme d'application et dispositif électronique
WO2024125301A1 (fr) Procédé d'affichage et dispositif électronique
CN114461312B (zh) 显示的方法、电子设备及存储介质
WO2024104094A1 (fr) Procédé de partage de capture d'écran et dispositif électronique
WO2023030198A1 (fr) Procédé d'annotation et dispositif électronique
WO2022262453A1 (fr) Procédé d'invite d'anomalie et dispositif électronique
WO2024037542A1 (fr) Procédé d'entrée tactile, système, dispositif électronique et support de stockage