WO2023273845A1 - 一种多应用录屏方法及装置 - Google Patents

一种多应用录屏方法及装置 Download PDF

Info

Publication number
WO2023273845A1
WO2023273845A1 PCT/CN2022/098273 CN2022098273W WO2023273845A1 WO 2023273845 A1 WO2023273845 A1 WO 2023273845A1 CN 2022098273 W CN2022098273 W CN 2022098273W WO 2023273845 A1 WO2023273845 A1 WO 2023273845A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
window
electronic device
audio
interface
Prior art date
Application number
PCT/CN2022/098273
Other languages
English (en)
French (fr)
Inventor
马丽
孟庆彬
陈刚
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22831665.9A priority Critical patent/EP4344221A1/en
Publication of WO2023273845A1 publication Critical patent/WO2023273845A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games

Definitions

  • the present application relates to the field of terminals, in particular to a multi-application screen recording method and device.
  • the Android emulator can simulate the Android operating system on a non-Android system (such as a computer operating system), realize the installation, operation, and uninstallation of Android applications on the computer, and allow users to experience Android games and other Android applications on the computer.
  • a non-Android system such as a computer operating system
  • the Android emulator can realize keyboard and mouse mapping, game assistance, screenshots, screen recording and other functions, and screen recording is one of the indispensable functions.
  • the Android emulator only supports single-window screen recording. After the user opens multiple windows in the Android emulator, simultaneous recording of multiple windows is not supported; and, when the window is switched, the recording content will also be switched to a new window; it cannot meet the user's needs for simultaneous recording of multiple windows.
  • the present application provides a multi-application screen recording method and device, which can record screens of multiple Android applications separately.
  • a multi-application screen recording method includes: receiving a first operation of the user in the first application window of the electronic device; in response to the first operation, the electronic device records the screen of the content of the first application window; receiving the first operation of the user in the second application window of the electronic device Two operations: In response to the second operation, the electronic device records the screen content of the second application window; after the electronic device records the screen content of the second application window for a first duration, it stops recording the screen content of the first application window.
  • the first application and the second application run in the Android system, and the first application window and the second application window are displayed in the non-Android system.
  • the window of the Android application running in the Android system is displayed on a non-Android system (for example, Windows, Linux, etc.) are displayed.
  • the electronic device can start to record the screen of the content of the Android application window according to the user's operation on the application window.
  • the electronic device obtains the corresponding display identification of the Android application in the Android system according to the window identification of the Android application window, and captures the image data of the Android application window according to the display identification; obtains the display identification of the Android application in the Android system according to the window identification of the Android application window.
  • the audio data of the Android application is captured according to the audio identifier.
  • the electronic device can generate a screen recording file according to the image data and audio data of the Android application to complete the screen recording. Since the screen recording channel of each Android application is independent, it is possible to record screens of multiple Android applications separately.
  • the electronic device includes a first screen recording file and a second screen recording file, wherein the first screen recording file is generated by recording the content of the first application window, and the second screen recording file is generated by recording the content of the second application window The content is generated by screen recording.
  • each Android application In this method, the screen recording channel of each Android application is independent. When recording the screen, each Android application generates a corresponding screen recording file. Record the screen of multiple Android applications at the same time, and generate multiple screen recording files accordingly.
  • the non-Android system includes a Windows system
  • the method further includes: the Android system creates a first display corresponding to the first application window, and creates a first display corresponding to the second application window.
  • the Android system creates the interface of the first application according to the image data of the first display; Generates the first screen recording file according to the interface of the first application; Windows system also generates the interface of the second application according to the image data of the second display; The interface of the second application generates a second screen recording file.
  • the Android system creates a corresponding display for each application window, and the window data generated by each application is associated with a display (indicated by a display identifier), and the image data corresponding to the application is obtained according to the display identifier when recording the screen , and the image recording of the application can be realized.
  • the Windows system generating the interface of the first application according to the first displayed image data includes: the Windows system receives the first displayed image data from the Android system, and the first The first synthesis instruction of the displayed image data; according to the first synthesis instruction, the first Windows synthesis instruction whose instruction format matches the Windows system is obtained, and the first Windows synthesis instruction is used to synthesize and render the first displayed image data to generate the first App's interface.
  • the Windows system generating the interface of the second application according to the second displayed image data includes: the Windows system receives the second displayed image data from the Android system, and the second The second synthesis instruction of the displayed image data; according to the second synthesis instruction, the second Windows synthesis instruction whose instruction format matches the Windows system is obtained, and the second Windows synthesis instruction is used to synthesize and render the second displayed image data to generate a second App's interface.
  • the Windows system generates the interface of the application according to the displayed image data corresponding to the application.
  • the method further includes: the Windows system acquires the first display identifier according to the first application window identifier, and acquires the interface of the first application according to the first display identifier; the Windows system The second display identifier is acquired according to the second application window identifier, and the interface of the second application is acquired according to the second display identifier.
  • the method further includes: the Windows system creates a first local window corresponding to the first display, and displays the interface of the first application in the first local window; the Windows system creates The second display corresponds to the second local window, and the interface of the second application is displayed in the second local window. That is, the simultaneous display of the first application window and the second application window can be realized.
  • the Windows system stops obtaining the interface of the first application according to the first display identifier . That is, stop displaying the first application, and stop performing image recording on the first application.
  • the Android system stops generating the first displayed image data, and stops sending the first displayed image data to the Windows system.
  • the non-Android system includes a Windows system
  • the method further includes: the Android system creates a first audio track instance corresponding to the first application window, and creates a second application window corresponding to The second audio track instance; the Windows system obtains the first audio identifier according to the first application window identifier; the first audio identifier is used to indicate the first audio track instance; the Windows system obtains the first audio track instance data according to the first audio identifier, according to The first audio track instance data generates the first screen recording file; the Windows system obtains the second audio identifier according to the second application window identifier; the second audio identifier is used to indicate the second audio instance; the Windows system obtains the second audio according to the second audio identifier track instance data; generate a second screen recording file based on the second track instance data.
  • the Android system generates a corresponding audio track instance (indicated by an audio ID) for each application window, and the audio recording of the application can be realized by obtaining the audio track instance data corresponding to the application according to the audio ID when recording the screen .
  • the Android system stops generating the first audio track instance data; the Windows system stops Obtain the first audio track instance data according to the first audio identifier, and stop generating the first screen recording file according to the first audio track instance data. In this way, it is possible to stop audio recording when the application switches to the background.
  • recording the screen content of the first application window by the electronic device includes: recording the image of the first application window by the electronic device; or recording the audio of the first application by the electronic device; or , the electronic device records the first application window image and audio.
  • the electronic device can obtain the corresponding display identification of the Android application in the Android system according to the window identification of the Android application window, grab the image data of the Android application window according to the display identification, and generate an image file according to the image data, and complete Image recording of what is displayed in the Android application window.
  • the electronic device can obtain the corresponding audio identification of the Android application in the Android system according to the window identification of the Android application window, capture the audio data of the Android application according to the audio identification, and generate an audio file according to the audio data, and complete the audio recording of the Android application window. recording.
  • the electronic device can generate a video file according to the image data and audio data of the Android application, and complete the video recording of the Android application window. That is to say, the electronic device can record only the image of the Android application, or only the audio of the Android application, or both the image and the audio of the Android application.
  • recording the screen content of the second application window by the electronic device includes: recording the image of the second application window by the electronic device.
  • the electronic device can record the first application window image and the second application window image respectively at the same time; or record the second application window image while recording the first application window audio (the first application window audio and the second The application window images are merged into a new video file); or the second application window image is recorded when the first application window video is recorded.
  • the electronic device can also receive input from a microphone, and combine the recorded Android application window image and voice input from the microphone into a video file; realize music and dubbing for the Android application image, and improve user enjoyment.
  • recording the screen content of the second application window by the electronic device includes: recording audio of the second application by the electronic device.
  • the electronic device can record the audio of the second application when recording the image of the first application window (the first application window image and the audio of the second application can also be further combined into a new video file); One application audio and the second application audio; or record the second application audio while recording the video of the first application window.
  • recording the screen content of the second application window by the electronic device includes: recording the image and audio of the second application window by the electronic device.
  • the electronic device can record the video of the second application when recording the image of the first application window; or record the first application video and the second application video respectively at the same time; or record the second application video while recording the audio of the first application window .
  • the method further includes: the electronic device displays the first application window, and displays the second application window.
  • the electronic device simultaneously displays the first application window and the second application window, realizing displaying multiple Android application windows.
  • an electronic device comprises: a processor; a memory; a display screen; and a computer program, wherein the computer program is stored on the memory, and when the computer program is executed by the processor, the electronic device is made to perform the following steps : receiving the first operation of the user in the first application window of the electronic device; in response to the first operation, the electronic device records the content of the first application window; receiving the second operation of the user in the second application window of the electronic device; In response to the second operation, the electronic device records the content of the second application window; after recording the content of the second application window for a first duration, the electronic device stops recording the content of the first application window.
  • the first application and the second application run in the Android system, and the first application window and the second application window are displayed in the non-Android system.
  • the electronic device includes a first screen recording file and a second screen recording file, wherein the first screen recording file is generated by recording the content of the first application window, and the second screen recording file is generated by recording the content of the second application window The content is generated by screen recording.
  • recording the screen content of the first application window by the electronic device includes: recording the image of the first application window by the electronic device; or recording the audio of the first application by the electronic device; or , the electronic device records the first application window image and audio.
  • recording the screen content of the second application window by the electronic device includes: recording the image of the second application window by the electronic device.
  • recording the screen content of the second application window by the electronic device includes: recording audio of the second application by the electronic device.
  • recording the screen content of the second application window by the electronic device includes: recording the image and audio of the second application window by the electronic device.
  • the electronic device when the computer program is executed by the processor, the electronic device is further caused to perform the following steps: the electronic device displays the first application window, and displays Second application window.
  • a computer-readable storage medium stores a computer program (also referred to as an instruction or code), and when the computer program is executed by the electronic device, the electronic device executes the method in the first aspect or any one of the implementation manners in the first aspect.
  • a computer program also referred to as an instruction or code
  • a computer program product is provided.
  • the electronic device is made to execute the first aspect or the method in any one implementation manner of the first aspect.
  • a chip system in a fifth aspect, includes a processor and interface circuits.
  • the interface circuit is used for performing the function of sending and receiving, and sending instructions to the processor.
  • the processor is made to execute the first aspect or the method in any one implementation manner of the first aspect.
  • Fig. 1A is a schematic diagram of an example of an Android emulator running scenario
  • Fig. 1B is a schematic diagram of an example of an Android emulator running scenario
  • Fig. 2 is a schematic diagram of the architecture of an Android emulator running on an electronic device
  • Fig. 3 is a schematic diagram of a scene example of a screen recording method in an Android emulator
  • FIG. 4A is a schematic diagram of a scene example of a multi-application screen recording method provided by the present application.
  • FIG. 4B is a schematic diagram of a scene example of a multi-application screen recording method provided by the present application.
  • FIG. 4C is a schematic diagram of a scene example of a multi-application screen recording method provided by the present application.
  • FIG. 5 is a schematic diagram of the hardware structure of an electronic device applicable to a multi-application screen recording method provided by the present application;
  • FIG. 6 is a schematic diagram of a scene example of a multi-application screen recording method provided by the present application.
  • FIG. 7 is a schematic diagram of a multi-application screen recording method provided by the present application.
  • FIG. 8 is a schematic diagram of a multi-application screen recording method provided by the present application.
  • FIG. 9 is a schematic diagram of a multi-application screen recording method provided by the present application.
  • FIG. 10 is a schematic diagram of a scene example of a multi-application screen recording method provided by the present application.
  • FIG. 11 is a schematic diagram of a multi-application screen recording method provided by the present application.
  • FIG. 12 is a schematic diagram of the structure and composition of an electronic device provided by the present application.
  • references to "one embodiment” or “some embodiments” or the like in this specification means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically stated otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise.
  • the term “connected” includes both direct and indirect connections, unless otherwise stated.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features.
  • words such as “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design scheme described as “exemplary” or “for example” in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. Rather, the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner.
  • Android emulators can emulate the Android operating system on non-Android systems. Users can install, run, and uninstall Android applications in the Android operating system. Exemplarily, the computer 100 is equipped with a non-Android operating system; for example, Windows, Linux, and the like. An Android emulator is installed on the operating system of the computer 100 . The user can run the Android emulator on the computer 100 to realize the installation, operation and uninstallation of Android applications on the computer 100 .
  • a non-Android operating system for example, Windows, Linux, and the like.
  • An Android emulator is installed on the operating system of the computer 100 . The user can run the Android emulator on the computer 100 to realize the installation, operation and uninstallation of Android applications on the computer 100 .
  • the desktop of the computer 100 includes an “emulator” icon 101 .
  • the user can use an input device (such as a mouse) of the computer 100 to click on the "emulator” icon 101 to run the Android emulator.
  • the computer 100 displays an Android emulator main interface 102 .
  • Android emulator main interface 102 comprises " application 1 " icon, " application 2 " icon and " application 3 " icon;
  • the "App 3" icon opens Android App 3.
  • the desktop of the computer 100 includes an "application 1" icon 103, an "application 2" icon 104 and an “application 3" icon 105; the user can click the "application 1" icon 103 to open the Android For application 1, click on the "application 2" icon 104 to open Android application 2, and click on the "application 3" icon 105 to open Android application 3.
  • FIG. 2 shows a schematic diagram of an Android emulator running on an electronic device.
  • exemplary embodiments of the electronic device 100 include but are not limited to carrying Portable devices (such as laptops), fixed devices (such as PCs), or servers with Windows, Linux or other operating systems.
  • FIG. 2 takes the electronic device 100 equipped with a Windows system as an example.
  • the Windows system 10 supports installing and running multiple apps.
  • the emulator 11 is an App that simulates the Android operating system, and supports installation and operation of Android applications (such as video, music, smart home App, mobile games, etc.); For example, as shown in FIG. 2, Android applications such as Application 1, Application 2, and Application 3 are installed on the Android system.
  • the user can open the application 1, the application 2 or the application 3, so that the application 1, the application 2 or the application 3 runs in the Android system.
  • the application 1 window, the application 2 window or the application 3 window are respectively generated.
  • Windows System 10 manages App1 window, App2 window and App3 window.
  • the Windows system 10 receives the user's operation on the user interface (user interface, UI) in the application 1 window, the application 2 window or the application 3 window, generates an instruction corresponding to the user operation; and sends the instruction to the Android system The corresponding Android application running in the internal processing.
  • a user can record a screen of an Android application.
  • Screen recording is to record the window content (image, audio, etc.) of the Android application as a screen recording file (such as an image file, audio file, video file, etc.).
  • FIG. 3 shows a scene example of a screen recording method in an Android emulator.
  • the main interface 210 of the Android emulator includes an icon 211 of "application 1", an icon 212 of "application 2" and an icon 213 of "application 3".
  • the user can use the input device of the computer 100 to click the "application 1" icon 211 to open the application 1; click the "application 2" icon 212 to open the application 2; click the "application 3" icon 213 to open the application 3.
  • the Android emulator main interface 210 also includes a function bar 214 .
  • the function bar 214 includes a "screen recording” button, a "full screen” button, a “screen rotation” button, a “positioning” button, and the like.
  • the "screen recording” button is used to start the screen recording function; the “full screen” button is used to display the window in full screen on the computer 100 screen; the “screen rotation” button is used to rotate the display direction of the window; the “positioning” button is used to determine the device s position.
  • a "video recording” page 215 is displayed on the main interface 210 of the Android emulator.
  • the "video recording” page 215 includes an input box for inputting a storage address of the screen recording file.
  • the “Video Recording” page 215 also includes a "Start” button. In response to the user's click operation on the "Start” button, the computer 100 starts video recording.
  • the screen recorded by the computer 100 is the display content of the current focus window in the Android emulator window.
  • the recording content will also switch accordingly.
  • the user opens the application 1, and the interface 220 of the application 1 is displayed in the Android emulator window.
  • the recorded content is the content displayed on the interface 220 .
  • the user opens the application 2 again, and the interface 230 of the application 2 is displayed in the Android emulator window.
  • the recorded content is switched to the displayed content on the interface 230 . It cannot meet the needs of users to record multiple windows in the Android emulator at the same time.
  • FIG. 4A shows a scene example of a screen recording method in an Android emulator provided by an embodiment of the present application.
  • the desktop 310 of the computer 100 includes an "application 1" icon 311 , an "application 2" icon 312 and an "application 3" icon 313 .
  • the user can use the input device of the computer 100 to click the "application 1" icon 311 to open the application 1; click the "application 2" icon 312 to open the application 2; click the "application 3" icon 313 to open the application 3.
  • the computer 100 displays the interface 320 of the application 1.
  • the computer 100 displays the interface 330 of the application 2.
  • the windows carried by the interface 320 of application 1 and the interface 330 of application 2 are windows on the Windows side, and the content displayed in the window is generated by running the Android application in the emulator. content.
  • the interface 320 of application 1 includes a “menu” 321
  • the interface 330 of application 2 includes a “menu” 331 .
  • a user initiates video recording of application 1 .
  • a drop-down menu 322 is displayed on the interface 320 .
  • Pull-down menu 322 includes a "Record Screen” option.
  • the user can use the input device of the computer 100 to click the “screen recording” option in the drop-down menu 322 to start video recording of the application 1 .
  • the computer 100 starts to record the window content of the application 1 .
  • FIG. 1 Exemplarily, as shown in (d) of FIG.
  • a "video recording” toolbar 323 is displayed on the interface 320 of application 1; the "video recording” toolbar 323 includes a recording duration, a pause button, an end button, etc. Click the Pause and Continue button to pause or continue screen recording for Application 1, and stop recording for Application 1 by clicking the End button.
  • a drop-down menu 332 is displayed on the interface 330 .
  • Pull-down menu 332 includes a "Record Screen” option.
  • the user can use the input device of the computer 100 to click on the “record screen” option in the drop-down menu 332 to start video recording for the application 2 .
  • the computer 100 starts to record the window content of the application 2 .
  • video recording " tool bar 333 is displayed on the interface 330 of application 2; " video recording " tool bar 323 includes recording duration, pause button, end button etc., the user can continue by clicking pause button to pause or continue screen recording for application 2, and stop recording screen for application 2 by clicking the end button.
  • the computer 100 records the screens of the application 1 and the application 2 respectively; after a period of time, the user clicks the end button in the "video recording” toolbar 323 of the application 1 to stop recording the screen of the application 1.
  • the computer 100 displays the “Video Recording” toolbar 323 and the “Video Recording” toolbar 333 as an example. In some other examples, when the computer 100 records the screen of the application, the "video recording" toolbar may not be displayed.
  • the user can also select the name and storage path of the screen recording file.
  • the computer 100 displays a video recording page 324; storage name and storage address.
  • the video recording page 324 also includes a "Start" button.
  • the computer 100 starts to record the window content of the application 1, and the interface 320 of the application 1 displays a "video recording” toolbar.
  • FIG. 4C in response to the user's click operation on the “screen recording” option in the drop-down menu 332 , the computer 100 displays a video recording page 334 .
  • the user can input the storage name and storage address of the recorded file of the application 2 in the video recording page 334 .
  • the video recording page 334 also includes a "Start” button.
  • the computer 100 starts recording the window content of the application 2, and the interface 330 of the application 2 displays a "video recording” toolbar.
  • the user can separately record screens of multiple Android application windows.
  • the application window can be recorded whether it is a focused window or a non-focused window.
  • the multi-application screen recording method provided in the embodiment of the present application can be applied to an electronic device installed with an Android emulator.
  • electronic devices may include personal computers (personal computers, PCs), notebook computers, tablet computers, netbooks, handheld computers, smart home devices (such as smart TVs, smart screens, large screens, smart speakers, etc.), vehicle-mounted computers, etc.
  • personal computers personal computers, PCs
  • notebook computers tablet computers
  • netbooks handheld computers
  • smart home devices such as smart TVs, smart screens, large screens, smart speakers, etc.
  • vehicle-mounted computers etc.
  • the embodiment of the present application does not impose any limitation on this.
  • the foregoing electronic device may include a structure as shown in FIG. 5 .
  • the electronic device 100 includes a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, a display screen 150, an antenna, Wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, input device 180 and so on.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, a display screen 150, an antenna, Wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, input device 180 and so on.
  • USB universal serial bus
  • the structure shown in the embodiment of the present application does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or fewer components than shown in the illustrations, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (Application processor, AP), a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP) ), controller, memory, video codec, digital signal processor (digital signal processor, DSP), and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor Application processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the electronic equipment.
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled with a touch sensor, a charger, a flashlight, a camera, etc. through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor through the I2C interface, so that the processor and the touch sensor communicate through the I2C bus interface to realize the touch function of the electronic device.
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with the display screen 150, keyboard and other peripheral devices.
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the display screen 150 through a DSI interface to realize the display function of the electronic device.
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the display screen 150 , the wireless communication module 160 , the audio module 170 , the input device 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device, and can also be used to transmit data between the electronic device and peripheral devices.
  • the electronic device is connected to a peripheral input device through the interface, such as a keyboard, a mouse, etc.; it can also be used to connect an earphone to play audio through the earphone.
  • This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device.
  • the electronic device may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device. While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the external memory, the display screen 150 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the electronic device realizes the display function through the GPU, the display screen 150, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 150 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 150 is used to display images, videos and the like.
  • the display screen 150 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the display screen 150 is also referred to as a screen.
  • the wireless communication function of the electronic device can be realized through the antenna, the wireless communication module 160 and so on.
  • Antennas are used to transmit and receive electromagnetic wave signals.
  • Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless fidelity, Wi-Fi) network), bluetooth (Bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna, frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it into electromagnetic wave and radiate it through the antenna.
  • the antenna of the electronic device is coupled to the wireless communication module 160, so that the electronic device can communicate with the network and other devices through wireless communication technology.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when an electronic device selects a frequency point, a digital signal processor is used to perform Fourier transform on the frequency point energy, etc.
  • Video codecs are used to compress or decompress digital video.
  • An electronic device may support one or more video codecs.
  • the electronic device can play or record video in multiple encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the NPU is a neural-network (NN) computing processor.
  • NPU neural-network
  • Applications such as intelligent cognition of electronic devices can be realized through NPU, such as: image recognition, face recognition, speech recognition, text understanding, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the electronic device can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • the electronic device can listen to music and the like through the speaker 170A.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the input device 180 may include a keyboard, a mouse, and the like.
  • the keyboard is used to input English letters, numbers, punctuation marks, etc. into the electronic equipment, so as to issue commands to the electronic equipment, input data, etc.
  • the mouse is an indicator for positioning the vertical and horizontal coordinates of the electronic device display system, and is used to input instructions to the electronic device.
  • the input device 180 may be connected to the electronic device through a wired connection, for example, the input device 180 may be connected to the electronic device through a GPIO interface, a USB interface, or the like.
  • the input device 180 can also be connected to the electronic device in a wireless manner, for example, the input device 180 is connected to the electronic device through bluetooth, infrared and the like.
  • the user can separately record screens of multiple Android applications running on the electronic device 100 .
  • the multi-application screen recording method provided by the embodiment of the present application will be described in detail below with reference to the accompanying drawings.
  • the user can start the screen recording on the user interface (user interface, UI) of the Android application.
  • UI user interface
  • the electronic device 100 displays a video recording page 601 .
  • the user can input the storage name and storage address of the screen recording file in the video recording page 601 .
  • the video recording page 601 also includes a "Start" button.
  • the electronic device 100 starts recording the screen of the Android application.
  • the video recording page 601 includes an “image” option 602 , an “audio” option 603 and a “video” option 604 .
  • the "image” option 602 indicates that only images are recorded when the Android application is recorded
  • the "audio” option 603 indicates that only the audio is recorded when the Android application is recorded
  • the "video” option 604 indicates that images and audio are recorded when the Android application is recorded.
  • the user can select the “image” option 602 to record the image of application 1; select the “audio” option 603 to record the audio of application 1; select the “video” option 604 to record the video (including image and audio) of application 1.
  • the electronic device 100 in response to the user’s click operation on the “record screen” option in application 1, the electronic device 100 starts recording the screen of the Android application, and records the video (including images and audio) of application 1. ).
  • the electronic device receives the user's operation of starting screen recording on the user interface of the Android application (for example, receiving the user's click operation on the "Start” button in Figure 6 or receiving the user's click on the "Start” button in Figure 4A
  • the click operation of the "screen recording” option in (c) starts the screen recording, and records the window content (image, audio or video) of the Android application.
  • the electronic device generates a screen recorder (screen recorder) unit corresponding to the Android application window according to a window identity (Window ID) of the application window.
  • buttons such as “Start”, “Pause”, “Continue” and “End” are included on the UI, and the electronic device can press the “Start”, “Pause”, “Continue” or “End” button on the UI according to the user's Click operation to start, pause, continue or end the screen recording of the application window.
  • the "video recording" toolbar 323 in the interface 320 of the application 1 includes a recording duration, a pause and continue button, an end button, etc., and the user can pause the application 1 by clicking the pause and continue button. To record the screen or continue to record the screen, you can click the end button to stop the screen recording of the application 1, etc.
  • the electronic device starts image recording of an Android application according to a user operation.
  • the video producer module in the screen recorder unit obtains the corresponding display identity (display identity, Display ID) according to the window identity, and the display indicated by the display identity is associated with the Android application window, and an Android application window is presented on a display superior.
  • the electronic device captures the image data of the Android application window from the composite rendering component according to the display identifier, and sends the image data to the recorder encoder module for video encoding and packaging, generates an image file, and completes the display content of the Android application window image recording.
  • the electronic device starts audio recording of the Android application according to a user operation.
  • the audio producer module in the screen recorder unit obtains corresponding audio identifiers according to the window identifiers, and each audio identifier is associated with a different Android application window.
  • the electronic device captures the audio data of the Android application window from the audio buffer according to the audio identifier, sends the audio data to the recorder encoder module for video encoding and encapsulation, generates an audio file, and completes the audio recording of the audio content of the Android application window recording.
  • the electronic device starts video recording of the Android application according to a user operation, that is, records the image and audio of the Android application.
  • the video producer module in the screen recording (screen recorder) unit obtains the corresponding display identification according to the window identification, and the electronic device captures the image data of the Android application window from the composite rendering component according to the display identification; the screen recording (screen recorder)
  • the audio producer module in the unit obtains the corresponding audio identifier according to the window identifier, and the electronic device grabs the audio data of the Android application window from the audio buffer according to the audio identifier.
  • the recorder encoder module performs video encoding and encapsulation of image data and audio data, generates video files, and completes video recording of Android application window content.
  • FIG. 8 and FIG. 9 illustrate by taking the electronic device running two Android applications and displaying two Android application windows as an example. It can be understood that the multi-application screen recording method provided in the embodiment of the present application is also applicable to the situation of more than two Android application windows.
  • a user runs application 1 and application 2 on the electronic device.
  • the user uses the input device of the computer 100 to click the "application 1" icon 311 to open the application 1; click the "application 2" icon 312 to open the application 2.
  • an Android application includes one or more interfaces, and each interface corresponds to an Activity; an Activity includes one or more layers (layer). Each layer consists of one or more elements (controls).
  • the framework layer (framework) of Android system includes window management service (window manager service, WMS), surface composition (SurfaceFlinger) module, activity management service (Activity manager service, AMS) and so on.
  • WMS is used for window management (for example, adding a window, deleting a window, modifying a window, etc.).
  • WMS creates window 1 corresponding to application 1; when application 2 starts, WMS creates window 2 corresponding to application 2.
  • the display management unit creates a corresponding display (Display) for each window, and establishes a one-to-one correspondence between the window and the Display, that is, establishes a one-to-one correspondence between the window identity (Window ID) and the display identity (Display ID).
  • the display management unit creates display 1 for window 1 and display 2 for window 2; and establishes a correspondence between display 1 and application 1, and establishes a correspondence between display 2 and application 2.
  • the Android system sends each displayed display ID and window information (for example, window ID, number of layers, layer ID, etc.) to the host (Host).
  • Host creates a local window (native window) corresponding to each display according to the display identifier and window information.
  • the Host includes a multi-window management (multi window manager) module, which is used to implement business logic such as creation and destruction of native windows, and window operations.
  • the multi-window management module creates a local window 1 corresponding to the window 1, and creates a local window 2 corresponding to the window 2.
  • the surface synthesis (SurfaceFlinger) module is used to obtain the display data of each Android application interface from the WMS (for example, the number of layers included in the interface, and the display elements of each layer).
  • SurfaceFlinger determines the synthesis command corresponding to each display data (for example, OPENGL ES command, HWComposer command) according to the display data of the Android application interface, and transmits the synthesis command corresponding to the display data to the Host.
  • Host's synthetic rendering component (which can call the graphics rendering and synthesizing function of the Android system) converts the received synthesizing instructions into an instruction format that matches the Host operating system, uses the synthesizing instructions to perform graphic synthesis on the display data, and generates images of each layer of the Android application interface Image; and synthesize and render the images of each layer to generate the interface corresponding to the Activity.
  • the composite rendering component sends the generated interface to the corresponding native window for display according to the display identifier and window information.
  • SurfaceFlinger obtains the display data 1 of the Activity of the application 1 from the WMS, and transmits the display data 1 and the corresponding synthesis instruction to the Host.
  • the composite rendering component of the Host uses composite instructions to composite the display data 1 to generate images of each layer; and composites and renders the images of each layer to generate the interface of the application 1.
  • the composite rendering component sends the interface of application 1 to the corresponding local window 1 for display according to the display identifier and window information of application 1.
  • SurfaceFlinger obtains the display data 2 of the Activity of the application 2 from the WMS, and transmits the display data 2 and the corresponding synthesis instructions to the Host.
  • the composite rendering component of the Host uses composite instructions to composite the display data 2 to generate images of each layer; and composites and renders the images of each layer to generate the interface of the application 2.
  • the composite rendering component sends the interface of the application 2 to the corresponding local window 2 for display according to the display identifier and window information of the application 2. For example, as shown in (b) of FIG. 4A , the electronic device 100 displays an interface 320 of application 1 and an interface 330 of application 2 . The electronic device 100 displays the window of the application 1 and the window of the application 2 at the same time.
  • the user can record the window image of application 1, and can also record the window image of application 2.
  • each interface of the Android application generates an Activity in the application layer.
  • WMS assigns one or more surfaces (Surface) to each Android application, and one Activity corresponds to one Surface; WMS manages the display order, size, position, etc. of Surface.
  • the Android application draws the display data of the Activity to the Surface, puts it into the cache queue and waits for the consumer to process it.
  • SurfaceFlinger obtains the layer corresponding to the Surface from the cache queue for image synthesis; it also sends the synthesis command (OPENGL ES command or HWComposer command) corresponding to the display data to the Host.
  • the WMS assigns a window to each Android application.
  • the display management unit obtains the Display ID and window information corresponding to the window from the WMS, and obtains the display data of each layer of the Android application interface from the surface composition module.
  • the display management unit sends the Display ID, window information and interface display data to the Host.
  • Host creates a native window corresponding to each display (Display) based on the Display ID and window information. For example, display 1 corresponds to local window 1, and display 2 corresponds to local window 2.
  • the functions of the display management unit in FIG. 8 and FIG. 9 are realized by the Android system; in other embodiments, the functions of the display management unit can also be realized by an emulator.
  • the synthetic rendering component of Host includes a synthetic module and a rendering module.
  • the synthetic module uses synthetic instructions to synthesize the display data of each interface to generate images of each layer of the interface; the rendering module renders the images of each layer to generate an interface.
  • the composite rendering component sends the interface to the native window (native window) corresponding to the display (Display) for display according to the Display ID. For example, the interface corresponding to display 1 (application 1) is sent to local window 1 for display, and the interface corresponding to display 2 (application 2) is sent to local window 2 for display.
  • the user starts image recording of the application 1 window.
  • the electronic device generates a screen recording unit 1 of an application 1 .
  • Screen recording unit 1 acquires the corresponding display identifier (display 1) according to the window identifier (window 1) of application 1; grabs the color buffer (color buffer) corresponding to display 1 in the composite rendering component, obtains the interface corresponding to display 1, and Display the interface corresponding to 1 to perform video encoding and encapsulation, and generate image file 1.
  • the interface corresponding to the application window (that is, the display content of the application window) is captured in the Android system to realize directional recording of the display content of the application window.
  • the application window is a focused window or a non-focused window does not affect the recording of the displayed content of the application window.
  • the user opens application 2.
  • the window of application 2 is the focused window, and the window of application 1 changes from the focused window to the non-focused window. Image recording of the Application 1 window is not affected.
  • the user starts the image recording of the application 2.
  • the electronic device generates a screen recording unit 2 of the application 2 .
  • Screen recording unit 2 obtains the corresponding display identifier (display 2) according to the window identifier (window 2) of application 2; grabs the color buffer (color buffer) corresponding to display 2 in the composite rendering component, obtains the interface corresponding to display 2, and Display the interface corresponding to 2 to perform video encoding and encapsulation, and generate image file 2.
  • the application switches from being displayed in the foreground to running in the background.
  • the computer 100 displays an interface 320 of application 1 and an interface 330 of application 2 .
  • the user performs video recording on the application 1, and the interface 320 of the application 1 displays a "video recording" toolbar 323 .
  • the user also performs video recording on the application 2, and a "video recording” toolbar 333 is displayed on the interface 330 of the application 2.
  • the interface 320 of Application 1 includes a "minimize” button.
  • the user can click the "minimize” button on the interface 320 to switch the application 1 to run in the background.
  • the computer 100 stops displaying the interface 320 of the application 1 in response to receiving the user's operation of clicking the “minimize” button on the interface 320 .
  • the methods of changing the life cycle of Activity include creating (onCreate), starting (onStart), continuing (onResume), pausing (onPause), stopping (onStop), destroying (onDestroy), etc.
  • this application implements the above methods One of them is the state that entered the method.
  • the application is switched from being displayed in the foreground to running in the background, and the host (Windows) notifies the Android system that the application is switched to running in the background.
  • the life cycle of the Activity corresponding to the application enters onPause.
  • the Activity pauses to generate display data.
  • the activity corresponding to application 1 suspends generating the display data of application 1
  • the composite rendering component suspends sending the interface of application 1 to local window 1.
  • the screen recording unit 1 of the application 1 on the host computer suspends capturing the color buffer (color buffer) corresponding to the display 1 from the composite rendering component, that is, suspends the screen recording of the application 1.
  • the host notifies the Android system that the application switches to the foreground display.
  • the life cycle of the Activity corresponding to the application exits onPause and enters onResume.
  • the Activity continues to generate display data. For example, when application 1 switches to the foreground display, the corresponding Activity of application 1 continues to generate the display data of application 1, and the composite rendering component continues to send the interface of application 1 to local window 1. Local window 1 continues to display the interface of application 1.
  • the screen recording unit 1 of the application 1 on the host computer continues to capture the color buffer corresponding to the display 1 from the composite rendering component, that is, continues to perform screen recording on the application 1.
  • application 1 is a video playback application.
  • App 1 switches to running in the background, pauses the video playback, and pauses the screen recording of the video playback interface. App 1 switches from running in the background to displaying in the foreground, continues to play the video, and continues to record the screen of the video playback interface.
  • the host notifies the Android system that the application is switched to the background before it is judged whether the application is recording the screen, and if the application is recording the screen, the Android system is not notified that the application enters the background ( It can be understood as the operation of not notifying the change of the front and back status of the application), so as to achieve the effect that the application is not displayed on the Windows side and can still record the screen normally.
  • the composite rendering component of the host stops sending the interface of the application to the local window for display.
  • the composite rendering component stops sending the interface of application 1 to the local window 1 for display.
  • the application switches from running in the background to displaying in the foreground, and the composite rendering component of the host continues to send the interface of the application to the local window for display.
  • the composite rendering component continues to send the interface of application 1 to local window 1 for display.
  • the screen recording unit on the host can obtain the interface of the application from the composite rendering component, without affecting the screen recording of the application.
  • application 1 is a video playback application.
  • Application 1 switches to running in the background, stops displaying the video playback interface, and the video continues to play in the background, without affecting the screen recording of the video playback interface.
  • a native window is created for each Android application on the host computer; the native window corresponds to the Display in the Android system one-to-one.
  • the display data and compositing instructions corresponding to the Display in the Android system are transmitted to the host for rendering and compositing, and then sent to the corresponding native window for display.
  • When recording the application window image obtain the corresponding display identification according to the window identification of the application window, and obtain the interface of the Android application in the Android system according to the display identification, that is, obtain the image of the application window.
  • the image recording channel of each application window is independent, realizing independent image recording for each application window.
  • FIG. 11 uses the electronic device running three Android applications as an example for illustration. It can be understood that the multi-application screen recording method provided in the embodiment of the present application is also applicable to the situation of more than three Android applications.
  • multiple Android applications are run on the electronic device, including application 1, application 2 and application 3, for example.
  • each Android application starts, create an audio track (audio track) instance corresponding to the audio stream of the Android application, and register it with the AudioFlinger unit.
  • an audio track instance corresponds to an audio identifier.
  • the data of each audio track instance is transferred to AudioFlinger through the corresponding buffer queue.
  • the buffer queue is a first-in first-out (first input first output, FIFO) buffer queue.
  • AudioFlinger synthesizes the audio track instance data into an audio stream, and transmits the audio stream of each Android application to the audio mixer (audio mixer) unit for mixing, and the mixed audio is transmitted to audio hardware (such as a speaker) for playback. It can be understood that what the electronic device plays is the mixed audio of application 1, application 2 and application 3.
  • the electronic device receives the user's operation of starting audio recording, and generates a screen recording unit corresponding to the application window; and, the audio track instance corresponding to each Android application is registered to the corresponding audio listening (audio listener) unit.
  • the user starts the audio recording of application 1 to generate screen recording unit 1; the user starts the audio recording of application 2 to generate screen recording unit 2; the user starts the audio recording of application 3 to generate screen recording unit 3.
  • Each audio monitoring unit obtains the audio track instance data from the corresponding buffer queue, and transmits it to the audio buffer of the host through the audio channel.
  • Each screen recording unit obtains the corresponding audio identification according to the window identification; the audio capture module in the screen recording unit obtains the audio data corresponding to the Android application from the audio buffer according to the audio identification, and encodes and encapsulates the audio data. Generate an audio file corresponding to the Android application window.
  • the function of the audio capture (audio capture) module can be realized by the audio generation (audio producer) module in Figure 7.
  • screen recording unit 1 obtains the audio data of window 1 (application 1) from the audio buffer to generate audio file 1;
  • screen recording unit 2 obtains the audio data of window 2 (application 2) from the audio buffer to generate audio file 2;
  • the screen recording unit 3 acquires the audio data of the window 3 (application 3) from the audio buffer, and generates an audio file 3.
  • the application switches from displaying in the foreground to running in the background. In one implementation manner, the application switches from displaying in the foreground to running in the background, and the host notifies the Android system that the application switches to running in the background.
  • the app pauses generating track instance data.
  • the audio listening unit pauses sending the app's track instance data to the audio buffer.
  • the screen recording unit of the application on the host suspends obtaining the audio data of the application from the audio buffer, that is, suspends audio recording of the application.
  • the application switches from running in the background to the foreground display, and the host notifies the Android system that the application switches to the foreground display.
  • the app continues to generate track instance data.
  • the audio listening unit continues to send the app's track instance data to the audio buffer.
  • the screen recording unit of the application on the host continues to acquire the audio data of the application from the audio buffer, that is, continues to record the audio of the application.
  • a corresponding audio monitoring unit is created for the audio recording of each Android application window, and the audio monitoring unit grabs the corresponding audio data from the audio track instance of the Android application to realize Android Directed grab recording of the application audio stream.
  • Each audio monitoring unit is independent, and the audio data recording channel is independent, realizing independent audio recording for each application window.
  • the user initiates video recording of the application window.
  • the electronic device adopts the method shown in Figure 9 to obtain the display data of the Android application, adopts the method shown in Figure 11 to obtain the audio data of the Android application, and performs mixed flow of the display data and audio data to generate a video file of the Android application, so as to implement each The application window performs video recording independently.
  • the electronic device mixes images and audios of different Android applications. For example, the electronic device performs image recording on application 1 and audio recording on application 2, and mixes the window image of application 1 with the audio of application 2 to generate a video file. In this way, the user can combine images and audios of different Android applications into a new video, and can also perform soundtracks on the Android application window images, etc., thereby improving user enjoyment.
  • the electronic device mixes the image of the Android application with the audio input to the device.
  • application 1 is a video-type Android application
  • the electronic device acquires the window image of application 1; the electronic device also receives voice input from the microphone; the image in the window of application 1 is mixed with the voice input from the microphone to generate a video file.
  • the user can make soundtracks and dubbing for the images of the Android application, so as to improve the user's enjoyment of use.
  • the foregoing embodiments of the present application are described by taking an Android emulator running on an electronic device as an example.
  • the Android system and the non-Android system can run on different electronic devices.
  • the above-mentioned functions of the Android system are implemented on the first electronic device, and the above-mentioned host functions are implemented on the second electronic device.
  • the Android system and the non-Android system run on different electronic devices.
  • the specific implementation manners of the above functional modules can refer to the corresponding descriptions in the above embodiments, and will not be repeated here.
  • the above-mentioned electronic device includes corresponding hardware structures and/or software modules for performing each function.
  • the embodiments of the present application can be implemented in the form of hardware or a combination of hardware and computer software in combination with the example units and algorithm steps described in the embodiments disclosed herein. Whether a certain function is executed by hardware or computer software drives hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the embodiments of the present application.
  • the embodiments of the present application may divide the above-mentioned electronic device into functional modules according to the above-mentioned method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. It should be noted that the division of modules in the embodiment of the present application is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • FIG. 12 shows a schematic structural diagram of a possible structure of the electronic device involved in the above embodiment.
  • the electronic device 1100 includes: a processing unit 1101 , a storage unit 1102 , a display unit 1103 and an audio unit 1104 .
  • the processing unit 1101 is configured to control and manage the actions of the electronic device 1100 .
  • it can be used to acquire the display data of the Android application, and can also be used to acquire the audio data of the Android application, and/or other processing steps in the embodiment of the present application.
  • the storage unit 1102 is used for storing program codes and data of the electronic device 1100 . For example, it can be used to save image files, audio files, video files, etc.
  • the display unit 1103 is used for displaying the interface of the electronic device 1100 .
  • it can be used to display the UI of Android applications, etc.
  • the audio unit 1104 is used for the electronic device 1100 to receive audio input or play audio.
  • the unit modules in the electronic device 1100 include but are not limited to the processing unit 1101 , the storage unit 1102 , the display unit 1103 and the audio unit 1104 .
  • the electronic device 1100 may further include a power supply unit and the like. The power supply unit is used to supply power to the electronic device 1100 .
  • the processing unit 1101 may be a processor or a controller, such as a central processing unit (central processing unit, CPU), a digital signal processor (digital signal processor, DSP), an application-specific integrated circuit (application-specific integrated circuit, ASIC) ), field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof.
  • the storage unit 1102 may be a memory.
  • the display unit 1103 may be a display screen.
  • the audio unit 1104 may be a microphone, a speaker, and the like.
  • the processing unit 1101 is a processor (processor 110 as shown in FIG. 5)
  • the storage unit 1102 can be a memory (internal memory 121 as shown in FIG. 5)
  • the display unit 1103 is a display screen (as shown in FIG. 5
  • the display screen 150, the display screen 150 may be a touch screen, and the touch screen may integrate a display panel and a touch panel)
  • the audio unit 1104 is an audio module (such as the audio module 170 shown in FIG. 5 ).
  • the electronic device 1100 provided in the embodiment of the present application may be the electronic device 100 shown in FIG. 5 .
  • the above-mentioned processor, memory, display screen, speaker, microphone, etc. may be connected together, for example, through a bus.
  • the embodiment of the present application also provides a computer-readable storage medium, where computer program code is stored, and when the processor executes the computer program code, the electronic device executes the method in the foregoing embodiments.
  • the embodiment of the present application also provides a computer program product, which causes the computer to execute the method in the foregoing embodiments when the computer program product is run on the computer.
  • the electronic device 1100, the computer-readable storage medium or the computer program product provided in the embodiment of the present application are all used to execute the corresponding method provided above, therefore, the beneficial effects that it can achieve can refer to the above-mentioned The beneficial effects of the corresponding method will not be repeated here.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be Incorporation or may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, and the software product is stored in a storage medium Among them, several instructions are included to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: various media capable of storing program codes such as U disks, mobile hard disks, ROMs, magnetic disks, or optical disks.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例公开了一种多应用录屏方法及装置,涉及终端领域。电子设备接收用户在第一应用窗口内的第一操作;响应于第一操作,对第一应用窗口内容录屏;接收用户在第二应用窗口内的第二操作;响应于第二操作,对第二应用窗口内容录屏;电子设备对第二应用窗口内容录屏第一时长后,停止对第一应用窗口内容录屏。其中,第一应用和第二应用运行在安卓***内,第一应用窗口和第二应用窗口显示在非安卓***。电子设备可以对非安卓***上的安卓应用分别录屏,满足用户同时录制多个安卓应用窗口的需求。

Description

一种多应用录屏方法及装置
本申请要求于2021年06月30日提交国家知识产权局、申请号为202110738194.5、申请名称为“一种多应用录屏方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端领域,尤其涉及一种多应用录屏方法及装置。
背景技术
随着智能手机、移动网络的普及,***行业不断发展壮大,***体验向着沉浸式体验发展。然而在手机上玩游戏会存在耗电快、易发烫、手指遮挡、操作不精确等问题,影响用户游戏体验,于是安卓模拟器应运而生。安卓模拟器能够在非安卓***(比如电脑操作***)上模拟安卓操作***,实现在电脑上安装、运行、卸载安卓应用,让用户在电脑上也能体验安卓游戏和其它安卓应用。
安卓模拟器可以实现键鼠映射、游戏辅助、截图、录屏等功能,录屏是其中不可或缺的功能之一。但是,目前安卓模拟器只支持单窗口录屏。用户在安卓模拟器中打开多个窗口后,不支持多个窗口同时录制;并且,在窗口切换时,录制内容也会随之切换到新的窗口;无法满足用户同时录制多个窗口的需求。
发明内容
本申请提供一种多应用录屏方法及装置,可以实现对多个安卓应用分别录屏。
第一方面,提供一种多应用录屏方法。该方法包括:接收用户在电子设备第一应用窗口内的第一操作;响应于所述第一操作,电子设备对第一应用窗口内容录屏;接收用户在电子设备第二应用窗口内的第二操作;响应于第二操作,电子设备对第二应用窗口内容录屏;电子设备对第二应用窗口内容录屏第一时长后,停止对第一应用窗口内容录屏。其中,第一应用和第二应用运行在安卓***内,第一应用窗口和第二应用窗口显示在非安卓***。
在该方法中,运行在安卓***内的安卓应用的窗口在非安卓***(比如,
Figure PCTCN2022098273-appb-000001
Windows、Linux等)显示。电子设备可以根据用户在应用窗口的操作,启动对安卓应用窗口内容录屏。电子设备根据安卓应用窗口的窗口标识获取该安卓应用在安卓***中对应的显示标识,根据显示标识抓取该安卓应用窗口的图像数据;根据安卓应用窗口的窗口标识获取该安卓应用在安卓***中对应的音频标识,根据音频标识抓取该安卓应用的音频数据。电子设备可以根据安卓应用的图像数据和音频数据生成录屏文件,完成录屏。由于每个安卓应用的录屏通道都是独立的,可以实现对多个安卓应用分别录屏。
根据第一方面,电子设备包括第一录屏文件和第二录屏文件,其中,第一录屏文件是对第一应用窗口内容录屏生成的,第二录屏文件是对第二应用窗口内容进行录屏生成的。
在该方法中,每个安卓应用的录屏通道都是独立的。在录屏时,每个安卓应用生成对应的录屏文件。对多个安卓应用同时录屏,相应的生成多个录屏文件。
根据第一方面,或者以上第一方面的任意一种实现方式,非安卓***包括Windows***,该方法还包括:安卓***创建第一应用窗口对应的第一显示,创建第二应用窗口对应的第二显示;Windows***根据第一显示的图像数据生成第一应用的界面;根据第一应用的界面生成第一录屏文件;Windows***还根据第二显示的图像数据生成第二应用的界面;根据第二应用的界面生成第二录屏文件。
在该方法中,安卓***为每个应用窗口创建对应的显示,每个应用生成的窗口数据关联到一个显示(用显示标识指示),录屏时根据显示标识获取该应用对应的显示的图像数据,即可实现对该应用的图像录制。
根据第一方面,或者以上第一方面的任意一种实现方式,Windows***根据第一显示的图像数据生成第一应用的界面包括:Windows***从安卓***接收第一显示的图像数据,和第一显示的图像数据的第一合成指令;根据第一合成指令获取指令格式与Windows***匹配的第一Windows合成指令,采用第一Windows合成指令对第一显示的图像数据进行合成和渲染,生成第一应用的界面。
根据第一方面,或者以上第一方面的任意一种实现方式,Windows***根据第二显示的图像数据生成第二应用的界面包括:Windows***从安卓***接收第二显示的图像数据,和第二显示的图像数据的第二合成指令;根据第二合成指令获取指令格式与Windows***匹配的第二Windows合成指令,采用第二Windows合成指令对第二显示的图像数据进行合成和渲染,生成第二应用的界面。
在该方法中,Windows***根据应用对应的显示的图像数据生成该应用的界面。
根据第一方面,或者以上第一方面的任意一种实现方式,该方法还包括:Windows***根据第一应用窗口标识获取第一显示标识,根据第一显示标识获取第一应用的界面;Windows***根据第二应用窗口标识获取第二显示标识,根据第二显示标识获取第二应用的界面。
根据第一方面,或者以上第一方面的任意一种实现方式,该方法还包括:Windows***创建第一显示对应的第一本地窗口,在第一本地窗口显示第一应用的界面;Windows***创建第二显示对应的第二本地窗口,在第二本地窗口显示第二应用的界面。即可以实现同时显示第一应用窗口和第二应用窗口。
根据第一方面,或者以上第一方面的任意一种实现方式,如果第一应用切换到后台运行,响应于第一应用切换到后台运行,Windows***停止根据第一显示标识获取第一应用的界面。即停止显示第一应用,且停止对第一应用进行图像录制。安卓***停止生成第一显示的图像数据,并停止向Windows***发送第一显示的图像数据。
根据第一方面,或者以上第一方面的任意一种实现方式,非安卓***包括Windows***,该方法还包括:安卓***创建第一应用窗口对应的第一音轨实例,创建第二应用窗口对应的第二音轨实例;Windows***根据第一应用窗口标识获取第一音频标识;第一音频标识用于指示第一音轨实例;Windows***根据第一音频标识获取第一音轨实例数据,根据第一音轨实例数据生成第一录屏文件;Windows***根据第二应用窗口标识获取第二音频标识;第二音频标识用于指示第二音频实例;Windows***根据第二音频标识获取第二音轨实例数据;根据第二音轨实例数据生成第二录屏文件。
在该方法中,安卓***为每个应用窗口生成对应的音轨实例(用音频标识指示),录屏时根据音频标识获取该应用对应的音轨实例数据,即可实现对该应用的音频录制。
根据第一方面,或者以上第一方面的任意一种实现方式,如果第一应用切换到后台 运行,响应于第一应用切换到后台运行,安卓***停止生成第一音轨实例数据;Windows***停止根据第一音频标识获取第一音轨实例数据,并停止根据第一音轨实例数据生成第一录屏文件。这样就可以实现,应用切换到后台运行时,停止音频录制。
根据第一方面,或者以上第一方面的任意一种实现方式,电子设备对第一应用窗口内容录屏包括:电子设备录制第一应用窗口图像;或者,电子设备录制第一应用的音频;或者,电子设备录制第一应用窗口图像和音频。
在该方法中,电子设备可以根据安卓应用窗口的窗口标识获取该安卓应用在安卓***中对应的显示标识,根据显示标识抓取该安卓应用窗口的图像数据,并根据图像数据生成图像文件,完成对安卓应用窗口显示内容的图像录制。电子设备可以根据安卓应用窗口的窗口标识获取该安卓应用在安卓***中对应的音频标识,根据音频标识抓取该安卓应用的音频数据,并根据音频数据生成音频文件,完成对安卓应用窗口的音频录制。电子设备可以根据安卓应用的图像数据和音频数据生成视频文件,完成对安卓应用窗口的视频录制。也就是说,电子设备可以仅录制安卓应用的图像,或仅录制安卓应用的音频,或录制安卓应用的图像和音频。
根据第一方面,或者以上第一方面的任意一种实现方式,电子设备对第二应用窗口内容录屏包括:电子设备录制第二应用窗口图像。
在该方法中,电子设备可以同时分别录制第一应用窗口图像和第二应用窗口图像;或者录制第一应用窗口音频时录制第二应用窗口图像(还可以进一步将第一应用窗口音频与第二应用窗口图像合并为新的视频文件);或者录制第一应用窗口视频时录制第二应用窗口图像。
在一种实现方式中,电子设备还可以接收麦克风输入,将录制的安卓应用窗口图像与麦克风的语音输入合并为视频文件;实现为安卓应用图像配乐、配音等,提高用户使用乐趣。
根据第一方面,或者以上第一方面的任意一种实现方式,电子设备对第二应用窗口内容录屏包括:电子设备录制第二应用的音频。
在该方法中,电子设备可以在录制第一应用窗口图像时录制第二应用的音频(还可以进一步将第一应用窗口图像与第二应用音频合并为新的视频文件);或者同时分别录制第一应用音频与第二应用音频;或者录制第一应用窗口视频时录制第二应用音频。
根据第一方面,或者以上第一方面的任意一种实现方式,电子设备对第二应用窗口内容录屏包括:电子设备录制第二应用窗口图像和音频。
在该方法中,电子设备可以在录制第一应用窗口图像时录制第二应用的视频;或者同时分别录制第一应用视频与第二应用视频;或者录制第一应用窗口音频时录制第二应用视频。
根据第一方面,或者以上第一方面的任意一种实现方式,该方法还包括:电子设备显示第一应用窗口,并显示第二应用窗口。
这样,电子设备同时显示第一应用窗口和第二应用窗口,实现显示多安卓应用窗口。
第二方面,提供一种电子设备。该电子设备包括:处理器;存储器;显示屏;以及计算机程序,其中所述计算机程序存储在所述存储器上,当所述计算机程序被所述处理器执行时,使得所述电子设备执行以下步骤:接收用户在电子设备第一应用窗口内的第一操作;响应于所述第一操作,电子设备对第一应用窗口内容录屏;接收用户在电子设备第二应用窗口内的第二操作;响应于第二操作,电子设备对第二应用窗口内容录屏; 电子设备对第二应用窗口内容录屏第一时长后,停止对第一应用窗口内容录屏。其中,第一应用和第二应用运行在安卓***内,第一应用窗口和第二应用窗口显示在非安卓***。
根据第二方面,电子设备包括第一录屏文件和第二录屏文件,其中,第一录屏文件是对第一应用窗口内容录屏生成的,第二录屏文件是对第二应用窗口内容进行录屏生成的。
根据第二方面,或者以上第二方面的任意一种实现方式,电子设备对第一应用窗口内容录屏包括:电子设备录制第一应用窗口图像;或者,电子设备录制第一应用的音频;或者,电子设备录制第一应用窗口图像和音频。
根据第二方面,或者以上第二方面的任意一种实现方式,电子设备对第二应用窗口内容录屏包括:电子设备录制第二应用窗口图像。
根据第二方面,或者以上第二方面的任意一种实现方式,电子设备对第二应用窗口内容录屏包括:电子设备录制第二应用的音频。
根据第二方面,或者以上第二方面的任意一种实现方式,电子设备对第二应用窗口内容录屏包括:电子设备录制第二应用窗口图像和音频。
根据第二方面,或者以上第二方面的任意一种实现方式,当所述计算机程序被所述处理器执行时,还使得所述电子设备执行以下步骤:电子设备显示第一应用窗口,并显示第二应用窗口。
第二方面以及第二方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第三方面,提供一种计算机可读存储介质。计算机可读存储介质存储有计算机程序(也可称为指令或代码),当该计算机程序被电子设备执行时,使得电子设备执行第一方面或第一方面中任意一种实施方式的方法。
第三方面以及第三方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第四方面,提供一种计算机程序产品。当计算机程序产品在电子设备上运行时,使得电子设备执行第一方面或第一方面中任意一种实施方式的方法。
第四方面以及第四方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
第五方面,提供一种芯片***。芯片***包括处理器和接口电路。接口电路用于执行收发功能,并将指令发送给处理器。当所述指令被处理器执行时,使得处理器执行第一方面或第一方面中任意一种实施方式的方法。
第五方面以及第五方面中任意一种实现方式所对应的技术效果,可参见上述第一方面及第一方面中任意一种实现方式所对应的技术效果,此处不再赘述。
附图说明
图1A为一种安卓模拟器运行场景实例示意图;
图1B为一种安卓模拟器运行场景实例示意图;
图2为一种电子设备上运行安卓模拟器的架构示意图;
图3为一种安卓模拟器中录屏方法场景实例示意图;
图4A为本申请提供的一种多应用录屏方法场景实例示意图;
图4B为本申请提供的一种多应用录屏方法场景实例示意图;
图4C为本申请提供的一种多应用录屏方法场景实例示意图;
图5为本申请提供的一种多应用录屏方法所适用的电子设备的硬件结构示意图;
图6为本申请提供的一种多应用录屏方法场景实例示意图;
图7为本申请提供的一种多应用录屏方法示意图;
图8为本申请提供的一种多应用录屏方法示意图;
图9为本申请提供的一种多应用录屏方法示意图;
图10为本申请提供的一种多应用录屏方法场景实例示意图;
图11为本申请提供的一种多应用录屏方法示意图;
图12为本申请提供的一种电子设备的结构组成示意图。
具体实施方式
以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个或两个以上(包含两个)。术语“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。术语“连接”包括直接连接和间接连接,除非另外说明。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。
在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
安卓模拟器能够在非安卓***上模拟安卓操作***。用户可以在安卓操作***中安装、运行、卸载安卓应用。示例性的,电脑100搭载非安卓操作***;比如,Windows,Linux等。电脑100的操作***上安装安卓模拟器。用户可以在电脑100上运行安卓模拟器,实现在电脑100上安装、运行、卸载安卓应用。
在一种示例中,如图1A所示,电脑100的桌面包括“模拟器”图标101。用户可以使用电脑100的输入设备(比如鼠标)点击“模拟器”图标101运行安卓模拟器。响应于用户对“模拟器”图标101的点击操作,电脑100显示安卓模拟器主界面102。安卓 模拟器主界面102包括“应用1”图标,“应用2”图标和“应用3”图标;用户可以点击“应用1”图标打开安卓应用1,点击“应用2”图标打开安卓应用2,点击“应用3”图标打开安卓应用3。
在另一种示例中,如图1B所示,电脑100的桌面包括“应用1”图标103,“应用2”图标104和“应用3”图标105;用户可以点击“应用1”图标103打开安卓应用1,点击“应用2”图标104打开安卓应用2,点击“应用3”图标105打开安卓应用3。
示例性的,图2示出一种电子设备上运行安卓模拟器的架构示意图。如图2所示,电子设备100的示例性实施例包括但不限于搭载
Figure PCTCN2022098273-appb-000002
Windows、Linux或者其它操作***的便携设备(比如笔记本电脑),固定设备(比如PC),或服务器等。图2以电子设备100搭载Windows***为例。该Windows***10支持安装、运行多个App。其中,模拟器11是模拟安卓操作***的App,支持安装、运行安卓应用(比如视频,音乐,智能家居App,***等);实现在电子设备100上运行安卓应用。比如,如图2所示,安卓***上安装应用1、应用2、应用3等安卓应用。用户可以打开应用1、应用2或应用3,使得应用1、应用2或应用3在安卓***内运行。应用1、应用2或应用3运行时分别生成应用1窗口、应用2窗口或应用3窗口。Windows***10管理应用1窗口、应用2窗口和应用3窗口。在一些实施例中,Windows***10接收用户在应用1窗口、应用2窗口或应用3窗口内用户界面(user interface,UI)上的操作,生成用户操作对应的指令;并将指令发送给安卓***内运行的对应安卓应用进行处理。
在一种示例中,用户可以对安卓应用进行录屏。录屏,即将安卓应用的窗口内容(图像、音频等)录制为录屏文件(比如,图像文件、音频文件、视频文件等)。
在一种示例中,图3示出了一种安卓模拟器中录屏方法场景实例。如图3的(a)所示,安卓模拟器主界面210包括“应用1”图标211,“应用2”图标212和“应用3”图标213。用户可以使用电脑100的输入设备点击“应用1”图标211打开应用1;点击“应用2”图标212打开应用2;点击“应用3”图标213打开应用3。安卓模拟器主界面210还包括功能栏214。功能栏214包括“录屏”按钮,“全屏”按钮,“屏幕旋转”按钮,“定位”按钮等。“录屏”按钮用于启动录屏功能;“全屏”按钮用于将窗口在电脑100屏幕上全屏显示;“屏幕旋转”按钮用于旋转窗口的显示方向;“定位”按钮用于确定本设备的位置。示例性的,响应于用户对“录屏”按钮的点击操作,如图3的(b)所示,安卓模拟器主界面210上显示“视频录制”页面215。“视频录制”页面215包括输入框,用于输入录屏文件的存储地址。“视频录制”页面215还包括“开始”按钮。响应于用户对“开始”按钮的点击操作,电脑100开始进行视频录制。在一种实现方式中,电脑100录制的画面是安卓模拟器窗口内当前焦点窗口的显示内容。焦点窗口改变,录制内容也随之切换。示例性的,如图3的(c)所示,用户打开应用1,安卓模拟器窗口内显示应用1的界面220。录制内容为界面220显示内容。如图3的(d)所示,用户再打开应用2,安卓模拟器窗口内显示应用2的界面230。录制内容切换为界面230显示内容。无法满足用户同时录制安卓模拟器内多个窗口的需求。
本申请实施例提供一种多应用录屏方法,可以实现对多个安卓应用分别录屏。图4A示出了本申请实施例提供的一种安卓模拟器中录屏方法的场景实例。示例性的,如图4A的(a)所示,电脑100的桌面310包括“应用1”图标311,“应用2”图标312和“应用3”图标313。用户可以使用电脑100的输入设备点击“应用1”图标311打开应用1;点击“应用2”图标312打开应用2;点击“应用3”图标313打开应用3。比如,响应 于用户对“应用1”图标311的点击操作,如图4A的(b)所示,电脑100显示应用1的界面320。响应于用户对“应用2”图标312的点击操作,电脑100显示应用2的界面330。示例性的,如图4A的(b)所示,应用1的界面320与应用2的界面330承载的窗口均为Windows侧的窗口,窗口内显示的内容为模拟器内的安卓应用运行产生的内容。应用1的界面320包括“菜单”321,应用2的界面330包括“菜单”331。在一种示例中,用户启动对应用1的视频录制。如图4A的(c)所示,响应于用户对“菜单”321的点击操作,界面320上显示下拉菜单322。下拉菜单322包括“录屏”选项。用户可以使用电脑100的输入设备点击下拉菜单322中“录屏”选项,启动对应用1的视频录制。响应于用户对下拉菜单322中“录屏”选项的点击操作,电脑100开始对应用1的窗口内容进行录制。示例性的,如图4A的(d)所示,应用1的界面320上显示“视频录制”工具栏323;“视频录制”工具栏323包括录制时长、暂停继续按钮、结束按钮等,用户可以通过点击暂停继续按钮对应用1暂停录屏或继续录屏,可以通过点击结束按钮对应用1停止录屏等。
在应用1录屏过程中,用户还可以启动对应用2的视频录制。示例性的,响应于用户对“菜单”331的点击操作,如图4A的(e)所示,界面330上显示下拉菜单332。下拉菜单332包括“录屏”选项。用户可以使用电脑100的输入设备点击下拉菜单332中“录屏”选项,启动对应用2的视频录制。响应于用户对下拉菜单332中“录屏”选项的点击操作,电脑100开始对应用2的窗口内容进行录制。如图4A的(f)所示,应用2的界面330上显示“视频录制”工具栏333;“视频录制”工具栏323包括录制时长、暂停继续按钮、结束按钮等,用户可以通过点击暂停继续按钮对应用2暂停录屏或继续录屏,可以通过点击结束按钮停止对应用2录屏等。示例性的,电脑100对应用1和应用2分别进行录屏;一段时间后,用户点击应用1“视频录制”工具栏323中结束按钮,停止对应用1录屏。需要说明的是,图4A以电脑100显示“视频录制”工具栏323和“视频录制”工具栏333为例。在另一些示例中,电脑100对应用进行录屏时,可以不显示“视频录制”工具栏。
可选的,在一些示例中,用户还可以选择录屏文件的名称和存储路径。示例性的,如图4B所示,响应于用户对下拉菜单322中“录屏”选项的点击操作,电脑100显示视频录制页面324;用户可以在视频录制页面324中输入应用1录屏文件的存储名称和存储地址。视频录制页面324还包括“开始”按钮。响应于用户对视频录制页面324中“开始”按钮的点击操作,电脑100开始对应用1的窗口内容进行录制,应用1的界面320上显示“视频录制”工具栏。如图4C所示,响应于用户对下拉菜单332中“录屏”选项的点击操作,电脑100显示视频录制页面334。用户可以在视频录制页面334中输入应用2录制文件的存储名称和存储地址。视频录制页面334还包括“开始”按钮。响应于用户对视频录制页面334中“开始”按钮的点击操作,电脑100开始对应用2的窗口内容进行录制,应用2的界面330上显示“视频录制”工具栏。
如图4A-图4C所示,本申请实施例提供的多应用录屏方法,用户可以对多个安卓应用窗口分别录屏。应用窗口是焦点窗口或非焦点窗口都可以录制。
本申请实施例提供的多应用录屏方法可以应用于安装了安卓模拟器的电子设备。比如,电子设备可以包括个人电脑(personal computer,PC)、笔记本电脑、平板电脑、上网本、手持计算机、智能家居设备(比如,智能电视、智慧屏、大屏、智能音箱等)、车载电脑等,本申请实施例对此不做任何限制。
在一种示例中,上述电子设备可以包括如图5所示结构。
电子设备100包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,显示屏150,天线,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,输入装置180等。
可以理解的是,本申请实施例示意的结构并不构成对该电子设备的具体限定。在本申请另一些实施例中,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(Application processor,AP),图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了***的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器,充电器,闪光灯,摄像头等。例如:处理器110可以通过I2C接口耦合触摸传感器,使处理器与触摸传感器通过I2C总线接口通信,实现电子设备的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常 被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏150,键盘等***器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和显示屏150通过DSI接口通信,实现电子设备的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与显示屏150,无线通信模块160,音频模块170,输入装置180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备充电,也可以用于电子设备与***设备之间传输数据。比如,电子设备通过该接口连接***输入设备,例如键盘、鼠标等;也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备的结构限定。在本申请另一些实施例中,电子设备也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏150,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备通过GPU,显示屏150,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏150和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏150用于显示图像,视频等。显示屏150包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。本申请实施例中,显示屏150也被称为屏幕。
电子设备的无线通信功能可以通过天线,无线通信模块160等实现。
天线用于发射和接收电磁波信号。电子设备中的每个天线可用于覆盖单个或多个通 信频带。不同的天线还可以复用,以提高天线的利用率。
无线通信模块160可以提供应用在电子设备上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导航卫星***(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。
在一些实施例中,电子设备的天线和无线通信模块160耦合,使得电子设备可以通过无线通信技术与网络以及其他设备通信。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备可以支持一种或多种视频编解码器。这样,电子设备可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作***,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备可以通过扬声器170A收听音乐等。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
输入装置180可以包括键盘、鼠标器等。键盘用于将英文字母、数字、标点符号等输入电子设备,从而向电子设备发出命令,输入数据等。鼠标器是电子设备显示***纵横坐标定位的指示器,用于向电子设备输入指令等。其中,输入装置180可以通过有线连接方式连接电子设备,比如,输入装置180通过GPIO接口、USB接口等连接电子设备。输入装置180还可以通过无线方式连接电子设备,比如,输入装置180通过蓝牙、红外等方式连接电子设备。
本申请实施例提供的多应用录屏方法,用户可以对电子设备100上运行的多个安卓应用分别录屏。下面结合附图对本申请实施例提供的多应用录屏方法进行详细介绍。
用户可以在安卓应用的用户界面(user interface,UI)上启动录屏。示例性的,如图6所示,响应于用户对应用1中“录屏”选项的点击操作,电子设备100显示视频录制页面601。用户可以在视频录制页面601中输入录屏文件的存储名称和存储地址。视频录制页面601还包括“开始”按钮。响应于用户对“开始”按钮的点击操作,电子设备100开始启动对该安卓应用录屏。在一种示例中,视频录制页面601包括“图像”选项602,“音频”选项603和“视频”选项604。“图像”选项602表示对安卓应用录屏时仅录制图像,“音频”选项603表示对安卓应用录屏时仅录制音频,“视频”选项604表示对安卓应用录屏时录制图像和音频。用户可以选择“图像”选项602,录制应用1的图像;选择“音频”选项603,录制应用1的音频;选择“视频”选项604录制应用1的视频(包括图像和音频)。在另一些示例中,如图4A所示,响应于用户对应用1中“录屏”选项的点击操作,电子设备100开始启动对该安卓应用录屏,录制应用1的视频(包括图像和音频)。
可以理解的,对不同应用进行录屏时,可以选择不同录制类型。比如,对应用1录屏时,选择“图像”选项602,录制应用1的图像;对应用2录屏时,选择“音频”选项603,录制应用2的音频。示例性的,电子设备录制应用1的图像时,录制应用2的音频。再比如,对应用1录屏时,选择“图像”选项602,录制应用1的图像;对应用2录屏时,选择“视频”选项604,录制应用2的图像和音频。电子设备录制应用1的图像时,录制应用2的图像和音频。
示例的,如图7所示,电子设备接收用户在安卓应用的用户界面上启动录屏的操作(比如,接收到用户对图6中“开始”按钮的点击操作或者接收到用户对图4A的(c)中“录屏”选项的点击操作),启动录屏,对该安卓应用的窗口内容(图像、音频或视频)进行录制。电子设备根据应用窗口的窗口标识(window identity,Window ID)生成安卓应用窗口对应的屏幕录制(screen recorder)单元。可选的,UI上包括“开始”、“暂停”、“继续”、“结束”等按钮,电子设备可以根据用户在UI上对“开始”、“暂停”、“继续”或“结束”按钮的点击操作,开始、暂停、继续或结束对该应用窗口的录屏。示例性的,如图4A的(d)所示,应用1的界面320内“视频录制”工具栏323包括录制时长、暂停继续按钮、结束按钮等,用户可以通过点击暂停继续按钮对应用1暂停录屏或继续录屏,可以通过点击结束按钮对应用1停止录屏等。
在一种示例中,电子设备根据用户操作启动对安卓应用的图像录制。屏幕录制(screen recorder)单元中视频生成(video producer)模块根据窗口标识获取对应的显示标识(display identity,Display ID),显示标识指示的显示与安卓应用窗口关联,一个安卓应用窗口呈现于一个显示上。电子设备根据显示标识从合成渲染组件中抓取该安卓应用窗口的图像数据,并将图像数据发送给记录编码(recorder encoder)模块进行视频编码和封装,生成图像文件,完成对安卓应用窗口显示内容的图像录制。
在一种示例中,电子设备根据用户操作启动对安卓应用的音频录制。屏幕录制(screen recorder)单元中音频生成(audio producer)模块根据窗口标识获取对应的音频标识,每一个音频标识分别与不同的一个安卓应用窗口关联。电子设备根据音频标识从音频缓冲区抓取该安卓应用窗口的音频数据,将音频数据发送给记录编码(recorder encoder)模块进行视频编码和封装,生成音频文件,完成对安卓应用窗口音频内容的音频录制。
在一种示例中,电子设备根据用户操作启动对安卓应用的视频录制,即录制安卓应用的图像和音频。屏幕录制(screen recorder)单元中视频生成(video producer)模块根据窗口标识获取对应的显示标识,电子设备根据显示标识从合成渲染组件中抓取该安卓应用窗口的图像数据;屏幕录制(screen recorder)单元中音频生成(audio producer)模块根据窗口标识获取对应的音频标识,电子设备根据音频标识从音频缓冲区抓取该安卓应用窗口的音频数据。记录编码(recorder encoder)模块对图像数据和音频数据进行视频编码和封装,生成视频文件,完成对安卓应用窗口内容的视频录制。
下面结合图8和图9对电子设备生成图像文件的过程进行详细介绍。
需要说明的是,图8和图9以电子设备运行两个安卓应用,显示两个安卓应用窗口为例进行说明。可以理解的,本申请实施例提供的多应用录屏方法同样适用于两个以上安卓应用窗口的情况。
在一种示例中,用户在电子设备上运行应用1和应用2。示例性的,如图4A的(a)所示,用户使用电脑100的输入设备点击“应用1”图标311打开应用1;点击“应用2”图标312打开应用2。
如图8所示,应用1和应用2启动运行,在安卓***内分别生成应用1和应用2对应的活动(Activity)。其中,一个安卓应用包括一个或多个界面,每个界面对应一个Activity;一个Activity包括一个或多个图层(layer)。每个图层包括一个或多个元素(控件)。
安卓***的框架层(framework)包括窗口管理服务(window manager service,WMS),表面合成(SurfaceFlinger)模块,活动管理服务(Activity manager service,AMS)等。
WMS用于窗口管理(比如,新增窗口、删除窗口、修改窗口等)。应用1启动,WMS创建应用1对应的窗口1;应用2启动,WMS创建应用2对应的窗口2。显示管理单元为每个窗口创建对应的显示(Display),并建立窗口与Display的一一对应关系,即建立窗口标识(window identity,Window ID)与显示标识(display identity,Display ID)的一一对应关系。比如,显示管理单元为窗口1创建显示1,为窗口2创建显示2;并建立显示1与应用1的对应关系,建立显示2与应用2的对应关系。
安卓***将每个显示的显示标识以及窗口信息(比如,窗口标识,图层个数,图层标识等)发送给主机(Host)。Host根据显示标识和窗口信息创建每个显示对应的本地窗口(native window)。示例性的,Host包括多窗口管理(multi window manager)模块,用于实现native window的创建、销毁以及窗口操作等业务逻辑。比如,多窗口管理模块 创建窗口1对应的本地窗口1,创建窗口2对应的本地窗口2。
表面合成(SurfaceFlinger)模块用于从WMS获取各个安卓应用界面的显示数据(比如,界面包括的图层个数,每个图层的显示元素)。SurfaceFlinger根据安卓应用界面的显示数据确定每个显示数据对应的合成指令(比如,OPENGL ES指令,HWComposer指令),并将显示数据对应的合成指令传输给Host。
Host的合成渲染组件(可以调用安卓***的图形渲染合成功能)将接收的合成指令转化为与Host操作***匹配的指令格式,采用合成指令对显示数据进行图形合成,生成安卓应用界面各个图层的图像;并对各个图层的图像进行合成和渲染,生成Activity对应的界面。合成渲染组件根据显示标识和窗口信息将生成的界面送到对应的native window进行显示。示例性的,SurfaceFlinger从WMS获取应用1的Activity的显示数据1,将显示数据1和对应的合成指令传输给Host。Host的合成渲染组件采用合成指令对显示数据1进行图形合成,生成各个图层的图像;并将各个图层图像进行合成渲染,生成应用1的界面。合成渲染组件根据应用1的显示标识和窗口信息,将应用1的界面送至对应的本地窗口1进行显示。SurfaceFlinger从WMS获取应用2的Activity的显示数据2,将显示数据2和对应的合成指令传输给Host。Host的合成渲染组件采用合成指令对显示数据2进行图形合成,生成各个图层的图像;并将各个图层图像进行合成渲染,生成应用2的界面。合成渲染组件根据应用2的显示标识和窗口信息,将应用2的界面送至对应的本地窗口2进行显示。比如,如图4A的(b)所示,电子设备100显示应用1的界面320和应用2的界面330。电子设备100同时显示应用1的窗口和应用2的窗口。
用户可以录制应用1的窗口图像,还可以录制应用2的窗口图像。在一种示例中,如图9所示,安卓应用启动后,安卓应用的每个界面在应用层对应产生一个Activity。WMS为每个安卓应用分配一个或多个表面(Surface),其中一个Activity对应一个Surface;WMS管理Surface的显示顺序、尺寸、位置等。安卓应用作为生产者将Activity的显示数据绘制到Surface,放入缓存队列等待消费者处理。SurfaceFlinger作为消费者从缓存队列获取Surface对应的图层,进行图像合成;还将显示数据对应的合成指令(OPENGL ES指令或HWComposer指令)发送给Host。
WMS为每个安卓应用分配一个窗口。显示管理单元从WMS获取该窗口对应的Display ID和窗口信息,从表面合成模块获取安卓应用界面各个图层的显示数据。显示管理单元将Display ID,窗口信息和界面显示数据发送给Host。Host根据Display ID和窗口信息创建每个显示(Display)对应的native window。比如,显示1对应本地窗口1,显示2对应本地窗口2。需要说明的是,图8和图9中显示管理单元的功能由安卓***实现;在另一些实施例中,显示管理单元的功能也可以由模拟器实现。
Host的合成渲染组件包括合成模块和渲染模块,合成模块采用合成指令对每个界面的显示数据进行图形合成,生成界面各个图层的图像;渲染模块对各个图层的图像进行渲染,生成界面。合成渲染组件根据Display ID将界面送到显示(Display)对应的本地窗口(native window)进行显示。比如,将显示1(应用1)对应的界面送至本地窗口1进行显示,将显示2(应用2)对应的界面送至本地窗口2进行显示。
用户启动应用1窗口的图像录制。电子设备生成应用1的屏幕录制单元1。屏幕录制单元1根据应用1的窗口标识(窗口1)获取对应的显示标识(显示1);在合成渲染组件中抓取显示1对应的颜色缓存(color buffer),获取显示1对应的界面,对显示1对应的界面进行视频编码和封装,生成图像文件1。
在图像录制过程中,通过应用窗口的窗口标识与显示标识之间一一对应关系,在安卓***中抓取应用窗口对应的界面(即应用窗口显示内容),实现定向录制应用窗口显示内容。该应用窗口是焦点窗口或非焦点窗口都不影响录制该应用窗口显示内容。比如,在对应用1的窗口录制过程中,用户打开应用2。应用2的窗口是焦点窗口,应用1的窗口由焦点窗口变为非焦点窗口。应用1窗口的图像录制不受影响。
在一种示例中,在应用1录制过程中,用户启动应用2的图像录制。电子设备生成应用2的屏幕录制单元2。屏幕录制单元2根据应用2的窗口标识(窗口2)获取对应的显示标识(显示2);在合成渲染组件中抓取显示2对应的颜色缓存(color buffer),获取显示2对应的界面,对显示2对应的界面进行视频编码和封装,生成图像文件2。
在一些实施例中,对应用进行图像录制过程中,应用从前台显示切换到后台运行。示例性的,如图10所示,电脑100显示应用1的界面320,和应用2的界面330。用户对应用1进行视频录制,应用1的界面320上显示“视频录制”工具栏323。用户还对应用2进行视频录制,应用2的界面330上显示“视频录制”工具栏333。应用1的界面320上包括“最小化”按钮。用户可以点击界面320上“最小化”按钮,使应用1切换到后台运行。示例性的,电脑100响应于接收到用户点击界面320上“最小化”按钮的操作,停止显示应用1的界面320。
改变Activity的生命周期的方法包括创建(onCreate)、开始(onStart)、继续(onResume)、暂停(onPause)、停止(onStop)、销毁(onDestroy)等,为了方便理解,本申请对执行了上述方法中的一个即为进入了该方法的状态。
在一种实现方式中,应用从前台显示切换到后台运行,主机(Windows)通知安卓***该应用切换到后台运行。应用对应的Activity的生命周期进入onPause。在onPause生命周期内,该Activity暂停生成显示数据。比如,应用1切换到后台运行,应用1对应的Activity暂停生成应用1的显示数据,合成渲染组件暂停向本地窗口1送应用1的界面。主机上应用1的屏幕录制单元1暂停从合成渲染组件中抓取显示1对应的颜色缓存(color buffer),即暂停对应用1的屏幕录制。如果应用从后台运行切换到前台显示,主机通知安卓***该应用切换到前台显示。应用对应的Activity的生命周期退出onPause,进入onResume。该Activity继续生成显示数据。比如,应用1切换到前台显示,应用1对应的Activity继续生成应用1的显示数据,合成渲染组件继续向本地窗口1送应用1的界面。本地窗口1继续显示应用1的界面。主机上应用1的屏幕录制单元1继续从合成渲染组件中抓取显示1对应的颜色缓存(color buffer),即继续对应用1进行屏幕录制。比如,应用1是视频播放应用。应用1切换到后台运行,暂停播放视频,并且暂停对视频播放界面录屏。应用1从后台运行切换为前台显示,继续播放视频,并继续对视频播放界面录屏。
在另一种可能的方式中,主机(Windows)通知安卓***该应用切换到后台运行前,判断该应用是否正在进行录屏,如果该应用正在进行录屏,不通知安卓***该应用进入后台(即可理解为不做通知该应用前后台状态变化的操作),从而实现该应用不在Windows侧显示,仍然能够正常录屏的效果。
在另一种实现方式中,应用从前台显示切换到后台运行,主机的合成渲染组件停止将该应用的界面送到本地窗口显示。比如,应用1切换到后台运行,合成渲染组件停止将应用1的界面送到本地窗口1显示。应用从后台运行切换到前台显示,主机的合成渲染组件继续将该应用的界面送到本地窗口显示。比如,应用1切换到前台显示,合成渲 染组件继续将应用1的界面送到本地窗口1显示。当应用在后台运行时,主机上屏幕录制单元可以从合成渲染组件获取该应用的界面,不影响对该应用进行屏幕录制。比如,应用1是视频播放应用。应用1切换到后台运行,停止显示播放视频界面,视频继续在后台播放,不影响对该视频播放界面进行录屏。
在本申请实施例提供的多应用录屏方法中,在主机上为每个安卓应用创建一个native window;native window与安卓***中Display一一对应。将安卓***中Display(显示)对应的显示数据与合成指令传输到主机进行渲染合成,并送到对应的native window进行显示。录制应用窗口图像时,根据应用窗口的窗口标识获取对应的显示标识,并根据显示标识在安卓***中获取安卓应用的界面,即获取应用窗口的图像。每个应用窗口的图像录制通道都是独立的,实现对每个应用窗口独立进行图像录制。
下面结合图11对电子设备生成音频文件的过程进行详细介绍。
需要说明的是,图11以电子设备运行三个安卓应用为例进行说明。可以理解的,本申请实施例提供的多应用录屏方法同样适用于三个以上安卓应用的情况。
在一种示例中,如图11所示,电子设备上运行多个安卓应用,比如包括应用1,应用2和应用3。每个安卓应用启动时,创建该安卓应用音频流对应的音轨(audio track)实例,并注册到音频合成(AudioFlinger)单元。示例性的,一个音轨实例对应一个音频标识。每个音轨实例的数据经过对应的缓存队列传输到AudioFlinger。示例性的,该缓存队列为先入先出(first input first output,FIFO)缓存队列。AudioFlinger将音轨实例数据合成音频流,并将各个安卓应用的音频流传输至音频混合(audio mixer)单元进行混合,混合后的音频传输至音频硬件(比如扬声器)进行播放。可以理解的,电子设备播放的是应用1,应用2和应用3的混合音频。
电子设备接收到用户启动音频录制的操作,生成应用窗口对应的屏幕录制单元;并且,每个安卓应用对应的音轨实例注册到对应的音频监听(audio listener)单元。比如,用户启动应用1的音频录制,生成屏幕录制单元1;用户启动应用2的音频录制,生成屏幕录制单元2;用户启动应用3的音频录制,生成屏幕录制单元3。每个音频监听单元从对应的缓存队列获取音轨实例数据,通过音频通道传输到主机的音频缓冲区。每个屏幕录制单元根据窗口标识获取对应的音频标识;屏幕录制单元中音频采集(audio capture)模块根据音频标识从音频缓冲区获取对应安卓应用的音频数据,对音频数据进行编码、封装等处理,生成安卓应用窗口对应的音频文件。示例性的,该音频采集(audio capture)模块的功能可以由图7中音频生成(audio producer)模块实现。比如,屏幕录制单元1从音频缓冲区获取窗口1(应用1)的音频数据,生成音频文件1;屏幕录制单元2从音频缓冲区获取窗口2(应用2)的音频数据,生成音频文件2;屏幕录制单元3从音频缓冲区获取窗口3(应用3)的音频数据,生成音频文件3。
在一些实施例中,对应用进行音频录制过程中,应用从前台显示切换到后台运行。在一种实现方式中,应用从前台显示切换到后台运行,主机通知安卓***该应用切换到后台运行。该应用暂停生成音轨实例数据。音频监听单元暂停向音频缓冲区发送该应用的音轨实例数据。主机上该应用的屏幕录制单元暂停从音频缓冲区获取该应用的音频数据,即暂停对该应用进行音频录制。应用从后台运行切换到前台显示,主机通知安卓***该应用切换到前台显示。该应用继续生成音轨实例数据。音频监听单元继续向音频缓冲区发送该应用的音轨实例数据。主机上该应用的屏幕录制单元继续从音频缓冲区获取该应用的音频数据,即继续对该应用进行音频录制。
在本申请实施例提供的多应用录屏方法中,为每个安卓应用窗口的音频录制创建对应的音频监听单元,音频监听单元从安卓应用的音轨实例中抓取对应的音频数据,实现安卓应用音频流的定向抓取录制。各个音频监听单元是独立的,音频数据录制通道是独立的,实现对每个应用窗口独立进行音频录制。
在另一些实施例中,用户启动对应用窗口进行视频录制。电子设备采用图9所示方法获取安卓应用的显示数据,采用图11所示方法获取该安卓应用的音频数据,并将显示数据和音频数据进行混流,生成安卓应用的视频文件,实现对每个应用窗口独立进行视频录制。
在一些实施例中,电子设备将不同安卓应用的图像和音频进行混流。比如,电子设备对应用1进行图像录制,对应用2进行音频录制,将应用1窗口图像和应用2的音频混流,生成视频文件。这样,用户可以将不同安卓应用的图像和音频合成新的视频,还可以对安卓应用窗口图像进行配乐等,提高用户使用乐趣。
在一些实施例中,电子设备将安卓应用的图像与输入设备的音频混流。比如,应用1是视频类安卓应用,电子设备获取应用1窗口图像;电子设备还接收麦克风的语音输入;将应用1窗口图像与麦克风的语音输入混流,生成视频文件。这样,用户可以为安卓应用图像配乐、配音等,提高用户使用乐趣。
需要说明的是,本申请上述实施例以电子设备上运行安卓模拟器为例进行说明。在另一些实施例中,安卓***与非安卓***可以运行在不同的电子设备上。比如,本申请实施例提供的多应用录屏方法中,上述安卓***的功能在电子设备一上实现,上述主机功能在电子设备二上实现。安卓***与非安卓***运行在不同电子设备上,上述各个功能模块的具体实现方式可以参考上述实施例相应描述,此处不再赘述。
可以理解的是,上述电子设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。
本申请实施例可以根据上述方法示例对上述电子设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在一种示例中,请参考图12,其示出了上述实施例中所涉及的电子设备的一种可能的结构示意图。该电子设备1100包括:处理单元1101,存储单元1102,显示单元1103和音频单元1104。
其中,处理单元1101,用于对电子设备1100的动作进行控制管理。例如,可以用于获取安卓应用的显示数据,还可以用于获取安卓应用的音频数据,和/或本申请实施例中其他处理步骤。
存储单元1102用于保存电子设备1100的程序代码和数据。例如,可以用于保存图像文件,音频文件,视频文件等。
显示单元1103用于显示电子设备1100的界面。例如,可以用于显示安卓应用的UI 等。
音频单元1104用于电子设备1100接收音频输入或播放音频。
当然,上述电子设备1100中的单元模块包括但不限于上述处理单元1101,存储单元1102,显示单元1103和音频单元1104。例如,电子设备1100中还可以包括电源单元等。电源单元用于对电子设备1100供电。
其中,处理单元1101可以是处理器或控制器,例如可以是中央处理器(central processing unit,CPU),数字信号处理器(digital signal processor,DSP),专用集成电路(application-specific integrated circuit,ASIC),现场可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。存储单元1102可以是存储器。显示单元1103可以是显示屏。音频单元1104可以是麦克风、扬声器等。
例如,处理单元1101为处理器(如图5所示的处理器110),存储单元1102可以为存储器(如图5所示的内部存储器121),显示单元1103为显示屏(如图5所示的显示屏150,该显示屏150可以为触摸屏,该触摸屏中可以集成显示面板和触控面板),音频单元1104为音频模块(如图5所示的音频模块170)。本申请实施例所提供的电子设备1100可以为图5所示的电子设备100。其中,上述处理器、存储器、显示屏、扬声器、麦克风等可以连接在一起,例如通过总线连接。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序代码,当处理器执行该计算机程序代码时,电子设备执行上述实施例中的方法。
本申请实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述实施例中的方法。
其中,本申请实施例提供的电子设备1100、计算机可读存储介质或者计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以使用硬件的形式实现,也可以使用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以 是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (18)

  1. 一种多应用录屏方法,其特征在于,包括:
    接收用户在电子设备第一应用窗口内的第一操作;
    响应于所述第一操作,电子设备对所述第一应用窗口内容录屏;
    接收用户在电子设备第二应用窗口内的第二操作;
    响应于所述第二操作,电子设备对所述第二应用窗口内容录屏;
    其中,所述第一应用和所述第二应用运行在安卓***内,所述第一应用窗口和所述第二应用窗口显示在非安卓***;
    电子设备对所述第二应用窗口内容录屏第一时长后,停止对所述第一应用窗口内容录屏。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    根据所述第一应用窗口内容生成第一录屏文件;
    根据所述第二应用窗口内容生成第二录屏文件。
  3. 根据权利要求2所述的方法,其特征在于,所述非安卓***包括Windows***,所述方法还包括:
    所述安卓***创建所述第一应用窗口对应的第一显示;
    所述安卓***创建所述第二应用窗口对应的第二显示;
    所述Windows***根据所述第一显示的图像数据生成所述第一应用的界面,根据所述第一应用的界面生成所述第一录屏文件;
    所述Windows***根据所述第二显示的图像数据生成所述第二应用的界面,根据所述第二应用的界面生成所述第二录屏文件。
  4. 根据权利要求3所述的方法,其特征在于,
    所述Windows***根据所述第一显示的图像数据生成所述第一应用的界面包括:
    所述Windows***从所述安卓***接收所述第一显示的图像数据,和所述第一显示的图像数据的第一合成指令;
    根据所述第一合成指令获取指令格式与所述Windows***匹配的第一Windows合成指令,采用所述第一Windows合成指令对所述第一显示的图像数据进行合成和渲染,生成所述第一应用的界面;
    所述Windows***根据所述第二显示的图像数据生成所述第二应用的界面包括:
    所述Windows***从所述安卓***接收所述第二显示的图像数据,和所述第二显示的图像数据的第二合成指令;
    根据所述第二合成指令获取指令格式与所述Windows***匹配的第二Windows合成指令,采用所述第二Windows合成指令对所述第二显示的图像数据进行合成和渲染,生成所述第二应用的界面。
  5. 根据权利要求3或4所述的方法,其特征在于,所述方法还包括:
    所述Windows***根据所述第一应用窗口标识获取第一显示标识,根据所述第一显示标识获取所述第一应用的界面;
    所述Windows***根据所述第二应用窗口标识获取第二显示标识,根据所述第二显示标识获取所述第二应用的界面。
  6. 根据权利要求3-5任意一项所述的方法,其特征在于,所述方法还包括:
    所述Windows***创建所述第一显示对应的第一本地窗口,在所述第一本地窗口显示所述第一应用的界面;
    所述Windows***创建所述第二显示对应的第二本地窗口,在所述第二本地窗口显示所述第二应用的界面。
  7. 根据权利要求3-6任意一项所述的方法,其特征在于,所述方法还包括:
    响应于所述第一应用切换到后台运行,所述Windows***停止根据第一显示标识获取所述第一应用的界面。
  8. 根据权利要求7所述的方法,其特征在于,所述方法还包括:
    响应于所述第一应用切换到后台运行,所述安卓***停止生成所述第一显示的图像数据,并停止向所述Windows***发送所述第一显示的图像数据。
  9. 根据权利要求2-8任意一项所述的方法,其特征在于,所述非安卓***包括Windows***,所述方法还包括:
    所述安卓***创建所述第一应用窗口对应的第一音轨实例;
    所述安卓***创建所述第二应用窗口对应的第二音轨实例;
    所述Windows***根据所述第一应用窗口标识获取第一音频标识;所述第一音频标识用于指示所述第一音轨实例;
    所述Windows***根据所述第一音频标识获取所述第一音轨实例数据,根据所述第一音轨实例数据生成所述第一录屏文件;
    所述Windows***根据所述第二应用窗口标识获取第二音频标识;所述第二音频标识用于指示所述第二音轨实例;
    所述Windows***根据所述第二音频标识获取所述第二音轨实例数据,根据所述第二音轨实例数据生成所述第二录屏文件。
  10. 根据权利要求9所述的方法,其特征在于,所述方法还包括:
    响应于所述第一应用切换到后台运行,所述Windows***停止根据所述第一音频标识获取所述第一音轨实例数据,并停止根据所述第一音轨实例数据生成所述第一录屏文件。
  11. 根据权利要求10所述的方法,其特征在于,所述方法还包括:
    响应于所述第一应用切换到后台运行,所述安卓***停止生成所述第一音轨实例数据。
  12. 根据权利要求1-11任意一项所述的方法,其特征在于,所述电子设备对所述第一应用窗口内容录屏包括:
    所述电子设备录制所述第一应用窗口图像;或者,
    所述电子设备录制所述第一应用的音频;或者,
    所述电子设备录制所述第一应用窗口图像和音频。
  13. 根据权利要求1-12任意一项所述的方法,其特征在于,电子设备对所述第二应用窗口内容录屏包括:
    所述电子设备录制所述第二应用窗口图像。
  14. 根据权利要求1-13任意一项所述的方法,其特征在于,电子设备对所述第二应 用窗口内容录屏包括:
    所述电子设备录制所述第二应用的音频。
  15. 根据权利要求1-14任意一项所述的方法,其特征在于,电子设备对所述第二应用窗口内容录屏包括:
    所述电子设备录制所述第二应用窗口图像和音频。
  16. 一种电子设备,其特征在于,包括:处理器,存储器,显示屏;其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述处理器执行时,使得所述电子设备执行如权利要求1-15任意一项所述的方法。
  17. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-15任意一项所述的方法。
  18. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-15任意一项所述的方法。
PCT/CN2022/098273 2021-06-30 2022-06-10 一种多应用录屏方法及装置 WO2023273845A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22831665.9A EP4344221A1 (en) 2021-06-30 2022-06-10 Multi-application screen recording method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110738194.5A CN115531889A (zh) 2021-06-30 2021-06-30 一种多应用录屏方法及装置
CN202110738194.5 2021-06-30

Publications (1)

Publication Number Publication Date
WO2023273845A1 true WO2023273845A1 (zh) 2023-01-05

Family

ID=84689993

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/098273 WO2023273845A1 (zh) 2021-06-30 2022-06-10 一种多应用录屏方法及装置

Country Status (3)

Country Link
EP (1) EP4344221A1 (zh)
CN (1) CN115531889A (zh)
WO (1) WO2023273845A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116737305A (zh) * 2023-08-14 2023-09-12 北京麟卓信息科技有限公司 一种基于图层动态合成的跨运行环境录屏优化方法及***

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010110765A1 (en) * 2009-03-23 2010-09-30 Thomson Licensing Method and apparatus for recording video sequence of screen displays
CN104142852A (zh) * 2014-08-04 2014-11-12 福州靠谱网络有限公司 在电脑上实现安卓模拟器图形加速方法
CN110417991A (zh) * 2019-06-18 2019-11-05 华为技术有限公司 一种录屏方法及电子设备
CN111768330A (zh) * 2019-03-30 2020-10-13 华为技术有限公司 图像处理方法及计算机***
CN111866423A (zh) * 2020-07-07 2020-10-30 广州三星通信技术研究有限公司 用于电子终端的录屏方法及相应设备
CN112114733A (zh) * 2020-09-23 2020-12-22 青岛海信移动通信技术股份有限公司 一种截屏、录屏方法、移动终端及计算机存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010110765A1 (en) * 2009-03-23 2010-09-30 Thomson Licensing Method and apparatus for recording video sequence of screen displays
CN104142852A (zh) * 2014-08-04 2014-11-12 福州靠谱网络有限公司 在电脑上实现安卓模拟器图形加速方法
CN111768330A (zh) * 2019-03-30 2020-10-13 华为技术有限公司 图像处理方法及计算机***
CN110417991A (zh) * 2019-06-18 2019-11-05 华为技术有限公司 一种录屏方法及电子设备
CN111866423A (zh) * 2020-07-07 2020-10-30 广州三星通信技术研究有限公司 用于电子终端的录屏方法及相应设备
CN112114733A (zh) * 2020-09-23 2020-12-22 青岛海信移动通信技术股份有限公司 一种截屏、录屏方法、移动终端及计算机存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Night god Android emulator records video", 3 February 2021 (2021-02-03), pages 1 - 3, XP093019004, Retrieved from the Internet <URL:https://jingyan.***.com/article/455a99508578c7e0662778bd.html> [retrieved on 20230130] *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116737305A (zh) * 2023-08-14 2023-09-12 北京麟卓信息科技有限公司 一种基于图层动态合成的跨运行环境录屏优化方法及***
CN116737305B (zh) * 2023-08-14 2023-10-27 北京麟卓信息科技有限公司 一种基于图层动态合成的跨运行环境录屏优化方法及***

Also Published As

Publication number Publication date
EP4344221A1 (en) 2024-03-27
CN115531889A (zh) 2022-12-30

Similar Documents

Publication Publication Date Title
WO2020238774A1 (zh) 一种通知消息的预览方法及电子设备
WO2021057830A1 (zh) 一种信息处理方法及电子设备
CN112394895B (zh) 画面跨设备显示方法与装置、电子设备
WO2021115194A1 (zh) 一种应用图标的显示方法及电子设备
WO2021121052A1 (zh) 一种多屏协同方法、***及电子设备
WO2022100304A1 (zh) 应用内容跨设备流转方法与装置、电子设备
CN114040242B (zh) 投屏方法、电子设备和存储介质
WO2022143082A1 (zh) 一种图像处理方法和电子设备
WO2023030099A1 (zh) 跨设备交互的方法、装置、投屏***及终端
WO2022242393A1 (zh) 控制方法、装置、电子设备及可读存储介质
WO2024045801A1 (zh) 用于截屏的方法、电子设备、介质以及程序产品
CN114996168A (zh) 一种多设备协同测试方法、测试设备及可读存储介质
WO2023273845A1 (zh) 一种多应用录屏方法及装置
CN114035721B (zh) 触控屏显示方法、装置及存储介质
WO2021052488A1 (zh) 一种信息处理方法及电子设备
WO2021082911A1 (zh) 一种内容传输方法和终端设备
WO2023179123A1 (zh) 蓝牙音频播放方法、电子设备及存储介质
WO2023001043A1 (zh) 一种显示内容方法、电子设备及***
WO2024001766A1 (zh) 屏幕录制和共享的方法及电子设备
WO2021227847A9 (zh) 一种文件应用方法及装置
CN116700660B (zh) 音频播放方法和电子设备
WO2024037542A1 (zh) 一种触控输入的方法、***、电子设备及存储介质
WO2024109481A1 (zh) 窗口控制方法及电子设备
WO2024109511A1 (zh) 媒体会话管理方法、电子设备及计算机可读存储介质
WO2023283941A1 (zh) 一种投屏图像的处理方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22831665

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022831665

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022831665

Country of ref document: EP

Effective date: 20231221

NENP Non-entry into the national phase

Ref country code: DE