CN116048351A - Screen capturing method and electronic equipment - Google Patents

Screen capturing method and electronic equipment Download PDF

Info

Publication number
CN116048351A
CN116048351A CN202210891491.8A CN202210891491A CN116048351A CN 116048351 A CN116048351 A CN 116048351A CN 202210891491 A CN202210891491 A CN 202210891491A CN 116048351 A CN116048351 A CN 116048351A
Authority
CN
China
Prior art keywords
window
screen
screen capturing
windows
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210891491.8A
Other languages
Chinese (zh)
Inventor
王钇杰
何正�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210891491.8A priority Critical patent/CN116048351A/en
Publication of CN116048351A publication Critical patent/CN116048351A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

The application discloses a screen capturing method and electronic equipment, and relates to the field of image display, wherein the screen capturing method comprises the following steps: displaying a plurality of windows; and responding to the screen capturing command, and synthesizing the layers of the target windows in the plurality of windows to obtain the target picture.

Description

Screen capturing method and electronic equipment
Technical Field
The application relates to the field of image display, in particular to a screen capturing method and electronic equipment.
Background
As shown in fig. 1, when an electronic device such as a mobile phone displays a floating window (or called a floating ball) F, if a scrolling screen is performed, a plurality of floating windows will appear in a picture generated by the scrolling screen, that is, the picture generated by the scrolling screen does not match with a window actually displayed by the mobile phone, which results in poor user experience.
Disclosure of Invention
The embodiment of the application provides a screen capturing method and electronic equipment, which are used for realizing screen capturing of a specific displayed window.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, a screen capturing method is provided, including: displaying a plurality of windows; and responding to the screen capturing command, and synthesizing the layers of the target windows in the plurality of windows to obtain the target picture.
According to the screen capturing method provided by the embodiment of the application, the picture is obtained by synthesizing the picture layer where the target window is located in the plurality of displayed windows, so that the screen capturing of the specific displayed window is realized.
In one possible implementation manner, the synthesizing the layer where the target window is located in the multiple windows to obtain the target picture includes: acquiring the type of a window which is not subjected to screen capturing; searching a layer where a window without screen capturing is located from a window tree according to the type, wherein each leaf node in the window tree comprises layer information where a window is located; and synthesizing the layers except the layer where the window without screen capturing is positioned in the plurality of windows to obtain the target picture. And the window which is not subjected to screen capturing is eliminated through the type of the window, so that the target picture is obtained.
In one possible implementation manner, the synthesizing the layer where the target window is located in the multiple windows to obtain the target picture includes: acquiring a task identifier of a window for screen capturing; searching a layer where a window for screen capturing is located from a window tree according to the type, wherein each leaf node in the window tree comprises layer information where a window is located; and synthesizing the layers of the windows in which the screen capturing windows are positioned in the windows to obtain the target picture. And determining a window for screen capturing through a task identifier corresponding to the window, so as to obtain a target picture.
In one possible implementation, the screen capture command is a scroll screen capture command; capturing a target window of the plurality of windows to generate a target picture, including: responding to a scrolling screen capturing command, and synthesizing the layers where the windows are positioned to obtain a first picture; scrolling and updating the content displayed by the target window; synthesizing the layer where the target window is located to obtain a second picture; and synthesizing the first picture and the second picture according to the screen capturing sequence to obtain the target picture. The scrolling screen capturing can be realized through the composition of a plurality of pictures, and the screen capturing of the window except the target window can be realized only once, so that the condition that the content of the same window is captured for a plurality of times during the scrolling screen capturing is avoided.
In one possible embodiment, the target window is a window other than a hover sphere.
In a second aspect, there is provided an electronic device comprising a processor and a memory in which instructions are stored which, when executed by the processor, perform a method as described in the first aspect and any of its embodiments.
In a third aspect, there is provided a computer readable storage medium comprising instructions which, when executed on an electronic device, cause the electronic device to perform the method of the first aspect and any implementation thereof.
In a fourth aspect, there is provided a computer program product comprising instructions which, when run on an electronic device as described above, cause the electronic device to perform the method of the first aspect and any of its embodiments.
In a fifth aspect, a chip system is provided, the chip system comprising a processor for supporting an electronic device to implement the functions referred to in the first aspect above. In one possible design, the device may further include interface circuitry that may be used to receive signals from other devices (e.g., memory) or to send signals to other devices (e.g., communication interfaces). The system-on-chip may include a chip, and may also include other discrete devices.
The technical effects of the second to fifth aspects are referred to the technical effects of the first aspect and any of its embodiments and are not repeated here.
Drawings
Fig. 1 is a schematic diagram of a scrolling screen capturing generated picture according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a program running on a processor according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a screen capturing method according to an embodiment of the present application;
fig. 5 is a schematic diagram of a mobile phone display interface according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a scrolling screen according to an embodiment of the present application;
fig. 7 is a schematic diagram of a finger joint double click provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of a finger joint sliding gesture according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of a screen capture button of a drop down menu according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a window according to an embodiment of the present disclosure arranged along a Z-axis of a display (i.e., a direction perpendicular to the display);
FIG. 11 is a flowchart of another screen capturing method according to an embodiment of the present disclosure;
FIG. 12 is a flowchart of another screen capturing method according to an embodiment of the present disclosure;
FIG. 13 is a flowchart of another screen capturing method according to an embodiment of the present disclosure;
FIG. 14 is a schematic view of another scrolling screen shot provided by an embodiment of the present application;
fig. 15 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
Some concepts related to the present application will be described first.
The terms "first," "second," and the like in the embodiments of the present application are used for the purpose of distinguishing between similar features and not necessarily for the purpose of indicating a relative importance, quantity, order, or the like.
The terms "exemplary" or "such as" and the like, as used in connection with embodiments of the present application, are intended to be exemplary, or descriptive. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
The terms "coupled" and "connected" in connection with embodiments of the present application are to be construed broadly, and may refer, for example, to a physical direct connection, or to an indirect connection via electronic devices, such as, for example, a connection via electrical resistance, inductance, capacitance, or other electronic devices.
The embodiment of the application provides an electronic device which can be a device with a display function and can be movable or fixed. The electronic device may be deployed on land (e.g., indoor or outdoor, hand-held or vehicle-mounted, etc.), on water (e.g., ship, etc.), or in the air (e.g., aircraft, balloon, satellite, etc.). The electronic device may be referred to as a User Equipment (UE), an access terminal, a terminal unit, a subscriber unit (subscriber unit), a terminal station, a Mobile Station (MS), a mobile station, a terminal agent, a terminal apparatus, or the like. For example, the electronic device may be a cell phone, tablet computer, notebook computer, smart bracelet, smart watch, headset, smart sound box, virtual Reality (VR) device, augmented reality (augmented reality, AR) device, terminal in industrial control (industrial control), terminal in unmanned (self-driving), terminal in remote medical (remote medical), terminal in smart grid (smart grid), terminal in transportation security (transportation safety), terminal in smart city (smart city), terminal in smart home (smart home), etc. The embodiment of the application is not limited to the specific type and structure of the electronic device. One possible configuration of the electronic device is described below.
Taking an electronic device as an example of a mobile phone, fig. 2 shows one possible structure of the electronic device 101. The electronic device 101 may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (universal serial bus, USB) interface 230, a power management module 240, a battery 241, a wireless charging coil 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an ear-headphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display 294, a user identification module (subscriber identification module, SIM) card interface 295, and the like.
The sensor module 280 may include, among other things, a pressure sensor, a gyroscope sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
It should be understood that the structure illustrated in the embodiments of the present invention does not constitute a specific limitation on the electronic device 101. In other embodiments of the present application, the electronic device 101 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units such as, for example: the processor 210 may be a field programmable gate array (field programmable gate array, FPGA), an application specific integrated circuit (application specific integrated circuit, ASIC), a system on chip (SoC), a central processing unit (central processing unit, CPU), an application processor (application processor, AP), a network processor (network processor, NP), a digital signal processor (digital signal processor, DSP), a micro control unit (micro controller unit, MCU), a programmable logic device (programmable logic device, PLD), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a baseband processor, and a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. For example, the processor 210 may be an application processor AP. Alternatively, the processor 210 may be integrated in a system on chip (SoC). Alternatively, the processor 210 may be integrated in an integrated circuit (integrated circuit, IC) chip. The processor 210 may include an Analog Front End (AFE) and a micro-controller unit (MCU) in an IC chip.
The controller may be a neural hub and a command center of the electronic device 101, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that the processor 210 has just used or recycled. If the processor 210 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 210 is reduced, thereby improving the efficiency of the system.
In some embodiments, processor 210 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a USB interface, among others.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present invention is only illustrative, and does not limit the structure of the electronic device 101. In other embodiments of the present application, the electronic device 101 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The power management module 240 is configured to receive a charging input from a charger. The charger may be a wireless charger (such as a wireless charging base of the electronic device 101 or other devices capable of wirelessly charging the electronic device 101), or may be a wired charger. For example, the power management module 240 may receive a charging input of a wired charger through the USB interface 230. The power management module 240 may receive wireless charging input through a wireless charging coil 242 of the electronic device.
The power management module 240 may also supply power to the electronic device while charging the battery 241. The power management module 240 receives input from the battery 241 to power the processor 210, the internal memory 221, the external memory interface 220, the display 294, the camera 293, the wireless communication module 260, and the like. The power management module 240 may also be configured to monitor parameters of the battery 241 such as battery capacity, battery cycle times, battery health (leakage, impedance), etc. In other embodiments, the power management module 240 may also be disposed in the processor 210.
The wireless communication function of the electronic device 101 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 101 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 250 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on the electronic device 101. The wireless communication module 260 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied on the electronic device 101. In some embodiments, antenna 1 and mobile communication module 250 of electronic device 101 are coupled, and antenna 2 and wireless communication module 260 are coupled, such that electronic device 101 may communicate with a network and other devices via wireless communication techniques.
The electronic device 101 implements display functions through a GPU, a display screen 294, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The display 294 is used to display images, videos, and the like. The display 294 includes a display panel and a touch screen. In some embodiments, the electronic device 101 may include 1 or N displays 294, N being a positive integer greater than 1.
The electronic device 101 may implement a photographing function through an ISP, a camera 293, a video codec, a GPU, a display screen 294, an application processor, and the like. The ISP is used to process the data fed back by the camera 293. In some embodiments, the ISP may be provided in the camera 293. The camera 293 is used to capture still images or video. In some embodiments, the electronic device 101 may include 1 or N cameras 293, N being a positive integer greater than 1.
The external memory interface 220 may be used to connect external memory cards, such as Micro SanDisk (Micro SD) cards, to enable expansion of the memory capabilities of the electronic device 101. The external memory card communicates with the processor 210 through an external memory interface 220 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
Internal memory 221 may be used to store computer executable program code that includes instructions. The processor 210 executes various functional applications of the electronic device 101 and data processing by executing instructions stored in the internal memory 221. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The memory to which embodiments of the present application relate may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), and direct memory bus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
The electronic device 101 may implement audio functionality through an audio module 270, speaker 270A, receiver 270B, microphone 270C, headphone interface 270D, application processor, and so forth. Such as music playing, recording, etc.
The audio module 270 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. In some embodiments, the audio module 270 may be disposed in the processor 210, or some functional modules of the audio module 270 may be disposed in the processor 210. Speaker 270A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. A receiver 270B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. Microphone 270C, also referred to as a "microphone" or "microphone," is used to convert sound signals into electrical signals. The electronic device 101 may be provided with at least one microphone 270C. The earphone interface 270D is for connecting a wired earphone. Earphone interface 270D may be USB interface 230 or a 3.5mm open mobile terminal platform (open mobile terminal platform, OMTP) standard interface, american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
Keys 290 include a power on key, a volume key, etc. The keys 290 may be mechanical keys. Or may be a touch key. The electronic device 101 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 101. The motor 291 may generate a vibration alert. The motor 291 may be used for incoming call vibration alerting or for touch vibration feedback. The indicator 292 may be an indicator light, which may be used to indicate a state of charge, a change in power, or an indication message, missed call, notification, etc. The SIM card interface 295 is for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 295 or removed from the SIM card interface 295 to enable contact and separation from the electronic device 101. The electronic device 101 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 295 may support a Nano SIN (Nano SIM) card, micro SIM (Micro SIM) card, SIM card, etc. In some embodiments, the electronic device 101 employs an embedded SIM (eSIM) card, which may be embedded in the electronic device 101 and not separable from the electronic device 101.
The programs executed by processor 210 may be based on an operating system, e.g.
Figure BDA0003767748250000051
Figure BDA0003767748250000052
Windows (Windows) operating system, etc. As shown in FIG. 3, the program running with processor 210 is based on +.>
Figure BDA0003767748250000053
Figure BDA0003767748250000054
For example, programs run by processor 210 are layered by function and may include an application layer, a framework layer, and a base layer. The algorithm library hardware abstraction layer, the kernel layer and the driving layer.
The base layer includes a hardware abstraction layer (hardware abstract layer, HAL) and a driver layer for driving hardware resources of the hardware layer. The driving layer may include a GPU driver for driving a display panel in the display screen to display content and a touch screen driver for acquiring a touch operation of a user on the display screen from the touch screen in the display screen. The hardware abstraction layer is used for abstracting hardware, and comprises a display module, a touch control module and the like. The display module is used for abstracting the display panel hardware in the display screen and calling the display algorithm in the algorithm library by the abstracted display panel. The touch module is used for abstracting a touch screen in the display screen and calling the abstracted touch screen for a touch algorithm in the algorithm library.
The framework layer is used for realizing basic algorithm based on the function of the hardware abstraction layer and supplying application layer call. For example, the screen capturing algorithm provided by the embodiment of the application is used for executing the screen capturing method provided by the embodiment of the application, and the touch control algorithm is used for generating a screen capturing command to trigger the screen capturing operation of the screen capturing application.
The application layer may include a screen capture application having a screen capture (particularly a scrolling screen capture) function that, when a screen capture command is detected, invokes a screen capture algorithm to effect a screen capture operation and generate a picture.
As described above, when the electronic device such as the mobile phone displays the floating window, if the scrolling screen capturing is performed, a plurality of floating windows will appear in the picture generated by the scrolling screen capturing, that is, the picture generated by the scrolling screen capturing does not conform to the window actually displayed by the mobile phone, which results in poor user experience. Accordingly, embodiments of the present application provide a screen capture method that improves user experience by capturing a screen of a specified target window (e.g., other windows that do not include a floating window) when multiple windows are displayed.
As shown in fig. 4, the screen capturing method includes:
s101, displaying a plurality of windows.
For example, as shown in fig. 5, the mobile phone displays a "setup" interface S and a floating window F, where the "setup" interface S and the floating window F correspond to different windows.
S102, responding to the screen capturing command, and synthesizing the layers of the target windows in the windows to obtain a target picture.
The screen shots referred to herein may refer to single page screen shots or scrolling screen shots. The single-page screen capturing refers to the screen capturing of a page of content currently displayed by the display screen. Scrolling screen shots (also referred to as multi-page screen shots) refer to displaying the rest of the content in a scrolling manner and performing screen shots when the display interface of the application is more than one page, and splicing the complete display interface of the application. For example, as shown in fig. 6, more than one page of "setup" interfaces are provided, by scrolling the "setup" interfaces upwards, three display interfaces on the left side are obtained, and after screen capturing and splicing, the complete "setup" interface on the right side is obtained.
As shown in fig. 7-9, the screen capture command may include, but is not limited to, a double-click of a finger joint (fig. 7), a finger joint swipe gesture (e.g., an S-type swipe gesture in fig. 8), clicking a screen capture button of a drop-down menu (fig. 9), and the like. Wherein, double-click of the finger joint can trigger a single-page screen capturing command; the finger joint swipe gesture may trigger a scroll screen capture command; the screen capturing button can display a button for scrolling the screen capturing, and the scrolling screen capturing command can be triggered by clicking the button.
As shown in FIG. 10, in
Figure BDA0003767748250000061
In a window management service (window manager service, WMS) in which windows are arranged according to the Z-axis of the display (i.e. perpendicular to the display), a floating window is located above the "set" interface, WMS is +.>
Figure BDA0003767748250000062
For window management. In the Java file Window manager service.java of the WMS, the data of a plurality of windows are used as a plurality of leaf nodes of a window tree, are organized and stored according to the tree structure, and the data of each window can be traversed through the root node of the window tree. The data of the window includes information such as a type of the window (e.g., a floating window, an application window), a task identifier (task id), coordinates, a size, a scaling, a corresponding surface control (surface control) object, etc., where each window corresponds to a task identifier, and the surface control (surface control) object corresponding to the window includes a layer where the window is located.
The format of the target picture is not limited in this application, and may be, for example, bitmap (bitmap), joint photographic experts group (joint photographic experts group, JPEG), or the like.
In the prior art, all the displayed layers are synthesized to obtain a picture, and the screen capturing method provided by the embodiment of the invention synthesizes the layers of the target window in the displayed windows to obtain the picture, so that the screen capturing of the displayed specific window is realized.
In one possible embodiment, the type of window that is not being captured may be acquired first, e.g., the type of window that is not being captured in the present application may be a floating window. And searching the layer where the window without screen capturing is located from the window tree according to the type of the window without screen capturing, wherein the target window is a window except the window without screen capturing in the displayed multiple windows, and synthesizing other layers except the layer where the window without screen capturing is located in the multiple windows, so that the target picture can be obtained. Specifically, as shown in fig. 11, the method includes:
s201, the screen capturing application sends the type of the window which is not subjected to screen capturing to a screen capturing interface function of an extended surface control (SurfaceControllEx) class by calling a system user interface (SystemUI) service.
A system user interface (SystemUI) service is located at the application layer for providing notifications, status bar information displays, screen shots, etc. The extended surface control (surfacecontrolling ex) class is located at the framework layer.
Figure BDA0003767748250000071
The screenshot interface function of the surface control (SurfaceControl) class in the framework layer is not open to the application layer, so the application layer cannot directly call the screenshot interface function of the surface control (SurfaceControl) class, so the screenshot interface function of the surface control (SurfaceControl) class is expanded to obtain the screenshot interface function of the expanded surface control (SurfaceControlEx) class, and the screenshot interface function can be used for calling the application in the application layer.
When there are a plurality of types of windows for which screen shots are not made, a system user interface (SystemUI) service may transmit the plurality of types of windows for which screen shots are not made in the form of an array.
S202, a screenshot interface function of an extended surface control (surface control Ex) class sends a window type without screenshot to a screenshot interface function of the surface control (surface control) class.
S203, a screenshot interface function of a surface control (surface control) class sends a window type without screenshot to a screenshot interface function of the WMS service.
S204, searching windows with the same type as windows which are not subjected to screen capture from the window tree by a screen capture interface function of the WMS service, and sending a layer where the windows which are not subjected to screen capture are located or a layer where the windows which are subjected to screen capture are located to a screen capture interface function of a surface control (surface control) class.
The screen capturing interface function of the WMS service searches the type of the window stored in each leaf node in the window tree according to the type of the window which is not subjected to screen capturing, if the types of the window are the same, a surface control (surface control) object corresponding to the window (or a layer where the window is located) is added into a list array, if the types of the window are different, the next leaf node is searched until the whole window tree is traversed, the list array is sent to the screen capturing interface function of the surface control (surface control) class, and at the moment, the screen capturing interface function of the WMS service is equivalent to the screen capturing interface function of the surface control (surface control) class.
Or, the screen capturing interface function of the WMS service searches the type of the window stored in each leaf node in the window tree according to the type of the window which is not subjected to screen capturing, if the types of the window are different, the surface control (surface control) object corresponding to the window (or the layer where the window is located) is added into the list array, if the types of the window are the same, the next leaf node is searched until the whole window tree is traversed, the list array is sent to the screen capturing interface function of the surface control (surface control) class, and at the moment, the screen capturing interface function of the WMS service sends the layer where the window subjected to screen capturing to the screen capturing interface function of the surface control (surface control) class.
S205, a screenshot interface function of a surface control (surface control) class sends a layer of a window which does not perform screenshot or a layer of a window which performs screenshot to a surface synthesis (surface eFlinger) service.
A surface compositing (surfeflinger) service is used to composite multiple layers to get a target picture.
S206, synthesizing a layer where a window for screen capturing is located in a plurality of windows by using a screenshot interface function of a surface control (surface control) class to obtain a target picture.
And if the layer of the window which is not subjected to screen capture is received, synthesizing the layers of the plurality of windows except the layer of the window which is not subjected to screen capture by a screenshot interface function of a surface control (surface control) class to obtain a target picture. If the layer of the window for screen capturing is received, a screen capturing interface function of a surface control (surface control) class directly synthesizes the layer of the window for screen capturing to obtain a target picture.
S207, a screenshot interface function of a surface control (surface control) class sequentially sends a target picture to a screenshot application through a screenshot interface function of the surface control (surface control), a screenshot interface function of an extended surface control (surface control Ex) class and a system user interface (SystemUI) service.
In the process of transmitting the target picture, because the inter-process communication is involved, the target picture is not directly transmitted to an application layer in sequence, but after the target picture is serialized to a cache, the address of the target picture or the handle of the target picture is sequentially transmitted to a screenshot interface function of an extended surface control (surfece) class through an android interface definition language (Android interface definition language, AIDL) interface, and then the target picture is transmitted to the application layer through the screenshot interface function of the extended surface control (surfece) class, so that the data transmission efficiency is improved.
In another possible implementation, a task identification (taskID) of the window in which the screen capture is performed (i.e., the target window) may be obtained first. And then, searching the layer where the window for screen capturing is located from the window tree according to the type of the window for screen capturing, and synthesizing the layer where the target window is located, so that the target picture can be obtained. Specifically, as shown in fig. 12, the method includes:
s301, the screen capturing application sends task identification (taskID) of a window for performing screen capturing to a screen capturing interface function of an extended surface control (SurfaceControllEx) class by calling a System user interface (SystemUI) service.
When there are a plurality of types of windows for which screen shots are made, a system user interface (SystemUI) service may transmit task identifications (taskids) of the plurality of windows for which screen shots are made in the form of an array.
S302, a screenshot interface function of an extended surface control (surfacecontrolEx) class sends task identification (taskID) of a window for performing screenshot to a screenshot interface function of the surface control (SurfaceControl) class.
S303, a screenshot interface function of a surface control (surface control) class sends a task identification (taskID) of a window for performing screenshot to a screenshot interface function of the WMS service.
S304, searching a window which is the same as a task identification (taskID) of the window for performing screen capturing from a window tree by a screen capturing interface function of the WMS service, and sending a layer where the window for not performing screen capturing is positioned or a layer where the window for performing screen capturing is positioned to a screen capturing interface function of a surface control (surface control) class.
The screen capturing interface function of the WMS service searches the task identifier (task ID) of the window stored in each leaf node in the window tree according to the task identifier (task ID) of the window for capturing the screen, if the task identifiers (task IDs) are the same, a surface control (surface control) object corresponding to the window (or a layer where the window is located) is added into the list array, if the task identifiers (task IDs) are different, the next leaf node is searched until the whole window tree is traversed, the list array is sent to the screen capturing interface function of the surface control (surface control) class, and at the moment, the screen capturing interface function of the WMS service sends the layer where the window for capturing the screen to the screen capturing interface function of the surface control (surface control) class.
Or, the screen capturing interface function of the WMS service searches for the task identifier (task id) of the window stored in each leaf node in the window tree according to the task identifier (task id) of the window for capturing the screen, if the task identifiers (task ids) are different, the surface control (surface control) object corresponding to the window (or the layer where the window is located) is added to the list array, if the task identifiers (task ids) are the same, the next leaf node is searched for, until the whole window tree is traversed, the list array is sent to the screen capturing interface function of the surface control (surface control) class, and at this time, the screen capturing interface function of the WMS service sends the layer where the window for not capturing the screen to the screen capturing interface function of the surface control (surface control) class.
S305, a screenshot interface function of a surface control (surface control) class sends a layer of a window which does not perform screenshot or a layer of a window which performs screenshot to a surface synthesis (surface eFlinger) service.
S306, a screenshot interface function of a surface control (surface control) class synthesizes a layer where a window for screenshot is located in a plurality of windows to obtain a target picture.
This step refers to step S206, and will not be described in detail herein.
S307, the screenshot interface function of the surface control (surface control) class sequentially sends the target picture to the screenshot application through the screenshot interface function of the surface control (surface control), the screenshot interface function of the extended surface control (surface control Ex) class and the system user interface (SystemUI) service.
This step refers to step S207 and will not be described in detail herein.
For scrolling screen capturing, as shown in fig. 13, a screen capturing method provided in an embodiment of the present application may include:
s401, displaying a plurality of windows.
For example, as shown in fig. 14, the mobile phone displays a "setup" interface S and a floating window F, where the "setup" interface S and the floating window F correspond to different windows.
S402, receiving a scrolling screen capturing command.
The scrolling command may be described with reference to step S102, and will not be described herein. If it is the first screen shot of the scrolling screen shot, step S403 is performed, otherwise steps S404-S406 are performed.
S403, synthesizing the layers where the windows are located to obtain a first picture.
Illustratively, as shown in fig. 14, all layers including the "setup" interface S and the floating window F are synthesized to obtain the first picture P1. At this point, a native screenshot interface function (i.e., a surface control (surfecontrol) class screenshot interface function) may be invoked to get the first picture.
S404, scrolling and updating the content displayed by the target window in the plurality of windows.
By way of example, as shown in fig. 14, by scrolling down the "set" interface S, the contents in the "set" interface S are displayed page by page.
S405, synthesizing the layer where the target window is located to obtain a second picture.
For example, as shown in fig. 14, the second pictures (P2 and P3) are obtained by synthesizing the layer where the "setting" interface S is located, and when the content of the target window is relatively large, there may be a plurality of second pictures.
Other contents of this step may refer to steps S102, S201-S207, S301-S307, and will not be described here.
S406, synthesizing the first picture and the second picture according to the screen capturing sequence to obtain a target picture.
Illustratively, as shown in FIG. 14, the pictures P1-P3 are synthesized in a screen capturing order to obtain the target picture T.
The embodiment realizes the rolling screen capturing which finally only comprises one floating window, and improves the user experience.
As shown in fig. 15, the embodiment of the application further provides a chip system. The chip system 60 includes at least one processor 601 and at least one interface circuit 602. The at least one processor 601 and the at least one interface circuit 602 may be interconnected by wires. The processor 601 is configured to support the electronic device to implement the steps of the method embodiments described above, e.g., the methods shown in fig. 4, 11, 12, 13, and at least one interface circuit 602 may be configured to receive signals from other devices (e.g., memory) or to transmit signals to other devices (e.g., communication interfaces). The system-on-chip may include a chip, and may also include other discrete devices.
Embodiments of the present application also provide a computer-readable storage medium including instructions that, when executed on an electronic device described above, cause the electronic device to perform the steps of the method embodiments described above, e.g., performing the methods shown in fig. 4, 11, 12, 13.
Embodiments of the present application also provide a computer program product comprising instructions which, when run on the above-described electronic device, cause the electronic device to perform the steps of the method embodiments described above, for example, performing the methods shown in fig. 4, 11, 12, 13.
Technical effects concerning the chip system, the computer-readable storage medium, the computer program product refer to the technical effects of the previous method embodiments.
It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system, apparatus and module may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple modules or components may be combined or integrated into another device, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interface, indirect coupling or communication connection of devices or modules, electrical, mechanical, or other form.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physically separate, i.e., may be located in one device, or may be distributed over multiple devices. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in each embodiment of the present application may be integrated in one device, or each module may exist alone physically, or two or more modules may be integrated in one device.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented using a software program, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (Digital Subscriber Line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device including one or more servers, data centers, etc. that can be integrated with the medium. The usable medium may be a magnetic medium (e.g., a floppy Disk, a hard Disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (7)

1. A method of screen capturing, comprising:
displaying a plurality of windows;
and responding to the screen capturing command, and synthesizing the layer where the target window in the plurality of windows is positioned to obtain a target picture.
2. The method of claim 1, wherein the synthesizing the layer of the target window in the plurality of windows to obtain the target picture includes:
acquiring the type of a window which is not subjected to screen capturing;
searching a layer where the window without screen capturing is located from a window tree according to the type, wherein each leaf node in the window tree comprises layer information where one window is located;
and synthesizing the layers except the layer where the window which does not screen capturing is located in the plurality of windows to obtain the target picture.
3. The method of claim 1, wherein the synthesizing the layer of the target window in the plurality of windows to obtain the target picture includes:
acquiring a task identifier of a window for screen capturing;
searching a layer where the window for screen capturing is located from a window tree according to the task identifier, wherein each leaf node in the window tree comprises layer information where one window is located;
and synthesizing the layers of the windows for screen capturing in the plurality of windows to obtain the target picture.
4. A method according to any one of claims 1-3, wherein the screen capture command is a scroll screen capture command; the capturing a target window of the plurality of windows to generate a target picture includes:
responding to the scrolling screen capturing command, and synthesizing the layers where the windows are positioned to obtain a first picture;
scrolling and updating the content displayed by the target window;
synthesizing the layer where the target window is located to obtain a second picture;
and synthesizing the first picture and the second picture according to the screen capturing sequence to obtain the target picture.
5. The method of claim 4, wherein the target window is a window other than a hover sphere.
6. An electronic device comprising a processor and a memory, the memory storing instructions that, when executed by the processor, perform the method of any of claims 1-5.
7. A computer readable storage medium comprising instructions which, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-5.
CN202210891491.8A 2022-07-27 2022-07-27 Screen capturing method and electronic equipment Pending CN116048351A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210891491.8A CN116048351A (en) 2022-07-27 2022-07-27 Screen capturing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210891491.8A CN116048351A (en) 2022-07-27 2022-07-27 Screen capturing method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116048351A true CN116048351A (en) 2023-05-02

Family

ID=86114051

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210891491.8A Pending CN116048351A (en) 2022-07-27 2022-07-27 Screen capturing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116048351A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111290675A (en) * 2020-03-02 2020-06-16 Oppo广东移动通信有限公司 Screenshot picture sharing method and device, terminal and storage medium
CN113552986A (en) * 2020-04-07 2021-10-26 华为技术有限公司 Multi-window screen capturing method and device and terminal equipment
US20220050565A1 (en) * 2019-04-29 2022-02-17 Vivo Mobile Communication Co.,Ltd. Screenshot method and terminal device
CN114489429A (en) * 2022-01-29 2022-05-13 青岛海信移动通信技术股份有限公司 Terminal device, long screen capture method and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220050565A1 (en) * 2019-04-29 2022-02-17 Vivo Mobile Communication Co.,Ltd. Screenshot method and terminal device
CN111290675A (en) * 2020-03-02 2020-06-16 Oppo广东移动通信有限公司 Screenshot picture sharing method and device, terminal and storage medium
CN113552986A (en) * 2020-04-07 2021-10-26 华为技术有限公司 Multi-window screen capturing method and device and terminal equipment
CN114489429A (en) * 2022-01-29 2022-05-13 青岛海信移动通信技术股份有限公司 Terminal device, long screen capture method and storage medium

Similar Documents

Publication Publication Date Title
CN106060378B (en) Apparatus and method for setting photographing module
KR20150082940A (en) Apparatas and method for controlling a rotation of screen in an electronic device
CN114115769B (en) Display method and electronic equipment
US10999501B2 (en) Electronic device and method for controlling display of panorama image
US20150235366A1 (en) Method for processing image data and apparatus for the same
EP3364646A1 (en) Electronic device and method for displaying 360-degree image in the electronic device
US20170155917A1 (en) Electronic device and operating method thereof
US10250818B2 (en) Electronic device including a plurality of cameras and operating method thereof
KR20150027934A (en) Apparatas and method for generating a file of receiving a shoot image of multi angle in an electronic device
CN110677713B (en) Video image processing method and device and storage medium
KR20150068136A (en) Contents download method of electronic apparatus and electronic appparatus thereof
KR20150026257A (en) Apparatas and method for updating a information of accessory in an electronic device
CN112612539A (en) Data model unloading method and device, electronic equipment and storage medium
CN117215990A (en) Inter-core communication method and device of multi-core chip and multi-core chip
CN116054870B (en) Wireless communication circuit, bluetooth communication switching method and electronic equipment
CN116048351A (en) Screen capturing method and electronic equipment
CN111726848B (en) Equipment binding method, target starting device, fixed terminal and storage medium
CN110598156B (en) Drawing data processing method, drawing data processing device, terminal, server and storage medium
KR20150115169A (en) Electronic apparatus and dispalying method thereof
CN115826771B (en) Input method switching method and electronic equipment
CN116737293B (en) Page switching method of electronic equipment and electronic equipment
CN116048678B (en) Word cut-off detection method, device and system
CN116737356A (en) Memory management method and electronic equipment
CN116744226B (en) Electronic fence data acquisition method and electronic equipment
CN116361865B (en) Access method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination