WO2021190524A1 - Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal - Google Patents

Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal Download PDF

Info

Publication number
WO2021190524A1
WO2021190524A1 PCT/CN2021/082531 CN2021082531W WO2021190524A1 WO 2021190524 A1 WO2021190524 A1 WO 2021190524A1 CN 2021082531 W CN2021082531 W CN 2021082531W WO 2021190524 A1 WO2021190524 A1 WO 2021190524A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
terminal
screenshot image
screenshot
display screen
Prior art date
Application number
PCT/CN2021/082531
Other languages
English (en)
Chinese (zh)
Inventor
熊刘冬
李春东
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021190524A1 publication Critical patent/WO2021190524A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This application relates to the field of human-computer interaction technology, and in particular to a method for processing screenshots, a graphical user interface, and a terminal.
  • This application provides a screenshot processing method, a graphical user interface, and a terminal. After the user takes a screenshot, the user can edit the screenshot while using the current client, and after the client interface is refreshed, the edited content matches the current interface.
  • the present application provides a screenshot processing method applied to a terminal.
  • the method may include: the terminal displays a first interface and a second interface on a display screen, and the interface content of the second interface is the first interface.
  • a first screenshot image of an interface when the terminal detects the first operation input, a second screenshot image of the first interface is acquired, and the second screenshot image is when the terminal detects the first operation input
  • the terminal when the terminal acquires the first screenshot image of the first interface, it displays the first screenshot image in the second interface, and the user can watch the first interface and the second interface at the same time.
  • the terminal detects the input of the first operation (for example, a refresh operation)
  • the terminal obtains a second screenshot image of the first interface according to the interface content currently displayed on the first interface, and the interface content of the second interface is replaced with the second screenshot image.
  • the effect of refreshing the screenshot image is realized.
  • the display content of the first interface changes, the user can replace the display content of the second interface at any time.
  • the terminal displaying the first interface and the second interface on the display screen includes: the terminal displays the first interface on the display screen; when the terminal detects the input of the second operation, in the The first interface and the second interface are displayed on the display screen, the interface content of the second interface is a first screenshot image of the first interface, and the first screenshot image is all input detected on the terminal.
  • the terminal after acquiring the first screenshot image of the first interface, the terminal triggers the second interface and displays the first screenshot image on the second interface. There is no overlapping area between the second interface and the first interface.
  • the operation of the client in the first interface is not affected, and the user can use the first interface and the second interface at the same time.
  • the first interface may be a video playback interface, a slideshow interface, or other dynamic interfaces. If the first interface is a video playback interface, when the user edits the second interface, the video file of the first interface continues to play, and the user can achieve the effect of editing the screenshot while watching the video.
  • the method further includes: the terminal adds a layer to the first screenshot image of the second interface; the terminal detects the input During the first operation, a second screenshot image of the first interface is acquired; the terminal replaces the first screenshot image with the second screenshot image, and the interface content of the second interface is the first interface The second screenshot image and the layer.
  • the terminal may add a layer on the second screenshot image, that is, edit the first screenshot image.
  • the layer remains and is displayed superimposed on the second screenshot image. That is, the user will not lose the original editing content after refreshing the screenshot image, and there is no need to edit the screenshot image again after obtaining the new screenshot image.
  • the method further includes: when the terminal detects the input of the third operation, saving the interface content of the second interface as a target screenshot image, and the target screenshot image includes the second screenshot image and The layer.
  • the terminal saves the screenshot image, it saves the layers together, that is, the user's editing content is retained.
  • the interface content of the first interface includes application interfaces of multiple clients.
  • the terminal displays the first interface and the second interface on the display screen when detecting the input second operation, Including: when the terminal detects the input of the fourth operation for the first client, displaying the first interface and the second interface on the display screen, and the interface content of the second interface is the first client
  • the first screenshot image of the terminal the first client is one of the multiple clients
  • the first screenshot image is the first client when the terminal detects the fourth operation input
  • the terminal may take a screenshot of one of the clients and display it in the second interface.
  • the fourth operation for the first client acts on the display screen of the terminal.
  • the terminal when the interface content of the first interface includes multiple client application interfaces, when the terminal detects the input second operation, the first interface and the second interface are displayed on the display screen. , Including: when the terminal detects the input second operation, acquiring a screenshot image of the first interface, and selecting the application interface of the first client from the screenshot images of the first interface according to the input sixth operation as the first screenshot image, Displayed in the second interface.
  • the terminal can select a screenshot image of one of the clients to display in the second interface without cropping the screenshot image again.
  • the sixth operation acts on the display screen of the terminal.
  • the method further includes: when the terminal detects a fifth operation input, adjusting the display state of the second interface according to the fifth operation, and the display state of the second interface includes at least the following One item: location, size, or shape.
  • the first operation, the second operation, the third operation, the fourth operation, and the fifth operation include automatic operation or user operation.
  • User operations can be operations completed by the user through buttons, touch screen, voice, etc.; automatic operations can be operations that are automatically triggered when the terminal reaches a certain state. For example, when the display content of the first interface is updated, the first operation is automatically triggered to obtain the second screenshot image of the first interface and display it on the second interface.
  • this application provides a graphical user interface on a terminal, the terminal having a display screen, a memory, and one or more processors for executing one or more programs stored in the memory,
  • the graphical user interface includes: a first interface and a second interface displayed on the display screen currently output by the system, wherein:
  • the first interface and the second interface are displayed on the display screen, the interface content of the second interface is the first screenshot image of the first interface, the first interface and The second interface has no overlapping area;
  • a second screenshot image of the first interface is acquired, where the second screenshot image is the first interface displayed when the terminal detects the input of the first operation Image;
  • the graphical user interface further includes: in response to the terminal adding a layer to the first screenshot image of the second interface; in response to the terminal detecting the input of the first operation, acquiring The second screenshot image of the first interface; in response to the terminal replacing the second screenshot image with the first screenshot image, the interface content of the second interface is the second screenshot image of the first interface and all The layer.
  • the graphical user interface further includes: in response to the terminal detecting the input of the third operation, saving the interface content of the second interface as a target screenshot image, the target screenshot image including The second screenshot image and the layer.
  • the graphical user interface further includes: the interface content of the first interface includes application interfaces of multiple clients.
  • the graphical user interface further includes: when the terminal detects the input of the fourth operation for the first client, displaying the first interface and the second interface on the display screen,
  • the interface content of the second interface is a first screenshot image of the first client
  • the first client is one of the plurality of clients
  • the first screenshot image is displayed on the terminal The image displayed by the first client when the input fourth operation is detected.
  • the graphical user interface further includes: in response to the terminal detecting a fifth operation input, adjusting the display state of the second interface according to the fifth operation, and the second The display status of the interface includes at least one of the following: position, size, or shape.
  • the present application provides a terminal, including: one or more processors, one or more memories; the one or more memories are coupled with the one or more processors, and the one or more A memory is used to store computer program code, the computer program code includes computer instructions, when the one or more processors execute the computer instructions, the terminal executes the screenshot processing in the full-screen display video provided by the first aspect Methods.
  • the present application provides a computer storage medium, including computer instructions, which when the computer instructions are executed on a terminal, cause the terminal to execute the method for processing screenshots in a full-screen display video as provided in the first aspect.
  • the second interface when the terminal displays the video playback interface in full screen, the second interface can be displayed floating according to the operation, the display interface of the second interface can be switched, and the terminal can also quickly switch between multi-window display and full-screen display according to the operation. During this process, the terminal continues to play the video.
  • Figure 1 is an application interface diagram of a screenshot processing method provided by this application
  • Figure 2a is an application interface diagram of another method for screenshot processing provided by this application.
  • Figure 2b is an application interface diagram of another method for screenshot processing provided by this application.
  • Figure 2c is an application interface diagram of another method for screenshot processing provided by this application.
  • FIG. 3 is an application interface diagram of another method for screenshot processing provided by this application.
  • FIG. 4 is an application interface diagram of another method for screenshot processing provided by this application.
  • Figure 5a is an application interface diagram of another method for screenshot processing provided by this application.
  • Figure 5b is an application interface diagram of another method for screenshot processing provided by this application.
  • Figure 6 is a schematic structural diagram of a terminal provided by this application.
  • FIG. 7 is a block diagram of the software structure of the terminal provided by this application.
  • FIG. 8 is a schematic flowchart of a screenshot processing method provided by this application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Thus, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, “plurality” means two or more.
  • the screenshot processing method provided by this application can edit the screenshot image while watching the video, and after the video content is refreshed, the content of the screenshot image can be refreshed, and the refreshed screenshot image can be combined with the content edited by the user to form the final image.
  • a specific application scenario is taken as an example to describe the method of screenshot processing in this application.
  • the user is watching a video and wants to take a screenshot of a certain frame of image in the video, while editing the screenshot, and wants to keep the video window displayed.
  • the video may be a video provided by a video application or a portal website, or a video saved in the terminal 100, or a video received by the terminal 100 from other devices, and there is no limitation here.
  • the video interface can be a video that is currently playing or a video that is paused.
  • the display screen 10 of the terminal 100 displays the first interface 20 currently output by the system, and the first interface 20 may be an interface provided by a video application.
  • the display screen 10 is configured with a touch panel, which can be used to receive a user's touch operation.
  • the touch operation refers to the operation of the user's hand, elbow, or stylus touching the display screen 10.
  • the full-screen display of the video interface means that only the video interface is displayed on the display screen 10, and no other content is displayed.
  • the video interface may occupy all the display area of the display screen 10, for example, as shown in the right drawing of FIG. 1.
  • the video interface may also only occupy part of the display area of the display screen 10. For example, when the display screen 10 is a notch screen, the video interface is displayed in the middle part of the notch screen. When the side or both edges are partially black, it can also be regarded as the display screen 10 displaying the video interface in full screen.
  • the full-screen display of the video interface may mean that while the video interface is displayed on the display screen, system-level interface elements, such as status bar, floating shortcut menu (such as Apple's Assistive Touch), etc., can also be displayed.
  • system-level interface elements such as status bar, floating shortcut menu (such as Apple's Assistive Touch), etc.
  • the status bar may include the name of the operator (for example, China Mobile), time, WiFi icon, signal strength, and current remaining power.
  • the video interface may include a video screen, and may also include a video progress bar, a virtual button for adjusting the volume, a virtual button for playing/pausing the video, and so on.
  • the terminal 100 displays the second interface 30 while displaying the first interface 20 on the display screen 10.
  • the display content in the second interface 30 is a screenshot image obtained by taking a screenshot of the display interface in the first interface 20.
  • the later mentioned that the terminal 100 displays the second interface 30 means that the terminal 100 displays the second interface 30 while displaying the first interface 20 on the display screen 10.
  • the operation of triggering the terminal 100 to take a screenshot on the display screen 10 may be referred to as the second operation.
  • the terminal 100 in response to the user's screen capture operation, the terminal 100 simultaneously displays the second interface 30 on the display screen 10.
  • the first interface 20 and the second interface 30 belong to two different clients, and the interface contents of the first interface 20 and the second interface 30 do not overlap.
  • the second interface 30 is rectangular.
  • the size of the second interface 30 may be similar to the size of the first interface 20, so that the user can view the interface content in the first interface 20 and the second interface 30 at the same time.
  • the positions and sizes of the second interface 30 and the first interface 20 can be dragged and adjusted according to user requirements.
  • the second interface 30 may also display one or more function icons.
  • the second interface 30 may display icons such as sending 21, editing 22, and deleting 23. These icons can be used to receive user touch operations and output corresponding interfaces based on the received touch operations. For example, the user can click any icon in the second interface 30 to operate the function described by the icon.
  • the screenshot image in the second interface 30 can be sent to other clients; when the user triggers icon editing 22, you can add a layer to the screenshot image in the second interface 30, which can be User-defined layer; when the user triggers icon deletion 23, the screenshot image in the second interface 30 is deleted, and there is no displayable content in the second interface 30, the second interface 30 is not displayed on the display screen 10.
  • the first interface 10 restores the position and size before the screenshot.
  • the display screen 10 of the terminal 100 is a foldable display screen, and the state of the display screen 10 is divided into a folded state and an expanded state.
  • the display screen 10 displays the first interface in full screen; as shown in FIG. The interface 20 and the second interface 30.
  • the display condition on the display screen 10 may also be as shown in FIG. 2c.
  • the video playback interface in the first interface 20 may or may not be interrupted.
  • the user can edit the screenshot image in the second interface 30 while continuously watching the video.
  • the icon editing 22 is in response to a user trigger.
  • the second interface 30 can display icons such as mark 24, refresh 25, and save 26. These icons can be used to receive user touch operations and output corresponding interfaces based on the received touch operations. For example, the user can click any icon in the second interface 30 to operate the function described by the icon.
  • the marking method includes adding a custom layer to the screenshot image in the second interface 30 through tools such as brushes, text, and painting.
  • the layer 40 includes an arrow and the text "important", that is, the arrow can be drawn by the brush tool, and the "important" can be edited by the text tool.
  • the interface content in the second interface 30 (including the screenshot image and the layer 40 added by the icon mark 24) is saved.
  • the screenshot image in the second interface 30 can be refreshed, but the layer added by the user is retained, and the refreshed screenshot image and the layer are superimposed to form a new picture.
  • the refreshed screenshot image is the interface content displayed on the first interface 20 when the user triggers the icon refresh 25. The following details how to refresh the screenshot image.
  • the video playback interface in the first interface 20 may not be interrupted. That is, the user can continue to watch the video while editing the screenshot image in the second interface 30.
  • the video playback interface in the first interface 20 continues to play, and part of the content 50 is updated.
  • the icon refresh 25 as shown in the lower drawing of FIG. 4
  • the screenshot image in the second interface 30 is the current display interface in the first interface 20, and the layer 40 is still superimposed and displayed on the refreshed interface. In the screenshot image.
  • the refreshed screenshot image and the layer 40 in the second interface 30 are saved. That is, after the user takes a screenshot, while editing the screenshot image while using the current client, and the client interface is refreshed, the editing content matches the current interface.
  • the operation of triggering the terminal 100 to refresh on the display screen 10 may be referred to as the first operation.
  • the operation of triggering the terminal 100 to save the interface content of the second interface may be referred to as a third operation.
  • a user when watching a video, a user can open the second interface through a screenshot, view the screenshot image in the second interface, and edit the screenshot image while watching the video.
  • the second interface can be refreshed.
  • the interface can be edited again.
  • Such a screenshot processing method does not affect the playback of the video, and the screenshot image can be edited while watching the video.
  • the terminal is equipped with a large-size display screen (for example, a display screen of 9 inches, 10 inches or more) or a folding screen or a dual screen, the advantages of the large screen can be fully utilized to present the first interface and the second interface.
  • the first interface 20 may also be other display interfaces.
  • the first interface 20 currently output by the system displayed on the display screen 10 of the terminal 100 can be any one of the following: a client application interface displayed in a portrait screen, a desktop displaying application icons, and a split screen displaying application interfaces of multiple clients at the same time Interface and so on.
  • the horizontal screen display means that the width of the interface content displayed by the terminal is greater than the height
  • the vertical screen display means that the height of the interface content displayed by the terminal is greater than the width.
  • the first interface 20 may also be other interfaces, such as a game interface, a web browsing interface, an interface for reading books, an interface for music playback, an interface for text editing, and the like.
  • the first interface 20 may also include system-level interface elements, such as floating shortcut menus, etc., which are not limited here.
  • the application interface of one of the clients may be captured, or the entire interface content of the first interface 20 may be captured.
  • the application interface of one of the clients is selected, and the interface content of the second display interface 30 is the application interface of the one client.
  • the interface content of the first interface 20 includes the application interfaces of the client 1 and the client 2.
  • the terminal system receives the user's trigger for the screenshot operation for the client terminal 1, the terminal system captures the application interface of the client terminal 1.
  • the screenshot operation for the client 1 may be a touch screen operation acting on the display area of the client 1 on the display screen 10, or may be a key operation, a hovering gesture operation, and so on.
  • the operation of triggering the terminal 100 to take a screenshot of the client 1 on the display screen 10 may be referred to as the fourth operation.
  • the terminal system when the terminal system receives a user triggering a screenshot operation for the first interface 20, the interface content of the first interface 20 is captured; the user can select the application interface of the client 1 in the screenshot content, and when the terminal system receives the user trigger For the selection operation of the client 1, the application interface of the client 1 among the intercepted interface content of the first interface 20 is displayed on the second interface 30.
  • the operation of triggering the terminal 100 to select the application interface of the client 1 in the screenshot content may be referred to as the sixth operation.
  • the terminal 100 in response to the user's screen capture operation, the terminal 100 simultaneously displays the second interface 30 on the display screen 10. Wherein, the interface contents of the first interface 20 and the second interface 30 do not overlap. The positions and sizes of the second interface 30 and the first interface 20 can be dragged and adjusted according to user requirements.
  • the display screen 10 of the terminal 100 is a foldable display screen, and the first interface 20 displays the application interfaces of two clients (client 1 and client 2) at the same time.
  • the state of the display screen 10 is divided into a folded state and an expanded state.
  • the display screen 10 displays the first interface in full screen, and the first interface displays the client 1 and the client 2 respectively.
  • the terminal 100 divides two display areas on the display screen 10, and displays the first interface 20 and the second interface 30 respectively.
  • the positions and sizes of the client 1 and the client 2 in the first interface 20 can be dragged and adjusted according to user requirements.
  • the user can adjust the display position, shape, and size of the second interface 30 according to specific needs.
  • several adjustment methods of the second interface 30 are exemplified.
  • the user can long press any area of the second interface 30, and when the terminal system detects that the long press time reaches a threshold, the second interface 30 can move with the movement of the finger.
  • the user can drag the edge position of the second interface 30 with a finger to change the display length or width of the second interface 30.
  • the user can drag the lower right corner of the second interface 30 with a finger, the upper left corner of the second interface 30 does not move, and the lower right corner of the second interface 30 expands to the edge position of the display screen 10 along with the sliding track of the finger.
  • the operation for adjusting the second interface 30 is not limited to the above examples, and other methods are also possible.
  • the user can also adjust the size of the second interface 30 by clicking on the adjustment options (for example, “enlarge the second interface”, “reduce the second interface”, etc.) or icons output on the display screen 10 of the terminal system.
  • the operation of triggering the terminal 100 to adjust the display state of the second interface may be referred to as the fifth operation.
  • the display mode of the interface elements in the second interface 30 can be adjusted adaptively. For example, after the second interface 30 is reduced, the interface elements (such as icons, etc.) in the second interface 30 may be displayed in a reduced scale, or only a part of the original interface elements of the second interface 30 may be displayed.
  • the adaptive adjustment of the display mode of the interface elements in the second interface 30 may also include adjusting the spacing between the interface elements, the display position between the interface elements, etc., which is not limited in this application.
  • the user can also close the second interface 30.
  • closing the second interface 30 means that the corresponding program that triggers the multi-window display is closed, and the display screen 10 only displays the first interface 20.
  • the processing resources and storage resources used by the terminal 100 to display the second interface 30 are released.
  • the second interface 30 continues to run in the background of the terminal 100, but is not displayed on the display screen 10.
  • the terminal 100 may be a portable electronic device such as a mobile phone, a tablet computer, a personal digital assistant (PDA), a wearable device, and the like.
  • portable electronic devices include, but are not limited to, portable electronic devices equipped with iOS, android, microsoft or other operating systems.
  • the aforementioned portable electronic device may also be other portable electronic devices, such as a laptop computer with a touch-sensitive surface (such as a touch panel).
  • the terminal 100 may not be a portable electronic device, but a desktop computer with a touch-sensitive surface (such as a touch panel).
  • the terminal 100 is equipped with a display screen, which can be used to display the interface content currently output by the terminal system.
  • the interface content can include the interface of the running application and the system-level menu, etc. It can be composed of the following interface elements: input interface elements, such as buttons, text input boxes, scroll bars , Menu, etc.; and output interface elements, such as window, label, etc.
  • the display screen of the terminal 100 is equipped with a touch panel, that is, the display screen is a touch screen, which can be used to receive a user's touch operation. operate.
  • the touch screen can also be used to receive a user's hovering touch operation, which refers to an operation where the user's hand is hovering above the display screen and not touching the display screen.
  • the terminal 100 when the terminal 100 receives an operation for triggering the display of the second interface, it displays the second interface while displaying the first interface on the touch screen.
  • the display content in the second interface is a screenshot image obtained by taking a screenshot of the display interface of the first interface.
  • the operation for triggering the display of the second interface, the display content of the second interface, the display mode, etc. can refer to the related description of the foregoing embodiment, which will not be repeated here.
  • the position, shape, and size of the second interface can be adjusted, and reference may be made to the related description of the foregoing embodiments, which will not be repeated here.
  • the terminal 100 may also receive an operation for triggering the terminal to hide the second interface.
  • the first operation, the second operation, the third operation, the fourth operation, and the fifth operation may be automatic operations or user operations.
  • User operations can be operations completed by the user through buttons, touch screen, voice, etc.; automatic operations can be operations that are automatically triggered when the terminal reaches a certain state. For example, when the display content of the first interface is updated, the first operation is automatically triggered to obtain a second screenshot image of the first interface and display it on the second interface.
  • FIG. 6 is a structural block diagram of an implementation manner of the terminal 100.
  • the terminal 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and user An identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the terminal 100.
  • the terminal 100 may include more or fewer components than those shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the terminal 100.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter/receiver (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter/receiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, which includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the terminal 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through an I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with the display screen 194, the camera 193 and other peripheral devices.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and so on.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the terminal 100.
  • the processor 110 and the display screen 194 communicate through a DSI interface to realize the display function of the terminal 100.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the terminal 100, and can also be used to transfer data between the terminal 100 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is merely a schematic description, and does not constitute a structural limitation of the terminal 100.
  • the terminal 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the terminal 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the terminal 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the terminal 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the terminal 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the terminal 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellite systems. (global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive a signal to be sent from the processor 110, perform frequency modulation, amplify, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the terminal 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the terminal 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the terminal 100 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is an image processing microprocessor, which is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the terminal 100 may include one or N display screens 194, and N is a positive integer greater than one.
  • the display screen 194 in FIG. 1 may be bent.
  • the above-mentioned display screen 194 can be bent means that the display screen can be bent to any angle at any position and can be maintained at that angle.
  • the display screen 194 can be folded in half from the middle. You can also fold up and down from the middle.
  • a display screen that can be bent is referred to as a foldable display screen.
  • the foldable display screen may be one screen, or a display screen formed by patching together multiple screens, which is not limited here.
  • the foldable display screen can display content in full screen.
  • the interface content when the interface content is displayed on a full screen, the interface content may occupy all the display area of the foldable display screen.
  • the interface content may only occupy part of the display area of the foldable display screen. For example, when the display screen is a notch screen, the middle part of the notch screen displays the interface content, one When the side or both edges are partially black, it can also be regarded as the foldable display screen displaying the interface content in full screen.
  • the terminal 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and transforms it into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the terminal 100 may include one or N cameras 193, and N is a positive integer greater than one.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the terminal 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the terminal 100 may support one or more video codecs. In this way, the terminal 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the terminal 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the terminal 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the terminal 100 by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program (such as a sound playback function, an image playback function, etc.) required by at least one function, and the like.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the terminal 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the terminal 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the terminal 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the terminal 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the terminal 100 may be provided with at least one microphone 170C. In other embodiments, the terminal 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In other embodiments, the terminal 100 may also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the pressure sensor 180A may be used to capture the pressure value generated when the user's finger part touches the display screen, and transmit the pressure value to the processor, so that the processor can recognize which finger part the user inputs through operate.
  • the capacitive pressure sensor may include at least two parallel plates with conductive materials.
  • the terminal 100 determines the strength of the pressure according to the change in capacitance.
  • the terminal 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the terminal 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions.
  • the pressure sensor 180A may transmit the detected capacitance value to the processor, so that the processor recognizes which finger part (knuckle or finger pad, etc.) the user inputs an operation through.
  • the pressure sensor 180A may also calculate the number of touch points based on the detected signal, and transmit the calculated value to the processor, so that the processor can recognize that the user performs a single-finger or multi-finger input operation .
  • the gyro sensor 180B may be used to determine the movement posture of the terminal 100.
  • the angular velocity of the terminal 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyroscope sensor 180B detects the shake angle of the terminal 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the terminal 100 through a reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the terminal 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the terminal 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the terminal 100 can detect the opening and closing of the flip according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the terminal 100 in various directions (generally three axes). When the terminal 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and apply to applications such as horizontal and vertical screen switching, pedometers, and so on. In some optional embodiments of the present application, the acceleration sensor 180E may be used to capture the acceleration value generated when the user's finger part touches the display screen, and transmit the acceleration value to the processor, so that the processor recognizes which finger part the user inputs through operate.
  • the terminal 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the terminal 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the terminal 100 emits infrared light to the outside through the light emitting diode.
  • the terminal 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the terminal 100. When insufficient reflected light is detected, the terminal 100 may determine that there is no object near the terminal 100.
  • the terminal 100 can use the proximity light sensor 180G to detect that the user holds the terminal 100 close to the ear to talk, so as to automatically turn off the display screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the terminal 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the terminal 100 is in a pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the terminal 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • the terminal 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the terminal 100 executes to reduce the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the terminal 100 when the temperature is lower than another threshold, the terminal 100 heats the battery 142 to avoid abnormal shutdown of the terminal 100 due to low temperature.
  • the terminal 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the terminal 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 180M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 180M, and realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 180M, and realize the heart rate detection function.
  • the button 190 includes a power-on button, a volume button, and so on.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the terminal 100 may receive key input, and generate key signal input related to user settings and function control of the terminal 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the terminal 100.
  • the terminal 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 may also be compatible with external memory cards.
  • the terminal 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the terminal 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the terminal 100 and cannot be separated from the terminal 100.
  • the software system of the terminal 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present invention takes an Android system with a layered architecture as an example to illustrate the software structure of the terminal 100 by way of example.
  • FIG. 7 is a block diagram of the software structure of the terminal 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
  • a floating launcher may also be added to the application layer to be used as a default display application in the second interface 30 mentioned above, and to provide users with an entrance to other applications.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, an activity manager, etc.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display screen, determine whether there is a status bar, lock the display screen, and intercept the display screen.
  • a FloatingWindow can be extended based on Android's native PhoneWindow, which is specifically used to display the second interface 30 mentioned above to distinguish it from ordinary windows, which have the attribute of being displayed floating at the top of the series of windows.
  • the window size can be given an appropriate value according to the size of the actual screen and the optimal display algorithm.
  • the aspect ratio of the window may default to the screen aspect ratio of a conventional mainstream mobile phone.
  • an additional close button and a minimize button can be drawn in the upper right corner.
  • some gesture operations of the user will be received. If the gestures of the second interface are met, the window will be frozen and the animation effect of the movement of the second interface will be played.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include videos, images, audios, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, and so on.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface that includes a short message notification icon may include a view that displays text and a view that displays pictures.
  • a button view for closing, minimizing and other operations on the second interface can be added correspondingly, and bound to the FloatingWindow in the above-mentioned window manager.
  • the phone manager is used to provide the communication function of the terminal 100. For example, the management of the call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and it can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, and so on.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or a scroll bar text, such as a notification of an application running in the background, or a notification that appears on the display screen in the form of a dialog window. For example, text messages are prompted in the status bar, prompt sounds, electronic devices vibrate, and indicator lights flash.
  • the activity manager is used to manage the running activities in the system, including process, application, service, task information, etc.
  • an activity task stack dedicated to managing the application Activity displayed in the second interface 30 can be added in the activity manager module to ensure that the application activity and task in the second interface will not be displayed in full screen on the screen. The application conflicts.
  • a motion detector can be added to the application framework layer, which is used to make logical judgments on the acquired input events and identify the type of the input event. For example, based on the touch coordinates included in the input event, the time stamp of the touch operation, and other information, it is determined that the input event is a knuckle touch event or a finger pad touch event.
  • the motion detection component can also record the trajectory of the input event, determine the gesture pattern of the input event, and respond to different operations according to different gestures.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: input manager (input manager), input dispatcher (input dispatcher), surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (such as OpenGL ES), 2D graphics engine (such as : SGL) and so on.
  • input manager input manager
  • input dispatcher input dispatcher
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library such as OpenGL ES
  • 2D graphics engine such as : SGL
  • the input manager is responsible for obtaining event data from the underlying input driver, parsed and encapsulated, and passed to the input dispatch manager.
  • the input scheduling manager is used to store window information. After receiving an input event from the input manager, it will look for a suitable window in the window it keeps, and dispatch the event to this window.
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations or key operations into original input events (including touch coordinates, time stamps of touch operations, or key functions, etc.).
  • the original input events are stored in the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the key operation as the volume down key plus the power key operation, the control corresponding to the volume down key plus the power key operation is the control of the screenshot application as an example, the display screen calls the interface of the application framework layer to start the screenshot application, and then through Call the kernel layer to start the camera driver and get the screenshot image.
  • FIG. 8 is a schematic flowchart of a screenshot processing method according to an embodiment of the present invention. As shown in Figure 8, the method includes:
  • the terminal displays the first interface on the display screen.
  • the terminal displays a first interface on a vertical or horizontal screen on the display screen
  • the first interface may be a video interface, a game interface, a web browsing interface, an interface for reading books, an interface for music playback, an interface for text editing, and so on.
  • the first interface may also be a client application interface, a desktop displaying application icons, a split screen interface that simultaneously displays application interfaces of multiple clients, and so on.
  • the first interface displayed on the display screen of the terminal may be: the display screen of the terminal only displays the video interface as shown in the right figure of FIG. 1, and does not display other content.
  • the video playback interface when the video playback interface is displayed in a full screen, the video playback interface may occupy all the display area of the display screen.
  • the video playback interface may also only occupy part of the display area of the display screen. For example, when the display screen is a notch screen, the video playback interface is displayed in the middle part of the notch screen. When the side or both edges are partially black, it can also be regarded as a full screen display of the video playback interface on the display screen.
  • the full-screen display of the video playback interface of the terminal can also be: while the video playback interface is displayed on the display screen, system-level interface elements, such as status bar, floating shortcut menu, etc., can also be displayed.
  • the video playback interface may include not only video images, but also a progress bar of the video, a virtual button for adjusting the volume, a virtual button for playing/pausing the video, and so on.
  • the second operation is used to take a screenshot of the displayed content of the first interface.
  • the terminal detects the input of the second operation, it displays the first interface and the second interface on the display screen.
  • the second operation may be a gesture that acts on the display screen of the terminal, or may be a button or voice command.
  • the second operation may be a gesture of sliding the user's knuckles on the terminal display screen to draw the first figure.
  • the first pattern may be a rectangle, a triangle, a square, a circle, or the like.
  • the first figure drawn by the user on the display screen through the knuckles may not be a standard shape in a geometric sense, as long as it is similar.
  • the second operation may also be: a click operation of the user's finger on the terminal display screen.
  • the click operation can be performed by one or more knuckles, finger pads, fingertips, stylus, etc.
  • the second operation may also be: the user's finger is in the first hovering posture on the display screen.
  • the first hovering posture may mean that the fingers hovering above the display screen are in an extended state, a bent state, or the like.
  • the display content of the second interface is the first screenshot image of the first interface.
  • the terminal detects the input second operation it will screenshot the content displayed on the first interface to obtain the first screenshot image.
  • the first screenshot image is an image displayed on the first interface when the terminal detects the input second operation.
  • the first interface displays application interfaces of multiple clients.
  • the first interface includes application interfaces of client 1 and client 2 as an example.
  • the terminal detects the input of the second operation, it takes a screenshot of the content displayed on the first interface and obtains the screenshot image; then the application interface of client 1 or client 2 can be selected from the screenshot image according to the input of the sixth operation as the first Screenshot image.
  • the interface content of the second interface is the first screenshot image.
  • the first interface displays application interfaces of multiple clients, and when the terminal detects the input fourth operation, it acquires the first screenshot image of the first client.
  • the first client is one of multiple clients in the first interface, and the fourth operation is a screenshot operation for the first client.
  • the second interface displays a first screenshot image
  • the first screenshot image is a screenshot image of the first client.
  • the fourth operation may be a touch screen operation acting on the display screen of the terminal, or a preset key operation, or an operation such as a hovering gesture, a voice command, and the like.
  • the fourth operation is a click operation on the display area of the application interface of the first client on the first interface.
  • the click operation can be performed by one or more knuckles, finger pads, fingertips, stylus, etc.
  • the fourth operation is that the user's finger is in the first hovering gesture on the display area of the application interface of the first client on the first interface.
  • the first hovering posture may mean that the fingers hovering above the display screen are in an extended state, a bent state, or the like.
  • the mWindowMap component of the Window Manager Service (WMS) in the Framework process is called through the SystemUI control to store application window information, Obtain information such as the application name (for example, the first client), window coordinates and other information of the client that is currently selected to put the first screenshot image on the second interface.
  • WMS Window Manager Service
  • the shape, position, and size of the second interface may be set by default by the terminal system, or set by the user according to his own usage habits, or may be determined in real time according to the operation.
  • the shape, position, and size of the first window are determined according to the operation, that is, the shape, position, and size of the first window are related to the fifth operation.
  • the terminal adds a layer on the first screenshot image of the second interface.
  • the terminal displays the first interface and the second interface on the display screen, and the interface content of the second interface is the first screenshot image.
  • the terminal is triggered to add a layer on the first screenshot image.
  • the method of adding layers includes marking the first screenshot image with tools such as brushes, text, and coloring.
  • the first interface is a video playback interface.
  • the video on the first interface can continue to be played. The effect of taking notes while watching the video.
  • the first operation is a refresh operation
  • the first operation may be a touch screen operation acting on the display screen of the terminal, or may be a voice command, a key operation, and the like.
  • the terminal adds a layer to the first screenshot image on the second interface, the video on the first interface can continue to be played. After the user finishes taking notes on the first screenshot image, the video content has been refreshed. 4, it can be seen that when the user wants to refresh the current first screenshot image, he can click the icon refresh 25.
  • the terminal detects the first operation input and refreshes the client just marked (for example, the first client). End) window coordinate information. Then, take a screenshot of the application interface of the first client in the first interface according to the application name of the client of the first screenshot image (for example, the first client), and obtain the second screenshot of the first client through the SurfaceControl.screenshot function.
  • the interface content of the second interface is a screenshot image of the application interface of the first client.
  • the first interface includes multiple clients (including the first client ) Application interface.
  • the terminal confirms the first client among the multiple clients on the first interface according to the application name of the client of the first screenshot image (for example, the first client) and the window coordinates of the first client, and obtains the first client.
  • the second screenshot image of a client is a screenshot image of a client.
  • the terminal replaces the first screenshot image with the second screenshot image.
  • the terminal after obtaining the second screenshot image of the first interface, the terminal replaces the first screenshot image with the second screenshot image. That is, the second screenshot image is displayed on the second interface.
  • the layer added in step S103 is retained in the second interface and displayed on the second screenshot image.
  • the third operation may be a click operation acting on the icon save 26 on the display screen of the terminal, or a gesture acting on the display screen of the terminal, or a voice command, key operation, etc.
  • the interface content of the second interface is saved as the target screenshot image.
  • the interface content of the second interface includes the second screenshot image and layers. That is, it is realized that the refreshed screenshot image is combined with the content (layer) edited by the user, and the final image is saved, and the effect that the edited content still matches the current interface after the client interface is refreshed is realized.
  • Stopping the display of the second interface may include the following two situations: hiding the second interface and closing the second interface. Among them, the hidden second interface can continue to run in the background of the terminal. Wherein, after the second interface is closed, processing resources and storage resources used by the terminal to display the second interface may be released.
  • the terminal displays the prompt identifier of the second interface.
  • the prompt mark can be a graphic mark (for example, a prompt bar, a second interface icon, an arrow, etc.), a text, and the like.
  • the reminder mark can be a reminder bar or floating window. The user can redisplay the second interface through the prompt mark.
  • the first operation, the second operation, the third operation, and the fourth operation are not limited to the default settings of the terminal at the factory, and may also be set by the user.
  • the user can select an operation as the first operation, the second operation, the third operation, and the fourth operation from a setting menu containing multiple operations, and the user can also customize the operation according to his own habits.
  • the terminal when the interface content of the first interface is a video playback interface, the terminal continues to display the video playback interface, and the video continues to be played. That is, the present application may display the second interface while continuously playing the video, and the interface content of the second interface is a screenshot image of the interface content of the first interface.
  • this application can also edit the second interface, and when the video content is updated, the screenshot image displayed in the second interface is updated accordingly, the original editing content is retained, and the updated screenshot image is combined with the editing content. Form the final screenshot image.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé de traitement de capture d'écran, une interface utilisateur graphique et un terminal. Le procédé peut comprendre : lors de l'acquisition d'une première image de capture d'écran d'une première interface, l'affichage, par un terminal, de la première image de capture d'écran dans une seconde interface, un utilisateur pouvant simultanément visualiser la première interface et la seconde interface ; et lorsque le terminal détecte une première opération d'entrée (telle qu'une opération de rafraîchissement), l'acquisition, par le terminal, d'une seconde image de capture d'écran de la première interface selon un contenu d'interface affiché actuellement dans la première interface, et le remplacement du contenu d'interface de la seconde interface par la seconde image de capture d'écran. Par la mise en œuvre de la présente application, l'effet de rafraîchissement d'une image de capture d'écran est réalisé, et lorsque le contenu d'affichage d'une première interface est changé, un utilisateur peut remplacer à tout moment le contenu d'affichage d'une seconde interface.
PCT/CN2021/082531 2020-03-24 2021-03-24 Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal WO2021190524A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010214656.9A CN113448658A (zh) 2020-03-24 2020-03-24 截屏处理的方法、图形用户接口及终端
CN202010214656.9 2020-03-24

Publications (1)

Publication Number Publication Date
WO2021190524A1 true WO2021190524A1 (fr) 2021-09-30

Family

ID=77806682

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/082531 WO2021190524A1 (fr) 2020-03-24 2021-03-24 Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal

Country Status (2)

Country Link
CN (1) CN113448658A (fr)
WO (1) WO2021190524A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114594894A (zh) * 2022-02-25 2022-06-07 青岛海信移动通信技术股份有限公司 界面元素的标记方法及终端设备、存储介质
CN115237299A (zh) * 2022-06-29 2022-10-25 北京优酷科技有限公司 播放页面切换方法及终端设备

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116088832B (zh) * 2022-07-14 2024-06-18 荣耀终端有限公司 界面处理方法和装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309560A (zh) * 2013-06-08 2013-09-18 东莞宇龙通信科技有限公司 多界面显示信息的方法及终端
CN107861681A (zh) * 2017-10-26 2018-03-30 深圳市万普拉斯科技有限公司 截屏处理方法、装置、计算机设备和存储介质
CN109597556A (zh) * 2018-12-12 2019-04-09 维沃移动通信有限公司 一种截屏方法及终端
US20190147026A1 (en) * 2017-05-16 2019-05-16 Apple Inc. Device, Method, and Graphical User Interface for Editing Screenshot Images
CN110231905A (zh) * 2019-05-07 2019-09-13 华为技术有限公司 一种截屏方法及电子设备
CN110737386A (zh) * 2019-09-06 2020-01-31 华为技术有限公司 一种屏幕截取方法及相关设备

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150177953A1 (en) * 2013-12-23 2015-06-25 Thomson Licensing User interface displaying scene dependent attributes
CN105611327A (zh) * 2015-12-22 2016-05-25 北京邦天信息技术有限公司 获取直播图片元数据并刷新的方法及***及转换设备
CN107239208B (zh) * 2017-05-27 2020-03-27 努比亚技术有限公司 处理屏幕截图的方法、设备及计算机可读存储介质
CN107390972B (zh) * 2017-07-06 2021-09-07 努比亚技术有限公司 一种终端录屏方法、装置及计算机可读存储介质
CN107896279A (zh) * 2017-11-16 2018-04-10 维沃移动通信有限公司 一种移动终端的截屏处理方法、装置及移动终端
CN109151546A (zh) * 2018-08-28 2019-01-04 维沃移动通信有限公司 一种视频处理方法、终端及计算机可读存储介质
CN109388304B (zh) * 2018-09-28 2021-05-11 维沃移动通信有限公司 一种截屏方法及终端设备
CN110012154A (zh) * 2019-02-22 2019-07-12 华为技术有限公司 一种具有折叠屏的电子设备的控制方法及电子设备
CN110401766B (zh) * 2019-05-22 2021-12-21 华为技术有限公司 一种拍摄方法及终端
CN110647274A (zh) * 2019-08-15 2020-01-03 华为技术有限公司 一种界面显示方法及设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309560A (zh) * 2013-06-08 2013-09-18 东莞宇龙通信科技有限公司 多界面显示信息的方法及终端
US20190147026A1 (en) * 2017-05-16 2019-05-16 Apple Inc. Device, Method, and Graphical User Interface for Editing Screenshot Images
CN107861681A (zh) * 2017-10-26 2018-03-30 深圳市万普拉斯科技有限公司 截屏处理方法、装置、计算机设备和存储介质
CN109597556A (zh) * 2018-12-12 2019-04-09 维沃移动通信有限公司 一种截屏方法及终端
CN110231905A (zh) * 2019-05-07 2019-09-13 华为技术有限公司 一种截屏方法及电子设备
CN110737386A (zh) * 2019-09-06 2020-01-31 华为技术有限公司 一种屏幕截取方法及相关设备

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114594894A (zh) * 2022-02-25 2022-06-07 青岛海信移动通信技术股份有限公司 界面元素的标记方法及终端设备、存储介质
CN115237299A (zh) * 2022-06-29 2022-10-25 北京优酷科技有限公司 播放页面切换方法及终端设备
CN115237299B (zh) * 2022-06-29 2024-03-22 北京优酷科技有限公司 播放页面切换方法及终端设备

Also Published As

Publication number Publication date
CN113448658A (zh) 2021-09-28

Similar Documents

Publication Publication Date Title
WO2021013158A1 (fr) Procédé d'affichage et appareil associé
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
JP7142783B2 (ja) 音声制御方法及び電子装置
WO2021027747A1 (fr) Procédé et dispositif d'affichage d'interface
WO2021103981A1 (fr) Procédé et appareil de traitement d'affichage à écran divisé, et dispositif électronique
WO2021129326A1 (fr) Procédé d'affichage d'écran et dispositif électronique
WO2021036571A1 (fr) Procédé d'édition de bureau et dispositif électronique
WO2020052529A1 (fr) Procédé pour régler rapidement une petite fenêtre lors d'un affichage plein écran pendant une vidéo, interface utilisateur graphique et terminal
WO2021139768A1 (fr) Procédé d'interaction pour traitement de tâches inter-appareils, et dispositif électronique et support de stockage
WO2021000881A1 (fr) Procédé de division d'écran et dispositif électronique
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
WO2020108356A1 (fr) Procédé d'affichage d'application et dispositif électronique
WO2020062294A1 (fr) Procédé de commande d'affichage pour une barre de navigation de système, interface utilisateur graphique et dispositif électronique
WO2020253758A1 (fr) Procédé de disposition d'interface utilisateur et dispositif électronique
US11687235B2 (en) Split-screen method and electronic device
WO2021082835A1 (fr) Procédé d'activation de fonction et dispositif électronique
WO2021000804A1 (fr) Procédé et appareil d'affichage dans un état verrouillé
WO2020221063A1 (fr) Procédé de commutation entre une page parent et une sous-page, et dispositif associé
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2021063237A1 (fr) Procédé de commande de dispositif électronique et dispositif électronique
WO2022068483A1 (fr) Procédé et appareil de démarrage d'application, et dispositif électronique
WO2021169399A1 (fr) Procédé de mise en cache d'une interface d'application, et appareil électronique
WO2021078032A1 (fr) Procédé d'affichage d'interface utilisateur et dispositif électronique
WO2021190524A1 (fr) Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal
WO2022068819A1 (fr) Procédé d'affichage d'interface et appareil associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21776278

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21776278

Country of ref document: EP

Kind code of ref document: A1