CN117406874A - Content sharing method, graphical interface and related device - Google Patents

Content sharing method, graphical interface and related device Download PDF

Info

Publication number
CN117406874A
CN117406874A CN202210801011.4A CN202210801011A CN117406874A CN 117406874 A CN117406874 A CN 117406874A CN 202210801011 A CN202210801011 A CN 202210801011A CN 117406874 A CN117406874 A CN 117406874A
Authority
CN
China
Prior art keywords
electronic device
window
interface element
sharing
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210801011.4A
Other languages
Chinese (zh)
Inventor
毕晟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210801011.4A priority Critical patent/CN117406874A/en
Priority to PCT/CN2023/105191 priority patent/WO2024008017A1/en
Publication of CN117406874A publication Critical patent/CN117406874A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a content sharing method, a graphical interface and a related device, wherein in the method, after an electronic device enters a sharing mode, response behaviors of one or more interface elements in a window can be changed, so that the electronic device can detect drag operation of a user on the interface elements and trigger sharing of the interface elements. Therefore, a developer does not need to state the application or interface element supporting sharing in advance and the transmission content in the sharing process, and the response behavior of the interface element is automatically changed after the electronic equipment enters the sharing mode, so that the sharing of the interface element is realized, the workload of the developer is reduced, and the application scene of content sharing is expanded.

Description

Content sharing method, graphical interface and related device
Technical Field
The application relates to the technical field of terminals, in particular to a content sharing method, a graphical interface and a related device.
Background
With the development of the internet and the popularization of electronic devices, more and more applications on the electronic devices, users often need to use a sharing operation to share the content of one application to other places, for example, the users drag a file in one folder to another folder, so as to realize the sharing of the file from one folder to another folder.
However, in an application scenario in which sharing of content is achieved through a drag operation, only a part of designated application programs can support the drag operation, and content in the application scenario is shared to other places. This is because the developer needs to adapt the application program in advance, declare the content that can be dragged in the application program, and the content that is displayed and transmitted in the dragging process, and the application program can finally respond to the drag operation of the user to share the content therein.
Therefore, in order to realize content sharing of the application program, a developer can only manually adapt to the application program in advance, which not only increases the workload of the developer, but also limits the application capable of realizing content sharing. Therefore, how to better realize content sharing and reduce workload of developers is a problem to be solved at present.
Disclosure of Invention
The application provides a content sharing method, a graphical interface and a related device, in the method, a developer can realize content sharing of an application without manual adaptation of drag sharing of an application program in advance, and workload of the developer is reduced.
In a first aspect, the present application provides a content sharing method, including: the first device displays a first window, the first window comprising one or more interface elements; the first device detecting a first operation on the first window; after the first device detects the first operation, the first device detects a drag operation acting on a first interface element of the one or more interface elements; responding to the drag operation, the first device displays a second interface element corresponding to the first interface element in a second window, or the first device sends the transmission content of the first interface element to the second device; the drag operation on the first interface element is not used for triggering the first device to display the second interface element in the second window or send the transmission content to the second device before the first device detects the first operation.
In the content sharing method, a sharing mode is provided, after the electronic device detects the first operation, the electronic device may enter the sharing mode, and in the sharing mode, the electronic device may automatically change a response behavior of the interface element, so that in the sharing mode, the interface element may be shared from one window to another window through a drag operation of a user, or a transmission content of the interface element may be shared to other devices. Therefore, the workload of manually adapting the application by the developer to realize drag sharing is reduced, the application scene of content sharing is expanded, and the experience of the user is improved.
In some embodiments, the electronic device may display the second window simultaneously when displaying the first window, and at this time, the electronic device may directly implement sharing of the transmission content of the interface element between the two windows displayed simultaneously.
In other embodiments, the electronic device does not display the second window while the first window is displayed, and triggers the display of the second window after the electronic device detects the drag operation. Thus, the electronic equipment realizes cross-window sharing of the content and window switching.
With reference to the first aspect, in some embodiments, before the first device displays, in the second window, the second interface element corresponding to the first interface element, or before the first device sends the transmission content corresponding to the first interface element to the second device, the method further includes: the first device displays a screenshot of a first interface element that moves following a movement trajectory of a drag operation.
That is, in the process that the user initiates the drag operation to drag the interface element, the electronic device can display the screenshot of the interface element along the movement track of the finger of the user, so as to promote the interest of drag sharing.
With reference to the first aspect, in some embodiments, the first window and the second window belong to the same application or different applications.
The electronic device can realize content sharing among different applications, namely, the content of one application is dragged and shared into another application, or the electronic device can realize content sharing among different pages in the same application, namely, the content of one page of the application is dragged and shared into another page.
With reference to the first aspect, in some embodiments, the second interface element includes: the transmission content or an identification of the transmission content.
With reference to the first aspect, in some embodiments, the icon is identified as a screenshot of the first interface element.
With reference to the first aspect, in some embodiments, the first operation is used to trigger the first device to enter the first mode, and after the first device detects the first operation acting on the first window, the method further includes:
the first device displays prompt information in the first window, wherein the prompt information is used for indicating that the first device enters a first mode.
With reference to the first aspect, in some embodiments, the first prompt is used to highlight the first interface element.
Illustratively, the first prompt may be represented as changing a display effect of the first interface element, adding additional information, etc., for example, the display effect may include: static effects of position, size, color, brightness, transparency, saturation, shading, etc., dynamic effects of interface element dithering, etc., the additional information may be represented as a border of the interface element, an icon in the upper right corner of the interface element, etc. Thus, the user can know the interface element which can be dragged in the currently displayed window through the first prompt information.
With reference to the first aspect, in some embodiments, the first prompt includes a border of the first interface element.
With reference to the first aspect, in some embodiments, after the first device detects the first operation acting on the first window, the method further includes: the first device stops displaying the third interface element in the first window.
The electronic device may only control a portion of the interface elements to enter the sharing mode, that is, only a portion of the interface elements support drag sharing, and in this case, in the sharing mode, the interface elements that do not support drag sharing may not be in the range of the first prompt information prompt, that is, the interface elements may not have animation effects or display effects changed, and no additional information may be added, or in the sharing mode, the electronic device may not display the interface elements that do not support drag sharing. Thus, the application of the security privacy information can be prevented from being partially existed, the privacy information of the user is revealed due to the sharing of the content, or the interference of the user on the interface element which needs to execute the drag sharing caused by the interference of the interface element which is not important in part can be prevented.
With reference to the first aspect, in some embodiments, the drag operation is specifically: and sliding operation from the position of the first interface element to the designated position.
With reference to the first aspect, in some embodiments, the first window is a window of the first application, the second window is a window of the second application, and the specified position may refer to a position where an icon of the second application is located; or, the designated location may refer to a location where an icon of the second device is located, or a location where an icon of the first contact is located, where the second device is a device used by the first contact.
Further, the icon of the second application may be displayed in a list comprising icons of a plurality of applications, or the icon of the second device may be displayed in a list comprising a plurality of device icons, or the icon of the first contact may be displayed in a list comprising a plurality of contact icons. Therefore, the user can share the transmission content to the appointed application, the appointed equipment or the appointed contact person according to the own requirement, and the operability of the user is improved.
With reference to the first aspect, in some embodiments, the second window is displayed in a first user interface, and the first user interface further includes the first window therein.
That is, the electronic device may display the first window and the second window simultaneously after sharing the transmission content to the second window. Thus, after the content sharing is successful, the user can simultaneously view the display content of the content sharing party and the content receiving party.
With reference to the first aspect, in some embodiments, the first interface element comprises one or more interface elements.
That is, the electronic device can realize the drag sharing of one or more interface elements through one drag operation, and when the electronic device realizes the drag sharing of a plurality of interface elements through one drag operation, the electronic device can quickly realize the sharing of a plurality of contents by a user, and the operation of the user is convenient.
With reference to the first aspect, in some embodiments, the first interface element includes N interface elements, N is greater than or equal to 2, and N is a positive integer, and before the first device detects a drag operation acting on the first interface element of the one or more interface elements, the method further includes: the first device detects a selection operation acting on the N interface elements.
With reference to the first aspect, in some embodiments, after the first device detects the first operation acting on the first window, the method further includes: the first device changes or creates response behaviors of M interface elements in the first window, so that the first device can respond to drag operation of the first interface elements in the M interface elements, display marks in the second window or send transmission content to the second device, wherein M is more than or equal to 1, and M is a positive integer.
That is, the electronic device can automatically change the response behavior of the interface element in the sharing mode, so that the trouble that the developer manually declares the application or the interface element supporting sharing is avoided, and the workload of the developer is reduced.
With reference to the first aspect, in some embodiments, after the first device detects a drag operation acting on a first interface element of the one or more interface elements, the method further includes:
The first device obtains the transmission content from the information of the first interface element.
That is, the electronic device can automatically determine the transmission content required in the sharing process according to the information of the interface element in the sharing mode, so that the trouble that the developer manually declares the transmission content is avoided, and the workload of the developer is reduced.
With reference to the first aspect, in some embodiments, the method is performed by a system unit of the first device, the system unit and the application to which the first window belongs being different modules of the first device.
In some embodiments, the system element may be located at a framework layer of the first device. That is, the sharing mode can be defined as a system-level drag sharing mode, so that any application under the system can respond to the first operation and enter the sharing mode, thereby realizing content sharing of the application, expanding the application scene of content sharing and improving the experience of users on drag sharing.
In a second aspect, embodiments of the present application provide an electronic device, including a memory, one or more processors, and one or more programs; the one or more processors, when executing the one or more programs, cause the electronic device to implement the method as described in the first aspect or any implementation of the first aspect.
In a third aspect, embodiments of the present application provide a computer-readable storage medium comprising instructions, characterized in that the instructions, when run on an electronic device, cause the electronic device to implement a method as described in the first aspect or any implementation of the first aspect.
Drawings
Fig. 1 is a schematic hardware structure of an electronic device 100 according to an embodiment of the present application;
FIGS. 2A-2G, 3A-3F, and 4A-4D are some of the user interfaces provided by embodiments of the present application;
fig. 5 is a schematic software structure of the electronic device 100 according to the embodiment of the present application;
fig. 6 is a flowchart of interaction between internal modules in a software structure of the electronic device 100 according to the embodiment of the present application;
FIG. 7 is a schematic diagram of a tree structure of windows, controls, and layouts provided in embodiments of the present application;
FIG. 8 is a schematic diagram of a portion of the layout and controls in the user interface 10 provided by embodiments of the present application;
fig. 9 is a flowchart of a content sharing method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and in addition, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and an acceptable form of the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the electronic equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be a visual interface element of text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc., displayed in a display of the electronic device.
The embodiment of the application provides a content sharing method, which comprises the following steps: after the electronic device enters the sharing mode, the electronic device can detect a drag operation of a user on a target interface element in the one or more interface elements, determine transmission content according to the target interface element, share the transmission content into a second window, and display the transmission content or an identifier corresponding to the transmission content in the second window, or share the transmission content into other devices, thereby realizing content sharing.
Wherein, the interface elements refer to a series of elements meeting the user interaction requirement in the user interface, and the interface elements comprise: pictures, text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, and the like controls, or combinations of these controls.
After entering the sharing mode, the electronic device can change or create the response behavior of one or more currently displayed interface elements, so that the electronic device can detect the sharing operation acting on the interface elements after entering the sharing mode, and trigger the sharing of the interface elements. Accordingly, the same operation of the electronic device with respect to the interface element may not trigger any behavior before the electronic device enters the sharing mode, or the same operation may be used to trigger the electronic device to perform other behaviors with respect to the interface element, which are different from sharing the interface element. For example, the interface element is a picture, before the electronic device enters the sharing mode, a click operation on the picture may be used to trigger displaying the high-definition image of the picture, and after the electronic device enters the sharing mode, a click operation on the picture may be used to trigger sharing the picture. In this way, a developer does not need to manually declare the content which can be dragged in the application program, and the electronic device can automatically adjust the response behavior of each interface element by detecting whether the electronic device enters a sharing mode, so that the electronic device can detect the sharing operation acting on the interface element, and the sharing of the interface element is realized.
After the electronic equipment detects the sharing operation acting on the target interface element, the transmission content can be automatically determined according to the target interface element, and the sharing of the target interface element can be realized according to the transmission content.
The transmission content may include text, pictures, voice, forms, video, files, and the like. For example, when the interface element is a text control, the transmission content may be a piece of text, and the text may be a text displayed in the text control, or may include other text besides the text displayed in the text control. For another example, when the interface element is a picture control, the transmission content may be a high-definition image corresponding to the picture control, where the content displayed in the picture control may be a part of the content of the high-definition image, and the definition of the content may also be lower than that of the high-definition image. For another example, when the interface element is a file icon, the transmission content may be one file. The electronic device can automatically acquire the content corresponding to the interface element as the transmission content, so that a developer does not need to manually declare the transmission content in the sharing process, and the electronic device can automatically take the content contained in the interface element as the transmission content according to the interface element selected by the user.
In addition, after the electronic device shares the transmission content to the second window, the electronic device may display the transmission content or an identifier corresponding to the transmission content in the second window, where the identifier may be a display form of the transmission content, for example, the identifier may be a screenshot of a target interface element or a preset icon, and the embodiment of the present application does not limit the identifier.
It can be seen that the content sharing method provided by the embodiment of the application provides a sharing mode, in which the electronic device can automatically change the response behavior of the interface element of the application, so as to realize the self-adaptation of the application to drag sharing, the content of any application can be shared in the sharing mode, a developer does not need to state in advance the application or the interface element supporting sharing, and the transmission content in the sharing process can also realize the content sharing of the application by the user, the workload of the developer is reduced, the application scene of content sharing is expanded, and the experience of the user is improved.
Fig. 1 shows a schematic hardware configuration of an electronic device 100.
The electronic device 100 may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook, as well as a cellular telephone, personal digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) device, virtual Reality (VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device, with the specific types of such electronic devices not being particularly limited in the embodiments of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
In some embodiments, the processor 110 may be configured to alter or create a response behavior of one or more interface elements currently displayed after the electronic device 100 enters the sharing mode, find a target interface element from the one or more interface elements according to a user operation, and determine a transmission content according to the target interface element. In particular with respect to altering or creating the response behavior of the interface element, determining the target interface element, determining the description of the transmission may be referred to in the subsequent method embodiments, which are not first developed here.
A memory may also be provided in the processor 110 for storing instructions and data.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, demodulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel.
In some embodiments, the display 194 may be used to display user interfaces including a first user interface, a first window, a second window, etc., related to content sharing, and details regarding such user interfaces may be found in subsequent figures 2A-2G, 3A-3F, and 4A-4D, which are not first expanded herein.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene.
The camera 193 is used to capture still images or video.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C.
The earphone interface 170D is used to connect a wired earphone.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100.
The air pressure sensor 180C is used to measure air pressure.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194.
In some embodiments, the touch sensor 180K may be configured to detect a sharing operation by a user and communicate the user operation to the processor 110, such that the processor 110 triggers the electronic device 100 to enter a sharing mode, or triggers the electronic device 100 to share an interface element, or the like.
The bone conduction sensor 180M may acquire a vibration signal.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100.
Some user interfaces provided by embodiments of the present application are described below in conjunction with fig. 2A-2G, 3A-3F, and 4A-4D.
Fig. 2A-2G illustrate some of the user interfaces involved in content sharing between applications by electronic device 100.
Fig. 2A illustrates a user interface 10 provided by the entertainment class application after the electronic device 100 has launched the entertainment class application. The entertainment class application may be used to provide entertainment services for users to social, chat, watch video, listen to music, etc., for example, the entertainment class application may be referred to as a microblog application.
As shown in fig. 2A, the user interface 10 may include: a status bar 101, a first menu bar 102, a browsing area 103, a second menu bar 104. Wherein:
the status bar 101 may include one or more signal strength indicators of a mobile communication signal, one or more signal strength indicators of a wireless fidelity (WiFi) signal, a battery status indicator, and a time indicator.
The first menu bar 102 may include one or more options, and the electronic device 100 may detect an operation acting on the option, and activate a function corresponding to the option. Illustratively, the first menu bar 102 may include: photographing options, attention options, recommendation options, city-in options, and more options. The photographing option may be used to start a photographing function, the attention option may be used to trigger the electronic device 100 to display entertainment content focused by the user in the browsing area 103, the recommendation option may be used to trigger the electronic device 100 to display entertainment content recommended by the entertainment application in the browsing area 103, the city-like option may be used to trigger the electronic device 100 to display other users in the city where the user is located in the browsing area 103, the entertainment content published on the entertainment application may be used to trigger the electronic device 100 to display other more hidden functions, such as publishing entertainment content in the form of text, pictures, videos, and the like.
The browse area 103 may be used to present entertainment content published by different users on the entertainment-type application. Wherein, for entertainment content posted by a user, the browsing area 103 may display the user's head portrait, name, posted entertainment content, and sharing options, comment options, praise options, etc. for the entertainment content. As shown in fig. 2A, the browsing area 103 may include a picture control 103A, where the picture control 103A is displayed as a thumbnail of a picture, and the electronic device 100 may detect a user operation, such as a click operation, acting on the picture control 103A, and trigger displaying a picture corresponding to the picture control 103A, where the picture may be a high-definition large picture, and the content displayed may be greater than the content displayed by the thumbnail in the picture control 103A, and/or the definition of the picture is higher than the definition of the thumbnail in the picture control 103A.
The second menu bar 104 may include one or more options, and the electronic device 100 may detect an operation on the option, and activate a function corresponding to the option. Illustratively, the second menu bar 104 may include: first page option, discovery option, message option, local option. The plurality of options may be used to trigger the electronic device 100 to display different pages provided by the entertainment application in the user interface 10, wherein the electronic device 100 displays a home page, a discovery page, a message page, a local page in the user interface 10 when the home option, the discovery option, the message option, and the local option are each in a selected state. Illustratively, the user interface 10 shown in FIG. 2A may be what the electronic device 100 displays when the first page option is in the selected state.
For example, the electronic device 100 may trigger the display of the user interface 10 shown in fig. 2A upon detecting a user operation, such as a click operation, of an application icon of the entertainment-type application by a user. Alternatively, the electronic device 100 may trigger to display the user interface 10 shown in fig. 2A by detecting a voice command that the user voice starts the entertainment application, and in this embodiment of the present application, the triggering manner of the electronic device 100 to display the user interface 10 shown in fig. 2A is not displayed.
It should be understood that, in the embodiment of the present application, the content sharing method provided by the embodiment of the present application is described by taking the example that the electronic device 100 starts an entertainment application, and the application started by the electronic device 100 is not limited in the embodiment of the present application, and in other embodiments of the present application, the electronic device 100 may also display a user interface provided by an application program such as a music application, a chat application, an office application, or the like. In addition, the user interface 10 shown in FIG. 2A is merely exemplary, and the entertainment-type application may provide a user interface that includes more or fewer controls, and the user interface 10 is not limiting of embodiments of the present application.
As shown in fig. 2A, when the electronic device 100 detects a user operation, such as a quick three-click operation, acting on the user interface 10, the electronic device 100 enters a sharing mode in response to the operation.
In the sharing mode, the electronic device 100 may detect a sharing operation acting on the currently displayed content, and trigger sharing of the interface element acting on the sharing operation, for example, sharing to other applications, sharing to other devices, and so on.
It can be appreciated that, when the sharing operation is a drag operation, the sharing mode may also be referred to as a sharing mode, and the name of the sharing mode is not limited in the embodiments of the present application.
In the embodiments below, under the condition that a "sharing mode" of an electronic device such as a smart phone is turned on, when the electronic device detects a sharing operation of a user, the electronic device may trigger sharing of an interface element acting on the sharing operation. The "sharing mode" may be a service and function provided by the electronic device 100, and may support the electronic device 100 to switch applications, so as to implement data sharing between applications or data sharing between devices.
Optionally, after the electronic device 100 enters the sharing mode, the electronic device 100 may display a frame around the currently displayed interface element, where the frame may be used to prompt the user that the electronic device 100 enters the sharing mode, and the electronic device 100 may detect an operation acting on the interface element surrounded by the frames, and trigger sharing of the interface element. Specifically, the electronic device 100 may traverse interface elements contained in the current display content, determine size (e.g., length and height) and location information of the interface elements, and draw a border around the interface elements according to the size and location information.
Illustratively, after the electronic device 100 enters the sharing mode, the electronic device 100 may display the user interface 10 as shown in fig. 2B. In fig. 2B, (a), (B), and (c) respectively show the user interfaces 10 that may be displayed after the three electronic devices 100 enter the sharing mode.
As shown in fig. 2B (a), in contrast to the user interface 10 shown in fig. 2A, a rectangular frame is displayed around each interface element displayed in the user interface 10 shown in fig. 2B, wherein the interface element may include the first menu bar 102 and photographing options, attention options, recommendation options, city-in options, more options in the first menu bar 102, and the head portraits, names, published entertainment contents of the user in the browsing area 103, and sharing options, comment options, praise options, and the like for the entertainment contents, and the second menu bar 104, first page options, discovery options, message options, local options, and the like in the second menu bar 104.
It should be noted that, in the embodiment of the present application, the interface element may refer to a single control, including a text control, a picture control, a button control, a form control, and so on, for example, a picture control 103A shown in (a) in fig. 2B, or may refer to a combination of multiple controls, for example, a first menu bar 102 shown in (a) in fig. 2B. The embodiments of the present application are not limited in this regard.
It will be appreciated that the border around these interface elements may be a circular frame, a diamond frame, an irregular frame, or a wire frame that conforms to the shape of the interface element, in addition to the rectangular frame shown in fig. 2B, and the shape of the border is not limited in this embodiment of the present application. In addition, the frames may be displayed by the electronic device 100 alternately, or at intervals, in addition to being displayed after the electronic device 100 enters the sharing mode. For example, after the electronic device 100 enters the sharing mode, the electronic device 100 may display the borders in the first menu bar 102 first, display the borders in the browsing area 103 again, display the borders in the second menu bar 104 finally, and then display the borders again in the order from top to bottom. The embodiment of the application does not limit the display rule or the display time of the frame.
Optionally, after the electronic device 100 enters the sharing mode, the electronic device 100 may change the display effect of the current display content, in addition to displaying the frame around the interface element, where the display effect includes: location, size, color, brightness, transparency, saturation, shading, and the like. In this way, the user can perceive that the current electronic device 100 has entered the sharing mode according to the change in the display effect of the current display content.
Optionally, after the electronic device 100 enters the sharing mode, the electronic device 100 may further display a prompt message, where the prompt message is used to prompt the user that the electronic device 100 has entered the sharing mode currently, for example, the electronic device 100 may display the prompt message at the bottom end of the user interface 10 as shown in fig. 2B: currently, a sharing mode is entered, and the content to be shared is selected.
It can be appreciated that, in the embodiment of the present application, after the electronic device 100 enters the sharing mode, the modification effect of the current display content is not limited.
In addition, after the electronic device 100 enters the sharing mode, for some interface elements that do not support sharing, the surrounding may not display a frame, or the interface elements may be directly hidden. Illustratively, assuming that the interface elements in the first menu bar 102 and the second menu bar 104 in fig. 2A do not support sharing, the electronic device 100 may display the user interface 10 as shown in (B) or (c) in fig. 2B.
As shown in fig. 2B (B), the user interface 10 still displays a first menu bar 102, a browsing area 103, and a second menu bar 104. However, compared to (a) in fig. 2B, the electronic device 100 displays a frame only in the browsing area 103. In this way, the user can understand that only the content in the browsing area 103 supports sharing, through the frame displayed only in the browsing area 103, and the content in the first menu bar 102 and the second menu bar 104 does not support sharing.
As shown in fig. 2B (c), only the browsing area 103 is displayed in the user interface 10. Likewise, the user may learn that only the content in the browsing area 103 supports sharing and that the content in the first menu bar 102 and the second menu bar 104 does not support sharing by entering the sharing mode through the change in the content in (c) in fig. 2A to 2B displayed by the electronic device 100.
In addition, after entering the sharing mode, the electronic device 100 may exit the sharing mode after detecting a specified operation (e.g., a left-sliding operation) of the user. For example, when the electronic device 100 detects a left-sliding operation acting on the user interface 10 shown in fig. 2B (c), the electronic device 100 exits the sharing mode and displays the user interface shown in fig. 2A.
It should be noted that, when the electronic device 100 enters the sharing mode, the electronic device 100 may also change the currently displayed user interface, for example, return to the previous level of the user interface of the application, or switch to the user interface of another application, and after changing the user interface, the electronic device 100 may still continue to be in the sharing mode, and the electronic device 100 may detect the sharing operation of the interface element in the changed user interface, and trigger the sharing of the interface element.
As shown in fig. 2C, the electronic device 100 may detect a sharing operation acting on the picture control 103A as shown in (a) of fig. 2B, triggering sharing of the picture control 103A. After the electronic device 100 detects the sharing operation acting on the picture control 103A, the electronic device 100 may obtain the transmission content corresponding to the picture control 103A, so that the electronic device 100 shares the transmission content to other applications. Illustratively, the sharing operation may appear as a drag operation from gesture 1 to gesture 2 in fig. 2C, to gesture 3 in fig. 2D, to gesture 4 in fig. 2E, to gesture 5 in fig. 2F. During the gesture change, the content displayed by the electronic device 100 may be different.
Optionally, in response to the sharing operation, the electronic device 100 may obtain a screenshot 103B that is the same as the picture control 103A by screenshot the picture control 103A. As shown in fig. 2C, the screenshot 103B may move following a touch point acted on the display screen by the user after the user triggers the sharing operation. Alternatively, the picture moved according to the touch point acted on the display screen by the user may be a picture corresponding to the picture control 103A, and the definition of the picture may be higher than the screenshot of the picture control 103A, and/or the display content may be greater than the display content in the screenshot of the picture control 103A. In the embodiment of the application, the displayed content is not limited in the dragging process of the user.
Optionally, if a border is displayed around the interface element displayed by the electronic device 100 after the electronic device 100 enters the sharing mode, the electronic device 100 may stop displaying the border when the electronic device 100 starts to detect the sharing operation.
As shown in fig. 2D, when a user acts on the sharing operation of the picture control 103A, the touch point of which in the display screen is near the bottom end of the display screen, the electronic device 100 may display an application list 105 as shown in fig. 2D at the bottom end of the display screen, where the application list 105 is used to display one or more application icons. Illustratively, the application list 105 may include: the first icon 105A, the second icon 105B, and the third icon 105C. The first icon 105A may be used to trigger starting a sms application, the second icon 105B may be used to trigger starting a setting application, and the third icon 105C may be used to trigger starting a gallery application. At this point, screenshot 103B is near the bottom of user interface 10.
Note that, the application corresponding to the application icon displayed in the application list 105 may be an application with a high frequency of use of the electronic device 100, or an application used by the electronic device 100 in a recent period of time, or an application running in the background of the electronic device 100, etc., and in the embodiment of the present application, the association between the application icon displayed in the application list 105 and the electronic device 100 is not limited.
It is to be appreciated that embodiments of the present application are not limited to where the application list 105 is displayed, for example, the application list 105 may be displayed on the left side of the user interface or on the right side of the user interface. For example, the electronic device 100 may trigger the display of the application list 105 on the left side of the user interface after detecting that the user's drag operation on the screenshot picture 103B moves to the left side of the user interface. In addition, the embodiment of the present application does not limit the time when the electronic device 100 triggers the display of the application list 105, for example, the electronic device 100 may trigger the display of the application list 105 in the user interface 10 after entering the sharing mode.
As shown in fig. 2E-2F, when the electronic device 100 detects that the user acts on the drag operation of the picture control 103A, and acts on the range where the first icon 105A is located, for example, as shown in fig. 2E, when the first icon 105A is overlapped with the screenshot picture 103B, the electronic device 100 displays the user interface 20 as shown in (a) in fig. 2F, the user interface 20 is a user interface provided for a short message application, or the electronic device 100 displays the user interface 30 as shown in (B) in fig. 2F, the user interface 30 includes both the user interface 10 provided for an entertainment application and the user interface 20 provided for a short message application, and the electronic device 100 may simultaneously display the content in the user interface 10 and the content in the user interface 20 in a split screen display manner. The user interface 20 may be a user interface displayed when the electronic device 100 historically opens a short message application, and in the embodiment of the present application, the user interface displayed by the application of the receiving party is not limited when the application shares content.
Alternatively, when a drag operation of the user acts on the range in which the first icon 105A is located, the electronic device 100 may change the display effect of the first icon 105A, which may include an icon size, an icon color, an icon position, and the like. As can be seen from comparing fig. 2D and fig. 2E, when the drag operation of the user acts on the range where the first icon 105A is located, the electronic device 100 can enlarge the icon size of the first icon 105A, so that by changing the size, the user is prompted that the picture corresponding to the picture control 103A can be shared into the application corresponding to the first icon 105A.
Alternatively, the electronic apparatus 100 may display the user interface as shown in (a) or (b) in fig. 2F after the drag operation of the user acts within the range where the first icon 105A is located and is maintained for a certain period of time, for example, 1S.
As shown in fig. 2F (a), the user interface 20 may include an information presentation area 201, an information input area 202. The information display area 201 is used for displaying information exchanged by a user and other people, that is, information that the electronic device 100 communicates with other devices, and the information input area 202 is used for triggering input of information.
As shown in fig. 2F (b), the user interface 30 may include an area 301 and an area 302, wherein the area 301 is used to display a user interface provided by an entertainment-type application, and the area 302 is used to display a user interface provided by a short message application. Therefore, the user can simultaneously view the user interfaces provided by the two applications, and can simultaneously view the source party of the picture and the receiver party of the picture, so that the user can more clearly view the sharing process of the picture.
It is to be understood that, when the electronic device 100 displays the content provided by two applications simultaneously in a split-screen display manner, the method is not limited to the above-mentioned up-down split-screen manner, and for example, the electronic device 100 may also display the content provided by two applications in a left-right split-screen manner or in a floating window manner.
As shown in fig. 2F (a) -fig. 2G, when the user drags the screenshot picture 103B to the information input area 202 shown in fig. 2F (a), the electronic device 100 may display an accessory window 203 shown in fig. 2G in the user interface 20, and the accessory window 203 may be used to display a file, a picture, a voice, and the like to be transmitted by the user.
As shown in fig. 2G, the accessory window 203 may include: picture 203A, delete icon 203B. The picture 203A may be a picture corresponding to the picture control 103A shown in fig. 2C.
That is, the electronic device 100 detects the drag operation of the user on the picture control 103A, and may trigger the electronic device 100 to share the content corresponding to the picture control 103A from the entertainment application to the short message application.
It should be noted that, the content shared by the electronic device 100 and the interface element acted by the drag operation of the user may be different, and the interface element displayed by the electronic device 100 is only a presentation form of the content shared by the electronic device 100. As shown in fig. 2C, when the electronic device 100 detects a drag operation on the picture control 103A, the content shared by the electronic device 100 is a picture.
It should be understood that the interface elements displayed by the electronic device 100 are not necessarily associated with their corresponding content. For example, when the interface element displayed by the electronic device 100 may be a picture, the corresponding content may be a text, and for example, when the interface element displayed by the electronic device 100 is a word, the corresponding content may be a website.
In addition, when the electronic device 100 detects a user operation acting on the transmission icon 202A in the information input area 202, in response to the operation, the electronic device 100 may transmit the picture 203A to other devices and display the picture 203A in the information presentation area 201.
In addition, when the user acts on the sharing operation of the picture control 103A, the touch point of the user in the display screen is near the bottom end of the display screen, the electronic device 100 may not display the application list 105, and may be directly switched to the user interface shown in fig. 2F, that is, the user interface provided by another application, where the application may be the application with the highest use frequency by the user, may be the application that is used by the electronic device 100 recently, may be an application that is run in the background of the electronic device 100, or may also be an application that is preset by the electronic device 100 and is used to display the content shared by the user, for example, a picture browsing application and so on. For example, the electronic device 100 switches directly from the user interface 10 shown in fig. 2C to the user interface shown in fig. 2F.
It should be understood that, in the embodiments of the present application, the sharing operation is not limited to the drag operation of fig. 2C-2F, and may be a click operation, for example, when the electronic device 100 detects a click operation acting on the picture control 103A shown in (a) of fig. 2B, in response to the operation, the electronic device 100 may display the application list 105 shown in fig. 2D at the bottom end of the user interface 10, and when the electronic device 100 detects a click operation acting on the first icon 105A in the application list 105, in response to the operation, the electronic device 100 may share the content corresponding to the picture control 103A into the short message application, that is, display the user interface 20 shown in fig. 2G.
Fig. 3A-3F illustrate some of the user interfaces involved in sharing multiple content between devices by electronic device 100.
Fig. 3A illustrates an exemplary user interface 10 displayed when the electronic device 100 enters the sharing mode, and a detailed description of the user interface may be referred to the related description of (a) in fig. 2B, which is not repeated here.
As shown in fig. 3A, when the electronic device 100 detects a selection operation, e.g., a click operation, on the picture control 103A, in response to the operation, the electronic device 100 may select the picture control 103A, determine the picture control 103A as content to be shared, and alter a display effect of the picture control 103A, where the display effect may include: color, size, saturation, transparency, etc. For example, the display effect may be seen with respect to the picture control 103A shown in FIG. 3B.
As shown in fig. 3B, the background color of the picture control 103A is darker than the background color of the picture control 103A in fig. 3A.
Optionally, the selected picture control 103A may be displayed with an animation effect, such as a dithering effect. Alternatively, there may also be vibration effects of the electronic device 100, and so on.
As shown in fig. 3B, the user interface 10 further includes a picture control 103C, and when the electronic device 100 detects a selection operation, for example, a clicking operation, on the picture control 103C, the electronic device 100 may select the picture control 103C, determine the picture control 103C as a content to be shared, and change a display effect of the picture control 103C.
As shown in fig. 3C, the background color of the picture control 103C is darker than the background color of the picture control 103C in fig. 3B. As can be seen in fig. 3C, both picture control 103A and picture control 103C are currently in the selected state.
In the case where both the picture control 103A and the picture control 103C are in the selected state, as shown in fig. 3D, the electronic device 100 may detect a sharing operation, such as a drag operation, acting on the picture control 103A or the picture control 103C, which may be represented as a sliding process of gesture 1 to gesture 2 as shown in fig. 3D. In addition, during the sliding process of the gesture, the electronic device 100 may display a screenshot picture 103D, where the screenshot picture 103D may move along the sliding track of the gesture, and the screenshot picture 103D may be obtained by combining or overlapping the shots of the picture control 103A and the picture control 103C.
When the electronic device 100 detects a sharing operation of the user, and the gesture thereof slides to the bottom end of the user interface, for example, detects gesture 2 in the sharing operation of the user, the electronic device 100 may switch to another application, displaying the user interface 20 as shown in fig. 3E. For example, the user interface 20 may be a user interface provided by a sms application, and the description of the user interface 20 may be referred to in the foregoing related description of fig. 2F, which is not repeated herein.
As shown in fig. 3E-3F, when the user drags the screenshot picture 103D to the information input area 202 shown in fig. 3E, the electronic device 100 may display an attachment window 203 shown in fig. 3F in the user interface 20, the attachment window 203 being used to display a file, picture, voice, etc. to be transmitted by the user.
As shown in fig. 3F, the accessory window 203 may include: picture 203C, picture 203D. The picture 203C may be a picture corresponding to the picture control 103A selected in fig. 3C, and the picture 203D may be a picture corresponding to the picture control 103C selected in fig. 3D.
As can be seen from fig. 3A-3F, the electronic device 100 may allow a user to select multiple interface elements while sharing the multiple interface elements after entering the sharing mode. Therefore, when a plurality of contents exist for sharing, the user can complete the sharing of the contents at one time by selecting the plurality of contents, so that the sharing efficiency is improved, and meanwhile, the user operation is convenient in the content sharing process.
Fig. 4A-4D illustrate some of the user interfaces involved in the sharing of content between devices by electronic device 100.
When the electronic device 100 detects a drag operation acting on the picture control 103A as shown in (a) in fig. 2B, in response to the operation, the electronic device 100 may generate and display a screenshot picture 103B, the screenshot picture 103B moving following the drag operation of the user. When a touch point of a drag operation of a user in a screen approaches a bottom end of a user interface, the electronic device 100 may display a device list 106 as shown in fig. 4A at the bottom end of the user interface, the device list 106 being for displaying one or more device icons. Illustratively, the device list 106 may include: a first icon 106A, a second icon 106B, and a third icon 106C. The first icon 106A may be used to trigger sending a picture corresponding to the picture control 103A to the device 1, the second icon 106B may be used to trigger sending a picture corresponding to the picture control 103A to the device 2, and the third icon 106C may be used to trigger sending a picture corresponding to the picture control 103A to the device 3.
It should be noted that, the device corresponding to the device icon displayed in the device list 106 may be a device that establishes a connection relationship (such as a wired connection relationship or a wireless connection relationship) for the electronic device 100, or a device that belongs to one account or one account group with the electronic device 100, etc., and in this embodiment of the present application, the association between the device icon displayed in the device list 106 and the electronic device 100 is not limited.
It is to be appreciated that embodiments of the present application are not limited to the location displayed in the device list 106, for example, the device list 106 may be displayed on the left side of the user interface or on the right side of the user interface. The description of the display position of the device list 106 may refer to the foregoing description of the display position of the application list 106, which is not repeated herein.
As shown in fig. 4B-4C, when the electronic device 100 detects that the user performs a drag operation on the picture control 103A, and the drag operation is performed in the range where the second icon 106B is located, that is, as shown in fig. 4B, when the second icon 106B is overlapped with the screenshot picture 103B, the electronic device 100 displays, in the user interface 10, prompt information 107 shown in fig. 4C, where the prompt information 107 is used to prompt the user that the picture corresponding to the picture control 103A has been sent to the device 2 corresponding to the second icon 106B.
Alternatively, when the drag operation of the user acts on the range of the second icon 106B, the electronic device 100 may change the display effect of the second icon 106B, which may include: icon size, icon color, icon position, etc. As can be seen from comparing fig. 2A and fig. 2B, when the drag operation of the user acts on the range of the second icon 106B, the electronic device 100 can change the icon color of the second icon 106B, so that the user is prompted by the change of the color that the picture corresponding to the picture control 103A can be shared with the device 2 corresponding to the second icon 106B.
Alternatively, the electronic device 100 may share the picture corresponding to the picture control 103A to the device 2 corresponding to the second icon 106B after the drag operation of the user is performed within the range of the second icon 106B and maintained for a period of time, for example, 1S, and display the prompt information 107 shown in fig. 4C.
Illustratively, after electronic device 100 sends the picture corresponding to picture control 103A to device 2, device 2 may display user interface 40 as shown in fig. 4D.
As shown in fig. 4D, user interface 40 may be an exemplary user interface for an application menu displayed by device 2. Wherein the user interface 40 may comprise a window 401. The window 401 may be used to display pictures sent by the electronic device 100. The window 401 may include a cancel option 401A, a save option 401B, a copy option 401C, a picture 401D. The cancel option 401A may be used to trigger cancellation of acquiring a picture sent by the electronic device 100, the save option 401B may be used to trigger saving the picture sent by the electronic device 100 locally, the copy option 401C may be used to trigger copying the picture sent by the electronic device 100, after the device 2 copies the picture, the device 2 may detect a paste operation of the user when displaying other input windows, paste the picture in the input window, for example, the device 2 detects a long press operation of the user when displaying a memo, trigger displaying a paste option, and trigger pasting the copied picture in the memo after detecting a confirm operation of the user on the paste option. The picture 401D displays a picture sent by the electronic device 100, where the picture may be a picture corresponding to the picture control 103A shown in fig. 4A.
It will be appreciated that, without being limited to displaying the window 401 in the above-mentioned user interface 40, when the device 2 detects a picture sent by the electronic device 100 while displaying the user interface provided by the application, the device 2 may display the window 401 in the user interface provided by the application.
As can be seen from fig. 2A-2G, 3A-3F, and 4A-4D, in the sharing mode, the electronic device 100 may detect a sharing operation performed by a user on an interface element, and share the interface element to other applications or other devices. In addition, fig. 2A-2G and fig. 4A-4D only show the process of sharing pictures by the electronic device 100, and it should be understood that the embodiment of the present application does not limit the content to be shared, for example, in the sharing mode, the electronic device 100 may detect that the user acts on the sharing operation of the text control, and share the content (for example, text) corresponding to the text control to other applications or other devices.
The electronic device may be a portable terminal device such as a mobile phone, a tablet computer, a wearable device, etc. on which iOS, android, microsoft or other operating systems are mounted, or may be a non-portable terminal device such as a Laptop computer (Laptop) having a touch-sensitive surface or touch panel, a desktop computer having a touch-sensitive surface or touch panel, etc. The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the invention, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 5 is a schematic software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 5, the application package may include application a and application B, which may be, for example, camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. application programs.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 5, the application framework layer may include a system window framework, a system view framework, a system drag service, and the like.
The system window framework is used to provide windows for applications.
The system view framework is used to manage and display views and manage response behavior of the controls. The display interface of the electronic device 100 may be comprised of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The system drag service is used for generating and managing a drag window and changing the display position of the window according to the sharing operation (such as drag operation) of the user.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes in detail the interaction between the modules in the software architecture of the electronic device 100, taking the user interfaces shown in fig. 2A-2G as a specific example.
Fig. 2A-2G illustrate user interfaces involved in sharing content corresponding to an interface element from one application to another application when the electronic device 100 detects a drag operation acting on the interface element in a sharing mode.
Fig. 6 is a flowchart of interaction between internal modules in the software structure of the electronic device 100 according to the embodiment of the present application.
As shown in fig. 6, the content sharing method provided in the embodiment of the present application relates to an application a, an application B, a system window frame, a system view frame, and a system drag service in a software structure of the electronic device 100. The descriptions of the system window frame, the system view frame, and the system drag service may be referred to in the foregoing related content in fig. 5, and will not be repeated here.
As shown in fig. 6, interactions between modules in the software structure may include:
stage one (S101-S102): starting application A
S101, detecting a starting operation by the application A.
The electronic device 100 may detect a launch operation on application a, which may be used to trigger launching of application a. For example, the start operation may be a click operation of an icon acting on the application a. By way of example, the application a may refer to the entertainment-like application mentioned in the relevant content of fig. 2A.
It will be appreciated that the embodiments of the present application are not limited to application a.
S102, the application A starts display of a window of the application A.
In response to the start operation, the electronic device 100 may display a window of the application a. It should be appreciated that the window may include one or more interface elements therein that form a user interface for display on the display of the electronic device 100 in a permutation and combination, or in an overlay. One or more windows may be included in one user interface. Interface elements can be divided into controls and layouts, where a layout is a special control, one layout can contain other layouts or controls, and no other layout or control can be found in the control. The user interface of the application a may be referred to as the user interface 10 shown in fig. 2A, for example.
Fig. 7 is a schematic diagram of a tree structure of windows, controls and layouts provided in an embodiment of the present application.
The window is the root of all the display content, and a View tree can be built in the window, which describes the superposition and arrangement relation of the controls and layouts contained in the user interface. As can be seen in FIG. 7, layout 1 is in the window, layout 1 may include layout 21, layout 22, layout 23, etc., layout 1 includes control 31, layout 22 includes control 32, layout 23 includes control 33.
Illustratively, FIG. 8 is a schematic diagram of a portion of the layout and controls in the user interface 10 provided by embodiments of the present application. As can be seen from fig. 8, control 11 and control 12 are located in layout 1.
Stage two (S103-S107): entering sharing mode
S103, detecting the operation of entering the sharing mode by the application A.
The electronic device 100 may detect an operation acting on the user interface provided by the application a when the user interface is displayed, which may be used to trigger the electronic device 100 to enter the sharing mode. In the sharing mode, the electronic device 100 may change the user interface currently displayed by the electronic device 100 and change the response behavior of the interface elements in the currently displayed user interface. See in particular the following step S106 and step S107. This operation may be, for example, a quick three-click operation as shown in fig. 2A.
It should be understood that, when the sharing operation of the user is a drag operation, the sharing mode may also be referred to as a drag mode, and the embodiment of the present application does not limit the name.
S104, the application A sends the operation indication information to the system window frame.
In response to the operation, the electronic device 100 transmits instruction information of the operation to the system window frame through the application a.
S105, the system window frame sends a rendering request to the system view frame according to the indication information.
The electronic device 100 may send a rendering request through the system window frame to the system view frame. The rendering request is used to trigger a modification of the current display content, i.e. rendering in a sharing mode is used for the currently displayed user interface. Wherein the modification of the display content may be used to prompt the consumer electronic device 100 that a sharing mode has been entered.
S106, rendering the window of the application A in a sharing mode by the system view framework.
The electronic device 100 may use rendering in the sharing mode for the currently displayed window, i.e., display the window in the sharing mode, through the system view framework in response to the rendering request. For example, the window in the sharing mode may refer to displaying a border around each interface element in the currently displayed window. The border may refer to the border around each interface element as shown in fig. 2B.
It can be understood that the rendering in the sharing mode is not limited to displaying the frame around each interface element, but can also change the display effect of the interface element, display text prompt information, etc., and the related content of step S204 shown in fig. 9 will be described in detail herein. In addition, the current display content may not be changed, and steps S105 to S106 are optional steps.
S107. the system view framework alters or creates the response behavior of the interface elements in the window of application a.
The electronic device 100 may alter or create the response behavior of the interface elements in the window of application a through the system view framework. After the response behavior of the interface element is changed or created, the electronic device 100 can trigger sharing of the interface element in response to a drag operation acting on the interface element.
In particular, the system view framework may adjust the response behavior of an interface element to a specified operation, which may be referred to as a drag operation, which may be referred to as sharing of the interface element.
It will be appreciated that the designation operation is not limited to a drag operation, which is not a limitation of the embodiments of the present application.
Stage three (S108-S117): content sharing
S108, detecting the drag operation by the application A.
After entering the sharing mode, the electronic device 100 may detect a drag operation acting on an interface element in the window of the application a. The drag operation may be a touch operation acting on the display screen, and may be, for example, a continuous drag operation of a user's finger acting on the display screen as shown in fig. 2D to 2F.
It can be appreciated that, in the steps S108 to S117, the sharing operation is taken as an example of the drag operation, and the partial process of content sharing between applications is described, and the sharing operation is not limited to the drag operation, but may be a click operation, a long-press and drag operation, etc., specifically, regarding what form of sharing operation is detected by the application a, the sharing of the content is triggered, which is consistent with the specified operation of the system view frame adjustment in the step S107.
S109, the application A generates a touch event according to the dragging operation.
The electronic device 100 may generate a touch event through the application a according to the drag operation.
It can be appreciated that when the drag operation is a touch operation on the display screen, the input event generated by the application a according to the drag operation is a touch event. In some embodiments, when the drag operation is an operation triggered by the user through the mouse, the input event generated by the application a according to the drag operation is a mouse event. The embodiment of the application does not limit the type of the input event.
The input event may include information of event type, coordinates, event, etc. The event types may include a down event, a move event, and an on event, where the down event represents a start of a user gesture, the on event represents an end of a user gesture, and the move event represents a process of a user gesture. An Input event triggered by a user gesture may include a down event, a plurality of move events, and an up event. Wherein the event type in the input event indicates that the operation of the user is specifically a drag operation, a click operation, a long press and drag operation, or the like. The coordinates refer to the position of the sharing operation on the display screen, and the time refers to the time when the user triggers the sharing operation.
S110, the application A sends the touch event to the system window frame.
The electronic device 100 may send a touch event to the system window frame through application a.
S111, the system window frame sends the touch event to the system view frame.
The electronic device 100 may send the touch event to the system view frame through the system window frame.
S112, triggering the screenshot of the target interface element according to the touch event by the system view frame, and determining the screenshot as the content displayed in the dragging process.
The target interface element is an interface element that the electronic device 100 acts upon when it starts to detect a drag operation. For example, the target interface element may refer to an interface element that the user's finger starts touching the display screen when the user initiates a drag operation, and the touch point points to.
The electronic device 100 may trigger a screenshot of the target interface element according to a touch event through the system view framework and determine the screenshot as content displayed during the drag. The system view framework may find the target interface element acted by the drag operation by traversing the interface element in the window of the current application a according to the coordinate information corresponding to the down event included in the touch event, and determine, according to the response behavior adjusted by the target interface element in step S107, whether the operation currently acting on the target interface element is a specified operation, and if so, trigger the execution of the adjusted response.
The electronic device 100 may display a screenshot of the target interface element during the drag after detecting a drag operation of the user on the target interface element.
Illustratively, the target interface element may refer to the picture control 103A shown in fig. 2C, that is, the interface element pointed to by the gesture 1 shown in fig. 2C.
It is understood that step S112 is an optional step.
S113, the system view framework acquires the transmission content aiming at the target interface element.
The electronic device 100 may obtain the content of the target interface element through the system view framework, and determine the content as the content transmitted by the sharing process. For example, when the content corresponding to the target interface element is a picture, the content shared by the electronic device 100 is the picture.
The system view framework may obtain the transmission content for the target interface element in the following two ways:
1) Acquiring the transmission content from the information carried by the target interface element
Specifically, the system view framework may acquire information carried by the target interface element through an external interface of the target interface element.
Illustratively, for a Text control, a developer can set data corresponding to the Text control through a setstring interface of the Text control. When the transmission content of the Text control is acquired, the data corresponding to the Text control can be acquired through a getstring interface, and the data is used as the transmission content of the Text control.
2) Obtaining the transmission content from a pre-adapted configuration file
The configuration file of the control indicates the attribute, layout, size, location, etc. information of the control. The developer can write the transmission content of the control into the configuration file in advance when configuring the configuration file of the control, so that the sharing mode is adapted in advance. In this way, when the transmission content of the control needs to be acquired, the transmission content corresponding to the control can be searched from the configuration file of the control.
Some exemplary code for a configuration file for a Text control is shown below:
the android: dragcontext= "Henry" is code added in the configuration file in advance for the developer to adapt to the sharing mode. When the transmission content of the control needs to be acquired, the system view framework can determine that the transmission content of the Text control is the Text "Henry" through the content added in advance.
It should be understood that the transmission content is the content to which the interface element corresponds, and the interface element is a presentation form of the transmission content for the user. The content transmitted by the electronic device 100 may include: text, picture, voice, form, video, file, etc., the embodiments of the present application do not limit the content of the transmission.
S114, the system view framework sends the screenshot of the target interface element and the transmission content to the system dragging service.
The electronic device 100 may send the screenshot of the target interface element and the transmission to the system drag service through the system view framework.
It will be appreciated that when step S112 is an optional step, the system view framework may send only the transmission content to the system drag service.
S115, the system drag service generates a drag window, and a screenshot is displayed in the drag window.
The electronic device 100 may display a screenshot of the target interface element following a touch point of the user acting on the display screen during the user drag. Specifically, the electronic device 100 may generate a drag window through the system drag server, and display a screenshot of the target interface element in the drag window, and the system drag service may synchronously move the drag window according to a position of a touch point acted on the display screen by the drag operation of the user, so as to achieve an effect that the screenshot of the target interface element moves along with the drag operation of the user. For example, referring to fig. 2C-2F, the screenshot displayed during the drag may be screenshot picture 103B.
It is understood that step S115 is an optional step.
S116, the system dragging service sends the transmission content to the application B.
The electronic device 100 may send the transmission content to the application B through the system drag service.
The application B may be a preset designated application, such as a desktop, a memo application, or the like. Alternatively, the application B may be an application determined under a preset rule, for example, an application that has been recently opened by the user in a history, an application that has the highest frequency of use by the user, and so on. Alternatively, the application B may be an application selected for the sharing operation of the user, for example, when the sharing operation is a drag operation, and the electronic device 100 detects the drag operation of the user acting on the target interface element to the application icon of the application B, the system drag server may send the transmission content to the application B. The embodiment of the present application does not limit the application B.
S117, displaying the transmission content or the identification of the transmission content in the window of the application B by the application B.
After the electronic device 100 sends the transmission content to the application B, the transmission content or an identification of the transmission content may be displayed in the window of the application B. In the embodiment of the present application, the transmission content and the identification of the transmission content may also be referred to as a second interface element. For a specific explanation of the interface elements, reference may be made to the foregoing.
For example, when the transmission content is a picture or text, application B may initiate display of a window of application B and display the picture or text directly in the window. For another example, when the transmission content is a video, the application B may initiate display of a window of the application B, and display a playing interface of the video in the window, where a frame of image of the video may be displayed in the playing interface, and a playing control may be included in the playing interface, where the playing control may be used to trigger playing of the video. For another example, when the transmission content is voice, the application B may initiate display of a window of the application B, and display a play icon of the voice in the window, where the play icon may be used to trigger playing the voice.
It should be understood that the identifier of the transmission content is a display form of the transmission content, and the display form may be represented as a preset icon or a screenshot of a target interface element, which is not limited in the embodiment of the present application.
Illustratively, the window of the application B may refer to the user interface 20 as shown in fig. 2G, and the transmission content or the identification of the transmission content displayed in the user interface may refer to the picture 203A as shown in fig. 2G.
In addition, it should be noted that, the display of the window of the application B may be started by the application B before the step S117, or after the application B acquires the transmission content, in the step S117, the display of the window of the application B is triggered by the application B first, and then the transmission content or the identifier of the transmission content is displayed in the window of the application B.
It can be appreciated that the system drag service may send the transmission content to other devices in addition to the application B, so as to implement content sharing between devices. In addition, the foregoing steps S101-S117 can realize content sharing from the application a to the application B, and it should be understood that the application a and the application B may be the same application besides different applications, which is not limited in the embodiment of the present application. As can be seen from steps S101-S117, the electronic device 100 achieves content sharing from application a to application B through the system window frame, the system view frame, and the system drag server in the application frame layer. And automatically adjusting the response behavior of the interface element to the drag operation through the system view frame, and determining the display content and the transmission content in the drag process. The system-level content sharing effect is achieved, so that a developer does not need to adapt to one application independently, manually declares the content which can be dragged in the application program and the content which is displayed and transmitted in the dragging process, the content sharing among the applications can be achieved, the workload of the developer is reduced, and the application scene of the content sharing is enlarged.
Fig. 9 is a flowchart of a content sharing method according to an embodiment of the present application.
As shown in fig. 9, the method includes:
s201. the electronic device 100 displays the first window.
The first window may be a window of a first application. The electronic device 100 may detect an operation of an application icon of the first application by a user, for example, a click operation, and display a first window of the first application in response to the operation. Illustratively, the display in this first window may refer to the user interface 10 shown in FIG. 2A.
One or more interface elements can be included in the first window, where an interface element refers to a series of elements in the user interface that meet user interaction requirements, and includes: pictures, text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, and the like, as well as combinations of these controls.
In a specific implementation, the electronic device 100 may detect the start operation through the application a, and start display of the user interface of the application a. The first application may refer to application a, and the display content in the first window may refer to a user interface of the application a, which may be specifically described in the foregoing steps S101 to S102.
S202, the electronic device 100 detects an operation of entering a sharing mode, wherein the operation acts on a first window.
The operation may refer to an operation acting on the first window. Illustratively, the operation may refer to the operation shown in FIG. 2A, such as a quick three-click operation. Alternatively, the operation may also refer to an operation that acts on the pull-down menu to turn on the sharing mode, which is not limited in the embodiments of the present application. In the embodiment of the present application, this operation may also be referred to as a first operation, which is used to trigger the electronic device 100 to enter the sharing mode.
After entering the sharing mode, the electronic device 100 may adjust the response behavior of one or more interface elements in the first window, so that the electronic device 100 can detect the sharing operation acting on the interface element, and trigger the sharing of the interface element. See for details the description of the subsequent step S203, which is not first developed here. In addition, after entering the sharing mode, the electronic device 100 may start rendering in the sharing mode on the first window, that is, change the display content, and remind the user from the visual point of view that the sharing mode has been entered currently. See for details the following description of step S204, which is not first developed here.
In other words, for the same operation, the response behavior of the electronic device 100 to the same interface element is different before entering the sharing mode and after entering the sharing mode, or the interface element does not have a response behavior before entering the sharing mode.
This operation may refer to the operation mentioned in step S103 above. See for a detailed description of step S103 above.
Further, the electronic device 100 may only control the entering of the sharing mode of a part of the interface elements in the first window, or the electronic device 100 may not respond to the operation of entering the sharing mode. In this way, only a portion of the interface elements in the first window may be allowed to support sharing, and another portion of the interface elements may be prohibited from sharing. The method is used for avoiding partial application with safety privacy information, and when the content sharing in the application is not hoped to be carried out, a developer can automatically adjust the content which is supported or not supported to be shared, so that the safety privacy of the user is ensured.
The electronic device 100 may provide, among other things, the following three levels of barred sharing:
1) Window level
The electronic device 100 prohibits the entire user interface from entering the sharing mode.
That is, after the electronic device 100 detects the operation of entering the sharing mode, the electronic device 100 does not enter the sharing mode.
2) Layout level
The electronic device 100 may prohibit a partial region in the entire user interface from entering the sharing mode, which may include a plurality of controls therein.
That is, after the electronic device 100 detects the operation of entering the sharing mode, the electronic device 100 may control only the first area in the first window to enter the sharing mode. Then after entering the sharing mode, the electronic device 100 adjusts the response behavior of the interface element only in the first region of the first window, and only starts rendering in the sharing mode for the content of the first region in the first window.
3) Control level
The electronic device 100 may prohibit a portion of the controls in the user interface from entering the sharing mode.
That is, after the electronic device 100 detects the operation of entering the sharing mode, the electronic device 100 may only control other interface elements except the first control to enter the sharing mode. Then after entering the sharing mode, the electronic device 100 adjusts the response behavior of only the interface elements in the first window except for the first control, and initiates rendering in the sharing mode only for the content in the first window except for the first control.
Illustratively, when the developer customizes the content which is forbidden to be shared in the application, the forbidden sharing of different levels can be respectively realized by calling an 'enablesystem dragag' interface in the XML configuration files of the three levels.
Layout level a partial exemplary code of a layout level profile is shown below:
the android: enablesystem flag= "false" is code added in the configuration file of the layout to prohibit the layout from entering the sharing mode.
It can be appreciated that the execution sequence of step S201 and step S202 is not limited in this embodiment, for example, the electronic device 100 may detect the operation of entering the sharing mode first and then display the first window.
S203. the electronic device 100 alters or creates a response behavior of the interface element in the first window.
The response behavior of the interface element refers to a response performed by the electronic device 100 after detecting a specified operation acting on the interface element.
The electronic device 100 alters or creates the response behavior of the interface element in the first window, so that when the electronic device 100 detects the sharing operation acting on the interface element in the first window after entering the sharing mode, the response executed by the electronic device 100 is triggering the sharing of the interface element.
The response behavior of the electronic device 100 for changing the interface element means that before entering the sharing mode, the electronic device 100 may detect an operation that acts on the interface element and is the same as the sharing operation, execute a certain response, and after entering the sharing mode, the electronic device 100 changes the response to trigger the sharing of the interface element.
The response behavior of the electronic device 100 to create the interface element means that, before entering the sharing mode, when the electronic device 100 detects an operation that acts on the interface element and is the same as the sharing operation, the electronic device 100 does not perform any response, and after entering the sharing mode, the electronic device 100 may determine that the response of the interface element triggers sharing of the interface element.
In particular implementations, the electronic device 100 may alter or create the response behavior of the interface element in the first window through the system view framework. The details of step S107 are specifically referred to in the foregoing description, and will not be repeated here.
It should be appreciated that the electronic device 100 may alter or create the response behavior of only a portion of the interface elements in the first window. The portion of the interface elements may be interface elements set by the developer that allow entry into the sharing mode. After entering the sharing mode, the electronic device 100 may adjust the response behaviors of the M interface elements in the first window, so that the electronic device 100 may trigger sharing of the target interface element in response to the sharing operation of the target interface element acting on the M interface elements. Wherein M is more than or equal to 1, and M is a positive integer. The description of entering the sharing mode with only part of the interface elements can be referred to the relevant content of step S202, and will not be repeated here.
S204, the electronic device 100 displays a first window in the sharing mode.
In response to the operation, the electronic device 100 enters a sharing mode, and starts rendering in the sharing mode on the first window, that is, displays the first window in the sharing mode. The first window in the sharing mode is different from the first window before entering the sharing mode. Specifically, after entering the sharing mode, the electronic device 100 may display a prompt (e.g., a first prompt) in the first window, where the prompt is used to indicate that the electronic device 100 has entered the sharing mode.
The prompt message may be expressed as:
1) Newly added information
That is, in response to an operation to enter the sharing mode, the electronic device 100 may add information in the first window. And reminding the user of the current entering of the sharing mode through the change of the information in the first window before and after the entering of the sharing mode.
For example, the prompt information may include a newly added frame for each interface element in the first window after entering the sharing mode. As shown in fig. 2B (a), the first menu bar 102 in the user interface 10 and the interface elements of the photographing option, the focus option, the recommendation option, the city option, the more option, and the head portrait, the name, the posted entertainment content, and the sharing option, the comment option, the like for the entertainment content of the user in the browsing area 103, and the first page option, the discovery option, the message option, the local option, and the like in the second menu bar 104, the second menu bar 104 are surrounded by a rectangular border, respectively.
In a specific implementation, after the electronic device 100 enters the sharing mode, the interface elements in the first window may be traversed, size and position information of each interface element are determined, size and position of a frame of each interface element are determined according to the size and position information, and the frame of each interface element is displayed in the first window. For example, assuming that the electronic device 100 obtains the length L and the width W of the first interface element and the position of the first interface element in the first window, the electronic device 100 may display a rectangular frame with the length L and the width W at the position after entering the sharing mode.
The frame may be used to prompt a user that the electronic device 100 has entered a sharing mode, and further, the frame may be further used to prompt a user that the electronic device 100 may detect a sharing operation acting on an interface element surrounded by the frame, and trigger sharing of the interface element.
The electronic device 100 may determine the size and location information of each interface element in the following two ways:
a) Obtaining size and position information of each interface element from configuration file of interface element
In this case, the configuration file of the interface element may include size and position information of the interface element. The electronic device 100 may obtain the size and location information of the interface element from the configuration file of each interface element in the process of traversing the interface element in the first window.
b) Calculating the size and position information of each interface element according to the layout of each interface element in the first window
In this case, the electronic device 100 needs to calculate the size and the position information of each interface element according to the placement and the arrangement of each interface element in the first window, and the relative relationship (such as the positional relationship, the size relationship, and the like) between each interface element. For example, assume that there are two interface elements: the length of the second control is X, and the length of the second control is Y times of the length of the first control, so that the length of the second control is XY according to the length relation between the second control and the first control. For another example, assuming that three controls are known to be side-by-side in a trisected layout, the size and position of the three controls may be relatively determined based on the size and position of the window.
It is to be understood that the manner in which the electronic device 100 determines the size and position information of each interface element is not limited to the two ways described above, for example, the electronic device 100 may combine the two ways described above, a part of the interface elements may directly obtain the size and position information of the interface element from the configuration file, and another part of the interface elements may combine the layout of each interface element to calculate the size and position information of the interface element.
It can be appreciated that the embodiment of the application does not limit the expression form of the first prompt information, and the first prompt information may be represented as a graph, an icon, a text, or the like. For example, after entering the sharing mode, the electronic device 100 may display a text prompt "currently entered, please select the content to be shared" at the bottom end of the first window. For another example, after entering the sharing mode, the electronic device 100 displays a sharing icon in each interface element of the first window.
2) Display effect of interface element is changed
That is, the electronic device 100 may change the display effect of each interface element in the first window after entering the sharing mode.
The display effects may include static effects on the display of position, size, color, transparency, shading, saturation, brightness, etc., as well as dynamic effects such as dithering. For example, after entering the sharing mode, the electronic device 100 may reduce the saturation of the interface elements in the first window. For another example, after entering the sharing mode, the electronic device 100 may dither the display of the interface elements in the first window.
It is to be understood that the difference between the first window in the sharing mode and the first window before entering the sharing mode is not limited to the above two types, and the embodiment of the present application is not limited thereto. In a specific implementation, the electronic device 100 may use rendering in the sharing mode for the currently displayed first window through the system view frame, and specifically may refer to the related content of the foregoing step S106, which is not described herein again.
In addition, when the first window includes an interface element that does not support sharing, the electronic device 100 may display the first window in the sharing mode in one or two of the following manners:
1) The electronic device 100 renders only the interface elements supporting sharing in the sharing mode
That is, after the electronic device 100 enters the sharing mode, the electronic device 100 may display the prompt information only in the area supporting the sharing interface element, for example, display the bezel only around the sharing interface element, or the electronic device 100 only changes the display effect of the sharing interface element.
Illustratively, referring to (B) in fig. 2B, the interface elements in the first menu bar 102 and the second menu bar 104 do not include a border, and the interface elements in the browsing area 103 include a border.
It can be seen that, in addition to the prompt information displayed in the first window may prompt the user that the electronic device 100 has entered the sharing mode, the prompt information may also be used to highlight the interface element in the first window that allows entering the sharing mode.
2) The electronic device 100 displays only interface elements that support sharing
That is, after the electronic device 100 enters the sharing mode, the electronic device 100 may display only the interface elements supporting sharing, and stop displaying the interface elements not supporting sharing, such as the third interface element.
For example, referring to (a) in fig. 2B, the electronic device 100 displays only the content in the browsing area 103.
It will be appreciated that step S204 is optional.
S205, the electronic device 100 detects a sharing operation acting on a first interface element in a first window.
Since the electronic device 100 sets the response behavior of each interface element in the first window to the sharing operation. When the electronic device 10 detects a sharing operation acting on the first interface element in the first window, the electronic device 100 may trigger sharing of the first interface element in response to the operation.
The sharing operation may be referred to as a drag operation. Illustratively, the first interface element may refer to the screenshot 103B as shown in fig. 2C, and the sharing operation may refer to the drag operation as shown in fig. 2C-2F, which may include the drag operation of gesture 1 to gesture 2 as shown in fig. 2C, the drag operation of gesture 2 to gesture 3 as shown in fig. 2D, the drag operation of gesture 3 to gesture 4 as shown in fig. 2E, and the end drag operation (lift operation) of gesture 5 as shown in fig. 5. The sharing operation may also be referred to as a drag operation as shown in fig. 4A-4B, which may include a drag operation of gesture 1 through gesture 2 as shown in fig. 4A, and a drag operation of gesture 2 through gesture 3 as shown in fig. 4B, for example. Further, the drag operation may refer to a sliding operation from a position where the first interface element is located to a specified position.
Further, the first interface element may include one or more interface elements. When the first interface element includes a plurality of interface elements, the sharing operation may be used to trigger sharing of the plurality of interface elements. At this time, the sharing operation may further include a selection operation and a drag operation acting on the interface element. Illustratively, the first interface element may include a picture control 103A and a picture control 103C as shown in fig. 3B. The sharing operation may include a click operation on the picture control 103A as shown in fig. 3A, and a click operation on the picture control 103C as shown in fig. 3B, and a drag operation of gesture 1 through gesture 2 as shown in fig. 3D.
It can be appreciated that the sharing operation is not limited in the embodiment of the present application, and the sharing operation may be one operation or a series of operations, and the sharing operation may be a drag operation, a click operation, or a long-press and drag operation.
S206, the electronic device 100 obtains the transmission content corresponding to the first interface element.
After the electronic device 100 detects the sharing operation acting on the first interface element, the electronic device 100 may acquire the transmission content for the first interface element.
The electronic device 100 may obtain the transmission content of the first interface element in two ways: 1) Acquiring the transmission content from information carried by the first interface element, and 2) acquiring the transmission content from a pre-adapted configuration file. For a detailed description of these two acquisition modes, reference may be made to the content related to step S113 in fig. 6, and details are not repeated here.
In a specific implementation, the electronic device 100 may acquire the transmission content for the first interface element through a system view framework. The details of step S113 may be referred to in the foregoing description, and will not be described herein.
S207, the electronic device 100 triggers sharing of the transmission content.
The sharing of the transmission content by the electronic device 100 may include the following two cases:
1) Electronic device 100 performs inter-window sharing of transmission content
That is, the electronic device 100 may share the transmission content from one window to another window. Specifically, the electronic device 100 may display, in response to the sharing operation, a second interface element corresponding to the first interface element in the second window. Wherein the second interface element comprises: the transmission content or an identification of the transmission content. The identifier of the transmission content may be an icon or a screenshot of the first interface element, and the description of the identifier may refer to the related content and is not repeated herein.
For example, the transmission content or the identification of the transmission content may refer to the picture 203A, or the transmission content or the identification of the transmission content may refer to the picture 203C and the picture 203D.
Further, sharing among windows can be divided into two types:
a) Sharing within applications
At this time, the first window and the second window are the same window belonging to one application. For example, the first window and the second window may display content of different pages of the application.
b) Sharing among applications
At this time, the first window and the second window belong to windows of different applications. For example, the first window is a window of a first application, and the second window is a window of a second application.
Thus, the electronic device 100 not only can realize the drag sharing of the content in the application, but also can realize the drag sharing of the content between the applications, and the flexibility of the content sharing is improved.
In addition, it should be noted that the electronic device 100 may not display the first window when displaying the second window. Thus, the electronic device 100 may complete the window switching while completing the content sharing. Illustratively, the second window may refer to the user interface 20 shown in FIG. 2G. Alternatively, the electronic device 100 may still display the first window on the same interface while the second window is displayed. Thus, the electronic device 100 can display the content sharing party and the content receiving party simultaneously when the content sharing between the windows is realized. Illustratively, the user interface may refer to the user interface 30 shown in fig. 2F (b), with a first window displayed in region 301 and a second window displayed in region 302 in the user interface 20.
2) Electronic device 100 performs inter-device sharing of transmission content
That is, the electronic device 100 may share the transmission content from the electronic device 100 to other devices (e.g., a second device). The other devices may be devices that have a connection relationship with the electronic device 100, or devices that belong to an account or group with the electronic device 100.
In some embodiments, the sharing operation may be a drag operation, and before the electronic device 100 shares the transmission content corresponding to the first interface element to another window or other devices, the electronic device 100 may display a screenshot of the first interface element that is moved according to the drag operation.
Specifically, when the electronic device 100 detects the sharing operation acting on the first interface element, the electronic device 100 may perform a screenshot on the first interface element to obtain a screenshot of the first interface element, and then, the electronic device 100 may display the screenshot, and the screenshot may be moved according to a movement track of the sharing operation of the user. The screenshot may be, for example, screenshot 103B as shown in fig. 2D-2E, screenshot 103D as shown in fig. 3D, or screenshot 103B as shown in fig. 4B.
In some embodiments, when the electronic device 100 detects the sharing operation, the electronic device 100 may trigger to display a plurality of applications or device icons that receive the transmission content, and after detecting the user selection operation on the target application or the target device icon, the electronic device 100 shares the transmission content to the target application or the target device. Thus, the user can autonomously select the target application or the target device for receiving the transmission content according to the own requirement, and the operability of the user is improved.
Further, when the sharing operation is a drag operation and the electronic device 100 performs sharing between applications, the electronic device 100 may trigger to display a receiving party of the transmission content when detecting that the drag operation moves from the position where the first interface element is located to a specified position (e.g., the bottom end of the display screen), or when detecting that the specified movement track (e.g., downward movement) of the drag operation. Taking the receiving party as an application list, wherein the application list comprises a plurality of application icons, and further, when the drag operation is specifically from the position of the first interface element to the position of one of the application icons in the application list, triggering to share the transmission content corresponding to the first interface element into the application. Illustratively, the application list may refer to the application list 105 as shown in fig. 2D or fig. 2E, and the device icon may refer to the device list 106 as shown in fig. 4A.
It is to be understood that the recipient of the transmission content displayed by the electronic device 100 is not limited to the above-mentioned application list. The electronic device 100 may also trigger the recipient to display the transmission content after detecting the operation to enter the sharing mode.
The recipient may also be a list of devices, for example. The device list may display icons of a plurality of devices, and when the drag operation is specifically from the position of the first interface element to the position of one of the device icons in the device list, the electronic device 100 may trigger sharing the transmission content corresponding to the first interface element to the device corresponding to the device icon. At this time, the electronic device 100 may be referred to as a first device, and the device corresponding to the device icon may be referred to as a second device.
For another example, the recipient may also be a contact list. The electronic device 100 may trigger sharing the transmission content corresponding to the first interface element to the device used by the contact when the drag operation is specifically from the position of the first interface element to the position of one of the contact icons in the contact list. At this time, the electronic device 100 may be referred to as a first device, and the device used by the contact may be referred to as a second device. The contacts may refer to phone contacts pre-stored in the electronic device 100, or contacts in a designated application (e.g., a WeChat application), or contacts in an account group, and the embodiment of the present application does not limit the contacts. The embodiment of the application does not limit the expression form of the receiver.
When the drag operation is specifically a sliding operation from the position of the first interface element to the designated position, the designated position may be the position where one of the application icons in the above-mentioned application list is located, at this time, the electronic device 100 may trigger sharing between the transmission content, and display the transmission content or the identifier of the transmission content in the window of the application, or the designated position is one of the device icons in the above-mentioned device list, or the position where one of the contact icons in the contact list is located, at this time, the electronic device 100 may trigger sending the transmission content corresponding to the first interface element to other devices.
In addition, the first interface element may include one or more interface elements. In this way, the electronic device 100 may complete sharing of one or more interface elements through one sharing operation. When the first interface element includes a plurality of interface elements (e.g., N, N+.2, and N is a positive integer), the electronic device 100 may detect a selection operation for the plurality of interface elements before detecting a sharing operation for the plurality of interface elements. Illustratively, the selection operation may refer to a selection operation on the picture control 103A as shown in fig. 3A, and a selection operation on the picture control 103C as shown in fig. 3B, where the first interface element includes the picture control 103A and the picture control 103C. According to the content sharing method, the user can quickly complete the sharing of the content between different applications or different devices in the sharing mode by quickly entering the sharing mode, so that the application scene of content sharing is enlarged, and the operation of the user is convenient.
In the embodiment of the present application, the above steps S201 to S207 may be performed by a system unit of the electronic device 100, which may be located in a frame layer of the electronic device 100, in other words, the application to which the system unit and the first window belong is a different module of the electronic device 100. That is, the sharing mode can be defined as a system-level drag sharing mode, so that any application under the system can respond to the first operation and enter the sharing mode, thereby realizing content sharing of the application, expanding the application scene of content sharing and improving the experience of users on drag sharing. The description of the frame layer of the electronic device 100 may be referred to in the foregoing related content in fig. 5, and will not be repeated here.
It should be understood that the steps in the above-described method embodiments may be accomplished by integrated logic circuitry in hardware in a processor or instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor or in a combination of hardware and software modules in a processor.
The application also provides an electronic device, which may include: memory and a processor. Wherein the memory is operable to store a computer program; the processor may be configured to invoke a computer program in the memory to cause the electronic device to perform the method performed by the electronic device 100 in any of the embodiments described above.
The present application also provides a chip system comprising at least one processor for implementing the functions involved in the method performed by the electronic device 100 in any of the above embodiments.
In one possible design, the system on a chip further includes a memory to hold program instructions and data, the memory being located either within the processor or external to the processor.
The chip system may be formed of a chip or may include a chip and other discrete devices.
Alternatively, the processor in the system-on-chip may be one or more. The processor may be implemented in hardware or in software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general purpose processor, implemented by reading software code stored in a memory.
Alternatively, the memory in the system-on-chip may be one or more. The memory may be integrated with the processor or may be separate from the processor, and embodiments of the present application are not limited. For example, the memory may be a non-transitory processor, such as a ROM, which may be integrated on the same chip as the processor, or may be separately disposed on different chips, and the type of memory and the manner of disposing the memory and the processor in the embodiments of the present application are not specifically limited.
Illustratively, the system-on-chip may be a field programmable gate array (field programmable gate array, FPGA), an application specific integrated chip (application specific integrated circuit, ASIC), a system on chip (SoC), a central processing unit (central processor unit, CPU), a network processor (network processor, NP), a digital signal processing circuit (digital signal processor, DSP), a microcontroller (micro controller unit, MCU), a programmable controller (programmable logic device, PLD) or other integrated chip.
The present application also provides a computer program product comprising: a computer program (which may also be referred to as code, or instructions), which when executed, causes a computer to perform the method performed by any of the electronic devices 100 in any of the embodiments described above.
The present application also provides a computer-readable storage medium storing a computer program (which may also be referred to as code, or instructions). The computer program, when executed, causes a computer to perform the method performed by any of the electronic devices 100 in any of the embodiments described above.
It should be appreciated that the processor in the embodiments of the present application may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method embodiments may be implemented by integrated logic circuits of hardware in a processor or instructions in software form. The processor may be a general purpose processor, a digital signal processor (digital signal processor, DSP), an application specific integrated circuit (AP 800plication specific integrated circuit,ASIC), a field programmable gate array (field programmable gate array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor reads the information in the memory and, in combination with its hardware, performs the steps of the above method.
In addition, the embodiment of the application also provides a device. The apparatus may be a component or module in particular, and may comprise one or more processors and memory coupled. Wherein the memory is for storing a computer program. The computer program, when executed by one or more processors, causes an apparatus to perform the methods of the method embodiments described above.
Wherein an apparatus, a computer-readable storage medium, a computer program product, or a chip provided by embodiments of the present application are each configured to perform the corresponding method provided above. Therefore, the advantages achieved by the method can be referred to as the advantages in the corresponding method provided above, and will not be described herein.
The embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
In summary, the foregoing description is only exemplary embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present invention should be included in the protection scope of the present invention.

Claims (19)

1. A method of content sharing, the method comprising:
the first device displays a first window, the first window comprising one or more interface elements;
the first device detecting a first operation on the first window;
after the first device detects the first operation, the first device detects a drag operation acting on a first interface element of the one or more interface elements;
Responding to the drag operation, the first device displays a second interface element corresponding to the first interface element in a second window, or the first device sends the transmission content of the first interface element to the second device;
before the first device detects the first operation, the drag operation acting on the first interface element is not used for triggering the first device to display the second interface element in the second window or send the transmission content to the second device.
2. The method of claim 1, wherein the first device displays a second interface element corresponding to the first interface element in a second window, or wherein before the first device sends the transmission content corresponding to the first interface element to the second device, the method further comprises:
and the first equipment displays a screenshot of the first interface element which moves along the movement track of the drag operation.
3. A method according to claim 1 or 2, wherein the first window and the second window belong to the same application or to different applications.
4. A method according to any one of claims 1-3, wherein the second interface element comprises: the transmission content or an identification of the transmission content.
5. The method of claim 4, wherein the identification is an icon, or a screenshot of the first interface element.
6. The method of any of claims 1-5, wherein the first operation is to trigger the first device to enter a first mode, the method further comprising, after the first device detects the first operation acting on the first window:
the first device displays first prompt information in the first window, wherein the first prompt information is used for indicating that the first device enters the first mode.
7. The method of claim 6, wherein the first hint information is used to highlight the first interface element.
8. The method of claim 7, wherein the first prompt includes a border of the first interface element.
9. The method of any of claims 1-8, wherein after the first device detects a first operation on the first window, the method further comprises:
the first device stops displaying the third interface element in the first window.
10. The method according to any one of claims 1-9, wherein the dragging operation is in particular: and sliding operation from the position of the first interface element to a designated position.
11. The method of claim 10, wherein the step of determining the position of the first electrode is performed,
the first window is a window of a first application, the second window is a window of a second application, the appointed position is the position of an icon of the second application,
or,
the appointed position is the position of the icon of the second device or the position of the icon of the first contact, wherein the second device is the device used by the first contact.
12. The method of any of claims 1-11, wherein the second window is displayed in a first user interface, the first user interface further comprising the first window.
13. The method of any one of claims 1-12, wherein the first interface element comprises one or more interface elements.
14. The method of claims 1-13, wherein the first interface element comprises N interface elements, N being ≡2, and N being a positive integer, the method further comprising, before the first device detects a drag operation acting on a first interface element of the one or more interface elements:
the first device detects a selection operation acting on the N interface elements.
15. The method of any of claims 1-14, wherein after the first device detects a first operation on the first window, the method further comprises:
the first device changes or creates response behaviors of M interface elements in the first window, so that the first device can respond to drag operation of the first interface elements in the M interface elements, the identification is displayed in the second window or the transmission content is sent to the second device, M is more than or equal to 1, and M is a positive integer.
16. The method of any of claims 1-15, wherein after the first device detects a drag operation on a first interface element of the one or more interface elements, the method further comprises:
the first device obtains the transmission content from the information of the first interface element.
17. The method of any of claims 1-16, wherein the method is performed by a system unit of the first device, the system unit and an application to which the first window belongs being different modules of the first device.
18. An electronic device comprising a memory, one or more processors, and one or more programs; the one or more processors, when executing the one or more programs, cause the electronic device to implement the method of any of claims 1-17.
19. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1 to 17.
CN202210801011.4A 2022-07-08 2022-07-08 Content sharing method, graphical interface and related device Pending CN117406874A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210801011.4A CN117406874A (en) 2022-07-08 2022-07-08 Content sharing method, graphical interface and related device
PCT/CN2023/105191 WO2024008017A1 (en) 2022-07-08 2023-06-30 Content sharing method, and graphical interface and related apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210801011.4A CN117406874A (en) 2022-07-08 2022-07-08 Content sharing method, graphical interface and related device

Publications (1)

Publication Number Publication Date
CN117406874A true CN117406874A (en) 2024-01-16

Family

ID=89454378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210801011.4A Pending CN117406874A (en) 2022-07-08 2022-07-08 Content sharing method, graphical interface and related device

Country Status (2)

Country Link
CN (1) CN117406874A (en)
WO (1) WO2024008017A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8756519B2 (en) * 2008-09-12 2014-06-17 Google Inc. Techniques for sharing content on a web page
KR102085181B1 (en) * 2013-10-23 2020-03-05 삼성전자주식회사 Method and device for transmitting data and method and device for receiving data
CN106489129A (en) * 2016-09-29 2017-03-08 北京小米移动软件有限公司 The method and device that a kind of content is shared
CN111367457A (en) * 2020-03-09 2020-07-03 Oppo广东移动通信有限公司 Content sharing method and device and electronic equipment

Also Published As

Publication number Publication date
WO2024008017A1 (en) 2024-01-11

Similar Documents

Publication Publication Date Title
WO2021013158A1 (en) Display method and related apparatus
WO2021063074A1 (en) Method for split-screen display and electronic apparatus
CN110839096B (en) Touch method of equipment with folding screen and folding screen equipment
KR102534354B1 (en) System navigation bar display control method, graphical user interface and electronic device
KR20210097794A (en) Display method and related device
CN110362244B (en) Screen splitting method and electronic equipment
JP7302038B2 (en) USER PROFILE PICTURE GENERATION METHOD AND ELECTRONIC DEVICE
CN111597000B (en) Small window management method and terminal
US20240077987A1 (en) Widget display method and electronic device
CN112068907A (en) Interface display method and electronic equipment
CN113791850A (en) Information display method and electronic equipment
CN116302227A (en) Method for combining multiple applications and simultaneously starting multiple applications and electronic equipment
CN115442509B (en) Shooting method, user interface and electronic equipment
WO2023083184A1 (en) Desktop management method, graphical user interface, and electronic device
CN117406874A (en) Content sharing method, graphical interface and related device
CN115268727A (en) Display method and device thereof
CN114003324B (en) Method for combining multiple applications and simultaneously starting multiple applications and electronic equipment
WO2024078412A1 (en) Cross-screen sharing method, graphical interface, and related apparatus
WO2023226922A1 (en) Widget management method, electronic device, and computer-readable storage medium
WO2023160455A1 (en) Object deletion method and electronic device
EP4365722A1 (en) Method for displaying dock bar in launcher and electronic device
CN117991937A (en) Multi-window management method, graphical interface and related device
CN117519861A (en) Interface display method and related device
CN117931036A (en) Task processing-based reminding method and electronic equipment
CN116797191A (en) Conflict management method, user interface and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination