CN115529487A - Video sharing method, electronic device and storage medium - Google Patents

Video sharing method, electronic device and storage medium Download PDF

Info

Publication number
CN115529487A
CN115529487A CN202110704189.2A CN202110704189A CN115529487A CN 115529487 A CN115529487 A CN 115529487A CN 202110704189 A CN202110704189 A CN 202110704189A CN 115529487 A CN115529487 A CN 115529487A
Authority
CN
China
Prior art keywords
video
playing
user
electronic device
progress
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110704189.2A
Other languages
Chinese (zh)
Inventor
杨婉艺
胡凯
张丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110704189.2A priority Critical patent/CN115529487A/en
Priority to PCT/CN2022/086789 priority patent/WO2022267640A1/en
Publication of CN115529487A publication Critical patent/CN115529487A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/437Interfacing the upstream path of the transmission network, e.g. for transmitting client requests to a VOD server

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application relates to the technical field of intelligent terminals, in particular to a video sharing method, electronic equipment and a storage medium, wherein the method comprises the following steps: the method comprises the steps that a first electronic device sends video access information of a video to a second electronic device through a first application, wherein the video access information is used for accessing and playing the video; the second electronic equipment responds to the playing operation of the user for the video access information, and plays the video; the second electronic equipment shares video playing data generated by playing the video to the first electronic equipment; and the first electronic equipment acquires the video playing state of the second electronic equipment according to the acquired video playing data. According to the method and the device, based on the shared video access information and the synchronous video playing data, the progress of watching the video by the users is displayed on the video playing interface, and functional options such as progress switching to other people and visual angles switching to other people are added on the video playing interface, so that interactive communication when the users watch the video together is promoted, and the watching experience of the users watching the video together is improved.

Description

Video sharing method, electronic device and storage medium
Technical Field
The invention relates to the technical field of intelligent terminals, in particular to a video sharing method, electronic equipment and a storage medium.
Background
In the current information sharing era, users have become accustomed to communicating and sharing data through terminal-installed instant messaging applications (apps), for example, users can invite family or friends to watch videos together in a chat group.
Currently, in the function of inviting members to watch videos together in group chat provided by some instant messaging applications, referring to the chat application 101 shown in fig. 1A, a user selects to open the chat application 101 in a desktop application of the mobile phone 100 and then opens the chat group 102, referring to the group chat interface 103 shown in fig. 1B, for example, the group name of the chat group 102 is "chase all to exchange all", the user clicks a further function button 1032 below an input box 1031 on the group chat interface 103, a function menu bar 1033 containing a plurality of group chat functions may be popped up, the user selects the view all function button 104 in the function menu bar 1033 to open the selection interface 105 shown in fig. 1C, the user selects a video application 1051 to be opened in a video application list displayed by the selection interface 105, selects a video name to watch to enter the play option interface 106 shown in fig. 1D, the user may click a button 1061 to select "click a single view invitation" and send a link to watch videos together in the group chat interface 103, and at this time, other users may click a link E to watch videos together shown in the video play option interface 107 shown in fig. 1; the user may click on button 1062 on the play option interface 106 to select "please see everybody in the package hall", and the user may send an invitation link to view a video together in the group chat interface 103 after paying a fee, at this time, other users may directly enter the video play interface 107 shown in fig. 1E to view a video together by clicking the link. It can be understood that the video streams watched by the users watching the video together are homologous, so on the video playing interface 107 shown in fig. 1E, only the user initiating the invitation to watch the video together can control the video playing progress, if the user has a temporary situation that needs to quit the watching-together function, the user can click the end button 1071 in the upper right corner of the video playing interface 107 to end watching the video together, and at this time, other users watching the video together can only quit, and cannot continue watching the video. In addition, the chat box 1072 and the emoticon button 1073 displayed on the video playback interface 107 only support the user to send text and emoticon for communication, and the interface 107 also has no function button for supporting playback by another device.
In a scene of watching videos together in group chat provided by the instant messaging application, when a plurality of users use the function to watch videos together, each user cannot control the progress of the user independently; the user can only play video on the mobile phone or tablet computer installed with the chat application 101 and cannot watch on other devices, for example, cannot watch on a smart screen; in addition, the users cannot communicate through voice call or video call in the process of watching videos together.
Disclosure of Invention
The embodiment of the application provides a video sharing method, electronic equipment and a storage medium, video access information which can open a video for playing is shared through message application to realize video sharing on a group chat interface or a plurality of one-to-one chat interfaces at the same time, and the video access information can support video playing on a plurality of devices, so that a user can switch playing devices at will; the method and the device have the advantages that the visual angle switching function and the progress switching function are added on the video playing interface, so that the user can conveniently switch to the watching visual angles of other users or to the watching progresses of other users to watch wonderful segments together, and the sharing fun of watching videos together is increased; the method and the device have the advantages that the bullet screen comment message displayed on the video playing interface is associated with the playing progress to mark the playing progress corresponding to the user when the user makes a comment, so that the user can conveniently click the bullet screen to select to jump to the wonderful section of the playing progress corresponding to the bullet screen comment, the group chat atmosphere and the sharing pleasure are increased, and the interactive experience of the user watching the video together is improved, wherein the comment message displayed in the bullet screen form can be shared in the message application in the form of the session message, and the method and the device are convenient for other members in a chat group in the message application or a contact friend in the message application to quickly open the video and jump to the corresponding playing progress to watch the wonderful moment of the video by clicking the session message; in addition, by adding the audio and video call function on the video playing interface, a user can directly initiate video calls or voice calls to other members in the chat group or friends in the address list on the video playing interface through the audio and video call function, and therefore the group chat members who watch videos together can conveniently communicate with each other.
In a first aspect, an embodiment of the present application provides a video sharing method, which is applied to a first electronic device and a second electronic device installed with a first application, and the method includes: the method comprises the steps that a first electronic device sends video access information of a video to a second electronic device through a first application, wherein the video access information is used for accessing and playing the video; the second electronic equipment responds to the playing operation of the user for the video access information, and plays the video; the second electronic equipment shares video playing data generated by playing the video to the first electronic equipment; the video playing data comprises data used for marking the playing state and the playing progress of the video; the first electronic equipment acquires video playing data and acquires the video playing state of the second electronic equipment according to the video playing data.
The first electronic equipment shares video access information capable of opening a video for playing to the second electronic equipment; in the process that the second electronic device clicks the video access information to open the video for playing, video playing data generated by playing the video can be synchronously shared with the first electronic device; the first electronic device can acquire, according to the video playing data shared by the second electronic device, a state of playing the video on the second electronic device, for example, a progress of playing the video on the second electronic device, the played video content, or a pause state and a fast-forward state of playing the video on the second electronic device, which is not limited herein.
The video access information may be, for example, a service card described in the following embodiments, and in other embodiments, the video access information may also correspond to an applet used for providing a video playing function on the first application, which is not limited herein.
In a possible implementation of the first aspect, the obtaining, by the first electronic device, the video playing status of the second electronic device according to the video playing data includes at least one of the following presentation modes: displaying the progress of video playing of the second electronic equipment in an interface of the first electronic equipment for playing the video; jumping to the progress corresponding to the video playing data in the interface of the first electronic device for playing the video; and displaying the content of the video played by the second electronic equipment on the interface of the video played by the first electronic equipment.
That is, the state of the first electronic device playing the video, which is obtained from the video playing data by the first electronic device, of the second electronic device may be displayed in multiple presentation manners, for example, the progress of the second electronic device playing the video may be displayed on an interface of the first electronic device playing the video, for example, the progress of the second electronic device playing the video is displayed on a progress bar of a playing interface, in other embodiments, the progress of the multiple second electronic devices playing the video may be displayed on an interface of the first electronic device playing the video, which is not limited herein. In this way, the user can know the progress of others watching the video (for example, watching a movie) at any time in the process of watching the video.
In other embodiments, the presenting manner may also be based on an operation instruction of a user on a video playing interface (i.e., a video playing window described in the following embodiments) of the first electronic device, for example, an operation instruction corresponding to a process of clicking an icon of the second electronic device displayed on a progress bar to jump to the progress of the second electronic device playing the video is clicked, and the video is continuously played by jumping to the progress of the second electronic device playing the video on the interface of the first electronic device playing the video. It can be understood that, after the interface of the first electronic device playing the video jumps to the progress of the second electronic device playing the video, the progress of the first electronic device originally playing the video is replaced by the progress of the second electronic device playing the video, and the progress is used as the current progress of the first electronic device playing the video. Therefore, the user can switch to the wonderful segment being watched by other people to share wonderful when watching the video together with other people, and the watching experience of the user is improved.
In other embodiments, the presenting manner may be based on an operation instruction of a user on a video playing interface of the first electronic device, for example, an operation instruction corresponding to an interface where the second electronic device plays the video is switched to by clicking an icon of the second electronic device displayed on a progress bar, and content of the video played by the second electronic device is displayed on the interface where the first electronic device plays the video, and it may be understood that the manner of displaying the content of the video played by the second electronic device may be that content displayed on the interface where the first electronic device plays the video is replaced with content of the video played by the second electronic device for display; or a part of display area can be separated from the interface of the first electronic device playing the video to display the content of the second electronic device playing the video, namely, the form of picture-in-picture is commonly known; in other embodiments, the content of the video played by the second electronic device may also be displayed through a floating window on the interface of the first electronic device playing the video, which is not limited herein. In addition, it can be understood that the displaying of the content of the video played by the second electronic device may also include displaying a playing progress, but the progress of the video played by the first electronic device is independent of the progress of the video played by the second electronic device, and a user of the first electronic device may close the content of the video played by the second electronic device displayed on the interface of the first electronic device playing the video, and continue to watch the video at the progress of the video played by the first electronic device.
That is, with the video sharing method provided by the present application, a user using a first electronic device can switch a viewing angle to a user using a second electronic device to watch the video. Therefore, in the process of watching videos by a plurality of users, the users can enjoy the experience of watching videos at different places and at the same time, the users can share wonderful pictures or wonderful fragments when watching the videos, and the watching experience of the users is improved.
In one possible implementation of the first aspect, the video access information includes a link for accessing and playing a video, and multimedia content related to the video; and the first application is a messaging application via which the first electronic device sends a message to the second electronic device, the message comprising video access information.
That is, the first electronic device shares the video access information with the second electronic device through the message application, the video access information presented in the message application can display multimedia content such as text introduction, pictures and the like related to the video, and the video access information can be sent to the second electronic device through the message application in a session message (i.e., the message).
For example, a first electronic device (e.g., a mobile phone) sends video access information of "beautiful nange" of a tv series to a second electronic device (e.g., a mobile phone) through an instant messaging application such as a "happy-connected application", and the video access information presented in the "happy-connected application may include multimedia contents such as a scenario introduction, a score, a leading actor introduction, and a drama of the" beautiful nange ". Therefore, the user can conveniently and quickly know the details of the video shared in the message application, and judge whether the user is interested, whether the user opens the video to watch the video together with the family and the like.
In one possible implementation of the foregoing first aspect, a method for a first electronic device to send video access information of a video to a second electronic device via a first application includes: the first electronic device sends video access information to the second electronic device via a session interface of a message application; the conversation interface comprises a first user identification and a second user identification, wherein the first user identification is used for identifying a first user operating the first electronic equipment, and the second user identification is used for identifying a second user operating the second electronic equipment.
That is, the first electronic device may send the video access information to a session interface of the messaging application, for example, a group chat interface in the messaging application, and thus, each member in a chat group corresponding to the group chat interface may click the video access information to open the video for viewing. In other embodiments, the session interface may also be a one-to-one chat interface in a message application, that is, the user of the first electronic device may send the video access information on the one-to-one chat interface with the user of the second electronic device, and it may be understood that the user of the first electronic device may also select multiple friends (for example, friends in an open communication directory) in an address book of the message application, and simultaneously send the video access information to multiple people through the multiple one-to-one chat interfaces. Therefore, the user can share the video to multiple people at the same time conveniently.
In a possible implementation of the first aspect, the obtaining, by the first electronic device, a video playing status of the second electronic device according to the video playing data includes: displaying the progress of the video playing of the second electronic equipment and a second user identification in an interface of the video playing of the first electronic equipment; and the second user identification is used for jumping to the progress corresponding to the video playing data or displaying the content of the video played by the second electronic equipment.
For example, for the foregoing example, the icon of the second electronic device displayed on the progress bar of the video playing interface of the first electronic device may be a second user identifier, where the second user identifier is, for example, an identifier such as a user name and a head portrait logged in an instant messaging application such as a smooth connection application, and for example, a nickname and a head portrait of a smooth connection friend may be displayed on the corresponding progress of the progress bar of the video playing interface of the first electronic device. Therefore, the user can conveniently know the progress of watching the video by the friends watching the video together, the chat topics can be searched, for example, highlights at different progresses can be discussed, and the user watching experience can be improved.
In one possible implementation of the first aspect described above, the video access information comprises a card for accessing the video, the card comprising multimedia content related to the video and a link for accessing the video.
The card comprises a first option and a second option, wherein the first option is used for triggering the playing of the video; the second option is used for triggering the video to be played through a third electronic device which is cooperated with the second electronic device; the first electronic device obtains the state of the second electronic device playing the video according to the video playing data, and the method further comprises the following steps: and displaying the progress of the third electronic equipment in playing the video and an equipment icon of the third electronic equipment in an interface of the first electronic equipment in playing the video.
The second electronic equipment responds to the playing operation of the user for the video access information, and plays the video, and the method comprises the following steps: the second electronic equipment responds to the operation that the user clicks the first option of the card, and triggers the playing of the video; or the second electronic equipment responds to the operation that the user clicks the second option of the card, and the video is triggered to be played on the third electronic equipment.
For example, the card included in the video access information is, for example, a session message that is obtained by the "@ service card" described in the following embodiment and that presents related content of a video, and referring to the related description and the interface schematic diagram of the "@ service card" in the following embodiment for searching for a video keyword, for example, as shown in fig. 5C, the user may click a play button (i.e., a first option of the above-mentioned card) on the session message to directly open the video in a messaging application (e.g., a smooth connection application) for playing, and the user may click an intelligent screen play button (i.e., a second option of the above-mentioned card) on the session message to play the video through a smart screen cooperating with the mobile phone. Therefore, the user can select the video playing device from the currently interconnected or coordinated devices based on the scene, and the user watching experience is improved.
It can be understood that when the user of the second electronic device selects the second option (e.g., the smart screen play button shown in fig. 5C below) to play the shared video, the play progress of the second electronic device displayed on the video play interface of the first electronic device is the progress of playing the video on the third electronic device, and a device icon of the third electronic device currently playing the video, such as the device icon 516b shown in fig. 7 below, can also be displayed at the corresponding progress.
In a possible implementation of the first aspect, the sharing, by the second electronic device, video playing data generated by playing a video to the first electronic device includes: the second electronic equipment uploads the video playing data to a server in communication connection with the first electronic equipment and the second electronic equipment; the first electronic equipment acquires video playing data from the server. The server is a server for running the first application.
Namely, video playing data are synchronized between the first electronic device and the second electronic device through the server, and both the first application on the first electronic device and the first application on the second electronic device run through the server. It can be understood that, in the embodiment of the present application, the user logged in the first application of the first electronic device and the user logged in the first application of the second electronic device are friends with each other, or are both members of the same chat group. For example, user A using the cell phone 201-1 and user B using the cell phone 201-2 described in the embodiments below are either each a good friend or both members of a good chat group "Happy family".
In one possible implementation of the first aspect, the video playing data includes a comment message that is edited and published by a user on an interface of the second electronic device for playing the video, and the comment message is associated with a playing progress of the video; the comment message can be displayed together with corresponding playing progress information on a playing interface of the video; the comment message is at least used for triggering jumping to the corresponding playing progress.
Namely, a comment message, also called a bullet screen, issued by a user through the second electronic device may be displayed on the interface of the first electronic device for playing the video, the comment message is displayed in association with the corresponding video playing progress, and the user using the first electronic device may click the bullet screen on the video playing interface to select to jump to the playing progress corresponding to the bullet screen comment. In other embodiments, the user of the other device that watches the video together may click the bullet screen on the video playing interface to select to jump to the corresponding playing progress, which is not limited herein.
For example, in the following embodiment, the user a using the mobile phone 201-1 may click on the interface of the video played by the mobile phone 201-1, and select to jump to the playing progress corresponding to the barrage comment, through the barrage comment published by the mobile phone 201-2. Therefore, in the process that the user watches the videos together, the corresponding wonderful segment can be switched to randomly according to the bullet screen content to watch the videos, and the watching experience of the user is improved.
In a possible implementation of the first aspect, the corresponding play progress corresponds to a publishing time of the comment message; or the corresponding play progress information indicates a time when the comment message starts to be edited, or the corresponding play progress information indicates a time when the comment message is sent out.
For example, in the following embodiment, the playing progress corresponding to the barrage comment displayed on the video playing interface may be the playing progress corresponding to the time when the user a clicks the comment button on the video playing interface to call out the input method to start editing the comment message, or may be the playing progress corresponding to the time when the user a clicks the send button after completing editing the comment message in the comment publishing interface (refer to a message sending interface shown in fig. 12 below) displayed on the video playing interface, which is not limited herein.
In a possible implementation of the first aspect, the comment message can be shared as a session message to other electronic devices via the first application, and the session message is at least used to trigger playing of the video and jumping to a corresponding playing progress.
Namely, the comment message published on the video playing interface by the user can be converted into a session message (i.e., a message in the message application) to be sent when being sent by the message application, the session message converted by the comment message can also be used as a message for sharing a video, and other device users can open video playing by clicking the session message converted from the comment message and displayed on the session interface and can jump to the playing progress corresponding to the comment message. Therefore, all members in the chat group can open the video to jump to the corresponding highlight segment directly through the session message on the group chat interface, and the members in the chat group can conveniently join together to watch the video and share own watching experience.
In a possible implementation of the first aspect, the method further includes: the second electronic equipment starts audio and video communication with the first electronic equipment in the process of playing the video; the first electronic equipment displays an interface of the audio and video call on an interface for playing the video.
Namely, the user of the first electronic device and the user of the second electronic device can carry out audio and video conversation through the video playing interface. For example, in the following embodiment, a user B may click an audio/video call button in a video playing window to initiate an audio/video call to any number of any members such as the user a in a chat group, for example, the user B and the user a are in an audio/video call, at this time, if the user B clicks an avatar of the user a on a progress bar in the video playing window to select to switch to an angle of view of the user a, the user B may see a call interface between the user a and the user B on the video playing window of the seen mobile phone 201-1 after the switching. In other embodiments, if the user a is in a video call with another user C in the chat group who is watching video, the user B can also watch the call interface between the user a and the user C on the video playing window of the mobile phone 201-1 after switching to the view angle of the user a.
It can be understood that, since the content of the audio/video call may relate to the personal privacy of the user, if the opposite device requests the authorized party of the user of the opposite device to display the audio/video call interface on the opposite device in the form of a pop window in the operation of switching to the view angle of the opposite device, in the above example, especially when the user a and the user B are other users (for example, the user C is also in a one-to-one video call), based on the privacy protection, even if the user a authorizes the user B to switch to the view angle, the call interface with the user C is displayed only on the interface displayed by the mobile phone 201-1 seen by the user B or only the avatar of the user C is displayed, and at this time, the user B cannot obtain the call content between the user a and the user C. In other embodiments, if the users A, B and C chat in group call through the above audio/video call button, the user B switches to the view angle of the user a, and may acquire the call content between the users a and C, which is not required here.
In a second aspect, an embodiment of the present application provides a video sharing method, which is applied to a first electronic device, and the method includes: sending video access information of a video by a first application, wherein the video access information is used for accessing and playing the video; acquiring video playing data, and acquiring the video playing state of the second electronic equipment according to the video playing data; the video playing data is data which is generated by the second electronic equipment playing the video and used for marking the playing state and the playing progress of the video.
In a possible implementation of the second aspect, the obtaining, according to the video playing data, a state of the second electronic device playing the video includes at least one of the following presentation modes: displaying the progress of the second electronic equipment in the video playing interface; jumping to the progress corresponding to the video playing data in the interface for playing the video; and displaying the content of the video played by the second electronic equipment on the interface for playing the video.
In one possible implementation of the second aspect, the first application is a messaging application, and sending video access information of the video via the first application includes: sending video access information via a session interface of a messaging application; the conversation interface comprises a first user identification and a second user identification, wherein the first user identification is used for identifying a first user operating the first electronic equipment, and the second user identification is used for identifying a second user operating the second electronic equipment.
In a possible implementation of the second aspect, the obtaining, according to the video playing data, a state of the second electronic device playing the video includes: displaying the progress of the second electronic equipment for playing the video and a second user identifier in an interface for playing the video; and the second user identification is used for jumping to the progress corresponding to the video playing data or displaying the content of the video played by the second electronic equipment.
In one possible implementation of the second aspect, the acquiring video playing data includes: the method comprises the steps that first electronic equipment obtains video playing data from a server; wherein the server is a server for running the first application.
In one possible implementation of the second aspect, the video playing data includes a comment message that is edited and published by a user on a playing interface of the second electronic device for playing the video, and the comment message is associated with the playing progress of the video; the comment message and the corresponding playing progress can be displayed together in an interface of the first electronic device for playing the video, and the comment message is at least used for triggering skipping to the corresponding playing progress.
In a possible implementation of the second aspect, the first electronic device can obtain a comment message shared to the first application as a session message, and the session message is at least used for triggering video playing and jumping to a corresponding playing progress.
In a third aspect, an embodiment of the present application provides a video sharing method, which is applied to a second electronic device, and the method includes: responding to the playing operation of the user for the video access information of the video, and playing the video; the video access information is information sent by the first electronic equipment through the first application, and is used for accessing and playing videos; sharing video playing data generated by playing the video; the video playing data is used for the first electronic device to acquire the state of the second electronic device for playing the video, and the video playing data comprises data used for marking the playing state and the playing progress of the video.
In one possible implementation of the third aspect, the video access information includes a link for accessing and playing a video, and multimedia content associated with the video.
In one possible implementation of the third aspect, the video access information includes a card for accessing the video, the card including multimedia content related to the video and a link for accessing the video; the card comprises a first option and a second option, wherein the first option is used for triggering the playing of the video; the second option is used for triggering the video to be played through a third electronic device which is cooperated with the second electronic device; the first electronic device obtains the state of the second electronic device playing the video according to the video playing data, and the method further includes: and displaying the progress of the third electronic equipment in playing the video and an equipment icon of the third electronic equipment in an interface of the first electronic equipment in playing the video.
In one possible implementation of the third aspect, playing the video in response to a playing operation of the video access information for the video by the user includes: responding to the operation of clicking the first option of the card by the user, and triggering the playing of the video; or responding to the operation that the user clicks the second option of the card, and triggering the video to be played on the third electronic equipment.
In a possible implementation of the third aspect, the sharing video playing data generated by playing the video includes: uploading the video playing data to a server, and sharing the video playing data through the server; the server is in communication connection with the first electronic device and the second electronic device, and the server is used for running the first application.
In a possible implementation of the third aspect, the video playing data includes a comment message that is edited and published by a user on a playing interface of the video, and the comment message is associated with a playing progress of the video; and the comment message can be displayed together with the corresponding playing progress on the playing interface of the video.
In a possible implementation of the third aspect, the comment message can be shared to the first application as a session message, and the session message is at least used to trigger playing the video and jumping to a corresponding playing progress.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: one or more processors; one or more memories; the one or more memories store one or more programs that, when executed by the one or more processors, cause the electronic device to perform the video sharing method described above.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, on which instructions are stored, and when executed on a computer, the instructions cause the computer to execute the video sharing method.
In a sixth aspect, embodiments of the present application provide a computer program product, which includes a computer program/instruction, and when executed by a processor, implements the video sharing method described above.
Drawings
Fig. 1A to 1E are schematic diagrams illustrating UI interfaces for watching videos together in group chat provided in the prior art.
Fig. 2 is a schematic diagram illustrating components of a communication system according to an embodiment of the present application.
Fig. 3 is a schematic diagram of the software architecture of the communication system shown in fig. 2.
Fig. 4 is a flowchart illustrating a specific implementation of a video sharing method according to an embodiment of the present application.
Fig. 5A to 5D are schematic diagrams of a mobile phone interface in an implementation process of a video sharing method according to an embodiment of the present application.
Fig. 6A to 6B are schematic diagrams of a mobile phone interface in an implementation process of a video sharing method according to an embodiment of the present application.
Fig. 6C shows a small enlarged schematic view of the apparatus of fig. 6A and 6B.
Fig. 7 is a schematic interface diagram of a video playing window in an implementation process of a video sharing method according to an embodiment of the present application.
Fig. 8 is a schematic flowchart illustrating an implementation of a video sharing method according to a second embodiment of the present application.
Fig. 9 is a schematic interface diagram of a video playback window provided in the second embodiment of the present application.
Fig. 10A to 10C are schematic diagrams illustrating a video playing window interface corresponding to a user initiating an audio/video call according to a second embodiment of the present application.
Fig. 11 is a schematic flowchart illustrating a specific implementation flow of a video sharing method according to a third embodiment of the present application.
Fig. 12 is a schematic view of a video playback window interface provided in the third embodiment of the present application.
Fig. 13A to 13B are schematic views of video playing window interfaces corresponding to before and after a user clicks a bullet screen comment to jump to a corresponding playing progress, provided in the third embodiment of the present application.
Fig. 14 is a schematic flowchart illustrating an implementation of a video sharing method according to a fourth embodiment of the present application.
Fig. 15 is a schematic interface diagram of a video playback window according to the fourth embodiment of the present application.
Fig. 16 is an interface schematic diagram illustrating that a user clicks a group chat publishing comment and then jumps to a group chat interface in the fourth embodiment of the present application.
Fig. 17 is a schematic interface diagram of a video playing window in an embodiment of a video sharing method provided in the second embodiment and a video sharing method provided in the fourth embodiment in the same application scene.
Fig. 18 is a schematic interface diagram of a video playing window in an embodiment of a video sharing method provided in the second embodiment and a video sharing method provided in the fourth embodiment in the same application scenario.
Fig. 19 is a schematic structural diagram of a mobile phone 100 according to an embodiment of the present application.
Fig. 20 is a block diagram illustrating a software structure of a mobile phone 100 according to an embodiment of the present disclosure.
Detailed Description
The present application is further described with reference to the following detailed description and the accompanying drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. In addition, for convenience of description, only a part of structures or processes related to the present application, not all of them, is illustrated in the drawings. It should be noted that in this specification, like reference numerals and letters refer to like items in the following drawings.
In the scenes shown in fig. 1A to 1E, in order to solve the problem that it is inconvenient for the user to interact and communicate with each other when watching videos together, the present application provides a video sharing method, which improves the service experience of the user watching videos together in group chat by providing the following functional schemes:
firstly, the method provided by the application realizes sharing videos on a group chat interface or a plurality of one-to-one chat interfaces simultaneously by sharing video access information on the session interface in a message application (such as an instant messaging application), for example, video access information corresponding to searched keywords is shared in a mode of inputting "@ service card + search keyword" in a chat input box of the session interface, the video access information comprises multimedia data such as text introduction and pictures, and a play link, so that a user can conveniently initiate an invitation to watch videos together in the group chat, and the user can also select to play videos on current group chat equipment or play videos on a smart screen in the content promoted by the service card, namely, the user is supported to watch videos on other equipment;
secondly, the method provided by the application increases the visual angle switching function and the progress switching function through the video playing interface watched by a plurality of users, so that the users can switch to the watching visual angles of other users or switch to the watching progress of other users to watch the wonderful segments together, the sharing fun of watching the video together is increased, each user watching the video together can control the watching progress of the user, and other people quit watching the video together and cannot influence the user to continuously watch the video;
third, the method provided by the application associates the barrage comment message displayed on the video playing interface with the corresponding playing progress to mark the playing progress corresponding to the comment made by the user, so that the user can conveniently click the barrage to select the highlight segment jumping to the playing progress corresponding to the barrage comment, meanwhile, a 'synchronization' function option is added on the displayed barrage, the user can select to synchronize the comment into the group chat when making the comment, so that members who do not watch the video together in the group chat can click the comment to open the corresponding progress in the corresponding video to watch the highlight segment corresponding to the comment together, or the user can click a program label below the comment to open the program video corresponding to the comment to start watching, thereby being beneficial to increasing the atmosphere and sharing fun of the group chat and improving the user experience;
according to the method, the video call/voice call function button is added to the video playing interface, so that a user can initiate video calls or voice calls to other members through the button, and therefore group chat members watching videos together can conveniently communicate with each other.
The video sharing method provided by the present application can be applied to the instant messaging system 200 shown in fig. 2. As shown in fig. 2, the instant messaging system 200 may include electronic devices 201-1, 201-2, …, 202-N (N is an integer greater than 1), where the N electronic devices may be interconnected through a communication network 202.
As an example, the communication network 202 may be a wired network or a wireless network. For example, the communication network 202 may be a Local Area Network (LAN) or a Wide Area Network (WAN), such as the internet. The communication network may be implemented using any known network communication protocol, which may be any wired or wireless communication protocol, such as ethernet, universal Serial Bus (USB), FIREWIRE (FIREWIRE), global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division Code Division Multiple Access (CDMA), long term evolution (long term evolution, tds), bluetooth, wireless fidelity (NFC), NFC, voice over Internet protocol (VoIP), any other suitable communication protocol for a wired or wireless communication network.
It is to be appreciated that in some embodiments, each electronic device in the communication system 200 may log in to its respective account (e.g., hua is account) and then may be interconnected through one or more servers, which may also be implemented through the cloud. In other embodiments, a Wi-Fi connection may be established between the electronic devices in the communication system 200 via a Wi-Fi protocol, which is not limited herein.
It is to be understood that the electronic devices described above may include, without limitation, cell phones, smart screens, desktop computers, tablet computers, laptop computers, wearable devices, head-mounted displays, mobile email devices, portable game consoles, portable music players, reader devices, personal digital assistants, virtual reality or augmented reality devices, television sets with one or more processors embedded or coupled therein, and other electronic devices.
The following describes a specific implementation process of the video sharing method in detail, taking the electronic devices 201-1 and 201-2 as mobile phones and the communication network 202 as a server as an example.
Fig. 3 shows a schematic diagram of the software architecture of the communication system shown in fig. 2.
As shown in fig. 3, the operating system of the mobile phone 201-1 includes a video playing module 2011a, a message display module 2012a, an audio/video call module 2013a, a message sending module 2014a, and a session management module 2015a, and the operating system of the mobile phone 201-2 includes a video playing module 2011b, a message display module 2012b, an audio/video call module 2013b, a message sending module 2014b, and a session management module 2015b; the mobile phones 201-1 and 201-2 are interconnected through the server 202, and a service system of the server 202 includes a data synchronization module 2021, a video service module 2022, and a group and message management module 2023.
The video playing module 2011a on the mobile phone 201-1 or the video playing module 2011b on the mobile phone 201-2 acquires video content from the video service module 2022 on the server 202 for playing, and uploads the playing state and progress to the data synchronization module 2021, and the video playing module 2011a synchronizes the playing state and progress with the video playing module 2011b through the data synchronization module 2021 of the server 202; the session management module 2015a on handset 201-1 synchronizes the group members and session messages with the session management module 2015b on handset 201-2 through the group and message management module 2023 on server 202.
On the handset 201-1: the video playing module 2011a sends a call initiating instruction to the audio and video call module 2013a after detecting that a user initiates a call on a video playing interface, and the audio and video call module 2013a receives or sends call information through the session management module 2015 a; the video playing module 2011a sends information and/or comment content and a sending instruction to the message sending module 2014a after detecting that a user sends information and/or comment, the message sending module 2014a sends the information and/or comment content in a session message form through the session management module 2015a, the session management module 2015a displays the received information through the message display module 2012a, comments published by the user can be sent to the video playing module 2013a through the message display module 2012a for synchronous display, if the user clicks the information or comment to open a video and a corresponding playing progress, the message display module 2012a can send a video playing instruction to the video playing module 2011a, and the video playing module 2011a plays the video and directly jumps to the corresponding progress.
On the handset 201-2: the information transfer and module interaction process between the video playing module 2011b, the session management module 2015b, the audio/video call module 2013b, the message sending module 2014b and the message display module 2012b is the same as the information transfer and interaction process between the modules in the mobile phone 201-1, and is not described herein again.
Specifically, the functions of the modules on the handset 201-1 or the handset 201-2 can be summarized as follows:
the video playing module 2011a or 2011b is configured to obtain corresponding video content from the video service module 2022 of the server 202, play the video content to the user, manage a playing state, a playing progress, and the like, and upload the playing state, the playing progress, and the like to the data synchronization module 2021 of the server 202;
the message display module 2012a or 2012b is used for displaying chat information sent by the user, shared video related information, comment messages published when the user watches videos, and the like;
the audio and video call module 2013a or 2013b is used for processing related messages in the audio and video call process, triggering events and related flows of the audio and video call, and the like;
a message sending module 2014a or 2014b, configured to send chat information edited and input by a user, select a shared video, edit an input comment message, and the like;
the session management module 2015a or 2015b is used for managing and synchronously updating information such as member lists of the chat groups, managing chat group session messages and session messages among contacts, and the like.
In the server 202, a data synchronization module 2021 receives video playing states and progress uploaded by playing devices such as the mobile phone 201-1 and the mobile phone 201-2, and performs data synchronization among the devices; the video service module 2022 may respond to requests such as video query and play on playing devices such as the mobile phone 201-1 and the mobile phone 201-2, and return video related data content to each device for playing by each device, and for the request of querying/acquiring video information, the video service module 2022 sends related video information to each device in a session message form through the group and message management module 2023; the group and message management module 2023 manages and synchronizes the group members and manages and synchronizes the session messages through the session management module on the devices such as the mobile phone 201-1 and the mobile phone 201-2.
Specifically, the data synchronization module 2021 is configured to synchronize video playing information of a video played by some or all members in the chat group through the video access information in the chat group, where the video playing information includes a playing state, a playing progress, and the like of the corresponding video;
the video service module 2022 is configured to provide video related information query service and provide related data services such as video playing content;
the group and message management module 2023 is configured to manage addition, deletion, and the like of members in the chat group, and manage saving, transceiving, and synchronizing of instant messages in the chat group.
It can be understood that the video playing module, the message displaying module, the audio/video call module, the message sending module and the session management module may be disposed in an application layer of an operating system of the mobile phones 201-1 and 201-2, for example, an instant messaging application configured or installed on the mobile phones 201-1 and 201-2 includes the above modules. In other embodiments, for example, the operating systems of the mobile phones 201-1 and 201-2 are hong meney operating systems (Homney OS), the video playing module, the message display module, the audio/video call module, the message sending module, and the session management module may be set as an atomization function module of a framework layer, which is used for each application program of an application layer to call, for example, the smooth connection application and the video application on the mobile phones 201-1 and 201-2 may call each module, so as to achieve the effect that the video sharing method of the present application can achieve, which is not limited herein.
As an example, the following describes the present application by taking an instant messaging application installed on the mobile phones 201-1 and 201-2 as a smooth connection application, where an account of the user a is logged in the mobile phone 201-1, an account of the user B is logged in the mobile phone 201-2, and the user a and the user B are members of the same chat group.
Based on the communication system shown in fig. 2 and the system software architecture shown in fig. 3, an implementation procedure of the video sharing method of the present application is described below with an application scenario combining a first embodiment to a fourth embodiment.
Example one
The embodiment of the application introduces the implementation process of the video sharing method in detail based on the scene that the user shares the video information in the group chat.
Fig. 4 shows a specific implementation flow of the video sharing method provided in the embodiment of the present application. The method comprises the following steps:
401: the handset 201-2 detects the operation of querying the video information in the chat group by the user B.
Specifically, the user B inputs a command for calling out a video service card in a chat group displayed on the mobile phone 201-2 according to a predetermined format, and queries a keyword, and queries video information corresponding to the keyword.
Referring to fig. 5A to 5C, a user B may click on an easy connection application icon 501B in the desktop application icon of the mobile phone 201-2 shown in fig. 5A to open an easy connection interface 502B shown in fig. 5B, as shown in fig. 5B, where the recent contact list displayed on the interface 502B includes a chat group 503B, for example, the chat group 503B is named "happy family"; user B clicks chat group 503B on smooth connection interface 502B, may open group chat interface 504B shown in fig. 5C, and user B may edit and input "@ video service card and query keyword" in chat box 5041B, for example, query relevant video information according to the format of "@ my wisdom screen jinxiu nansong", where "my wisdom screen" is a device-type service card supporting smooth connection application sharing, and may be stored in the form of smooth connection application "contact" for user invocation. After the mobile phone 201-2 is connected to a smart screen device, the smart screen device can be saved as a "contact" in a smooth connection application running on the mobile phone 201-1, and the user can also define a name of the contact for the smart screen device, for example, defined as "my smart screen" or "smart screen of XXX", which is not limited herein.
Referring to fig. 5C, the user inputs "@ my smart screen" wake up "my smart screen" provided video playing service on the group chat interface 504 b; the 'jinxiu nange' is a keyword for inquiring the name of a program, after a video playing service corresponding to the 'my smart screen' is awakened, video content named as the 'jinxiu nange' is searched from the keyword, and then related program content can be inquired, and video access information is generated and shared to the group chat interface 504b. In other embodiments, the user may further input "@ my smart screen currently playing" to share video access information related to the video currently playing on the smart screen device to the group chat interface 504b, where the presentation form of the video access information may refer to the session message 510b shown in fig. 5C, and the interface shown in fig. 5C will be described in detail below and will not be described again. The members in the chat group 503b may click the video access information to select to play on the mobile phone running the smooth connection application directly or on the smart screen device interconnected with the mobile phone, which is not limited herein.
It can be understood that any group member in the chat group 503b can input the @ video service card and the query keyword of the video desired to be shared in the chat box of the group chat interface and send the input to query the video related information corresponding to the query keyword.
402: the handset 201-2 responds to the operation instruction of the user B and sends a video information query instruction to the server 202.
Specifically, after the processor of the mobile phone 201-2 processes the "@ video service card and query keyword" operation instruction input by the user B on the group chat interface 504B of the smooth connection application, the processor may send the query instruction to the server 202 through the wireless communication module.
403: the server 202 queries the corresponding video information based on the received query instruction, and stores the queried related video information.
Specifically, for example, as for the foregoing example, the group and message management module 2023 on the server 202 receives the query instruction message corresponding to the keyword for waking up the video service card and querying the program name on the smooth connection application by the user B, for example, the query instruction message corresponding to the user B inputting "@ my wisdom screen jinxiu nange", the group and message management module 2023 identifies the keyword "jinxiu nange" in the query instruction message, queries and acquires the relevant video information corresponding to the query keyword from the video service module 2022, where the video information may include but is not limited to information such as video identification, video introduction, video duration, poster, and the like.
404: the server 202 sends the queried relevant video information to the chat group.
Specifically, for the foregoing example, the group and message management module 2023 on the server 202 sends the obtained related video information of "beautiful southern song" to the corresponding chat group in the chang connection application, and both the session management module 2015a on the mobile phone 201-1 used by the user a and the session management module 2015B on the mobile phone 201-2 used by the user B can receive the video information, that is, the chang connection application running on the mobile phone 201-1 or the mobile phone 201-2 receives the related video information of "beautiful southern song".
405: the mobile phones 201-1 and 201-2 convert the video information received in the chat group into conversation messages to be displayed.
Specifically, the session management module 2015a on the mobile phone 201-1 converts the received related video information in the chat group into a session message form, and displays the session message in the chat interface of the chat group opened in the smooth connection application through the message display module 2012 a; the session management module 2015b on the handset 201-2 converts the received related video information in the chat group into a session message form, and displays the session message in the chat interface of the chat group opened in the open-connected application through the message display module 2012 b. The session message converted by the session management module 2015a or 2015b includes the related video information.
For example, referring to fig. 5C, in the open chat interface of the mobile phone 201-2, as shown in fig. 5C, a session message 510b containing video information related to "jinxiu nan song" is displayed in the group chat interface 504b shown in fig. 5C, a program name, such as the jinxiu nan song shown in fig. 5C, may be displayed on the session message 510b, and a play button 511b, a smart screen play button 512b, a shared status bar 513b displaying members and the number of people watching video together in the chat group 503b, and the like may also be displayed in the session message 510 b. The interface for displaying the conversation message containing the video information related to the "jinxiu nange" on the mobile phone 201-2 is shown in fig. 5C, and is not described herein again.
In other embodiments, the session message 510B may further include a plurality of programs, as shown in fig. 5D, the query information edited and input by the user B in the step 401 is "@ my smart screen recent hot broadcast", the server 202 queries related video information and then sends the acquired related video information of the recent hot broadcast to the chat group 502B, the session message 510B containing the video information of the recent hot broadcast displayed on the mobile phone 201-2 may include a plurality of video program names, such as the program name 1, the program name 2, the program name 3, the program name 4, and the like shown in fig. 5D, and the hot broadcast degree of the program, such as the display frequency information and the like, may be displayed below each program name; the session message 510b may further include the above-mentioned play button 511b corresponding to each program name, and a selection button, such as the view more button 514b shown in fig. 5D, may be provided below the session message 510b, and the user may open a list of more programs by clicking the button 514 b.
It can be understood that, when the instant messaging application having the modules of the mobile phone 201-1 or 201-2 shown in fig. 3 is run on the electronic device such as the mobile phone used by each group member in the chat group 503, the relevant video information obtained by querying the server 202 can be requested and received through the steps 401 to 405, and converted into the session message 510b shown in fig. 5C or 5D and displayed on the group chat interface 504b shown in fig. 5C or 5D, which is not described again here.
406: the handset 201-1 detects the operation that the user a clicks the session message containing the related video information in the chat group to play the video.
Specifically, referring to fig. 6A, the user a may click on a play button 511a in a session message 510a containing related video information in a group chat interface 504a displayed in a smooth connection application running on the cellular phone 201-1 to directly play the video on the cellular phone 201-1, referring to operation (1) shown in fig. 6A. Alternatively, in other embodiments, the user a may click on the smart screen play button 512a of the group chat interface 504a to play the video on the smart screen, referring to operation (2) shown in fig. 6A, wherein the smart screen capable of playing the video may be interconnected with the mobile phone 201-1 of the user a through the system shown in fig. 2. It is understood that the user a clicking on the shared status bar 513a on the group chat interface 504a can also know which group members are watching the video content corresponding to the session information 510 a.
It is understood that other group members in the chat group 503a shown in fig. 6A may click the session message to open the related video, which is not described herein again.
In addition, referring to fig. 6A, a video playing widget 515a may be further displayed in the upper right corner of the group chat interface 504a, and the number of people watching the video in real time may be also displayed below the video playing widget 515a through the sharing status bar 513 a. In other embodiments, referring to fig. 6B, if the video information queried by the user B in step 401 includes multiple programs, for example, the video information related to the latest hot-cast, the video playing widget 515a displayed on the group chat interface 504a may be in the form shown in fig. 6B, where the video playing widget 515a displays that "program name 1" has 1 person watching, "program name 2" has 3 people watching, "program name 3" has 2 people watching, "and the video playing widget 515a displays that how many people are watching videos together through the sharing status column 513a, for example, 6 people watching videos together. It is understood that if there are more programs watched by the members of the chat group, for example, 4 or more programs, the user can also slide to the left on the video playing pane 515a (refer to operation (3) shown in fig. 6B) to view the number of people that other programs are watching together at the same time.
It is understood that the video playback widget 515a may be a floating window, and in other embodiments, the video playback widget 515a may be displayed in other positions or in other forms, which is not limited herein.
407: the mobile phone 201-1 responds to the operation instruction of the user a, opens the related video information, and plays the video.
Specifically, after the mobile phone 201-1 receives the operation instruction of the user a to click the play button 511a, the video playing module 2011a requests the video service module 2022 in the server 202 for corresponding video data based on the video identifier in the related video information included in the session message 510a, for example, the video identifier is an identification number (ID) of the related video, and starts playing the corresponding video. In other embodiments, after the mobile phone 201-1 receives an operation instruction of the user a to click the smart screen play button 512a, the mobile phone 201-1 may invoke the video play module on the smart screen to request the video service module 2022 in the server 202 for the corresponding video data based on the interconnection relationship between the mobile phone 201-1 and the smart screen, and start playing the corresponding video. At this time, referring to fig. 6A, a device icon 516A may be displayed on the avatar of the user a, and an enlarged schematic view of the device icon 516A is shown with reference to fig. 6C. In other embodiments, the display device icon 516a may be displayed in other locations or in other forms, and is not limited herein.
408: the handset 201-1 synchronously transmits video playing data to the server 202. The video playing data includes data such as video playing state and playing progress.
Specifically, the video playing module 2011a on the mobile phone 201-1 collects real-time playing data of the user a entering into watching a video through the smooth connection application running on the mobile phone 201-1, that is, the video playing data including the video playing state, the playing progress information, and the like, and the video playing module 2011a synchronously reports the collected video playing data to the data synchronization module 2021 of the server 202. It is understood that the video playing data uploaded by the mobile phone 201-1 can be sent to the data synchronization module 2021 along with the smooth connection application data.
409: the server 202 receives and stores video playing data on the handset 201-1.
Specifically, the video synchronization module 2021 on the server 202 receives and stores the video playing data in the smooth application data sent by the cell phone 201-1. It can be understood that the video synchronization module 2021 may also receive synchronization data uploaded by other electronic devices, and store corresponding data corresponding to the instant messaging application data, such as smooth connection, logged on different devices by different users, which is not limited herein.
410: the server 202 synchronously sends the stored video playing data on the handset 201-1 to the handset 201-2.
Specifically, the video synchronization module 2021 on the server 202 responds to the user request to send the stored video playing data synchronized in the smooth connection application running on the cell phone 201-1 to the smooth connection application running on the cell phone 201-2, so as to synchronize the video playing data from the cell phone 201-1 between the cell phone 201-1 and the cell phone 201-2.
It can be understood that if the user B also uses the mobile phone 201-2 or the smart screen interconnected with the mobile phone 201-2 to play the related video, the mobile phone 201-1 of the user a can also obtain video playing data on the mobile phone 201-2, and the implementation of the synchronization process refers to the above step 406 to this step 410, which is not described herein again.
It is understood that the data synchronization module 2021 in the server 202 can synchronously send the real-time data of the video played on the electronic devices such as the mobile phone or the smart screen used by the members of the chat group who watch the video together to the electronic devices such as the mobile phone or the smart screen of the other members.
411: the mobile phone 201-2 receives and displays the video playing data on the mobile phone 201-1.
Specifically, the video playing module 2011b on the mobile phone 201-2 receives video playing data from the mobile phone 201-1, which is synchronously stored by the video synchronization module 2021 on the server 202, and displays the video playing data on a video playing window displayed in a smooth connection application run by the mobile phone 201-2, for example, displays information such as a video playing status and a video playing progress in the video playing data, where the video playing status includes, but is not limited to, whether the mobile phone 201-1 is playing the related video, and a playing, pausing status, or fast-forward playing, double-speed playing status of the related video on the mobile phone 201-1.
Referring to fig. 7, in a video playing window 701B displayed on the connect-disconnect application run by the mobile phone 201-2, a progress bar 702B displays a total video duration 7021 and a duration 7022 that has already been played, and a progress position where other group members watching a video together watch the video is also displayed above the progress bar 702B, where the duration 9022 that has already been played matches the progress position, for example, a duration 9022 that has already been played, corresponding to the progress position B7023 where the local user B watches the video shown in fig. 7, is displayed as "00. If user B is viewing the video through a smart screen interconnected with the cell phone 201-2, the user B may also display a device icon 516B on his avatar, for example, the device icon 516B displays a smart screen icon, without limitation. In addition, the video playing window 701B further includes an exit button 703B and an exit watching-together button 704B, wherein the user B can close the video playing window 701B by clicking the exit button 703B; the user B clicks the exit-watching-together button 704B to continue watching the video, but the information of the playing status and the playing progress of other members watching the video together is no longer displayed on the video playing window 701B.
In other embodiments, the video playing window 701b may also display the pause or the playing status of the other members playing the video, for example, the pause status of the video playing on the electronic device of the user C may be displayed by the status bubble 7026 above the progress position C7025, as shown in fig. 7. The form of the mobile phone 201-2 displaying the video playing data on the mobile phone 201-1 and the electronic devices of other members can also be displayed in other forms, which is not limited herein.
It can be understood that the cell phone 201-2 displays the video playing data on the cell phone 201-1, and the member that is watching the corresponding video and is displayed in the video playing widget 515b on the group chat interface 504b displayed by the cell phone 201-2 includes the user a corresponding to the cell phone 201-1.
The embodiment of the application introduces an implementation process of the video sharing method in a video sharing scene in group chat, and a user can quickly and conveniently share videos in the group chat through the video sharing method provided by the embodiment of the application and watch the videos together with other members in a chat group. Another implementation procedure of the video sharing method of the present application is described below with another application scenario in which another embodiment is combined.
Example two
The embodiment of the application introduces the implementation process of the video sharing method in detail based on the scene that the user switches to other member visual angles or switches to the playing progress of other members to watch the video together when the user watches the video together in group chat.
Fig. 8 shows a specific implementation flow of a video sharing method provided in an embodiment of the present application. The method comprises the following steps:
steps 801 to 810 are identical to steps 401 to 410 described in fig. 4 in the first embodiment, and are not described again here.
Step 811 is different from step 411 in the first embodiment in that, in step 811 in the embodiment of the present application, when the mobile phone 201-2 receives and displays the video playing data on the mobile phone 201-1, the interface of the displayed video playing window is different from that shown in fig. 7 in the first embodiment.
Fig. 9 shows a video playing window interface diagram provided in an embodiment of the present application.
As shown in fig. 9, in a video playing window 901B of a display screen of the mobile phone 201-2 displaying a user B in the embodiment of the present application, a progress bar 902B displays a total video duration 9021 and a played duration 9022, and a progress position of other group members watching a video together is also displayed above the progress bar 902B, where the played duration 9022 matches the progress position, for example, a progress position B9023 of the local user B watching the video, a progress position a 9024 of the user a watching the video, and a progress position C9205 of the user C watching the video shown in fig. 9, so that in the user B viewing angle shown in fig. 9, the played duration 9022 corresponding to the progress position B9023 of the user B is displayed as "01. In addition, the video playing window 901B also displays a quit button 903B, a quit watching-together button 904B and an audio/video call button 905B, wherein, when the user B clicks the audio/video call button 905B, the user B can initiate an audio/video call to any member in the chat group 504B, and it can be understood that, when the user B clicks the audio/video call button 905B, the user B can also select a plurality of members in the chat group 504B to initiate a group voice call or a group video call.
As shown in fig. 9, in the video playing window 901b, a view switching button and a progress switching button may also be displayed above the avatars of other members, and the user may click the avatar of the target object that wants to switch the view or the progress to pop up the two buttons. For example, if the user B clicks the avatar of the user a (refer to operation (4) shown in fig. 9), the view switching button 906B and the progress switching button 907B are displayed above the avatar of the user a, as an illustration, the view switching button 906B may be displayed as "view switching to TA", the progress switching button 907B may be displayed as "progress synchronizing to TA", and in other embodiments, the display forms and display contents of the view switching button and the progress switching button may be other, which is not limited herein.
Referring to the above soft software architecture of the communication system shown in fig. 3 and fig. 9, the progress of the user a watching the video is displayed at the progress position a on the video playing window 901B of the user B, when the user B clicks the avatar of the user a at the progress position a and clicks the view switching button 906B, the video playing module 2011B on the mobile phone 201-2 acquires the video playing data on the mobile phone 201-1 based on the data synchronization module 2021 of the server 202, and the smooth connection application running on the mobile phone 201-2 can respond to the operation of the user B to display the video playing window of the view of the user a at this time (refer to 901a shown in fig. 10C); the video playing data includes data such as a video playing progress and comments barrage issued by all users in the video playing process, and the video playing module 2011b on the mobile phone 201-2 can also obtain status data of the audio/video call of the user a, so that the audio/video call window of the user a is synchronously displayed in the process of switching the viewing angle.
It can be understood that in other embodiments, in the process of switching the view angle, the user a may also be asked through a pop-up window notification to inquire whether the user B agrees to switch to the view angle of the user a to watch the video, whether the user B is authorized to obtain the state data of the audio/video call on the mobile phone 201-1 after switching the view angle, and the like, which is not described herein again.
Similarly, when the user B clicks the avatar of the user a at the progress position a and clicks the progress switching button 907B, the video playing module 2011B on the mobile phone 201-2 skips to the playing progress of the user a watching the video at this time based on the obtained video playing data of the user a to continue watching the video.
It can be understood that in the process of switching the user B to the viewing angle of the user a or to the progress of the user a, the user a and the user B can keep the audio and video conversation, perform communication interaction, discuss highlights and the like in the video.
The above process is described in detail in steps 812 to 815, and will not be described again.
812: the handset 201-2 detects that the user B initiates an audio/video call.
Specifically, the user B clicks the audio/video call button 905B in the video playback window 901B displayed on the smooth connection application run by the mobile phone 201-2, and after selecting a call object, clicks to determine to initiate a voice call or a video call.
Referring to fig. 10A, when the user B clicks the audio/video call button 905B, the mobile phone 201-2 displays a group member list 1001 in the chat group 504B, and the user B clicks a check box 1002 behind the user a and the user C and clicks a determination button 1003 at the upper right of the group member list, so that a voice call or a video call can be initiated to the user a and the user C.
813: the handset 201-2 initiates a voice call or a video call to the handset 201-1 through the server 202 in response to an operation of the user B.
Specifically, the video call module 2011B on the mobile phone 201-2 sends an instruction for initiating a call to the audio/video call module 2013B in response to an operation of initiating an audio/video call by the user B, the audio/video call module 2013B initiates a voice or video call and sends or receives a call message through the session management module 2015B, and the session management module 2015B on the mobile phone 201-2 synchronizes the session message with the session management module 2015a on the mobile phone 201-1 through the group and message management module 2023 on the server 202 to implement the audio/video call between the mobile phone 201-2 and the mobile phone 201-1, that is, the audio/video call between the user B and the user a. Similarly, the process of the user B initiating the audio/video call to the user C is the same as the above process, and is not described herein again.
Referring to fig. 10B, after the user B successfully initiates an audio/video call to the user a and the user C, an audio/video call window 1004 is displayed in a video playing window 901B displayed in the mobile phone 201-2 under the viewing angle of the user B, and a video window 1005 of the user a, a video window 1006 of the user C, and a call window 1007 of the user B are displayed on the audio/video call window 1004, it can be understood that a picture displayed in the video window 1005 is image data acquired by the camera of the mobile phone 201-1 of the user a and synchronized by a data synchronization module 2021 on the server 202, and similarly, the video window 1006 and the call window 1007 are image data acquired by the camera on the electronic device used by the user C and the camera on the mobile phone 201-2 of the user B, respectively.
Returning to fig. 10B, an exit/switch to voice button 1008 is also displayed on the video call window 1004, and the user B can click the button 1008 to switch the ongoing video call to a voice call or to exit the currently ongoing video call. As shown in fig. 10B, the video window 1005 of user a and the video window 1006 of user C may be displayed as large windows or as front windows under the viewing angle of user B, which is in accordance with the conversation habits of the users. In other embodiments, the audio/video call window 1004 may also be displayed in other forms, which are not limited herein.
814: the mobile phone 201-2 detects the operation of switching to the progress of another person by the user B, and switches to the playing progress of the user a to continue playing the video.
Specifically, the user B clicks the avatar of the user a in the video playback window 901B displayed on the smooth application run by the mobile phone 201-2 (refer to operation (4) shown in fig. 9 described above), and after the view angle switching button 906B and the progress switching button 907B are displayed above the avatar of the user a, clicks the progress switching button 907B (refer to operation (5) shown in fig. 10B), and selects to switch to the playback progress of the user a. In response to the operation of the user B, the video playing module 2011B on the mobile phone 201-2 adjusts the progress of playing the corresponding video on the current mobile phone 201-2 to the video playing progress synchronized with the user a based on the video playing data of the user a, such as the video playing progress and the comment barrage issued by all users in the video playing process, which is synchronized with the data synchronization module 2021 on the server 202, that is, switches to the progress of playing the corresponding video on the mobile phone 201-2 of the user B to the playing progress of the user a. It can be understood that, after the user B switches to the playing progress of the user a, the progress of the user B watching the video before is replaced by the playing progress of the user a, and the previous playing progress of the user B is cleared.
It can be appreciated that the function of the switch progress button 907B is beneficial for users to share the same picture in the video content, for example, user B communicates the video content during a video call with user a, and for facilitating the communication, user B switches the current video playing progress to the playing progress of user a.
815: the mobile phone 201-2 detects that the user B switches to another visual angle, and switches to the visual angle of the user a to watch the video.
Specifically, the user B clicks the avatar of the user a in the video playback window 901B displayed on the smooth application run by the cellular phone 201-2 (refer to operation (4) shown in fig. 9 described above), and after the view switching button 906B and the progress switching button 907B are displayed above the avatar of the user a, clicks the view switching button 906B (refer to operation (6) shown in fig. 10B), and selects to switch to the view of the user a. The video playing module 2011B on the mobile phone 201-2, in response to an operation of the user B, acquires video playing data of the user a synchronized by the data synchronization module 2021 on the server 202, for example, data such as a video playing progress and comment banners issued by all users in a video playing process, displays a picture displayed by the current mobile phone 201-2 as a video playing window 901a displayed on the mobile phone 201-1 of the user a, the video playing module 2011B can also acquire audio and video call data performed by the user a through a smooth connection application, for example, data such as a window state displaying the audio and video call currently performed by the user a at this time, and the video playing module 2011B switches the video playing window 901B on the mobile phone 201-2 to the video playing window 901a of the user a viewing angle displayed on the mobile phone 201-1 based on the acquired data such as the video playing data and the window state of the audio and video call. Referring to fig. 10C, at this time, after switching to the video playing window 901a of the viewing angle of the user a, the positions of the video window 1005 of the user a and the video window 1007 of the user B in the audio/video call window 1004 are already different from those shown in fig. 10B; in addition, the progress position 9024 shown below the user a head is highlighted, and the already played time length 9022 is also shown as "00. At this time, a quit button 1009 may also be displayed on the video playing window 901B displayed on the mobile phone 201-2, and the user B may quit the viewing angle of the user a and switch back to the viewing angle thereof by clicking the button 1009, that is, after the user B clicks the button 1009, the picture displayed on the mobile phone 201-2 is changed from the picture shown in fig. 10C to the picture shown in fig. 10B.
It is understood that the function of the view angle switching button 906B is beneficial for users to share the same picture in the video content, for example, user B communicates the video content during a video call with user a, and for facilitating the communication, user B switches the current display picture of the mobile phone to the view angle of user a to communicate with a discussion.
It can be understood that when the user B performs the operation of switching to the view angle of the user a, the mobile phone 201-2 may send a confirmation notification of "whether to approve switching the view angle" to the mobile phone 201-1 through the server 202, and after the user a clicks "confirmation" or "approve" on the mobile phone 201-1, the mobile phone 201-2 of the user B can switch to the view angle of the user a to continue playing the video.
In other embodiments, the operation mode of switching the user to another view angle and the image transformation form on the electronic device such as the mobile phone used by the user may be other modes, which is not limited herein.
It is to be understood that there is no requirement for the execution sequence among the steps 812 to 813, the step 814 and the step 815, and in other embodiments, the steps 812 to 813, the step 815 and then the step 814 may be executed first; alternatively, step 814 or step 815 is performed first, and then steps 812 to 813 are performed, which is not limited herein.
The embodiment of the application introduces an implementation process of switching to other member visual angles when the video sharing method of the application is used for watching videos together in group chat or switching to a scene where other members watch videos together at the playing progress. Another implementation procedure of the video sharing method of the present application is described below with another application scenario in which another embodiment is combined.
EXAMPLE III
The embodiment of the application introduces the implementation process of the video sharing method in detail based on the fact that when a user watches videos together in group chat, the user intends to click comments on a video playing window to jump to a scene with corresponding progress.
Fig. 11 shows a specific implementation flow of the video sharing method provided in the embodiment of the present application, including the following steps:
steps 1101 to 1110 are the same as steps 401 to 410 described in fig. 4 in the first embodiment, and are not repeated herein.
Step 1111 differs from step 411 in the first embodiment described above in that, in step 1111 of the embodiment of the present application, when the mobile phone 201-2 receives and displays the video playing data on the mobile phone 201-1, the interface of the displayed video playing window is different from that shown in fig. 7 in the first embodiment described above.
Fig. 12 shows a video playing window interface diagram provided in an embodiment of the present application.
As shown in fig. 12, in the video playing window 121B displayed on the display screen of the handset 201-2 of the embodiment of the present application, the progress bar 122B displays a total video duration 1221 and an already played duration 1222, and further displays, above the progress bar 122B, the progress positions of other group members watching the video together, where the already played duration 1222 matches the progress positions, for example, the progress position B1223 of the video watched by the local user B, the progress position a 1224 of the video watched by the user a, and the progress position C1225 of the video watched by the user C shown in fig. 12, in the video playing window 121B in the perspective of the user B shown in fig. 12, the already played duration 1222 corresponding to the progress position B1223 of the user B is displayed as "01. In addition, the video playing window 121B further displays an exit button 123B, an exit watching-together button 124B, and a comment sending button 126, where the user B can initiate an audio/video call to any member in the chat group 504B by clicking the audio/video call button 905B, which is described with reference to fig. 10 and the related description in the second embodiment and is not described herein again; the comment message posted by the user B may be presented in the form of a bullet screen 1262B on the video playing window 121B when the user B clicks the comment posting button 126, and similarly, the comment message posted by the user a may be displayed as a bullet screen 1261a shown in fig. 12, and the comment message posted by the user C may be displayed as a bullet screen 1261C shown in fig. 12. A view more comments button 1262 may also be displayed on the video play window 121B, and the user B clicking on this button 1262 may view more comment messages posted by himself or other members for the played video.
As shown in fig. 12, the barrages 1261a, 1261b, and 1261c displayed in the video playback window 121b include not only the comment message 1270 but also the playback progress information 1271 corresponding to the time when the user posted the comment message.
Referring to the communication system architecture shown in fig. 3 and fig. 12, when detecting that the comment button 126 is triggered to issue a comment message in the video playing window 121B displayed in the smooth application run by the mobile phone 201-2, the video playing module 2011B of the mobile phone 201-2 sequentially merges the comment message issued by the user B and the video playing progress information at a corresponding time through the message sending module 2014B, the session management module 2015B and the message display module 2012B, and then displays the merged comment message in the video playing window 121B in a bullet screen form, that is, the bullet screen 1261B shown in fig. 12. When other users click the bullet screen 1261b, the users can choose to jump to the corresponding video playing progress to play the video. Similarly, the process of displaying comments by the user a and the user C as the bullet screens 1261a and 1261C refers to the display process of the bullet screen 1261b. The above bullet screen display process is described in detail in the following steps 1112 to 1114, and will not be described herein again.
1112: the cell phone 201-2 detects an operation of the user B to make a comment.
Specifically, the user B clicks the comment sending button 126 in the video playing window 121B displayed on the smooth application run by the mobile phone 201-2, and clicks a comment after editing the input comment information.
Referring to operation (7) shown in fig. 12, when the user B clicks the comment sending button 126, the mobile phone 201-2 displays the input method interface 128 for the user to edit and input comment information, and after the user finishes editing and inputting, the user clicks the comment sending button 129 to send a comment.
1113: the handset 201-2 responds to the operation of the user B, and synchronizes the comment message published by the user B through the server 202.
Specifically, the video call module 2011B of the mobile phone 201-2 obtains comment information edited and input by the user through an input method interface invoked by the video playing window 121B on the smooth connection application in response to an operation of comment made by the user B, the video call module 2011B sends the comment information and video playing progress information corresponding to comment made by the user (for example, video playing progress information corresponding to the comment made by the user B when the user B clicks the comment making button 126 on the video playing window 121B) to the session management module 2015B via the message sending module 2014B, and the session management module 2015B integrates and converts the received comment information and the corresponding progress information into a comment message with progress information. The session management module 2015b on the mobile phone 201-2 synchronizes the comment message with the progress information with the session management module 2015a on the mobile phone 201-1 through the group and message management module 2023 on the server 202; after receiving the comment message with the progress information synchronized by the server 202, the session management module 2015a on the mobile phone 201-1 sends the comment message to the message display module 2012a, the message display module 2012a sends the comment message to be displayed to the video playing module 2011a, and the video playing module 2011a on the mobile phone 201-1 displays the comment message and the history comment message on the video playing window displayed on the mobile phone 201-1. Meanwhile, the comment message and the history comment message are also displayed on the video playing window 121b of the mobile phone 201-2, and the display process is the same as that of the comment message displayed on the mobile phone 201-1, which is not described herein again.
In other embodiments, the video playing progress information corresponding to the comment made by the user may also be the video playing progress information corresponding to the comment made by the user after the user has edited the comment information and clicked the publishing button 129.
Similarly, the process of the user a and the user C for posting comments is the same as the above process, and is not described again here. Referring to the above publishing and displaying process of the comment message, that is, the comment message with progress information is synchronously displayed on the display screens of the mobile phone 201-2 and the mobile phone 201-1, refer to the barrage 1261a, 1261b, and 1261c shown in fig. 12.
1114: the mobile phone 201-2 detects that the user B clicks the operation of switching the play progress of the bullet screen, and switches to the play progress corresponding to the bullet screen to continue playing the video.
Specifically, referring to fig. 13A, the user B clicks the pop-up screen 1261a (refer to operation (8) shown in fig. 13A) in the video playback window 121B displayed on the smooth connection application run by the mobile phone 201-2, a comment button 1301, a comment adding button 1302, and a switch-to-progress button 1303 may be displayed above the pop-up screen 1261a, the user B clicks the comment button 1301 to approve the comment message displayed on the pop-up screen 1261a, and the user B clicks the comment adding button 1302 to copy the comment message displayed on the pop-up screen 1261a as the content of the comment to be published by the user B; in addition, the user B clicks the progress button 1303, the video playing module 2011B on the mobile phone 201-2 may switch the progress of the currently viewed video to the video playing progress corresponding to the published time of the comment, the video playing window 121B after switching may refer to fig. 13B, at this time, the played time length 1222 corresponding to the progress position B1223 of the user B is "00.
It can be understood that, in the video playing window 121B shown in fig. 13A, the user B may also click the bullet screen 1261B or 1261c and switch to the playing progress corresponding to the bullet screen, and other users may also click the bullet screen displayed on the electronic device used by the user B and switch to the playing progress corresponding to the bullet screen, which is not limited herein. The process of clicking the barrage on the mobile phone 201-1 by another user, for example, the user a, to jump to the playing progress corresponding to the barrage comment message is described with reference to the above process, and is not described herein again.
The embodiment of the application introduces an implementation process of the video sharing method in a scene of clicking comment skip playing progress when a user watches videos together in a group chat, and the user can click comment messages sent by other people on a pop-up screen to perform operations such as clicking or adding one to make a comment and the like through the video sharing method provided by the embodiment of the application, and can also select to switch to the corresponding video playing progress when the comment is made, so that the interactive fun of watching videos by all members in a chat group is increased, and the user experience is improved. Another implementation procedure of the video sharing method of the present application is described below with another application scenario in which another embodiment is combined.
Example four
On the basis of the third embodiment, the comment information can be published to the group chat interface when a comment is added to the comment publishing button, so that the application is suitable for a use scene in which a user can open a video and jump to a corresponding progress through the comment information displayed in the group chat, and the implementation process of the video sharing method provided by the application is described in detail.
Fig. 14 shows a specific implementation flow of the video sharing method provided in the embodiment of the present application, including the following steps:
steps 1401 to 1410 are the same as steps 1101 to 1110 described in fig. 11 in the third embodiment, and are not described again here.
Step 1411 is different from step 1111 of the third embodiment in that, in step 1411 of the embodiment of the present application, when the mobile phone 201-2 receives and displays the video playing data on the mobile phone 201-1, the interface of the displayed video playing window is different from that shown in fig. 12 of the third embodiment.
Fig. 15 shows a video playback window interface diagram provided in an embodiment of the present application.
As shown in fig. 15, in a video playing window 151B displayed on the display screen of the cell phone 201-2 according to the embodiment of the present application, a progress bar 152B displays a total video time length 1521 and a played time length 1522, and a progress position of other group members watching a video together is also displayed above the progress bar 152B, where the played time length 1522 matches the progress position, for example, a progress position B1523 of the local user B watching the video, a progress position a 1524 of the user a watching the video, and a progress position C1525 of the user C watching the video shown in fig. 15, in the video playing window 151B at the viewing angle of the user B shown in fig. 15, the played time length 2 corresponding to the progress position B1523 of the user B is displayed as "01 02 19". In addition, the video playing window 151B further displays an exit button 153B, an exit watching-together button 154B, and a message sending button 156, wherein the user B can initiate an audio/video call to any member in the chat group 504B by clicking the audio/video call button 905B, which is described with reference to fig. 10 and the related description in the second embodiment and is not repeated herein; the user B clicks the message sending button 156 (refer to operation (7) shown in fig. 12) to pop up the message posting interface 1501 to post the comment message, and at the same time, the user B can select to synchronize the comment message to be posted to the group chat on the message posting interface 1501 (refer to operation (9) shown in fig. 15), the comment message posted by the user B can be displayed on the video playing window 151B in the form of a pop-up screen 1562B, and the comment message posted by the user B can also be displayed on the group chat interface 504B in the form of a session message, which will be described in detail in the following steps and will not be described herein again. Likewise, the comment message posted by the user a may be displayed as a bullet screen 1561a shown in fig. 15, and the comment message posted by the user C may be displayed as a bullet screen 1561C shown in fig. 15. A view more comment button 1562 may also be displayed on the video playback window 151B, and the user B can view more comment messages posted by himself or herself and other members for the played video by clicking on this button 1562.
As shown in fig. 15, a message posting interface 1501 displayed in the video playing window 151B includes a synchronize to group chat option 1502 and a check box 1503, and when the user B clicks the check box 1503 and displays a check mark, the comment message posted by the user B can be synchronized to the group chat interface 504B.
Referring to the communication system architecture shown in fig. 3 and fig. 12, when detecting that the user B performs an operation of triggering the message button 156 in the video playing window 151B displayed on the smooth application run by the mobile phone 201-2, the video playing module 2011B of the mobile phone 201-2 sequentially fuses the comment message issued by the user B and the video playing progress information at the corresponding time through the message sending module 2014B, the session management module 2015B and the message display module 2012B, and displays the fused comment message in the video playing window 151B in a bullet screen form, that is, the bullet screen 1561a; when the user B selects to synchronize the comment message to be posted to the group chat in the message posting interface 1501, the comment message posted by the user B is converted into a form of a session message by the session management module 2015B, and then displayed in the corresponding group chat interface 504B by the message display module 2012B. It is understood that the comment message displayed in the group chat interface 504b in the form of a session message also has the playing progress information at the corresponding moment, so that other group chat members can click the session message to open the video and jump to the corresponding playing progress to watch the video. Similarly, the processes of posting comment barrage by the user a and the user C and synchronizing the comment to the group chat are described in detail in the following steps 1512 to 1513, and are not described herein again.
1512: the cell phone 201-2 detects an operation of the user B to make a comment.
Specifically, the user B clicks the send message button 156 in the video playing window 151B displayed on the smooth application run by the mobile phone 201-2, and clicks to send a comment after editing the input comment information.
Referring to fig. 15, after the user B finishes editing the comment message on the message posting interface 1501, the user B may select a check box 1503 synchronized before the group chat option 1502, and the check box 1503 displays a check mark, such as a "√" shown in fig. 15, when the user B clicks the send button 159 to post the comment message, the comment message is posted to the group chat interface 504B synchronously; if user B chooses not to check sync to group chat option 1502, i.e., check box 1503 does not display a check mark, the user clicks send button 159 and only displays the comment message on video playback interface 151B.
Fig. 16 shows an interface diagram after a user clicks and synchronizes to a group chat and posts comments and jumps to a group chat interface.
As shown in fig. 16, a comment posted by the user B is displayed on the group chat interface 504B in the form of a conversation message 1601B, and a comment source information box 1602 may be displayed below the conversation message 1601B to distinguish from other chat information on the group chat interface 504B.
1513: the mobile phone 201-2 responds to the operation of the user B, synchronizes the comment message published by the user B through the server 202, and simultaneously the mobile phone 201-2 converts the comment message synchronized into the group chat into a conversation message for display.
Specifically, for the process of the mobile phone 201-2 synchronizing the comment message issued by the user B through the server 202, reference may be made to step 1113 in the third embodiment, which is not described herein again.
If the user B clicks the synchronize to group chat option 1502 when making a comment, the session management module 2015B on the mobile phone 201-2 may further convert the received comment message with the progress information into a session message, and display the session message through the message display module 2012B. As shown in fig. 16 described above, the comment source information box 1602 is also displayed below the message content displayed by the conversation message 1601b converted by the conversation management module 2015b to display the source of the comment message. For example, the conversation message 1601B shown in fig. 16 displays message contents "comment 00 posted by user B: program 1", which indicates that user B is the comment posted during the video playback corresponding to program name 1.
It can be understood that other members in the chat group 503B corresponding to the group chat interface 504B can see the session message 1601B synchronized by the user B in the group chat, and further, other members in the chat group 503B can click the comment source information box 1602 to directly open the video corresponding to the comment and jump to the playback progress "00.
It can be understood that the user B may also click the comment source information box 1602 below the conversation message 1601B on the group chat interface 504B displayed by the mobile phone 201-2 to open the video corresponding to the comment and jump to the playback progress "00" corresponding to the comment. It can be understood that, after the user B clicks the comment source information frame 1602 to switch to the playing progress corresponding to the comment, the progress of the user B watching the video before is replaced by the playing progress corresponding to the comment.
It can be understood that the application scenario to which the video sharing method of the present application introduced in the first to fourth embodiments is applicable and various functions implemented in the video sharing process can also be implemented in the same application scenario, for example, the audio/video call function in the video sharing process described in the second embodiment can be combined with the comment posting to group chat and other functions described in the fourth embodiment, so that a user can perform audio/video call with other members in the process of watching a video, can jump to a playing progress corresponding to a comment through a pop-up screen on a video playing interface to watch a video, and can click a comment message synchronized in the group chat interface to open a progress corresponding to the comment and watch a video together with a family, thereby facilitating interaction and communication between the members watching the video together. In this case, the video playing interface displayed on the electronic device such as the mobile phone and the like can refer to fig. 17, and the specific implementation process refers to the description related to the second embodiment and the fourth embodiment, which is not described herein again.
In other embodiments, the view switching function described in the second embodiment may also be combined with the comment posting and group chat function described in the fourth embodiment, so that the user may switch to another view during the process of watching the video, may jump to the play progress corresponding to the comment through a barrage on the video play interface to watch the video, and may click a comment message synchronized in the group chat interface to open the video to jump to the progress corresponding to the comment to watch the video together with the family, thereby facilitating the interactive communication between the members watching the video together. In this case, the video playing interface displayed on the electronic device such as the mobile phone may refer to that shown in fig. 18, and the specific implementation process refers to the relevant description in the second embodiment and the fourth embodiment, which is not described herein again.
Fig. 19 shows a schematic structural diagram of the handset 201-1 according to an embodiment.
The handset 201-1 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the handset 201-1. In other embodiments of the present application, the handset 201-1 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. In this embodiment, the processor 110 may control, through the controller and the like, the modules, such as the video playing module, on the mobile phone 201-1 to cooperate with each other, so as to execute the video sharing method of this application.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the mobile phone 201-1, and may also be used to transmit data between the mobile phone 201-1 and peripheral devices. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules shown in the embodiment of the present invention is only illustrative, and does not limit the structure of the mobile phone 201-1. In other embodiments of the present application, the mobile phone 201-1 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the cell phone 201-1. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the handset 201-1 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 201-1 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 can provide a solution including wireless communication of 2G/3G/4G/5G and the like applied to the handset 201-1. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the mobile phone 201-1, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset 201-1 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the handset 201-1 can communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone 201-1 realizes the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini-LED, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the handset 201-1 may include 1 or N display screens 194, N being a positive integer greater than 1. In the embodiment of the present application, the display screen 194 is used for displaying a chat interface or a group chat interface of the aforementioned instant messaging application (or called messaging application) such as the open link, and a video playing interface for playing a shared video.
The mobile phone 201-1 can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like. In the embodiment of the application, the image data displayed on the call interface in the video call process can be collected.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the handset 201-1 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 201-1 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. Handset 201-1 may support one or more video codecs. Thus, the handset 201-1 can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize the applications of intelligent cognition and the like of the mobile phone 201-1, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 201-1. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created during the use of the mobile phone 201-1. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications and data processing of the handset 201-1 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The handset 201-1 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into a sound signal. The cellular phone 201-1 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the handset 201-1 receives a call or voice message, it can receive voice by placing the receiver 170B close to the ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The handset 201-1 may be provided with at least one microphone 170C. In other embodiments, the handset 201-1 may be provided with two microphones 170C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the mobile phone 201-1 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The handset 201-1 determines the intensity of the pressure based on the change in capacitance. When a touch operation is applied to the display screen 194, the mobile phone 201-1 detects the intensity of the touch operation according to the pressure sensor 180A. The cellular phone 201-1 can also calculate the touched position based on the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion gesture of the handset 201-1. The air pressure sensor 180C is used to measure air pressure. In some embodiments, the handset 201-1 calculates altitude, aiding positioning and navigation, from the barometric pressure measured by the barometric pressure sensor 180C. The magnetic sensor 180D includes a hall sensor. In some embodiments, when the handset 201-1 is a flip phone, the handset 201-1 may detect the opening and closing of the flip according to the magnetic sensor 180D. The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone 201-1 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 201-1 is at rest. A distance sensor 180F for measuring a distance. The handset 201-1 may measure distance by infrared or laser. In some embodiments, the camera 201-1 may take a picture of a scene and use the range sensor 180F to measure the range to achieve fast focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The mobile phone 201-1 can detect that the mobile phone 201-1 held by the user is close to the ear for conversation by using the proximity light sensor 180G, so that the screen is automatically turned off to achieve the purpose of saving electricity. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen. The ambient light sensor 180L is used to sense the ambient light level. The handset 201-1 may adaptively adjust the brightness of the display 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the mobile phone 201-1 is in a pocket to prevent accidental touches. The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 201-1 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like. The temperature sensor 180J is used to detect temperature. In some embodiments, the handset 201-1 implements a temperature processing strategy using the temperature detected by the temperature sensor 180J.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K can be disposed on the surface of the handset 201-1 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The handset 201-1 may receive a key input, generate a key signal input related to user settings and function control of the handset 201-1.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the cellular phone 201-1 by being inserted into the SIM card interface 195 or being pulled out from the SIM card interface 195. The handset 201-1 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc.
Fig. 20 is a block diagram of a software configuration of the cellular phone 201-1 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 20, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 20, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like. The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It is understood that the structure of the handset 201-2 can be the same as the structure of the handset 201-1 shown in fig. 19 and 20, and will not be described herein again.
It is to be understood that the embodiments of this application will describe various aspects of the illustrative embodiments using terms commonly employed by those skilled in the art to convey the substance of their work to others skilled in the art. It will be apparent, however, to one skilled in the art that some alternative embodiments may be practiced using the features described in part. For purposes of explanation, specific numbers and configurations are set forth in order to provide a more thorough understanding of the illustrative embodiments. It will be apparent, however, to one skilled in the art that alternative embodiments may be practiced without the specific details. In some other instances, well-known features are omitted or simplified in order not to obscure the illustrative embodiments of the present application.
In the drawings, some features may be shown in a particular arrangement and/or order. However, it should be understood that such specific arrangement and/or ordering is not required. Rather, in some embodiments, these features may be described in a manner and/or order different from that shown in the illustrative figures. Additionally, the inclusion of structural features in particular figures does not imply that all embodiments need include such features, and in some embodiments, may not include such features, or may be combined with other features.
Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one example embodiment or technology disclosed herein. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
The present disclosure also relates to an operating device for performing the method. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random Access Memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application Specific Integrated Circuits (ASICs), or any type of media suitable for storing electronic instructions, and each may be coupled to a computer system bus. Further, the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
The processes and displays presented herein are inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform one or more method steps. The structure for a variety of these systems is discussed in the description that follows. In addition, any particular programming language sufficient to implement the techniques and embodiments disclosed herein may be used. Various programming languages may be used to implement the present disclosure as discussed herein.
Moreover, the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the disclosed subject matter. Accordingly, the present disclosure is intended to be illustrative, but not limiting, of the scope of the concepts discussed herein.

Claims (30)

1. A video sharing method is applied to a first electronic device and a second electronic device which are provided with a first application, and is characterized by comprising the following steps:
the first electronic equipment sends video access information of a video to the second electronic equipment through a first application, wherein the video access information is used for accessing and playing the video;
the second electronic equipment responds to the playing operation of the user for the video access information, and plays the video;
the second electronic equipment shares video playing data generated by playing the video to the first electronic equipment; the video playing data comprises data used for marking the playing state and the playing progress of the video;
and the first electronic equipment acquires the video playing data and acquires the state of the second electronic equipment playing the video according to the video playing data.
2. The method according to claim 1, wherein the first electronic device obtains the video playing status of the second electronic device according to the video playing data, and the video playing status includes at least one of the following presentation modes:
displaying the progress of the second electronic equipment in playing the video in an interface of the first electronic equipment in playing the video;
jumping to the progress corresponding to the video playing data in an interface of the first electronic device for playing the video;
and displaying the content of the video played by the second electronic equipment on an interface of the first electronic equipment for playing the video.
3. The method of claim 2, wherein the video access information includes a link for accessing and playing the video, and multimedia content related to the video; and is provided with
The first application is a message application, and the first electronic device sends a message to the second electronic device via the message application, wherein the message includes the video access information.
4. The method of claim 3, wherein sending, by the first electronic device, video access information for a video to the second electronic device via a first application comprises:
the first electronic device sends the video access information to the second electronic device via a session interface of the message application; wherein
The conversation interface comprises a first user identification and a second user identification, wherein the first user identification is used for identifying a first user operating the first electronic equipment, and the second user identification is used for identifying a second user operating the second electronic equipment.
5. The method according to claim 4, wherein the obtaining, by the first electronic device, the state of the second electronic device playing the video according to the video playing data includes:
displaying the progress of the second electronic equipment for playing the video and the second user identification in an interface of the first electronic equipment for playing the video; wherein the content of the first and second substances,
the second user identification is used for jumping to the progress corresponding to the video playing data, or
And the second electronic equipment is used for displaying the content of the video played by the second electronic equipment.
6. The method of any of claim 1, wherein the video access information comprises a card for accessing the video, the card comprising multimedia content related to the video and a link for accessing the video.
7. The method of claim 6, wherein the card includes a first option and a second option, wherein,
the first option is used for triggering the playing of the video;
the second option is used for triggering the video to be played through a third electronic device which is cooperated with the second electronic device;
the first electronic device obtains the state of the second electronic device playing the video according to the video playing data, and the method further includes:
and displaying the progress of the third electronic equipment for playing the video and an equipment icon of the third electronic equipment in an interface of the first electronic equipment for playing the video.
8. The method of claim 7, wherein the second electronic device plays the video in response to a user operation to play the video access information, comprising:
the second electronic equipment responds to the operation that a user clicks a first option of the card, and triggers the playing of the video; or
And the second electronic equipment responds to the operation that the user clicks the second option of the card, and triggers the video to be played on the third electronic equipment.
9. The method according to any one of claims 1 to 8, wherein the second electronic device shares video playing data generated by playing the video with the first electronic device, including:
the second electronic equipment uploads the video playing data to a server in communication connection with the first electronic equipment and the second electronic equipment;
and the first electronic equipment acquires the video playing data from the server.
10. The method of claim 9, wherein the server is a server for running the first application.
11. The method of claim 1, wherein the video playing data comprises a comment message posted by a user editing on an interface of the second electronic device playing the video, and wherein the comment message is associated with a progress of the video playing; the comment message can be displayed on a playing interface of the video together with corresponding playing progress information;
the comment message is at least used for triggering skipping to the corresponding playing progress.
12. The method of claim 11, wherein the corresponding play progress corresponds to a posting time of the comment message; or
The corresponding play progress information indicates a time when the comment message starts to be edited, or
The corresponding play progress information indicates a time when the comment message is sent out.
13. The method according to any one of claims 11 to 12,
the comment message can be shared to other electronic devices as a session message via the first application, and the session message is at least used for triggering playing of the video and jumping to the corresponding playing progress.
14. The method according to any one of claims 1 to 13, further comprising:
the second electronic equipment starts audio and video communication with the first electronic equipment in the process of playing the video;
and the first electronic equipment displays the interface of the audio and video call on the interface for playing the video.
15. A video sharing method applied to a first electronic device is characterized by comprising the following steps:
sending video access information of a video via a first application, wherein the video access information is used for accessing and playing the video;
acquiring video playing data, and acquiring the state of the second electronic equipment for playing the video according to the video playing data; the video playing data is data which is generated by the second electronic equipment playing the video and used for marking the playing state and the playing progress of the video.
16. The method according to claim 15, wherein the obtaining the state of the second electronic device playing the video according to the video playing data includes at least one of the following presentation manners:
displaying the progress of the second electronic equipment in the video playing interface;
jumping to the progress corresponding to the video playing data in the interface for playing the video;
and displaying the content of the video played by the second electronic equipment on an interface for playing the video.
17. The method of claim 16, wherein the first application is a messaging application, and wherein sending video access information for the video via the first application comprises:
sending the video access information via a session interface of the messaging application; wherein
The conversation interface comprises a first user identification and a second user identification, wherein the first user identification is used for identifying a first user operating the first electronic equipment, and the second user identification is used for identifying a second user operating the second electronic equipment.
18. The method according to claim 17, wherein the obtaining the state of the second electronic device playing the video according to the video playing data comprises:
displaying the progress of the second electronic equipment in playing the video and the second user identification in an interface for playing the video; wherein the content of the first and second substances,
the second user identification is used for jumping to the progress corresponding to the video playing data, or
And the second electronic equipment is used for displaying the content of the video played by the second electronic equipment.
19. The method according to any one of claims 15 to 18, wherein the obtaining video playing data comprises: the first electronic equipment acquires the video playing data from the server; wherein the content of the first and second substances,
the server is a server for running the first application.
20. The method of claim 19, wherein the video playback data includes a comment message posted by a user editing on a playback interface of the second electronic device playing the video, and wherein the comment message is associated with a playback progress of the video; wherein the content of the first and second substances,
the comment message and the corresponding playing progress can be displayed together in an interface of the first electronic device for playing the video, and the comment message is at least used for triggering skipping to the corresponding playing progress.
21. The method of any of claims 20, wherein the first electronic device is capable of obtaining the comment message shared as a conversation message to the first application, and wherein the comment message is shared as a conversation message to the first application
And the session message is at least used for triggering the video playing and jumping to the corresponding playing progress.
22. A video sharing method applied to a second electronic device is characterized by comprising the following steps:
responding to the playing operation of a user on video access information of a video, and playing the video; the video access information is information sent by first electronic equipment through a first application, and the video access information is used for accessing and playing the video;
sharing video playing data generated by playing the video; the video playing data is used for the first electronic device to acquire the state of the second electronic device for playing the video, and the video playing data includes data used for marking the playing state and the playing progress of the video.
23. The method of claim 22, wherein the video access information comprises a link for accessing and playing the video, and multimedia content associated with the video.
24. The method of claim 23, wherein the video access information comprises a card for accessing the video, the card comprising multimedia content related to the video and a link for accessing the video; and the card includes a first option and a second option, wherein,
the first option is used for triggering the playing of the video;
the second option is used for triggering the video to be played through a third electronic device which is cooperated with the second electronic device;
the first electronic device obtains the state of the second electronic device playing the video according to the video playing data, and the method further comprises the following steps:
and displaying the progress of the third electronic equipment in playing the video and an equipment icon of the third electronic equipment in an interface of the first electronic equipment in playing the video.
25. The method according to claim 24, wherein the playing the video in response to the playing operation of the video access information of the video by the user comprises:
responding to an operation of clicking a first option of the card by a user, and triggering the playing of the video; or
And triggering the video to be played on the third electronic equipment in response to the operation that the user clicks the second option of the card.
26. The method according to any one of claims 22 to 25, wherein the sharing video playing data generated by playing the video comprises:
uploading the video playing data to a server, and sharing the video playing data through the server; wherein the content of the first and second substances,
the server is in communication connection with the first electronic device and the second electronic device, and the server is used for running the first application.
27. The method of claim 26, wherein the video playback data comprises a comment message posted by a user editing on a playback interface of the video, and wherein the comment message is associated with a playback progress of the video; wherein the content of the first and second substances,
the comment message can be displayed on the video playing interface together with the corresponding playing progress.
28. The method of claim 27, wherein the comment message can be shared to the first application as a conversation message, and wherein the conversation message is at least used to trigger playing the video and jumping to the corresponding play progress.
29. An electronic device, characterized in that the electronic device comprises: one or more processors; one or more memories; the one or more memories store one or more programs that, when executed by the one or more processors, cause the electronic device to perform the video sharing method of any of claims 1-28.
30. A computer-readable storage medium having instructions stored thereon, which when executed on a computer cause the computer to perform the video sharing method of any one of claims 1 to 28.
CN202110704189.2A 2021-06-24 2021-06-24 Video sharing method, electronic device and storage medium Pending CN115529487A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110704189.2A CN115529487A (en) 2021-06-24 2021-06-24 Video sharing method, electronic device and storage medium
PCT/CN2022/086789 WO2022267640A1 (en) 2021-06-24 2022-04-14 Video sharing method, and electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110704189.2A CN115529487A (en) 2021-06-24 2021-06-24 Video sharing method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN115529487A true CN115529487A (en) 2022-12-27

Family

ID=84545205

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110704189.2A Pending CN115529487A (en) 2021-06-24 2021-06-24 Video sharing method, electronic device and storage medium

Country Status (2)

Country Link
CN (1) CN115529487A (en)
WO (1) WO2022267640A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117041225A (en) * 2023-09-28 2023-11-10 中科融信科技有限公司 Multi-party audio and video communication method and system based on 5G

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107241622A (en) * 2016-03-29 2017-10-10 北京三星通信技术研究有限公司 video location processing method, terminal device and cloud server
CN108282673A (en) * 2018-01-29 2018-07-13 优酷网络技术(北京)有限公司 A kind of update method, server and client playing record
CN112187619A (en) * 2020-05-26 2021-01-05 华为技术有限公司 Instant messaging method and equipment
CN112423087A (en) * 2020-11-17 2021-02-26 北京字跳网络技术有限公司 Video interaction information display method and terminal equipment
CN112969097A (en) * 2021-02-19 2021-06-15 腾讯科技(深圳)有限公司 Content playing method and device, and content commenting method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101600891B1 (en) * 2014-10-17 2016-03-09 쿨사인 주식회사 Synchronization method and system for audio and video of a plurality terminal
CN110784771B (en) * 2019-10-30 2022-02-08 维沃移动通信有限公司 Video sharing method and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107241622A (en) * 2016-03-29 2017-10-10 北京三星通信技术研究有限公司 video location processing method, terminal device and cloud server
CN108282673A (en) * 2018-01-29 2018-07-13 优酷网络技术(北京)有限公司 A kind of update method, server and client playing record
CN112187619A (en) * 2020-05-26 2021-01-05 华为技术有限公司 Instant messaging method and equipment
CN112423087A (en) * 2020-11-17 2021-02-26 北京字跳网络技术有限公司 Video interaction information display method and terminal equipment
CN112969097A (en) * 2021-02-19 2021-06-15 腾讯科技(深圳)有限公司 Content playing method and device, and content commenting method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117041225A (en) * 2023-09-28 2023-11-10 中科融信科技有限公司 Multi-party audio and video communication method and system based on 5G

Also Published As

Publication number Publication date
WO2022267640A1 (en) 2022-12-29

Similar Documents

Publication Publication Date Title
CN114467297B (en) Video call display method and related device applied to electronic equipment
CN110109636B (en) Screen projection method, electronic device and system
CN111345010B (en) Multimedia content synchronization method, electronic equipment and storage medium
CN113542839B (en) Screen projection method of electronic equipment and electronic equipment
US11941323B2 (en) Meme creation method and apparatus
WO2020078299A1 (en) Method for processing video file, and electronic device
CN111182145A (en) Display method and related product
CN114040242B (en) Screen projection method, electronic equipment and storage medium
CN109981885B (en) Method for presenting video by electronic equipment in incoming call and electronic equipment
JP7416519B2 (en) Multi-terminal multimedia data communication method and system
CN113496426A (en) Service recommendation method, electronic device and system
CN114173000B (en) Method, electronic equipment and system for replying message and storage medium
US20230421900A1 (en) Target User Focus Tracking Photographing Method, Electronic Device, and Storage Medium
CN110198362B (en) Method and system for adding intelligent household equipment into contact
CN114710640A (en) Video call method, device and terminal based on virtual image
WO2022042769A2 (en) Multi-screen interaction system and method, apparatus, and medium
CN112527093A (en) Gesture input method and electronic equipment
CN113835649A (en) Screen projection method and terminal
CN110955373A (en) Display element display method and electronic equipment
WO2024045801A1 (en) Method for screenshotting, and electronic device, medium and program product
CN112383664A (en) Equipment control method, first terminal equipment and second terminal equipment
WO2022267640A1 (en) Video sharing method, and electronic device and storage medium
CN114756785A (en) Page display method and device, electronic equipment and readable storage medium
WO2023045597A1 (en) Cross-device transfer control method and apparatus for large-screen service
CN112532508A (en) Video communication method and video communication device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination