CN113099130A - Collaborative video processing method and device, electronic equipment and storage medium - Google Patents

Collaborative video processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113099130A
CN113099130A CN202110407804.3A CN202110407804A CN113099130A CN 113099130 A CN113099130 A CN 113099130A CN 202110407804 A CN202110407804 A CN 202110407804A CN 113099130 A CN113099130 A CN 113099130A
Authority
CN
China
Prior art keywords
editing
video
log
terminal device
processed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110407804.3A
Other languages
Chinese (zh)
Inventor
熊名男
辛彦哲
宋毓韬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202110407804.3A priority Critical patent/CN113099130A/en
Publication of CN113099130A publication Critical patent/CN113099130A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the disclosure provides a collaborative video processing method and device, an electronic device and a storage medium, wherein a first editing log is obtained by editing a video to be processed displayed on a first terminal device in response to a first editing instruction, wherein the first editing log is used for indicating an editing content and/or an editing process of editing the video to be processed based on the first editing instruction; and sending the first editing log to the second terminal equipment, wherein the first editing log is used for the problem that the second terminal equipment displays the editing content corresponding to the first editing log on the video to be processed. The first terminal equipment responds to the first editing instruction to edit the video to be processed, and simultaneously sends the correspondingly generated first editing log to the second terminal equipment, so that the second terminal equipment can reproduce the editing content corresponding to the first editing instruction, the cooperative processing of the video to be processed is realized, the production efficiency of the video for directing broadcasting is improved, and the quality of the video for directing broadcasting is improved.

Description

Collaborative video processing method and device, electronic equipment and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of computer and network communication, and in particular, to a collaborative video processing method and apparatus, an electronic device, and a storage medium.
Background
For large-scale live programs, program director, director and other staff, program content and effect need to be arranged through the director platform, so as to realize better program effect and meet the watching requirements of audiences.
At present, a cloud director platform can perform audio and video processing on recorded video content, for example, adding a video special effect, arranging video content and the like, and after the processing is completed, the processed video is distributed and pushed to a video platform for video playing, so that the video presentation effect is improved.
However, in the actual video directing process, different links of video processing are usually implemented by a plurality of staff members respectively, and due to certain mutual influence among different links, operations can only be performed in sequence according to a fixed sequence, and problems occurring in the video processing process cannot be coordinated in time, so that the efficiency of video directing and the quality of video directing are affected.
Disclosure of Invention
The embodiment of the disclosure provides a collaborative video processing method and device, electronic equipment and a storage medium, so as to overcome the problem that the efficiency of video directing and the quality of video directing are affected because problems occurring in the video processing process cannot be coordinated in time.
In a first aspect, an embodiment of the present disclosure provides a collaborative video processing method, which is applied to a first terminal device, where the first terminal device and a second terminal device both display videos to be processed, and the method includes:
responding to a first editing instruction, editing the video to be processed displayed on the first terminal device to obtain a first editing log, wherein the first editing log is used for indicating the editing content and/or the editing process of editing the video to be processed based on the first editing instruction; and sending the first editing log to a second terminal device, wherein the first editing log is used for displaying the editing content corresponding to the first editing log on the video to be processed by the second terminal device.
In a second aspect, an embodiment of the present disclosure provides a collaborative video processing method, which is applied to a second terminal device, where both a first terminal device and the second terminal device display videos to be processed, and the method includes:
receiving a first editing log, wherein the first editing log is used for indicating editing content and/or an editing process for editing the video to be processed based on the first editing instruction, and the first editing instruction is used for editing the video to be processed displayed on the first terminal equipment when being responded; and displaying the editing content corresponding to the first editing log on the video to be processed according to the first editing log.
In a third aspect, an embodiment of the present disclosure provides a collaborative video processing method applied to a server, where the method includes: receiving a first editing log sent by a first terminal device, wherein the first editing log is used for indicating editing content and/or an editing process for editing the video to be processed based on the first editing instruction, and the first editing instruction is used for editing the video to be processed displayed on the first terminal device when being responded; and sending the first editing log to second terminal equipment so that the second terminal equipment displays the editing content corresponding to the first editing log on the video to be processed.
In a fourth aspect, an embodiment of the present disclosure provides a collaborative video processing apparatus, which is applied to a first terminal device, where the first terminal device and a second terminal device both display videos to be processed, and the apparatus includes:
the display module is used for responding to a first editing instruction, editing the video to be processed displayed on the first terminal device, and obtaining a first editing log, wherein the first editing log is used for indicating the editing content and/or the editing process of editing the video to be processed based on the first editing instruction;
and the transceiver module is used for sending the first editing log to a second terminal device, wherein the first editing log is used for displaying the editing content corresponding to the first editing log on the video to be processed by the second terminal device.
In a fifth aspect, an embodiment of the present disclosure provides a collaborative video processing apparatus, which is applied to a second terminal device, where both a first terminal device and the second terminal device display videos to be processed, and the apparatus includes:
a transceiver module, configured to receive a first editing log, where the first editing log is used to indicate an editing content and/or an editing process for editing the video to be processed based on the first editing instruction, and the first editing instruction is used to edit the video to be processed displayed on the first terminal device when being responded;
and the display module is used for displaying the editing content corresponding to the first editing log on the video to be processed according to the first editing log.
In a sixth aspect, an embodiment of the present disclosure provides a collaborative video processing apparatus, which is applied to a server, and the apparatus includes:
a receiving module, configured to receive a first editing log sent by a first terminal device, where the first editing log is used to indicate an editing content and/or an editing process for editing the video to be processed based on the first editing instruction, where the first editing instruction is used to edit the video to be processed displayed on the first terminal device when the first editing instruction is responded;
and the sending module is used for sending the first editing log to second terminal equipment so as to enable the second terminal equipment to display the editing content corresponding to the first editing log on the video to be processed.
In a seventh aspect, an embodiment of the present disclosure provides an electronic device, including: at least one processor and memory; the memory stores computer-executable instructions; the at least one processor executes computer-executable instructions stored in the memory to cause the at least one processor to perform the collaborative video processing method according to the first aspect and various possible designs of the first aspect, or to cause the at least one processor to perform the collaborative video processing method according to the second aspect and various possible designs of the second aspect, or to cause the at least one processor to perform the collaborative video processing method according to the third aspect and various possible designs of the third aspect.
In an eighth aspect, an embodiment of the present disclosure provides a computer-readable storage medium, where a computer-executable instruction is stored, and when a processor executes the computer-executable instruction, the collaborative video processing method according to the first aspect and various possible designs of the first aspect is implemented, or the collaborative video processing method according to the second aspect and various possible designs of the second aspect is implemented, or the collaborative video processing method according to the third aspect and various possible designs of the third aspect is implemented.
In a ninth aspect, an embodiment of the present disclosure provides a computer program product, which includes a computer program, and when executed by a processor, implements the collaborative video processing method according to the first aspect and various possible designs of the first aspect, or implements the collaborative video processing method according to the second aspect and various possible designs of the second aspect, or implements the collaborative video processing method according to the third aspect and various possible designs of the third aspect.
In the collaborative video processing method, the collaborative video processing apparatus, the electronic device, and the storage medium provided in this embodiment, a video to be processed displayed on the first terminal device is edited in response to a first editing instruction to obtain a first editing log, where the first editing log is used to indicate an editing content and/or an editing process for editing the video to be processed based on the first editing instruction; and sending the first editing log to a second terminal device, wherein the first editing log is used for the problem that the second terminal device displays the editing content corresponding to the first editing log on the video to be processed. The first terminal equipment responds to the first editing instruction to edit the video to be processed, and simultaneously sends the correspondingly generated first editing log to the second terminal equipment, so that the second terminal equipment can reproduce the editing content corresponding to the first editing instruction on the video to be processed on the second terminal equipment through the first editing log, and a user on one side of the second terminal equipment can know the editing of the video to be processed by the user on one side of the first terminal equipment in real time, so that the video to be processed is correspondingly edited, the cooperative processing of the video to be processed is realized, the production efficiency of the video to be guided is improved, and the quality of the video to be guided is improved.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present disclosure, and for those skilled in the art, other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic view of an application scenario provided in the embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a prior art process for processing a director video;
fig. 3 is a first schematic flowchart of a collaborative video processing method according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of an interaction process between a first terminal device and a second terminal device according to an embodiment of the present disclosure;
fig. 5 is a schematic flowchart illustrating a second cooperative video processing method according to an embodiment of the present disclosure;
fig. 6 is a third schematic flowchart of a collaborative video processing method according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of an interaction process between a first terminal device and a second terminal device according to another embodiment of the present disclosure;
FIG. 8 is a schematic flow chart of step S305 in the embodiment shown in FIG. 6;
fig. 9 is a signaling diagram of a cooperative video processing method according to an embodiment of the present disclosure;
fig. 10 is a fourth schematic flowchart of a collaborative video processing method according to an embodiment of the present disclosure;
fig. 11 is a signaling diagram of another cooperative video processing method according to an embodiment of the present disclosure;
fig. 12 is a block diagram illustrating a configuration of a cooperative video processing apparatus according to an embodiment of the disclosure;
fig. 13 is a block diagram illustrating another cooperative video processing apparatus according to an embodiment of the present disclosure;
fig. 14 is a block diagram illustrating a structure of still another cooperative video processing apparatus according to an embodiment of the disclosure;
fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 16 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are some, but not all embodiments of the present disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
Fig. 1 is a schematic view of an application scenario provided by an embodiment of the present disclosure, and referring to fig. 1, a collaborative video processing method provided by the embodiment of the present disclosure may be applied in an application scenario of a video director, specifically, as shown in fig. 1, in the application scenario, a camera unit is used for acquiring video data and sending the video data to a cloud server, a director user is in communication connection with the cloud server through a terminal device installed with a cloud director platform client to acquire the video data and edit and process the video data, for example, adjust video combination clips, add video special effects, and the like, and finally, an output video after editing processing is distributed to a live broadcast platform for live broadcast, so that the live broadcast platform user can access the live broadcast platform to view the output video.
Fig. 2 is a schematic diagram of a process for processing a director video in the prior art, in an actual director process, different links of video processing are usually implemented by a plurality of staff respectively, and due to a certain mutual influence between different links, operations can only be performed in sequence according to a fixed sequence, as shown in fig. 2, after a video to be processed enters a video processing link, editing processing is performed in sequence by a director user a, and a director user C, and an output video can be generated finally, whereas if the content of modification at the director user C side needs to be matched by the director user a and needs to be returned for modification, the video can only be returned to the director user a side, and is processed and confirmed by the director user a, the director user B, and the director user C in sequence, and since problems occurring in the video processing process cannot be cooperatively processed in time, leading to problems affecting the efficiency and quality of the video guide.
Fig. 3 is a first flowchart illustrating a cooperative video processing method according to an embodiment of the present disclosure. Exemplarily, the method of this embodiment may be applied to a first terminal device, where the first terminal device is in communication connection with a second terminal device, and both the first terminal device and the second terminal device display a video to be processed, where the first terminal device and the second terminal device are, for example, a personal computer, and the method for processing a collaborative video includes:
s101: and responding to the first editing instruction, editing the video to be processed displayed on the first terminal equipment to obtain a first editing log, wherein the first editing log is used for indicating the editing content and/or the editing process of editing the video to be processed based on the first editing instruction.
Illustratively, the first editing instruction is an instruction for the director user on the first terminal device side to edit the video to be processed, and more specifically, for example, to add a special effect to the video to be processed. And the first terminal equipment responds to the first editing instruction after receiving the first editing instruction, correspondingly edits the video to be processed and displays the content through the interactive interface, and the process is a conventional process for responding to user operation and editing and displaying the video to be processed by the terminal equipment, and is not specifically repeated here. Meanwhile, a first editing log is generated for the editing content and/or the editing process of the video to be processed by the first editing instruction, for example, the first editing log includes the trigger time of the first editing instruction, the identifier of the first terminal device, and specific information related to the editing content, such as the editing position and the editing type, and/or operation information for editing the video to be processed, which is not described in detail herein. According to the first editing log, the editing process and the editing result of the first editing instruction to the video to be processed can be reproduced.
S102: and sending the first editing log to the second terminal equipment, wherein the first editing log is used for displaying the editing content corresponding to the first editing log on the video to be processed by the second terminal equipment.
For example, the first terminal device and the second terminal device are directly connected by a local area network or the like, and the first terminal device and the second terminal device exchange data in the same local area network through software clients installed in the first terminal device and the second terminal device, so that the first terminal device sends the first edit log to the second terminal device. For another example, the first terminal device and the second terminal device are connected to a server in the cloud server through respective internally installed software clients, and the first terminal device sends the first editing log to the cloud server and forwards the first editing log to the second terminal device through the cloud service, which is not described herein again.
Further, since the first editing log sent by the first terminal device indicates the editing content for editing the video to be processed based on the first editing instruction, the second terminal device may reproduce the editing process and the editing result of the video to be processed by the first editing instruction according to the first editing log after receiving the first editing log, and more specifically, for example, the first editing log may include specific operation information and parameters corresponding to the first editing instruction, such as a position for adding a special effect to the video to be processed, a type of the special effect, and the like, so that the second terminal device may display the editing result corresponding to the first editing log on the video to be processed of the second terminal device after receiving the first editing log; for another example, the first editing log may include edited content obtained by editing the video to be processed based on the first editing instruction, description information of newly added and modified video elements, and the like, so that after the second terminal device receives the first editing log, an editing result corresponding to the first editing log may be displayed on the video to be processed of the second terminal device, thereby achieving a purpose of synchronizing a process of processing the video to be processed at one side of the first terminal device to one side of the second terminal device in real time.
Fig. 4 is a schematic view of an interaction process between a first terminal device and a second terminal device provided in an embodiment of the present disclosure, and as shown in fig. 4, after the first terminal device responds to a first editing instruction, an edited content of a video to be processed is displayed on the first terminal device through a first interaction interface, for example, an "advertisement" is added to the video to be processed; and simultaneously, the corresponding first editing log is sent to the second terminal equipment, and after the second terminal equipment receives the first editing log, the editing content corresponding to the first editing log is synchronously displayed on the second terminal equipment through the second interactive interface, so that the director user at one side of the second terminal equipment can simultaneously edit the video to be processed while observing the editing of the video to be processed by the director user at one side of the first terminal equipment, thereby realizing the cooperative processing of the video to be processed by the first terminal equipment and the second terminal equipment and improving the generation efficiency of the director video.
Fig. 5 is a flowchart illustrating a second cooperative video processing method according to an embodiment of the disclosure. Exemplarily, the method of this embodiment may be applied to a second terminal device, where the first terminal device and the second terminal device are in communication connection, and both the first terminal device and the second terminal device display a video to be processed, where the first terminal device and the second terminal device are, for example, a personal computer, and the method for processing a collaborative video includes:
s201: and receiving a first editing log, wherein the first editing log is used for indicating editing contents for editing the video to be processed based on a first editing instruction, and the first editing instruction is used for editing the video to be processed displayed on the first terminal equipment when being responded.
S202: and displaying the editing content corresponding to the first editing log on the video to be processed according to the first editing log.
Illustratively, a first editing log sent by the first terminal device indicates editing content for editing a video to be processed based on a first editing instruction, and after receiving the first editing log, the second terminal device can reproduce an editing process and an editing result of the video to be processed by the first editing instruction according to the first editing log, and more specifically, the first editing log can enable the second terminal device to display the editing process and the editing result corresponding to the first editing log on the video to be processed of the second terminal device, so that a video processing process to be processed on the first terminal device side is synchronized to the second terminal device side in real time.
In the embodiment shown in fig. 3, a specific implementation method of the second terminal device side is already described, and is not described here again.
In this embodiment, when the first terminal device edits the video to be processed in response to the first editing instruction, the first editing log generated correspondingly is sent to the second terminal device, so that the second terminal device can reproduce the editing content corresponding to the first editing instruction on the video to be processed on the second terminal device through the first editing log, and a user on one side of the second terminal device can know the editing of the video to be processed by the user on one side of the first terminal device in real time, so that the video to be processed is correspondingly edited, thereby implementing the cooperative processing of the video to be processed, improving the production efficiency of the director video, and improving the quality of the director video.
Fig. 6 is a third schematic flowchart of a cooperative video processing method provided in the embodiment of the present disclosure, in which steps of receiving a second editing instruction, sending a lock request, and detecting a collision are added to the method provided in the embodiment based on the embodiment shown in fig. 3, as shown in fig. 6, the cooperative video processing method provided in the embodiment includes:
s301: and responding to the first editing instruction, editing the video to be processed displayed on the first terminal equipment to obtain a first editing log, wherein the first editing log is used for indicating the editing content for editing the video to be processed based on the first editing instruction.
In a possible implementation manner, the first editing instruction includes annotation information, and the first editing instruction is used for instructing to add the annotation information in the video to be processed, where the annotation information includes annotation characters and/or annotation identifications.
Specifically, for example, through a first editing instruction, in a video to be processed, a certain video in the video is originally indicated by an annotation mark, and is described by annotation characters. Fig. 7 is a schematic diagram of annotation information provided in an embodiment of the disclosure, and as shown in fig. 7, the annotation information includes annotation characters and annotation marks, and through the annotation information in the first editing instruction, the director user on the first terminal device side marks a "billboard" in the video to be processed, and annotates information "code for the billboard". The first editing log generated by the first editing instruction comprising the annotation information is sent to the second terminal equipment, so that the annotation information can be reproduced in the video to be processed displayed by the second terminal equipment, and the director user at one side of the second terminal equipment can receive the modification intention of the director user at one side of the first terminal equipment, thereby realizing real-time cooperative processing of the video to be processed, improving the processing efficiency of the director video and reducing repeated modification.
In a possible implementation manner, the annotation information further includes a cursor, and through movement of the cursor, more real-time communication between different terminal devices can be achieved.
S302: and sending cooperation request information to the server to establish cooperation connection with at least one second terminal device, wherein the cooperation request information is used for the first terminal device to establish cooperation connection with the at least one second terminal device through the server.
S303: and sending the first editing log to the second terminal equipment which establishes the cooperative connection through the server.
Illustratively, before starting the cooperative processing procedure, the first terminal device needs to serve as an initiator to initiate a request to the server to inform the server of which terminal devices the cooperative processing needs to be established, that is, to establish a cooperative connection with at least one second terminal device.
The coordination request information includes identification information of the first terminal device and identification information of the second terminal device, wherein the identification information of the first terminal device is used for enabling the server to determine whether to respond to the coordination request information. More specifically, the server stores therein authority identification information for determining, by the identification information of each terminal device, the authority of each terminal device for establishing the cooperative connection, that is, whether it is qualified to initiate the cooperative connection request. After receiving the cooperative request information, the server determines whether the first terminal device has the authority to initiate cooperative connection according to the identification information of the first terminal device, and if the first terminal device has the authority according to the identification information of the first terminal device, the server establishes cooperative connection between the first terminal device and at least one second terminal device according to the identification of the at least one second terminal device in the cooperative request information sent by the server, so that the subsequent cooperative processing process is performed. In this embodiment, the security of the subsequent cooperative processing process is improved by authenticating the first terminal device according to the cooperative request information.
S304: and receiving a second editing log, wherein the second editing log is used for indicating the editing content for editing the video to be processed based on a second editing instruction, and the second editing instruction is used for editing the video to be processed displayed on the second terminal equipment when being responded.
S305: and displaying the editing content corresponding to the second editing log on the video to be processed according to the second editing log.
Illustratively, the second editing log is similar to the first editing log in content, and is information for characterizing the editing content of the video to be processed. And the second editing log is the editing content of the video to be processed when the second editing instruction is responded on the second terminal equipment. And the first terminal equipment can reproduce and display the edited content corresponding to the second editing instruction according to the second editing log. Therefore, the director user at one side of the first terminal equipment can know the edition to be performed on the video to be processed by the director user at one side of the second terminal equipment in real time, and the cooperative processing among the multi-terminal equipment is realized.
Fig. 7 is another schematic view of an interaction process between a first terminal device and a second terminal device according to the embodiment of the present disclosure, and as shown in fig. 7, after the first terminal device responds to a first editing instruction, an edited content of a video to be processed is displayed on the first terminal device through a first interaction interface, for example, an "advertisement" is added to the video to be processed; sending the corresponding first editing log to a second terminal device, receiving a second editing log sent by the second terminal device by the first terminal device, and displaying an editing content for editing the video to be processed corresponding to the second editing log on an interactive interface of the first terminal device according to the second editing log, for example, adding a special effect to the video to be processed; the first terminal user can simultaneously edit the video to be processed while observing the editing of the video to be processed by the director user at one side of the second terminal device, so that the bidirectional cooperative processing of the video to be processed by the first terminal device and the second terminal device is realized, and the generation efficiency of the director video is improved.
Optionally, as shown in fig. 8, step S305 includes two specific implementation steps S3051 and S3052:
s3051: and editing the video to be processed according to the second editing log to generate an edited video, wherein the video parameters of the edited video relative to the video to be processed are changed, and/or at least one video element is added to the edited video relative to the video to be processed.
S3052: and displaying the edited video.
Exemplarily, the process of editing the video to be processed according to the second editing log includes changing video parameters of the video to be processed, for example, changing the size and the dimension of the video to be processed, and if the video to be processed includes a plurality of videos, the video parameters of the video to be processed further include information such as a position relationship and a playing order of the video to be processed; in addition, the process of editing the video to be processed according to the second editing log further includes adding video elements to the video to be processed, where the video elements are, for example, characters, special effects, videos, pictures, and the like, where the edited video may be a video obtained by editing the video to be processed in any manner related in the prior art, and here, specific contents of the edited video are not limited, and are not repeated by way of example.
It should be noted that, because the first editing log and the second editing log are similar to each other in terms of content and are both a process and a result of editing the video to be processed, according to the first editing log, the second terminal device edits the video to be processed according to the first editing log to generate an edited video, which is similar to the process in the foregoing embodiment and is not described here again.
Optionally, after or before step S305, the method further includes:
s306: and sending a video locking request, wherein the video locking request is used for indicating that the editing of the edited video is forbidden, and determining a first editing log and/or a second editing log corresponding to the edited video as an input log, and the input log is used for generating an output video.
Further, optionally, after the first terminal device sends the cooperative request information to the server and the server authenticates the first terminal device according to the identifier of the first terminal device, the first terminal device may further send a video locking request to finally determine the editing content of the video to be processed, where the video locking request may be sent by the first terminal device to the server, and the server locks the edited video, and no longer allows the first terminal device or the second terminal device to modify the video to be processed in a manner of synchronously editing the log, and uses the last received second editing log and/or the first editing log as an input log to generate a final output video. More specifically, for example, if only the first terminal device edits the video to be processed in the process of editing the video, the first editing log sent by the first terminal device is used as an input log to generate an output video, where the output video only includes editing content corresponding to the first editing log; if the video to be processed is edited by the first terminal device and the at least one second terminal device in the process of editing the video to be processed, taking a first editing log sent by the first terminal device and a corresponding second editing log sent by the at least one second terminal device as input logs, and generating an output video, wherein the output video comprises editing content corresponding to the first editing log and editing content corresponding to the at least one second editing log.
In this embodiment, the first terminal device is equivalent to a decision maker in the whole video cooperative processing process, and determines the final editing content of the video to be processed, and when the first terminal device determines that the editing process of the video to be processed by each terminal device is completed, the first terminal device generates a final output video by sending a video locking request, and distributes the final output video to the live broadcast platform for playing. And the management efficiency in the cooperative processing process is improved. The identity authority of the first terminal equipment decision maker is determined by the server according to the identification information of the first terminal equipment.
Optionally, the method further comprises:
s307: and according to the second editing log, performing conflict detection on the first editing instruction, and if the editing content corresponding to the first editing instruction conflicts with the editing content corresponding to the second editing log, outputting prompt information.
For example, after the first terminal device obtains the second edit log, the corresponding edit content may be displayed on the video to be processed, and then, if the first terminal device needs to continue to respond to the first edit instruction input by the director user and continue editing the video to be processed, the first terminal device may first detect, according to the second edit log, that the first edit instruction is received later, and determine whether there is an edit content conflict, for example, in the edit content represented by the second edit log, a video element a in the video to be processed is deleted, and if the first terminal device includes an attribute modification on the video element a in the first edit instruction in a subsequent response, there is an edit content conflict because the video element a is deleted, and similarly, the edit content conflict further includes: the position conflict, the size conflict, and the like between the video element added by the first terminal device and the video element added by the second terminal device are not described in detail herein.
In this embodiment, by performing conflict detection on the first editing instruction through the second editing log, real-time conflict detection on cooperative processing between multiple terminal devices can be realized, editing conflict problems are prevented from affecting editing quality and editing effect of the director video, and video cooperative editing processing efficiency is improved.
Fig. 9 is a signaling diagram of a cooperative video processing method according to an embodiment of the present disclosure, and as shown in fig. 9, the cooperative video processing method according to the embodiment of the present disclosure includes:
s401: the first terminal device responds to the first editing instruction, edits the video to be processed displayed on the first terminal device, and obtains a first editing log, wherein the first editing log is used for indicating editing content for editing the video to be processed based on the first editing instruction.
Optionally, the first editing instruction includes annotation information, and the first editing instruction is used to instruct to add the annotation information in the video to be processed, where the annotation information includes annotation characters and/or annotation identifiers.
S402: and the first terminal equipment sends the first editing log to the second terminal equipment.
S403: the second terminal device receives the first edit log.
S404: and the second terminal equipment displays the editing content corresponding to the first editing log on the video to be processed according to the first editing log.
Optionally, displaying, according to the first editing log, the editing content corresponding to the first editing log on the video to be processed, where the displaying includes:
editing the video to be processed according to the first editing log to generate an edited video, wherein video parameters of the edited video relative to the video to be processed are changed, and/or at least one video element is added to the edited video relative to the video to be processed; and displaying the edited video.
S405: and the second terminal equipment responds to the second editing instruction, edits the video to be processed displayed on the second terminal equipment, and obtains a second editing log, wherein the second editing log is used for indicating the editing content for editing the video to be processed based on the second editing instruction.
Optionally, after step S405, the method may further include:
s406: and the second terminal equipment sends a video locking request, wherein the video locking request is used for indicating that the editing of the edited video is forbidden, and a first editing log and/or a second editing log corresponding to the edited video are/is determined as input logs, and the input logs are used for generating output videos.
Optionally, after step S405, the method may further include:
s407: and the second terminal equipment performs conflict detection on the second editing instruction according to the first editing log, and outputs prompt information if the editing content corresponding to the second editing instruction conflicts with the editing content corresponding to the first editing log.
Steps S406 and S407 may be executed sequentially or individually, and the execution order of steps S406 and S407 is not limited here.
S408: and the second terminal equipment sends the second editing log to the first terminal equipment.
S409: the first terminal device receives the second edit log.
S410: and the first terminal equipment displays the editing content corresponding to the second editing log on the video to be processed according to the second editing log.
In this embodiment, the processes of receiving the first editing log, displaying the editing content of the first editing log, triggering the second editing instruction, and sending the second editing log by the second terminal device are similar to the processes and effects of the first terminal device executing receiving the second editing log, displaying the editing content of the second editing log, triggering the first editing instruction, and sending the first editing log, and reference may be made to the description in the above embodiments, and details are not repeated here.
In this implementation, the first terminal device and the second terminal device generate a corresponding first editing log and a corresponding second editing log by responding to the corresponding first editing instruction and the corresponding second editing instruction respectively, and send the first editing log and the second editing log to the other side, so that both the first terminal device and the second terminal device can receive the editing log sent by the other side, and therefore, the editing of the video to be processed by the other side is displayed on respective interactive interfaces, and the purpose of cooperative processing is further achieved. Meanwhile, it should be noted that the first terminal device and/or the second terminal device may include one or more terminal devices, and the implementation principle and the process of the first terminal device and/or the second terminal device are consistent with those described in the foregoing embodiments and are not described herein again.
Fig. 10 is a fourth schematic flowchart of the cooperative video processing method provided by the embodiment of the present disclosure, where for example, the method of the present embodiment may be applied to a server, and the server is in communication connection with a first terminal device and a second terminal device, as shown in fig. 10, the cooperative video processing method provided by the embodiment of the present disclosure includes:
s501: and receiving a first editing log sent by the first terminal device, wherein the first editing log is used for indicating editing contents for editing the video to be processed based on a first editing instruction, and the first editing instruction is used for editing the video to be processed displayed on the first terminal device when being responded.
S502: and sending the first editing log to the second terminal equipment so that the second terminal equipment displays the editing content corresponding to the first editing log on the video to be processed.
Illustratively, the server is in communication connection with the first terminal device and the second terminal device, the server receives a first editing log sent by the first terminal device and forwards the first editing log to the second terminal device in real time, and meanwhile, the server receives a second editing log sent by the second terminal device and forwards the second editing log to the first terminal device in real time, so that real-time synchronization of editing contents among the plurality of terminal devices is achieved, data exchange between the server and the client can be achieved in a client-server mode, and the process is not described again here.
Optionally, the method provided in this embodiment further includes:
storing a first editing log sent by first terminal equipment and/or a second editing log sent by second terminal equipment as historical data; receiving a historical data request instruction sent by first terminal equipment or second terminal equipment; and responding to a historical data request instruction sent by the first terminal equipment, and sending the historical data to the first terminal equipment, or responding to a historical data request instruction sent by the second terminal equipment, and sending the historical data to the second terminal equipment.
In this embodiment, after receiving the first editing log sent by the first terminal device and/or the second editing log sent by the second terminal device, the server saves the first editing log and/or the second editing log as the historical data in real time, because the first editing log and the second editing log have timeliness, after the information editing instruction is responded, the terminal device generates the editing log of the information to cover the previous editing log, therefore, the server saves the first editing log and the second editing log received each time, can realize the recording of the whole process of editing the video to be processed, and further, when the first terminal device and the second terminal device need to call the editing log of a certain time node, can obtain the corresponding historical data by sending a historical data request instruction to the server, and the fine management of the editing process of the video to be processed is realized, and the cooperative processing efficiency is improved.
Fig. 11 is a signaling diagram of another cooperative video processing method provided in the embodiment of the present disclosure, and as shown in fig. 11, the cooperative video processing method provided in the embodiment of the present disclosure includes:
s601: and the first terminal equipment sends the collaboration request information to the server.
S602A: and the server sends the coordination processing identifier to the first terminal equipment.
S602B: and the server sends the coordination processing identification to at least one second terminal device.
S603: the first terminal device sends the response information to the server.
S604: the second terminal device transmits the response information to the server.
S605: and the server receives response information sent by the first terminal equipment and at least one second terminal equipment, and determines that the first terminal equipment and the second terminal equipment are in a cooperative state.
S606: the server receives a first editing log sent by the first terminal device in the collaborative state.
S607: and the server sends the first editing log to the second terminal equipment in the coordination state.
S608: and the server receives a second editing log sent by the second terminal equipment in the collaborative state.
S609: and the server sends the second editing log to the first terminal equipment in the coordination state.
In this embodiment, the first terminal device and the second terminal device are connected to the server, so that cooperative video processing when the to-be-processed video is edited between the first terminal device and the second terminal device is realized, the first terminal device and the second terminal device can adjust the editing processing method of the to-be-processed video in real time according to the content edited by the other side of the to-be-processed video, the cooperative efficiency of editing the to-be-processed video is improved, and the overall efficiency and the director quality of the video director process are improved. Specific implementation methods of the first terminal device, the second terminal device, and the server in the respective execution steps are described in detail in the embodiments shown in fig. 3 to fig. 10, and are not described herein again.
Corresponding to the cooperative video processing method in the foregoing embodiment, fig. 12 is a block diagram of a cooperative video processing apparatus provided in the embodiment of the present disclosure, and is applied to a first terminal device. For ease of illustration, only portions that are relevant to embodiments of the present disclosure are shown. Referring to fig. 12, the cooperative video processing apparatus 6 includes:
the display module 61 is configured to, in response to a first editing instruction, edit a to-be-processed video displayed on the first terminal device to obtain a first editing log, where the first editing log is used to indicate an editing content for editing the to-be-processed video based on the first editing instruction;
the transceiving module 62 is configured to send the first editing log to the second terminal device, where the first editing log is used for the second terminal device to display the editing content corresponding to the first editing log on the to-be-processed video.
In one possible implementation, the transceiver module 62 is specifically configured to: sending collaboration request information to a server to establish collaboration connection with at least one second terminal device, wherein the collaboration request information is used for the first terminal device to establish collaboration connection with the at least one second terminal device through the server; and sending the first editing log to the second terminal equipment which establishes the cooperative connection through the server.
In a possible implementation manner, the cooperation request information includes identification information of the first terminal device and identification information of the second terminal device, where the identification information of the first terminal device is used for enabling the server to determine whether to respond to the cooperation request information.
In a possible implementation manner, the first editing instruction includes annotation information, and the first editing instruction is used for instructing to add the annotation information in the video to be processed, where the annotation information includes annotation characters and/or annotation identifications.
In one possible implementation, the transceiver module 62 is further configured to: receiving a second editing log, wherein the second editing log is used for indicating editing contents for editing the video to be processed based on a second editing instruction, and the second editing instruction is used for editing the video to be processed displayed on the second terminal equipment when being responded; a display module 61, further configured to: and displaying the editing content corresponding to the second editing log on the video to be processed according to the second editing log.
In a possible implementation manner, when displaying, according to the second editing log, the editing content corresponding to the second editing log on the video to be processed, the display module 61 is specifically configured to: editing the video to be processed according to the second editing log to generate an edited video, wherein the video parameters of the edited video relative to the video to be processed are changed, and/or at least one video element is added to the edited video relative to the video to be processed; and displaying the edited video.
In one possible implementation, the transceiver module 62 is further configured to: and sending a video locking request, wherein the video locking request is used for indicating that the editing of the edited video is forbidden, and determining a first editing log and/or a second editing log corresponding to the edited video as an input log, and the input log is used for generating an output video.
In a possible implementation manner, the display module 61 is further configured to: and according to the second editing log, performing conflict detection on the first editing instruction, and if the editing content corresponding to the first editing instruction conflicts with the editing content corresponding to the second editing log, outputting prompt information.
The device provided in this embodiment may be used to implement the technical solution of the above method embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
Fig. 13 is a block diagram of another cooperative video processing apparatus according to an embodiment of the present disclosure, which is applied to a second terminal device. For ease of illustration, only portions that are relevant to embodiments of the present disclosure are shown. Referring to fig. 13, the cooperative video processing apparatus 7 includes:
a transceiver module 71, configured to receive a first editing log, where the first editing log is used to indicate an editing content for editing a to-be-processed video based on a first editing instruction, and the first editing instruction is used to edit the to-be-processed video displayed on a first terminal device when the first editing instruction is responded;
and the display module 72 is configured to display, according to the first editing log, the editing content corresponding to the first editing log on the video to be processed.
In one possible implementation, the display module 72 is specifically configured to: editing the video to be processed according to the first editing log to generate an edited video, wherein video parameters of the edited video relative to the video to be processed are changed, and/or at least one video element is added to the edited video relative to the video to be processed; and displaying the edited video.
In one possible implementation, the display module 72 is further configured to: responding to a second editing instruction, and editing the video to be processed displayed on the first terminal equipment; and performing conflict detection on the second editing instruction according to the first editing log, and outputting prompt information if the editing content corresponding to the second editing instruction conflicts with the editing content corresponding to the first editing log.
In a possible implementation manner, the transceiver module 71 is further configured to: and sending a video locking request, wherein the video locking request is used for indicating that the editing of the edited video is forbidden, and determining a first editing log corresponding to the edited video as an input log, and the input log is used for generating an output video.
Fig. 14 is a block diagram of a configuration of another cooperative video processing apparatus according to an embodiment of the present disclosure, which is applied to a server. For ease of illustration, only portions that are relevant to embodiments of the present disclosure are shown. Referring to fig. 14, the cooperative video processing apparatus 8 includes:
the receiving module 81 is configured to receive a first editing log sent by a first terminal device, where the first editing log is used to indicate an editing content for editing a to-be-processed video based on a first editing instruction, where the first editing instruction is used to edit the to-be-processed video displayed on the first terminal device when the first editing instruction is responded;
the sending module 82 is configured to send the first editing log to the second terminal device, so that the second terminal device displays the editing content corresponding to the first editing log on the to-be-processed video.
In a possible implementation manner, the sending module 82 is further configured to: sending a coordination processing identifier to the first terminal equipment and at least one second terminal equipment; the receiving module 81 is further configured to: and receiving response information sent by the first terminal equipment and at least one second terminal equipment, and determining that the first terminal equipment and the second terminal equipment are in a cooperative state.
In a possible implementation manner, the receiving module 81, when receiving the first editing log sent by the first terminal device, is specifically configured to: receiving a first editing log sent by a first terminal device in a collaborative state; when sending the first edit log to the second terminal device, the sending module 82 is specifically configured to: and sending the first editing log to the second terminal equipment in the coordination state.
In a possible implementation manner, the cooperative video processing apparatus 8 further includes:
the control module 83 is configured to store a first editing log sent by the first terminal device and/or a second editing log sent by the second terminal device as historical data;
the receiving module 81 is further configured to: receiving a historical data request instruction sent by first terminal equipment or second terminal equipment;
the sending module 82 is further configured to: and responding to a historical data request instruction sent by the first terminal equipment, and sending the historical data to the first terminal equipment, or responding to a historical data request instruction sent by the second terminal equipment, and sending the historical data to the second terminal equipment.
Fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, and as shown in fig. 15, the electronic device 9 includes at least one processor 91 and a memory 92;
memory 92 stores computer-executable instructions;
the at least one processor 91 executes computer-executable instructions stored by the memory 92 to cause the at least one processor 91 to perform a screen sharing method as in the embodiments shown in fig. 3-10.
The processor 91 and the memory 92 are connected by a bus 93.
The relevant description may be understood by referring to the relevant description and effect corresponding to the steps in the embodiments corresponding to fig. 3 to fig. 11, and redundant description is not repeated here.
Referring to fig. 16, a schematic structural diagram of an electronic device 900 suitable for implementing the embodiment of the present disclosure is shown, where the electronic device 900 may be a terminal device or a server. Among them, the terminal Device may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a Digital broadcast receiver, a Personal Digital Assistant (PDA), a tablet computer (PAD), a Portable Multimedia Player (PMP), a car terminal (e.g., car navigation terminal), etc., and a fixed terminal such as a Digital TV, a desktop computer, etc. The electronic device shown in fig. 16 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 16, the electronic device 900 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 901, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 902 or a program loaded from a storage means 908 into a Random Access Memory (RAM) 903. In the RAM 903, various programs and data necessary for the operation of the electronic apparatus 900 are also stored. The processing apparatus 901, the ROM902, and the RAM 903 are connected to each other through a bus 904. An input/output (I/O) interface 905 is also connected to bus 904.
Generally, the following devices may be connected to the I/O interface 905: input devices 906 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 907 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 908 including, for example, magnetic tape, hard disk, etc.; and a communication device 909. The communication device 909 may allow the electronic apparatus 900 to perform wireless or wired communication with other apparatuses to exchange data. While fig. 16 illustrates an electronic device 900 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication device 909, or installed from the storage device 908, or installed from the ROM 902. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing apparatus 901.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods shown in the above embodiments.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of Network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In a first aspect, according to one or more embodiments of the present disclosure, there is provided a collaborative video processing method applied to a first terminal device, where the first terminal device and a second terminal device both display videos to be processed, the method including:
responding to a first editing instruction, editing the video to be processed displayed on the first terminal device to obtain a first editing log, wherein the first editing log is used for indicating the editing content and/or the editing process of editing the video to be processed based on the first editing instruction; and sending the first editing log to a second terminal device, wherein the first editing log is used for displaying the editing content corresponding to the first editing log on the video to be processed by the second terminal device.
According to one or more embodiments of the present disclosure, sending the first edit log to a second terminal device includes: sending cooperation request information to a server to establish cooperation connection with at least one second terminal device, wherein the cooperation request information is used for the first terminal device to establish cooperation connection with the at least one second terminal device through the server; and sending the first editing log to the second terminal equipment which has established the cooperative connection.
According to one or more embodiments of the present disclosure, the coordination request information includes identification information of the first terminal device and identification information of the second terminal device, where the identification information of the first terminal device is used to enable the server to determine whether to respond to the coordination request information.
According to one or more embodiments of the present disclosure, the first editing instruction includes annotation information, and the first editing instruction is used to instruct to add the annotation information to the video to be processed, where the annotation information includes annotation characters and/or annotation marks.
According to one or more embodiments of the present disclosure, the method further comprises: receiving a second editing log, wherein the second editing log is used for indicating editing content and/or an editing process for editing the video to be processed based on a second editing instruction, and the second editing instruction is used for editing the video to be processed displayed on the second terminal equipment when being responded; and displaying the editing content corresponding to the second editing log on the video to be processed according to the second editing log.
According to one or more embodiments of the present disclosure, displaying, according to the second editing log, an editing content corresponding to the second editing log on the video to be processed includes: editing the video to be processed according to the second editing log to generate an edited video, wherein video parameters of the edited video relative to the video to be processed are changed, and/or at least one video element is added to the edited video relative to the video to be processed; and displaying the edited video.
According to one or more embodiments of the present disclosure, the method further comprises: and sending a video locking request, wherein the video locking request is used for indicating that the editing of the edited video is forbidden, and determining a first editing log and/or a second editing log corresponding to the edited video as an input log, and the input log is used for generating an output video.
According to one or more embodiments of the present disclosure, the method further comprises: and according to the second editing log, performing conflict detection on a first editing instruction, and if the editing content corresponding to the first editing instruction conflicts with the editing content corresponding to the second editing log, outputting prompt information.
In a second aspect, according to one or more embodiments of the present disclosure, there is provided a collaborative video processing method applied to a second terminal device, where a first terminal device and the second terminal device both display videos to be processed, the method including:
receiving a first editing log, wherein the first editing log is used for indicating editing content and/or an editing process for editing the video to be processed based on the first editing instruction, and the first editing instruction is used for editing the video to be processed displayed on the first terminal equipment when being responded; and displaying the editing content corresponding to the first editing log on the video to be processed according to the first editing log.
According to one or more embodiments of the present disclosure, displaying, according to the first editing log, an editing content corresponding to the first editing log on the video to be processed includes: editing the video to be processed according to the first editing log to generate an edited video, wherein video parameters of the edited video relative to the video to be processed are changed, and/or at least one video element is added to the edited video relative to the video to be processed; and displaying the edited video.
According to one or more embodiments of the present disclosure, the method further comprises: responding to a second editing instruction, and editing the video to be processed displayed on the first terminal equipment; and performing conflict detection on a second editing instruction according to the first editing log, and outputting prompt information if the editing content corresponding to the second editing instruction conflicts with the editing content corresponding to the first editing log.
According to one or more embodiments of the present disclosure, the method further comprises: sending a video locking request, wherein the video locking request is used for indicating that editing of the edited video is forbidden, and determining a first editing log corresponding to the edited video as an input log, and the input log is used for generating an output video.
In a third aspect, according to one or more embodiments of the present disclosure, there is provided a collaborative video processing method applied to a server, the method including:
receiving a first editing log sent by a first terminal device, wherein the first editing log is used for indicating editing content and/or an editing process for editing the video to be processed based on the first editing instruction, and the first editing instruction is used for editing the video to be processed displayed on the first terminal device when being responded; and sending the first editing log to second terminal equipment so that the second terminal equipment displays the editing content corresponding to the first editing log on the video to be processed.
According to one or more embodiments of the present disclosure, the method further comprises: sending a coordination processing identifier to the first terminal device and at least one second terminal device; receiving response information sent by the first terminal equipment and at least one second terminal equipment, and determining that the first terminal equipment and the second terminal equipment are in a cooperative state; the receiving of the first edit log sent by the first terminal device includes: receiving a first editing log sent by a first terminal device in a collaborative state; sending the first editing log to a second terminal device, including: and sending the first editing log to the second terminal equipment in the coordination state.
According to one or more embodiments of the present disclosure, the method further comprises: storing a first editing log sent by the first terminal device and/or a second editing log sent by the second terminal device as historical data; receiving a historical data request instruction sent by the first terminal device or the second terminal device; responding to a historical data request instruction sent by the first terminal device, sending the historical data to the first terminal device, or responding to a historical data request instruction sent by the second terminal device, and sending the historical data to the second terminal device.
In a fourth aspect, an embodiment of the present disclosure provides a collaborative video processing apparatus, which is applied to a first terminal device, where the first terminal device and a second terminal device both display videos to be processed, and the apparatus includes:
the display module is used for responding to a first editing instruction, editing the video to be processed displayed on the first terminal device, and obtaining a first editing log, wherein the first editing log is used for indicating the editing content and/or the editing process of editing the video to be processed based on the first editing instruction;
and the transceiver module is used for sending the first editing log to a second terminal device, wherein the first editing log is used for displaying the editing content corresponding to the first editing log on the video to be processed by the second terminal device.
According to one or more embodiments of the present disclosure, a transceiver module is specifically configured to: sending cooperation request information to a server to establish cooperation connection with at least one second terminal device, wherein the cooperation request information is used for the first terminal device to establish cooperation connection with the at least one second terminal device through the server; and sending the first editing log to the second terminal equipment which has established the cooperative connection.
According to one or more embodiments of the present disclosure, the coordination request information includes identification information of the first terminal device and identification information of the second terminal device, where the identification information of the first terminal device is used to enable the server to determine whether to respond to the coordination request information.
According to one or more embodiments of the present disclosure, the first editing instruction includes annotation information, and the first editing instruction is used to instruct to add the annotation information to the video to be processed, where the annotation information includes annotation characters and/or annotation marks.
According to one or more embodiments of the present disclosure, the transceiver module is further configured to: receiving a second editing log, wherein the second editing log is used for indicating editing content and/or an editing process for editing the video to be processed based on a second editing instruction, and the second editing instruction is used for editing the video to be processed displayed on the second terminal equipment when being responded; a display module further to: and displaying the editing content corresponding to the second editing log on the video to be processed according to the second editing log.
According to one or more embodiments of the present disclosure, when displaying, according to the second editing log, the editing content corresponding to the second editing log on the video to be processed, the display module is specifically configured to: editing the video to be processed according to the second editing log to generate an edited video, wherein video parameters of the edited video relative to the video to be processed are changed, and/or at least one video element is added to the edited video relative to the video to be processed; and displaying the edited video.
According to one or more embodiments of the present disclosure, the transceiver module is further configured to: and sending a video locking request, wherein the video locking request is used for indicating that the editing of the edited video is forbidden, and determining a first editing log and/or a second editing log corresponding to the edited video as an input log, and the input log is used for generating an output video.
According to one or more embodiments of the present disclosure, the display module is further configured to: and according to the second editing log, performing conflict detection on a first editing instruction, and if the editing content corresponding to the first editing instruction conflicts with the editing content corresponding to the second editing log, outputting prompt information.
In a fifth aspect, an embodiment of the present disclosure provides a collaborative video processing apparatus, which is applied to a second terminal device, where both a first terminal device and the second terminal device display videos to be processed, and the apparatus includes:
a transceiver module, configured to receive a first editing log, where the first editing log is used to indicate an editing content and/or an editing process for editing the video to be processed based on the first editing instruction, and the first editing instruction is used to edit the video to be processed displayed on the first terminal device when being responded;
and the display module is used for displaying the editing content corresponding to the first editing log on the video to be processed according to the first editing log.
According to one or more embodiments of the present disclosure, the display module is specifically configured to: editing the video to be processed according to the first editing log to generate an edited video, wherein video parameters of the edited video relative to the video to be processed are changed, and/or at least one video element is added to the edited video relative to the video to be processed; and displaying the edited video.
According to one or more embodiments of the present disclosure, the display module is further configured to: responding to a second editing instruction, and editing the video to be processed displayed on the first terminal equipment; and performing conflict detection on a second editing instruction according to the first editing log, and outputting prompt information if the editing content corresponding to the second editing instruction conflicts with the editing content corresponding to the first editing log.
According to one or more embodiments of the present disclosure, the transceiver module is further configured to: sending a video locking request, wherein the video locking request is used for indicating that editing of the edited video is forbidden, and determining a first editing log corresponding to the edited video as an input log, and the input log is used for generating an output video.
In a sixth aspect, an embodiment of the present disclosure provides a collaborative video processing apparatus, which is applied to a server, and the apparatus includes:
a receiving module, configured to receive a first editing log sent by a first terminal device, where the first editing log is used to indicate an editing content and/or an editing process for editing the video to be processed based on the first editing instruction, where the first editing instruction is used to edit the video to be processed displayed on the first terminal device when the first editing instruction is responded;
and the sending module is used for sending the first editing log to second terminal equipment so as to enable the second terminal equipment to display the editing content corresponding to the first editing log on the video to be processed.
According to one or more embodiments of the present disclosure, the sending module is further configured to: sending a coordination processing identifier to the first terminal device and at least one second terminal device; a receiving module, further configured to: and receiving response information sent by the first terminal equipment and at least one second terminal equipment, and determining that the first terminal equipment and the second terminal equipment are in a cooperative state.
According to one or more embodiments of the present disclosure, when receiving the first edit log sent by the first terminal device, the receiving module is specifically configured to: receiving a first editing log sent by a first terminal device in a collaborative state; when the sending module sends the first editing log to the second terminal device, the sending module is specifically configured to: and sending the first editing log to the second terminal equipment in the coordination state.
According to one or more embodiments of the present disclosure, the video processing apparatus 8 further includes:
the control module is used for storing a first editing log sent by the first terminal device and/or a second editing log sent by the second terminal device as historical data;
a receiving module, further configured to: receiving a historical data request instruction sent by the first terminal device or the second terminal device;
a sending module, further configured to: responding to a historical data request instruction sent by the first terminal device, sending the historical data to the first terminal device, or responding to a historical data request instruction sent by the second terminal device, and sending the historical data to the second terminal device.
In a seventh aspect, an embodiment of the present disclosure provides an electronic device, including: at least one processor and memory; the memory stores computer-executable instructions; the at least one processor executes computer-executable instructions stored in the memory to cause the at least one processor to perform the collaborative video processing method according to the first aspect and various possible designs of the first aspect, or to cause the at least one processor to perform the collaborative video processing method according to the second aspect and various possible designs of the second aspect, or to cause the at least one processor to perform the collaborative video processing method according to the third aspect and various possible designs of the third aspect.
In an eighth aspect, an embodiment of the present disclosure provides a computer-readable storage medium, where a computer-executable instruction is stored, and when a processor executes the computer-executable instruction, the collaborative video processing method according to the first aspect and various possible designs of the first aspect is implemented, or the collaborative video processing method according to the second aspect and various possible designs of the second aspect is implemented, or the collaborative video processing method according to the third aspect and various possible designs of the third aspect is implemented.
In a ninth aspect, an embodiment of the present disclosure provides a computer program product, which includes a computer program, and when executed by a processor, implements the collaborative video processing method according to the first aspect and various possible designs of the first aspect, or implements the collaborative video processing method according to the second aspect and various possible designs of the second aspect, or implements the collaborative video processing method according to the third aspect and various possible designs of the third aspect.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (20)

1. A collaborative video processing method is applied to a first terminal device, and comprises the following steps:
responding to a first editing instruction, editing the video to be processed displayed on the first terminal device to obtain a first editing log, wherein the first editing log is used for indicating the editing content and/or the editing process of editing the video to be processed based on the first editing instruction;
and sending the first editing log to a second terminal device, wherein the first editing log is used for displaying the editing content corresponding to the first editing log on the video to be processed by the second terminal device.
2. The method of claim 1, wherein sending the first edit log to a second terminal device comprises:
sending cooperation request information to a server to establish cooperation connection with at least one second terminal device, wherein the cooperation request information is used for the first terminal device to establish cooperation connection with the at least one second terminal device through the server;
and sending the first editing log to the second terminal equipment which has established the cooperative connection.
3. The method according to claim 2, wherein the cooperation request information includes identification information of the first terminal device and identification information of the second terminal device, and wherein the identification information of the first terminal device is used for the server to determine whether to respond to the cooperation request information.
4. The method according to claim 1, wherein the first editing instruction comprises annotation information, and the first editing instruction is used for instructing to add the annotation information in the video to be processed, wherein the annotation information comprises annotation characters and/or annotation marks.
5. The method according to any one of claims 1-4, further comprising:
receiving a second editing log, wherein the second editing log is used for indicating editing content and/or an editing process for editing the video to be processed based on a second editing instruction, and the second editing instruction is used for editing the video to be processed displayed on the second terminal equipment when being responded;
and displaying the editing content corresponding to the second editing log on the video to be processed according to the second editing log.
6. The method according to claim 5, wherein displaying, according to the second editing log, the edited content corresponding to the second editing log on the video to be processed comprises:
editing the video to be processed according to the second editing log to generate an edited video;
and displaying the edited video.
7. The method of claim 5, further comprising:
and sending a video locking request, wherein the video locking request is used for indicating that the editing of the edited video is forbidden, and determining a first editing log and/or a second editing log corresponding to the edited video as an input log, and the input log is used for generating an output video.
8. The method of claim 5, further comprising:
and according to the second editing log, performing conflict detection on a first editing instruction, and if the editing content corresponding to the first editing instruction conflicts with the editing content corresponding to the second editing log, outputting prompt information.
9. A collaborative video processing method is applied to a second terminal device, and comprises the following steps:
receiving a first editing log, wherein the first editing log is used for indicating editing content and/or an editing process for editing a video to be processed based on a first editing instruction, and the first editing instruction is used for editing the video to be processed displayed on a first terminal device when being responded;
and displaying the editing content corresponding to the first editing log on the video to be processed according to the first editing log.
10. The method of claim 9, further comprising:
responding to a second editing instruction, and editing the video to be processed displayed on the first terminal equipment;
and performing conflict detection on a second editing instruction according to the first editing log, and outputting prompt information if the editing content corresponding to the second editing instruction conflicts with the editing content corresponding to the first editing log.
11. The method according to claim 9 or 10, characterized in that the method further comprises:
sending a video locking request, wherein the video locking request is used for indicating that editing of the edited video is forbidden, and determining a first editing log corresponding to the edited video as an input log, and the input log is used for generating an output video.
12. A collaborative video processing method applied to a server includes:
receiving a first editing log sent by a first terminal device, wherein the first editing log is used for indicating editing content and/or an editing process for editing a video to be processed based on a first editing instruction, and the first editing instruction is used for editing the video to be processed displayed on the first terminal device when being responded;
and sending the first editing log to second terminal equipment so that the second terminal equipment displays the editing content corresponding to the first editing log on the video to be processed.
13. The method of claim 12, further comprising:
sending a coordination processing identifier to the first terminal device and at least one second terminal device;
receiving response information sent by the first terminal equipment and at least one second terminal equipment, and determining that the first terminal equipment and the second terminal equipment are in a cooperative state;
the receiving of the first edit log sent by the first terminal device includes: receiving a first editing log sent by a first terminal device in a collaborative state;
sending the first editing log to a second terminal device, including:
and sending the first editing log to the second terminal equipment in the coordination state.
14. The method according to claim 12 or 13, characterized in that the method further comprises:
storing a first editing log sent by the first terminal device and/or a second editing log sent by the second terminal device as historical data;
receiving a historical data request instruction sent by the first terminal device or the second terminal device;
responding to a historical data request instruction sent by the first terminal device, sending the historical data to the first terminal device, or responding to a historical data request instruction sent by the second terminal device, and sending the historical data to the second terminal device.
15. A cooperative video processing apparatus, applied to a first terminal device, the apparatus comprising:
the display module is used for responding to a first editing instruction, editing the video to be processed displayed on the first terminal device, and obtaining a first editing log, wherein the first editing log is used for indicating the editing content and/or the editing process of editing the video to be processed based on the first editing instruction;
and the transceiver module is used for sending the first editing log to a second terminal device, wherein the first editing log is used for displaying the editing content corresponding to the first editing log on the video to be processed by the second terminal device.
16. A cooperative video processing apparatus, applied to a second terminal device, the apparatus comprising:
the receiving and sending module is used for receiving a first editing log, the first editing log is used for indicating editing contents and/or editing processes for editing the video to be processed based on the first editing instruction, and the first editing instruction is used for editing the video to be processed displayed on the first terminal equipment when being responded;
and the display module is used for displaying the editing content corresponding to the first editing log on the video to be processed according to the first editing log.
17. A cooperative video processing apparatus, applied to a server, the apparatus comprising:
the video editing device comprises a receiving module, a processing module and a processing module, wherein the receiving module is used for receiving a first editing log sent by a first terminal device, the first editing log is used for indicating editing contents and/or editing processes for editing a video to be processed based on a first editing instruction, and the first editing instruction is used for editing the video to be processed displayed on the first terminal device when being responded;
and the sending module is used for sending the first editing log to second terminal equipment so as to enable the second terminal equipment to display the editing content corresponding to the first editing log on the video to be processed.
18. An electronic device, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing the computer-executable instructions stored by the memory causes the at least one processor to perform the collaborative video processing method according to any of claims 1 to 14.
19. A computer-readable storage medium having stored thereon computer-executable instructions which, when executed by a processor, implement the collaborative video processing method according to any one of claims 1 to 14.
20. A computer program product comprising a computer program which, when executed by a processor, implements the collaborative video processing method of any of claims 1 to 14.
CN202110407804.3A 2021-04-15 2021-04-15 Collaborative video processing method and device, electronic equipment and storage medium Pending CN113099130A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110407804.3A CN113099130A (en) 2021-04-15 2021-04-15 Collaborative video processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110407804.3A CN113099130A (en) 2021-04-15 2021-04-15 Collaborative video processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113099130A true CN113099130A (en) 2021-07-09

Family

ID=76678010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110407804.3A Pending CN113099130A (en) 2021-04-15 2021-04-15 Collaborative video processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113099130A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114025215A (en) * 2021-11-04 2022-02-08 深圳传音控股股份有限公司 File processing method, mobile terminal and storage medium
WO2023241283A1 (en) * 2022-06-16 2023-12-21 北京字跳网络技术有限公司 Video editing method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876275A (en) * 2017-05-16 2018-11-23 群晖科技股份有限公司 Collaboration working method and the system for using this method
CN111277905A (en) * 2020-03-09 2020-06-12 新华智云科技有限公司 Online collaborative video editing method and device
CN111914520A (en) * 2019-05-09 2020-11-10 富泰华工业(深圳)有限公司 Document collaborative editing method and device, computer device and storage medium
CN112261416A (en) * 2020-10-20 2021-01-22 广州博冠信息科技有限公司 Cloud-based video processing method and device, storage medium and electronic equipment
CN112399189A (en) * 2019-08-19 2021-02-23 腾讯科技(深圳)有限公司 Delay output control method, device, system, equipment and medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876275A (en) * 2017-05-16 2018-11-23 群晖科技股份有限公司 Collaboration working method and the system for using this method
CN111914520A (en) * 2019-05-09 2020-11-10 富泰华工业(深圳)有限公司 Document collaborative editing method and device, computer device and storage medium
CN112399189A (en) * 2019-08-19 2021-02-23 腾讯科技(深圳)有限公司 Delay output control method, device, system, equipment and medium
CN111277905A (en) * 2020-03-09 2020-06-12 新华智云科技有限公司 Online collaborative video editing method and device
CN112261416A (en) * 2020-10-20 2021-01-22 广州博冠信息科技有限公司 Cloud-based video processing method and device, storage medium and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114025215A (en) * 2021-11-04 2022-02-08 深圳传音控股股份有限公司 File processing method, mobile terminal and storage medium
WO2023241283A1 (en) * 2022-06-16 2023-12-21 北京字跳网络技术有限公司 Video editing method and device

Similar Documents

Publication Publication Date Title
US11490033B2 (en) Video generating method, apparatus, electronic device and computer storage medium
CN110597774B (en) File sharing method, system, device, computing equipment and terminal equipment
US11711441B2 (en) Method and apparatus for publishing video synchronously, electronic device, and readable storage medium
CN113489937B (en) Video sharing method, device, equipment and medium
CN111399729A (en) Image drawing method and device, readable medium and electronic equipment
WO2015065955A1 (en) Bandwidth reduction system and method
CN112073307B (en) Mail processing method, mail processing device, electronic equipment and computer readable medium
CN110866193A (en) Feedback method, device and equipment based on online document comment and storage medium
CN113099130A (en) Collaborative video processing method and device, electronic equipment and storage medium
WO2022057575A1 (en) Multimedia data publishing method and apparatus, and device and medium
US20240119969A1 (en) Video processing method and apparatus, electronic device and storage medium
US11758087B2 (en) Multimedia conference data processing method and apparatus, and electronic device
WO2022022619A1 (en) Document editing method and apparatus, and electronic device
CN111818383B (en) Video data generation method, system, device, electronic equipment and storage medium
US20220368553A1 (en) Interaction method and apparatus, and electronic device
WO2022227859A1 (en) Information display method and apparatus, device and storage medium
CN110908752A (en) Control setting method and device, electronic equipment and interaction system
CN111385638B (en) Video processing method and device
CN111367592B (en) Information processing method and device
CN114650274A (en) Method, device and system for displaying conference shared screen content
CN111294657A (en) Information processing method and device
WO2023143318A1 (en) Data display method and apparatus, and electronic device and storage medium
CN111294611B (en) Video insertion method and device, electronic equipment and computer readable storage medium
CN111343149B (en) Comment method and device, electronic equipment and computer readable medium
CN116416120A (en) Image special effect processing method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination