WO2021249168A1 - 视频处理方法、装置、电子设备及计算机可读存储介质 - Google Patents
视频处理方法、装置、电子设备及计算机可读存储介质 Download PDFInfo
- Publication number
- WO2021249168A1 WO2021249168A1 PCT/CN2021/095502 CN2021095502W WO2021249168A1 WO 2021249168 A1 WO2021249168 A1 WO 2021249168A1 CN 2021095502 W CN2021095502 W CN 2021095502W WO 2021249168 A1 WO2021249168 A1 WO 2021249168A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- editing
- track
- processed
- processing
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 8
- 238000012545 processing Methods 0.000 claims abstract description 445
- 230000006870 function Effects 0.000 claims abstract description 349
- 238000000034 method Methods 0.000 claims abstract description 78
- 230000008569 process Effects 0.000 claims abstract description 33
- 230000000007 visual effect Effects 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 16
- 230000004044 response Effects 0.000 claims description 15
- 230000007246 mechanism Effects 0.000 abstract description 4
- 238000000638 solvent extraction Methods 0.000 abstract 1
- 230000000694 effects Effects 0.000 description 32
- 238000010586 diagram Methods 0.000 description 12
- 238000004891 communication Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 230000001960 triggered effect Effects 0.000 description 6
- 230000002708 enhancing effect Effects 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47205—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47217—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
Definitions
- the present disclosure relates to the field of computer technology. Specifically, the present disclosure relates to a video processing method, device, electronic equipment, and computer-readable storage medium.
- the content of the invention is provided to introduce concepts in a brief form, and these concepts will be described in detail in the following specific embodiments.
- the content of the invention is not intended to identify the key features or essential features of the technical solution that is required to be protected, nor is it intended to be used to limit the scope of the technical solution that is required to be protected.
- embodiments of the present disclosure provide a video processing method, including:
- the preview screen of the processed video processed by the processing function is displayed in the video preview area, and the editing logo corresponding to the processing function is displayed in the track editing area, where the editing logo and The editing track of the video to be processed is superimposed and displayed in the track editing area.
- a video processing device including:
- the to-be-processed video receiving module is used to receive the to-be-processed video
- the to-be-processed video display module is used to display the preview screen of the to-be-processed video through the video preview area on the display interface, to display the editing track of the to-be-processed video through the track editing area, and to display at least one processing function through the processing function navigation area;
- the to-be-processed video processing module is used to display the preview screen of the processed video processed by the processing function in the video preview area when a trigger operation for any processing function is received, and display the corresponding processing function in the track editing area
- the editing logo where the editing logo and the editing track of the to-be-processed video are superimposed and displayed in the track editing area.
- an embodiment of the present disclosure provides an electronic device, including a memory and a processor
- a computer program is stored in the memory
- the processor is configured to execute a computer program to implement the method provided in the embodiment of the first aspect.
- the embodiments of the present disclosure provide a computer-readable storage medium on which a computer program is stored, and the computer program is executed by a processor to implement the method provided in the embodiment of the first aspect.
- the preview screen is displayed in the video preview area, the editing logo is displayed in the track editing area, and the processing function to be selected is displayed in the processing function navigation area.
- the to-be-processed video The segment is processed to obtain the processed video, and the preview screen of the processed video is displayed in the video preview area, and the editing logo corresponding to the processing function is displayed in the track editing area.
- the program partitions the screen preview, track editing, and processing function navigation in the video processing process, and sets a reasonable linkage mechanism, it ensures that users can easily and conveniently obtain rich processing functions when using the program for video processing. , Improve the user experience.
- the solution can also improve the functional scalability of the application, thereby satisfying the needs of users and enhancing the user's operating experience.
- FIG. 1 is a schematic flowchart of a video processing method provided by an embodiment of the disclosure
- FIG. 2a is a schematic diagram of a display interface after a user uploads a video to be processed in an embodiment of the disclosure
- FIG. 2b is a schematic diagram of the display interface after the user has made a leftward sliding operation on the track editing area in the display interface of FIG. 2a in an embodiment of the disclosure;
- FIG. 3a is a schematic diagram of a track editing area corresponding to a special effect processing function added in an example of an embodiment of the disclosure
- FIG. 3b is a schematic diagram of a track editing area corresponding to a divided video processing function in an example of an embodiment of the disclosure
- FIG. 4 is a structural block diagram of a video processing device provided by an embodiment of the disclosure.
- FIG. 5 is a schematic structural diagram of an electronic device provided by an embodiment of the disclosure.
- FIG. 1 is a schematic flowchart of a video processing method provided by an embodiment of the present disclosure.
- the execution body of the method may include an electronic device that installs a video processing service application. As shown in FIG. 1, the method may include:
- Step S101 Receive a video to be processed.
- the video to be processed may include an unprocessed video uploaded by a user, a processed draft video saved by an application, or an unprocessed video formed by splicing multiple pictures uploaded by the user.
- the user can choose to upload a locally stored unprocessed video through the new video add button set in the application interface, or can directly select the last processed draft video saved in the application.
- the user can also choose to upload multiple locally stored pictures through the new video add button set in the application interface, and the application can splice the received multiple pictures into a video as a video to be processed.
- the present disclosure does not limit the receiving method of the to-be-processed video.
- the application program performs related display and processing operations after receiving the to-be-processed video.
- Step S102 Display a preview picture of the video to be processed in the video preview area on the display interface, display the edited track of the video to be processed in the track editing area, and display at least one processing function through the processing function navigation area.
- the display interface of the application program may include three areas to display related information of the video to be processed, and the three areas respectively include: a video preview area, a track editing area, and a processing function navigation area.
- the display interface includes a video preview area 201, a track editing area 202, and a processing function navigation area 203.
- the video preview area 201 can be regarded as a playback interface for displaying preview images.
- the video preview area 201 can display preview images corresponding to each video segment of the video to be processed, or can play the entire video to be processed.
- the track editing area 202 is provided with a time axis 204 and a time axis ruler 205, and is used to display at least one editing mark.
- the editing track 206 of the video to be processed is displayed in the track editing area 202.
- the track start point of the editing track 206 of the video to be processed is aligned with the start point of the timeline 204, and the editing track 206 of the video to be processed is aligned with the start point of the timeline 204.
- Each video clip of the video to be processed is displayed on the screen.
- the user can perform stretching and compression operations on the editing track 206 of the video to be processed in the length direction of the editing track 206 of the video to be processed.
- the application program reduces the time axis 204; when receiving a compression operation for the editing track 206, the application program enlarges the time axis 204.
- the processing function navigation area 203 is used to display various processing functions for processing the to-be-processed video, for example, editing function, filter function, special effect function, etc.
- the processing function navigation area 203 may include a multi-level navigation bar, and the corresponding processing function is displayed on each level of the navigation bar.
- the user can trigger the corresponding processing function by opening the multi-level navigation bar as needed.
- By setting a multi-level navigation bar in the processing function navigation area 203 it is possible to provide users with a simpler and easier-to-use operation interface and more selectable processing functions. At the same time, it can also improve the functional scalability of the application to meet the needs of users. Demand and improve the user’s operating experience.
- Step S103 when a trigger operation for any processing function is received, the preview screen of the processed video processed by any processing function is displayed in the video preview area, and the edit corresponding to any processing function is displayed in the track editing area Mark, where the edit mark and the edit track of the to-be-processed video are superimposed and displayed in the track edit area.
- the edit flag indicates that the user has triggered the corresponding processing function.
- the edit identification may include edit track and edit effect identification.
- a processing instruction to execute the processing function is issued.
- the application program uses the processing function to process the to-be-processed segment of the video to be processed and obtain the processed video.
- the application program displays the preview screen of the processed video in the video preview area 201, and displays the editing identifier corresponding to the processing function in the track editing area 202.
- the editing identifier corresponding to the processing function and the editing track of the to-be-processed video are superimposed and displayed.
- the application program when the displayed editing identifier includes the editing track corresponding to the processing function, the application program superimposes the editing track corresponding to the processing function and the editing track of the video to be processed in the track editing area in parallel;
- the application program when the editing identifier includes the editing effect identifier corresponding to the processing function, the application program superimposes the editing effect identifier corresponding to the processing function on the editing track of the to-be-processed video for superimposed display.
- the preview screen is displayed through the video preview area, the editing mark is displayed through the track editing area, and the processing function to be selected is displayed through the processing function navigation area.
- the video clip to be processed is processed to obtain the processed video
- the preview screen of the processed video is displayed in the video preview area, and the editing mark corresponding to the processing function is displayed in the track editing area.
- the method may further include:
- the starting editing point of the video to be processed can be selected by the user by sliding the editing track of the video to be processed in the track editing area left and right.
- the user's sliding operation on the track editing area can be understood as changing the relative position of the editing track of the video to be processed and the time axis ruler, because the position of the time axis ruler is fixed (for example, located in the center of the track editing area) and It is perpendicular to the editing track of the video to be processed, so the position of the time axis on the editing track of the video to be processed can be changed by sliding operation.
- the time point on the time axis corresponding to the intersection of the time axis ruler and the editing track of the to-be-processed video is taken as the starting editing point of the to-be-processed video.
- the application can start processing the to-be-processed video from the starting editing point. Take the user sliding the editing track 206 of the video to be processed to the left in the track editing area 202 shown in FIG. 2a to obtain the situation shown in FIG. 2b as an example.
- the timeline ruler 205 and the editing track 206 of the video to be processed are The time point on the time axis 204 corresponding to the intersection point A is the corresponding starting edit point. Taking FIG. 2b as an example, it can be seen that the starting edit point corresponds to the time 00:05 on the time axis 204.
- a preview screen of the processed video processed by the processing function is displayed in the video preview area, and displayed in the track editing area
- the edit ID corresponding to the processing function can include:
- a processing instruction corresponding to the processing function is acquired, where the processing instruction includes the function identification and processing parameters of the processing function;
- the corresponding preview screen is displayed in the video preview area, and the editing logo corresponding to the processing function is displayed in the track editing area.
- the editing logo uses the starting editing point as the reference point and is related to the editing track of the video clip to be processed Align on the timeline.
- a processing instruction to execute the processing function is issued, and the processing instruction includes the function identification and processing parameters of the processing function.
- the function identifier is used to indicate the processing function selected by the user.
- the processing parameter includes the parameter for processing the video clip to be processed. Taking the processing function of adding media resource as an example, the processing parameter includes the corresponding media resource identifier.
- the application program first obtains the duration corresponding to the corresponding processing function according to the function identifier therein.
- the duration corresponding to the processing function can be preset.
- the duration corresponding to the sticker function can be set to 1 second, and after receiving the function identifier of the sticker function, the application can determine that the duration corresponding to the processing function is 1 second.
- the application program determines the corresponding video clip to be processed according to the starting edit point and the corresponding duration of the processing function. Specifically, the application uses the starting editing point as the starting point of the to-be-processed video clip, and the corresponding duration of the processing function as The duration of the to-be-processed video segment can then be determined.
- the application program uses the processing function corresponding to the function identifier and the corresponding processing parameter to process the determined to-be-processed video segment to obtain a processed video.
- the application program displays the preview screen corresponding to the processed video in the video preview area. It may only display the preview screen at a certain moment (for example, the moment of the initial editing point), or Play the processed video based on the received play trigger operation.
- the editing track corresponding to the processing function and the editing track of the to-be-processed video are superimposed and displayed in the track editing area in parallel, and the editing track corresponding to the processing function takes the starting editing point as the reference point and matches
- the editing tracks of the video clips to be processed are aligned on the timeline.
- the processing function as the added special effect function as an example, the time corresponding to the added special effect is 0.3 seconds.
- the application determines that the video clip to be processed is 00:05 to 00: 35 video clips to be processed.
- the application After the application adds special effects to the video clip to be processed between 00:05 and 00:35, the editing track corresponding to the added special effect will be displayed in parallel with the editing track of the to-be-processed video in the track editing area, and the special effect function should be added
- the corresponding editing track and the video clip to be processed respectively correspond to 00:05 to 00:35 on the time axis, that is, the two are aligned on the time axis with the start editing point as the reference point.
- the application can also generate the reverse processing instruction of the current instruction.
- the reverse instruction of the add instruction is delete
- the delete instruction includes the identification and processing parameters corresponding to the processing function.
- the application program stores the reverse processing instruction in the cancel queue, and displays the cancel button on the user interface. When receiving the user's click operation on the cancel button on the user interface, the reverse processing instruction is executed, that is, the processing function corresponding to the previous processing instruction is cancelled.
- the processing function is a processing function of adding a media resource
- the corresponding processing parameter includes the media resource identifier corresponding to the processing function
- the to-be-processed video clip is processed based on the processing parameter to obtain the processed
- the video includes:
- the application obtains the corresponding media content in the media resource library of the corresponding processing function according to the media resource identifier, and the media resource library includes multiple media resources corresponding to the processing function, and each type of media resource Each has a corresponding media resource identifier, so that the corresponding media resource can be obtained by matching the media resource identifier.
- the application program mounts the media resource on the video clip to be processed, and the processed video can be obtained.
- the application obtains the corresponding filter resource package from the filter resource library according to the filter resource identifier, and adds the filter resource package to the video clip to be processed for rendering to get the processed image video.
- the processing function is the processing function of adding media resources
- the corresponding processing parameters include the media resource identifier corresponding to the processing function and the content parameters to be added
- the video to be processed is based on the processing parameters. Fragments are processed, and the processed video is obtained, including:
- the processing parameters received by the application include the text content parameters to be added input by the user and the text effect resource package identifier selected by the user.
- the application obtains the corresponding text effect resource package from the text effect resource library according to the text effect resource package identifier.
- Text effect resource package and process the text to be added according to the text effect resource package to obtain the processed text with text effect to be added, and use it as a media resource to be added. Add text to the video clip to be processed for rendering to obtain the processed video.
- displaying the editing identifier corresponding to the processing function in the track editing area includes:
- the editing track corresponding to the processing function is displayed in the track editing area, where the editing track corresponding to the processing function takes the starting editing point as a reference point and is aligned with the editing track of the corresponding to-be-processed video clip on the time axis and is superimposed and displayed in parallel.
- the editing track corresponding to the processing function is superimposed and displayed in parallel with the editing track of the to-be-processed video, and the editing track corresponding to the processing function is aligned with the corresponding to-be-processed video clip on the time axis, and the editing track corresponding to the processing function is aligned with the corresponding to-be-processed
- the datum point of the video segment alignment is the starting editing point of the video to be processed.
- the editing track 301 corresponding to special effect type 1 starts editing Point B is the reference point and corresponds to the editing track 302 of the corresponding to-be-processed video segment (that is, the editing track 302 of the to-be-processed video segment, starting from the timeline ruler 303 to the right and ending with the dotted line in the editing track 302). Align on the time axis 304. As can be seen from FIG. 3a, the editing track 301 of the special effect type 1 and the editing track 302 of the video clip to be processed are aligned and correspond to 00:05 to 00:20 on the time axis 304.
- the method may further include:
- the navigation bar corresponding to the processing function is displayed in the processing function navigation area.
- the application program can realize the linkage between the track editing area and the processing function navigation area by receiving the user's selection operation of the editing identifier. Specifically, when the user selects a processing function corresponding edit mark in the track editing area, the application will display the navigation bar corresponding to the processing function in the processing function navigation area, and the navigation bar displays the processing function associated with the processing function. Processing function.
- processing function 1 for example, "replace special effect”
- Processing function 2 for example, "copy special effect”
- Processing function n for example, "delete special effect”
- other trigger buttons for processing functions related to adding special effects.
- displaying the navigation bar corresponding to the processing function in the processing function navigation area includes:
- the application program Upon receiving the selection operation of the edit track corresponding to the processing function, the application program updates the state information in the visual model, and sends the updated state information to the navigation bar manager through the visual model;
- a navigation bar is created through the navigation bar manager and displayed in the processing function navigation area.
- the view model and the navigation bar manager are both functional components within the application.
- the state information indicates the selected state of the edit flag
- the state information may indicate that the edit flag is selected or not.
- the application program updates the state information of the edit mark corresponding to the processing function in the view model to the selected state, and the view model sends the updated state information
- the navigation bar manager determines that the status information of the edit flag corresponding to the processing function is updated to the selected state, and then creates the navigation bar corresponding to the processing function corresponding to the edit flag and
- the navigation bar is displayed, for example, the processing function associated with the processing function can be displayed in the navigation bar.
- the method may further include:
- the updated editing identification information is sent to the track manager through the view model, so that the track manager displays the editing identification corresponding to the processing function in the track editing area according to the updated editing identification information.
- view model and track manager are both functional components within the application.
- the application After obtaining the processed video, the application sends a processing completion notification to the view model to inform the view model that it has completed processing the currently pending video segment.
- the view model sends the updated editing identification information to the track manager, for example, the updated editing identification information is determined according to the corresponding processing function.
- the track manager creates a corresponding edit mark according to the received updated edit mark information, and displays the created edit mark in the track edit area.
- the method before receiving a trigger operation for any processing function, the method further includes:
- the navigation bar corresponding to the processing function used to process the video clip to be processed is displayed.
- the user in addition to sliding the editing track of the to-be-processed video in the track editing area to select the starting edit point of the to-be-processed video, the user can also select the editing track of the to-be-processed video, or the system automatically selects the editing of the to-be-processed video track.
- the to-be-processed video may include at least one video segment, for example, the selected video segment is determined to be the to-be-processed video segment.
- the application can also display the navigation bar corresponding to the processing function for processing the video clip to be processed in the processing function navigation area while determining the video clip to be processed, that is, by receiving the user’s
- the application program displays the navigation bar corresponding to the processing function for processing the video clip to be processed in the processing function navigation area.
- a preview screen of the processed video processed by the processing function is displayed in the video preview area, and displayed in the track editing area
- the edit ID corresponding to the processing function including:
- the corresponding preview screen is displayed in the video preview area, and the editing logo corresponding to the processing function is displayed in the track editing area, where the editing logo and the editing track of the to-be-processed video are overlapped and displayed in the track editing area.
- the application program has determined the corresponding segment to be processed by receiving the user's selection operation on the editing track of the to-be-processed video.
- the application program processes the video clip to be processed based on the function identifier and processing parameters in the processing instruction to obtain the processed video .
- the application program displays the corresponding preview screen in the video preview area, and displays the editing logo corresponding to the processing function in the track editing area.
- the editing identifier may include an editing effect identifier, and the editing effect identifier may be overlapped and displayed in the track editing area with the editing track of the video to be processed.
- the selection operation of the editing track of the video to be processed may correspond to at least one processing function of editing the video to be processed.
- the application program upon receiving the user's selection operation on the editing track of the to-be-processed video, the application program displays a navigation bar corresponding to the processing function for processing the to-be-processed video clip in the processing function navigation area 313.
- the application receives the user's click operation on the video segmentation processing function in the navigation bar, and also receives the user's movement operation on the to-be-processed video segment 311 in the editing track of the to-be-processed video, the application can obtain the video The processing instruction corresponding to the segmentation function.
- the processing instruction may include the video segmentation function identifier and processing parameters (for example, the processing parameters include the current operating video segment identifier and the target location time of the video segment, etc.).
- the application program is based on the processing instruction to process the video
- the segment is processed to get the processed video.
- the application can also display the corresponding preview screen in the video preview area, and display the editing effect identifier 312 corresponding to the processing function in the track editing area, as shown in Figure 3b, the editing effect identifier 312 and the to-be-processed
- the editing track of the video is superimposed and displayed in the track editing area.
- the navigation bar where the processing function for processing the video clip to be processed is displayed in the processing function navigation area includes:
- the application program When receiving the user's selection operation on the editing track of the to-be-processed video, the application program updates the state information in the visual model, and sends the updated state information to the navigation bar manager through the visual model;
- the application creates a navigation bar through the navigation bar manager and displays the navigation bar in the processing function navigation area.
- the linkage between the track editing area and the processing function navigation area can be realized by selecting the editing track of the video to be processed.
- the processing function navigation area will display a navigation bar corresponding to the processing function used to process the video clip to be processed, and the navigation bar displays that can be used to process the video clip to be processed Processing function.
- the editing identifier corresponding to the triggered processing function can be displayed in the track editing area, thereby realizing the processing of the functional navigation area and the track. Linkage of editing area.
- the application updates the state information of the editing track corresponding to the processing function in the view model to the selected state, and the view model sends the updated state information to the track manager.
- the track manager determines that the status information of the editing track corresponding to the processing function is updated to the selected state, and displays or selects the corresponding editing track.
- the application program displays the selection of the editing track of the video to be processed in the track editing area; after receiving the add media resource processing in the function navigation area When the function is triggered, the application program displays the editing track corresponding to the processing function in the track editing area.
- a video clip adding button is displayed on the editing track of the to-be-processed video. Therefore, the method may further include:
- the editing track of the video to be processed is updated.
- the application program updates the editing track of the video to be processed.
- the application program can determine the time point corresponding to the position of the time axis ruler, and the application program adds the video clip to be added to the current waiting time according to the position of the time point on the time axis corresponding to the current to-be-processed video. Process the video and update the editing track of the video to be processed.
- the application program will add the video clip to be added from the start point of the time axis to the current video to be processed Medium; if the time point is located to the right of the center point of the time axis corresponding to the current to-be-processed video, the application will start adding the to-be-added video segment from the end time point of the current to-be-processed video.
- the application program can also add the to-be-added video segment to the current to-be-processed video according to other implementations, which is not limited in the present disclosure.
- the trigger operation includes click, text/voice input, and touch input (for example, the movement operation of moving the to-be-processed video clip 311 in FIG. 3b), etc., and the user sends out the trigger operation to execute the corresponding processing function.
- the application program can obtain the processing instruction of the corresponding processing function according to the user's trigger operation.
- the device 400 may include: a to-be-processed video receiving module 401, a to-be-processed video display module 402, and a to-be-processed video processing module 403 ,in:
- the to-be-processed video receiving module 401 is used to receive the to-be-processed video
- the to-be-processed video display module 402 is configured to display the preview screen of the to-be-processed video through the video preview area on the display interface, display the editing track of the to-be-processed video through the track editing area, and display at least one processing function through the processing function navigation area;
- the to-be-processed video processing module 403 is used to display a preview screen of the processed video processed by any processing function in the video preview area when a trigger operation for any processing function is received, and display the preview image in the track editing area
- the editing logo corresponding to any processing function, where the editing logo and the editing track of the to-be-processed video are superimposed and displayed in the track editing area.
- the preview screen is displayed through the video preview area, the editing mark is displayed through the track editing area, and the processing function to be selected is displayed through the processing function navigation area.
- the video clip to be processed is processed to obtain the processed video
- the preview screen of the processed video is displayed in the video preview area
- the editing logo corresponding to the processing function is displayed in the track editing area.
- the device further includes a starting edit point determination module, configured to:
- the to-be-processed video processing module may include: a first processing instruction acquisition module, a first processed video acquisition module, and a first preview and track display module, wherein:
- the first processing instruction acquisition module is configured to acquire a processing instruction corresponding to the processing function when a trigger operation for any processing function is received, where the processing instruction includes the function identification and processing parameters of the processing function;
- the first processed video acquisition module is used to determine the video clip to be processed based on the function identifier and the starting edit point, and process the video clip to be processed based on the processing parameters to obtain the processed video;
- the first preview and track display module is used to display the corresponding preview screen in the video preview area based on the processed video, and display the editing logo corresponding to the processing function in the track editing area, where the editing logo is based on the starting editing point Click and align with the editing track of the video clip to be processed on the timeline.
- the processing parameter includes a media resource identifier corresponding to the processing function
- the first processed video acquisition module is specifically configured to:
- the first preview and track display module is specifically used for:
- the editing track corresponding to the processing function is displayed in the track editing area, where the editing track corresponding to the processing function takes the starting editing point as a reference point and is aligned with the editing track of the corresponding to-be-processed video clip on the time axis and is superimposed and displayed in parallel.
- the device may further include a first linkage module for:
- the navigation bar corresponding to the processing function is displayed in the processing function navigation area.
- the first linkage module is specifically used for:
- a navigation bar is created through the navigation bar manager and displayed in the processing function navigation area.
- the device further includes an edit logo creation and display module, configured to:
- the updated editing identification information is sent to the track manager through the view model, so that the track manager displays the editing identification corresponding to the processing function in the track editing area according to the updated editing identification information.
- the device further includes a second linkage module for:
- the navigation bar where the processing function used to process the video clip to be processed is displayed.
- the to-be-processed video processing module may include: a second processing instruction acquisition module, a second processed video acquisition module, and a second preview and track display module, wherein:
- the second processing instruction acquisition module is configured to acquire the processing instruction corresponding to the processing function when a trigger operation for any processing function is received;
- the second processed video acquisition module is configured to process the to-be-processed video segment based on the processing instruction to obtain the processed video;
- the second preview and track display module is used to display the corresponding preview screen in the video preview area based on the processed video, and display the editing logo corresponding to the processing function in the track editing area, where the editing logo and the editing track of the video to be processed Overlap and superimpose in the track editing area.
- the second linkage module is specifically used for:
- a navigation bar is created through the navigation bar manager and displayed in the processing function navigation area.
- a video clip adding button is displayed on the editing track of the to-be-processed video, and the device further includes an editing track update module for:
- the editing track of the video to be processed is updated.
- modules may be implemented as software components executed on one or more general-purpose processors, and may also be implemented as hardware that performs certain functions or a combination thereof, such as programmable logic devices and/or application specific integrated circuits.
- these modules may be embodied in the form of software products, and the software products may be stored in non-volatile storage media. These non-volatile storage media include Devices, mobile terminals, etc.) implement the methods described in the embodiments of the present invention.
- the aforementioned modules may also be implemented on a single device or distributed on multiple devices. The functions of these modules can be combined with each other, or can be further split into multiple sub-modules.
- the video processing devices in the foregoing embodiments may include mobile terminals, such as smart phones, palmtop computers, tablet computers, wearable devices with display screens, etc., and may also include computer equipment, such as desktop computers, notebook computers, all-in-one computers, etc. .
- FIG. 5 shows a schematic structural diagram of an electronic device (for example, a terminal device or a server that executes the method shown in FIG. 1) 500 suitable for implementing the embodiments of the present disclosure.
- the electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablets), PMPs (portable multimedia players), vehicle-mounted terminals (e.g. Car navigation terminals), mobile terminals such as wearable devices, and fixed terminals such as digital TVs, desktop computers, and the like.
- the electronic device shown in FIG. 5 is only an example, and should not bring any limitation to the function and scope of use of the embodiments of the present disclosure.
- the electronic device includes: a memory and a processor, where the memory is used to store programs for executing the methods described in the foregoing method embodiments; the processor is configured to execute the programs stored in the memory.
- the processor here may be referred to as the processing device 501 described below, and the memory may include at least one of a read-only memory (ROM) 502, a random access memory (RAM) 503, and a storage device 508, specifically as follows Shown:
- the electronic device 500 may include a processing device (such as a central processing unit, a graphics processor, etc.) 501, which may be loaded into a random access device according to a program stored in a read-only memory (ROM) 502 or from a storage device 508.
- the program in the memory (RAM) 503 executes various appropriate actions and processing.
- various programs and data necessary for the operation of the electronic device 500 are also stored.
- the processing device 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504.
- An input/output (I/O) interface 505 is also connected to the bus 504.
- the following devices can be connected to the I/O interface 505: including input devices 506 such as touch screens, touch pads, keyboards, mice, cameras, microphones, accelerometers, gyroscopes, etc.; including, for example, liquid crystal displays (LCD), speakers, vibrations
- input devices 506 such as touch screens, touch pads, keyboards, mice, cameras, microphones, accelerometers, gyroscopes, etc.
- LCD liquid crystal displays
- An output device 507 such as a device
- a storage device 508 such as a magnetic tape, a hard disk, etc.
- the communication device 509 may allow the electronic device 500 to perform wireless or wired communication with other devices to exchange data.
- FIG. 5 shows an electronic device with various devices, it should be understood that it is not required to implement or have all of the illustrated devices. It may alternatively be implemented or provided with more or fewer devices.
- an embodiment of the present disclosure includes a computer program product, which includes a computer program carried on a non-transitory computer readable medium, and the computer program contains program code for executing the method shown in the flowchart.
- the computer program may be downloaded and installed from the network through the communication device 509, or installed from the storage device 508, or installed from the ROM 502.
- the processing device 501 the above-mentioned functions defined in the method of the embodiment of the present disclosure are executed.
- the aforementioned computer-readable storage medium in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium or any combination of the two.
- the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or a combination of any of the above. More specific examples of computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
- a computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
- a computer-readable signal medium may include a data signal propagated in a baseband or as a part of a carrier wave, and a computer-readable program code is carried therein. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
- the computer-readable signal medium may also be any computer-readable medium other than the computer-readable storage medium.
- the computer-readable signal medium may send, propagate or transmit the program for use by or in combination with the instruction execution system, apparatus, or device .
- the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to: wire, optical cable, RF (Radio Frequency), etc., or any suitable combination of the above.
- the client and server can communicate with any currently known or future developed network protocol such as HTTP (HyperText Transfer Protocol), and can communicate with digital data in any form or medium.
- Communication e.g., communication network
- Examples of communication networks include local area networks (“LAN”), wide area networks (“WAN”), the Internet (for example, the Internet), and end-to-end networks (for example, ad hoc end-to-end networks), as well as any currently known or future research and development network of.
- the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or it may exist alone without being installed in the electronic device.
- the aforementioned computer-readable medium carries one or more programs, and when the aforementioned one or more programs are executed by the electronic device, the electronic device:
- the processing function is triggered, the preview screen of the processed video processed by the processing function is displayed in the video preview area, and the editing logo corresponding to the processing function is displayed in the track editing area, where the editing logo and the editing track of the video to be processed It is superimposed and displayed in the track editing area.
- the computer program code used to perform the operations of the present disclosure can be written in one or more programming languages or a combination thereof.
- the above-mentioned programming languages include, but are not limited to, object-oriented programming languages—such as Java, Smalltalk, C++, and Including conventional procedural programming languages-such as "C" language or similar programming languages.
- the program code can be executed entirely on the user's computer, partly on the user's computer, executed as an independent software package, partly on the user's computer and partly executed on a remote computer, or entirely executed on the remote computer or server.
- the remote computer can be connected to the user’s computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (for example, using an Internet service provider to pass Internet connection).
- LAN local area network
- WAN wide area network
- each block in the flowchart or block diagram can represent a module, program segment, or part of code, and the module, program segment, or part of code contains one or more for realizing the specified logic function.
- Executable instructions can also occur in a different order from the order marked in the drawings. For example, two blocks shown one after the other can actually be executed substantially in parallel, or they can sometimes be executed in the reverse order, depending on the functions involved.
- each block in the block diagram and/or flowchart, and the combination of the blocks in the block diagram and/or flowchart can be implemented by a dedicated hardware-based system that performs the specified functions or operations Or it can be realized by a combination of dedicated hardware and computer instructions.
- modules or units involved in the described embodiments of the present disclosure can be implemented in software or hardware.
- the name of the module or unit does not constitute a limitation on the unit itself under certain circumstances.
- the video receiving module to be processed can also be described as a "module for receiving the video to be processed".
- exemplary types of hardware logic components include: Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), Application Specific Standard Product (ASSP), System on Chip (SOC), Complex Programmable Logical device (CPLD) and so on.
- FPGA Field Programmable Gate Array
- ASIC Application Specific Integrated Circuit
- ASSP Application Specific Standard Product
- SOC System on Chip
- CPLD Complex Programmable Logical device
- a machine-readable medium may be a tangible medium, which may contain or store a program for use by the instruction execution system, apparatus, or device or in combination with the instruction execution system, apparatus, or device.
- the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
- the machine-readable medium may include, but is not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or equipment, or any suitable combination of the foregoing.
- machine-readable storage media would include electrical connections based on one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
- RAM random access memory
- ROM read-only memory
- EPROM or flash memory erasable programmable read-only memory
- CD-ROM compact disk read only memory
- magnetic storage device or any suitable combination of the foregoing.
- the present disclosure provides a video processing method, including:
- the preview screen of the processed video processed by the processing function is displayed in the video preview area, and the editing logo corresponding to the processing function is displayed in the track editing area, where the editing logo and The editing track of the video to be processed is superimposed and displayed in the track editing area.
- the method before receiving a trigger operation for any processing function, the method further includes:
- the preview screen of the processed video processed by the processing function is displayed in the video preview area, and the processed video is displayed in the track editing area.
- the editing logo corresponding to the function including:
- a processing instruction corresponding to the processing function is acquired, where the processing instruction includes the function identification and processing parameters of the processing function;
- the corresponding preview screen is displayed in the video preview area, and the editing logo corresponding to the processing function is displayed in the track editing area.
- the editing mark uses the starting editing point as a reference point and is aligned with the editing track of the to-be-processed video clip on the time axis.
- the processing parameter includes the processing function and the media resource identifier corresponding to the processing function
- processing the to-be-processed video segment based on the processing parameter to obtain the processed video includes:
- displaying the editing identifier corresponding to the processing function in the track editing area includes:
- the editing track corresponding to the processing function is displayed in the track editing area, where the editing track corresponding to the processing function takes the start editing point as the reference point and is aligned with the editing track of the corresponding to-be-processed video clip on the time axis and superimposed and displayed in parallel.
- the method further includes:
- a navigation bar corresponding to the processing function is displayed in the processing function navigation area.
- displaying the navigation bar corresponding to the processing function in the processing function navigation area includes:
- a navigation bar is created through the navigation bar manager and displayed in the processing function navigation area.
- the method further includes:
- the updated editing identification information is sent to the track manager through the view model, so that the track manager displays the editing identification corresponding to the processing function in the track editing area according to the updated editing identification information.
- the method before receiving a trigger operation for any processing function, the method further includes:
- the navigation bar where the processing function used to process the video clip to be processed is displayed.
- the preview screen of the processed video processed by the processing function is displayed in the video preview area, and the processed video is displayed in the track editing area.
- the editing logo corresponding to the function including:
- the corresponding preview screen is displayed in the video preview area, and the editing logo corresponding to the processing function is displayed in the track editing area, where the editing logo and the editing track of the to-be-processed video are overlapped and displayed in the track editing area.
- the navigation bar where the processing function for processing the video clip to be processed is displayed in the processing function navigation area includes:
- a navigation bar is created through the navigation bar manager and displayed in the processing function navigation area.
- a video clip adding button is displayed on the edit track of the to-be-processed video, and the method further includes:
- the editing track of the video to be processed is updated.
- the present disclosure provides a video processing device, including:
- the to-be-processed video receiving module is used to receive the to-be-processed video
- the to-be-processed video display module is used to display the preview screen of the to-be-processed video through the video preview area on the display interface, to display the editing track of the to-be-processed video through the track editing area, and to display at least one processing function through the processing function navigation area;
- the to-be-processed video processing module is used to display the preview screen of the processed video processed by the processing function in the video preview area when a trigger operation for any processing function is received, and display the corresponding processing function in the track editing area
- the editing logo where the editing logo and the editing track of the to-be-processed video are superimposed and displayed in the track editing area.
- the device further includes a starting edit point determination module, configured to:
- the to-be-processed video processing module may include: a first processing instruction acquisition module, a first processed video acquisition module, and a first preview and track display module, wherein:
- the first processing instruction acquisition module is configured to acquire a processing instruction corresponding to the processing function when a trigger operation for any processing function is received, where the processing instruction includes the function identification and processing parameters of the processing function;
- the first processed video acquisition module is used to determine the video clip to be processed based on the function identifier and the starting edit point, and process the video clip to be processed based on the processing parameters to obtain the processed video;
- the first preview and track display module is used to display the corresponding preview screen in the video preview area based on the processed video, and display the editing logo corresponding to the processing function in the track editing area, where the editing logo is based on the starting editing point Click and align with the editing track of the video clip to be processed on the timeline.
- the processing parameter includes a media resource identifier corresponding to the processing function
- the first processed video acquisition module is specifically configured to:
- the first preview and track display module is specifically used for:
- the editing track corresponding to the processing function is displayed in the track editing area, where the editing track corresponding to the processing function takes the starting editing point as a reference point and is aligned with the editing track of the corresponding to-be-processed video clip on the time axis and is superimposed and displayed in parallel.
- the device may further include a first linkage module for:
- the navigation bar corresponding to the processing function is displayed in the processing function navigation area.
- the first linkage module is specifically used for:
- a navigation bar is created through the navigation bar manager and displayed in the processing function navigation area.
- the device further includes an edit logo creation and display module, which is used to:
- the updated editing identification information is sent to the track manager through the view model, so that the track manager displays the editing identification corresponding to the processing function in the track editing area according to the updated editing identification information.
- the device further includes a second linkage module for:
- the navigation bar where the processing function used to process the video clip to be processed is displayed.
- the to-be-processed video processing module may include: a second processing instruction acquisition module, a second processed video acquisition module, and a second preview and track display module, where:
- the second processing instruction acquisition module is configured to acquire the processing instruction corresponding to the processing function when a trigger operation for any processing function is received;
- the second processed video acquisition module is configured to process the to-be-processed video segment based on the processing instruction to obtain the processed video;
- the second preview and track display module is used to display the corresponding preview screen in the video preview area based on the processed video, and display the editing logo corresponding to the processing function in the track editing area, where the editing logo and the editing track of the video to be processed Overlap and superimpose in the track editing area.
- the second linkage module is specifically used for:
- a navigation bar is created through the navigation bar manager and displayed in the processing function navigation area.
- a video clip adding button is displayed on the editing track of the to-be-processed video, and the device further includes an editing track update module for:
- the editing track of the video to be processed is updated.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Marketing (AREA)
- Business, Economics & Management (AREA)
- Television Signal Processing For Recording (AREA)
- User Interface Of Digital Computer (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
Claims (15)
- 一种视频处理方法,其特征在于,包括:接收待处理视频;在显示界面通过视频预览区显示所述待处理视频的预览画面,通过轨道编辑区显示所述待处理视频的编辑轨道,并通过处理功能导航区显示至少一个处理功能;在接收到针对任一处理功能的触发操作时,在所述视频预览区显示经所述任一处理功能处理得到的处理后的视频的预览画面,并在所述轨道编辑区显示所述任一处理功能对应的编辑标识,其中,所述编辑标识与所述待处理视频的编辑轨道在所述轨道编辑区中叠加显示。
- 根据权利要求1所述的方法,其特征在于,在接收到针对任一处理功能的触发操作之前,所述方法还包括:接收所述轨道编辑区上的滑动操作,确定所述待处理视频的起始编辑点。
- 根据权利要求2所述的方法,其特征在于,所述在接收到针对任一处理功能的触发操作时,在所述视频预览区显示经所述任一处理功能处理得到的处理后的视频的预览画面,并在所述轨道编辑区显示所述任一处理功能对应的编辑标识,包括:在接收到针对任一处理功能的触发操作时,获取所述任一处理功能对应的处理指令,其中,所述处理指令包括所述任一处理功能的功能标识和处理参数;基于所述功能标识和所述起始编辑点确定待处理视频片段,并基于所述处理参数对所述待处理视频片段进行处理,得到所述处理后的视频;基于所述处理后的视频,在所述视频预览区显示对应的预览画面,并在所述轨道编辑区显示所述任一处理功能对应的编辑标识,其中,所述编辑标识以所述起始编辑点为基准点并与所述待处理视频片段的编辑轨道在时间轴上对齐。
- 根据权利要求3所述的方法,其特征在于,所述处理参数包括与所述任一处理功能对应的媒体资源标识,所述基于所述处理参数对所述待处理视频片段进行处理,得到所述处理后的视频,包括:基于所述媒体资源标识获取对应的媒体资源;以及将所述对应的媒体资源挂载在所述待处理视频片段上,得到所述处理后的视频。
- 根据权利要求3或4所述的方法,其特征在于,所述在所述轨道编辑区显示所述任一处理功能对应的编辑标识,包括:在所述轨道编辑区显示所述任一处理功能对应的编辑轨道,其中,所述任一处理功能对应的编辑轨道以所述起始编辑点为基准点并与对应的待处理视频片段的编辑轨道在时间轴上对齐且并行叠加显示。
- 根据权利要求5所述的方法,其特征在于,所述方法还包括:在接收到针对所述任一处理功能对应的编辑轨道的选中操作时,在所述处理功能导航区显示与所述任一处理功能对应的导航栏。
- 根据权利要求6所述的方法,其特征在于,所述在所述处理功能导航区显示与所述任一处理功能对应的导航栏,包括:在接收到所述选中操作时,更新视觉模型中的状态信息,并通过所述视觉模型将更新后的状态信息发送至导航栏管理器;响应于所述更新后的状态信息,通过所述导航栏管理器创建所述导航栏并在所述处理功能导航区显示所述导航栏。
- 根据权利要求3-7任一项所述的方法,其特征在于,在得到所述处理后的视频之后,所述方法还包括:向视图模型发送处理完成通知;响应于所述处理完成通知,通过所述视图模型向轨道管理器发送更新后的编辑标识信息,以使所述轨道管理器根据所述更新后的编辑标识信息,在所述轨道编辑区显示所述处理功能对应的编辑标识。
- 根据权利要求1所述的方法,其特征在于,在接收到针对任一处理功能的触发操作之前,所述方法还包括:接收所述待处理视频的编辑轨道上的选中操作,确定所述待处理视频的待处理视频片段;以及在所述处理功能导航区显示用于处理所述待处理视频片段的处理功能所在的导航栏。
- 根据权利要求9所述的方法,其特征在于,所述在接收到针对任一处理功能的触发操作时,在所述视频预览区显示经所述任一处理功能处理得到的处理后的视频的预览画面,并在所述轨道编辑区显示所述任一处理功能对应的编辑标识,包括:在接收到针对所述任一处理功能的触发操作时,获取所述任一处理功能对应的处理指令;基于所述处理指令对所述待处理视频片段进行处理,得到所述处理后的视频;基于所述处理后的视频,在所述视频预览区显示对应的预览画面,并在所述轨道编辑区显示所述任一处理功能对应的编辑标识,其中,所述编辑标识与所述待处理视频的编辑轨道在所述轨道编辑区中重叠叠加显示。
- 根据权利要求9或10所述的方法,其特征在于,所述在所述处理功能导航区显示用于处理所述待处理视频片段的处理功能所在的导航栏,包括:在接收到所述选中操作时,更新视觉模型中的状态信息,并通过所述视觉模型将更新后的状态信息发送至导航栏管理器;响应于所述更新后的状态信息,通过所述导航栏管理器创建所述导航栏并在所述处理功能导航区显示所述导航栏。
- 根据权利要求1-11任一项所述的方法,其特征在于,在所述待处理视频的编辑 轨道上显示有视频片段添加按钮,所述方法还包括:在通过所述视频片段添加按钮接收到视频添加操作时,获取所述视频添加操作对应的待添加视频片段;根据所述待添加视频片段,更新所述待处理视频的编辑轨道。
- 一种视频处理装置,其特征在于,包括:待处理视频接收模块,用于接收待处理视频;待处理视频显示模块,用于在显示界面通过视频预览区显示所述待处理视频的预览画面,通过轨道编辑区显示所述待处理视频的编辑轨道,并通过处理功能导航区显示至少一个处理功能;待处理视频处理模块,用于在接收到针对任一处理功能的触发操作时,在所述视频预览区显示经所述任一处理功能处理得到的处理后的视频的预览画面,并在所述轨道编辑区显示所述任一处理功能对应的编辑标识,其中,所述编辑标识与所述待处理视频的编辑轨道在所述轨道编辑区中叠加显示。
- 一种电子设备,其特征在于,包括存储器和处理器;所述存储器中存储有计算机程序;所述处理器,用于执行所述计算机程序,在所述计算机程序被执行时,使得所述电子设备实现根据权利要求1至12中任一项所述的方法。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现根据权利要求1至12中任一项所述的方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112022025162A BR112022025162A2 (pt) | 2020-06-10 | 2021-05-24 | Método e aparelho de processamento de vídeo, dispositivo eletrônico e meio de armazenamento legível por computador |
KR1020227043537A KR102575848B1 (ko) | 2020-06-10 | 2021-05-24 | 비디오 처리 방법 및 장치, 전자 장치, 및 컴퓨터 판독가능 저장매체 |
EP21822298.2A EP4152758A4 (en) | 2020-06-10 | 2021-05-24 | VIDEO PROCESSING METHOD AND APPARATUS, ELECTRONIC DEVICE, AND COMPUTER READABLE RECORDING MEDIUM |
JP2022576468A JP7307864B2 (ja) | 2020-06-10 | 2021-05-24 | ビデオ処理方法、装置、電子機器及びコンピュータ可読記憶媒体 |
US18/064,128 US20230107220A1 (en) | 2020-06-10 | 2022-12-09 | Video processing method and apparatus, electronic device, and computer readable storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010525242.8A CN111629252B (zh) | 2020-06-10 | 2020-06-10 | 视频处理方法、装置、电子设备及计算机可读存储介质 |
CN202010525242.8 | 2020-06-10 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/064,128 Continuation US20230107220A1 (en) | 2020-06-10 | 2022-12-09 | Video processing method and apparatus, electronic device, and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021249168A1 true WO2021249168A1 (zh) | 2021-12-16 |
Family
ID=72272209
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/095502 WO2021249168A1 (zh) | 2020-06-10 | 2021-05-24 | 视频处理方法、装置、电子设备及计算机可读存储介质 |
Country Status (7)
Country | Link |
---|---|
US (1) | US20230107220A1 (zh) |
EP (1) | EP4152758A4 (zh) |
JP (1) | JP7307864B2 (zh) |
KR (1) | KR102575848B1 (zh) |
CN (1) | CN111629252B (zh) |
BR (1) | BR112022025162A2 (zh) |
WO (1) | WO2021249168A1 (zh) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111629252B (zh) * | 2020-06-10 | 2022-03-25 | 北京字节跳动网络技术有限公司 | 视频处理方法、装置、电子设备及计算机可读存储介质 |
CN112822543A (zh) * | 2020-12-30 | 2021-05-18 | 北京达佳互联信息技术有限公司 | 视频处理方法及装置、电子设备、存储介质 |
CN112804590B (zh) * | 2020-12-31 | 2023-03-28 | 上海深柯视觉艺术设计有限公司 | 一种基于ue4的视频编辑*** |
CN113473204B (zh) * | 2021-05-31 | 2023-10-13 | 北京达佳互联信息技术有限公司 | 一种信息展示方法、装置、电子设备及存储介质 |
CN113347479B (zh) * | 2021-05-31 | 2023-05-26 | 网易(杭州)网络有限公司 | 多媒体素材的编辑方法、装置、设备及存储介质 |
CN113891127A (zh) * | 2021-08-31 | 2022-01-04 | 维沃移动通信有限公司 | 视频编辑方法、装置及电子设备 |
CN113784165B (zh) * | 2021-09-17 | 2023-05-05 | 北京快来文化传播集团有限公司 | 短视频滤镜叠加方法、***、电子设备及可读存储介质 |
CN114253653A (zh) * | 2021-09-27 | 2022-03-29 | 北京字节跳动网络技术有限公司 | 视频处理方法、视频处理装置和计算机可读存储介质 |
CN113873329A (zh) * | 2021-10-19 | 2021-12-31 | 深圳追一科技有限公司 | 视频处理方法、装置、计算机存储介质及电子设备 |
CN114125181B (zh) * | 2021-11-22 | 2024-06-21 | 北京达佳互联信息技术有限公司 | 视频处理方法和视频处理装置 |
CN115460455B (zh) * | 2022-09-06 | 2024-02-09 | 上海硬通网络科技有限公司 | 一种视频剪辑方法、装置、设备及存储介质 |
CN117453085B (zh) * | 2023-12-22 | 2024-06-25 | 荣耀终端有限公司 | 显示方法、电子设备及存储介质 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100086136A (ko) * | 2009-01-22 | 2010-07-30 | (주)코드엑트 | 동영상 편집 시스템 |
CN104811629A (zh) * | 2015-04-21 | 2015-07-29 | 上海极食信息科技有限公司 | 一种在同一界面内获取视频素材并对其制作的方法及*** |
CN109120997A (zh) * | 2018-09-30 | 2019-01-01 | 北京微播视界科技有限公司 | 视频处理方法、装置、终端和介质 |
CN109495791A (zh) * | 2018-11-30 | 2019-03-19 | 北京字节跳动网络技术有限公司 | 一种视频贴纸的添加方法、装置、电子设备及可读介质 |
CN110198486A (zh) * | 2019-05-28 | 2019-09-03 | 上海哔哩哔哩科技有限公司 | 一种预览视频素材的方法、计算机设备及可读存储介质 |
CN110381371A (zh) * | 2019-07-30 | 2019-10-25 | 维沃移动通信有限公司 | 一种视频剪辑方法及电子设备 |
CN110636382A (zh) * | 2019-09-17 | 2019-12-31 | 北京达佳互联信息技术有限公司 | 在视频中添加可视对象的方法、装置、电子设备及存储介质 |
CN111629252A (zh) * | 2020-06-10 | 2020-09-04 | 北京字节跳动网络技术有限公司 | 视频处理方法、装置、电子设备及计算机可读存储介质 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5999173A (en) * | 1992-04-03 | 1999-12-07 | Adobe Systems Incorporated | Method and apparatus for video editing with video clip representations displayed along a time line |
US9131078B2 (en) * | 2007-07-27 | 2015-09-08 | Lagavulin Limited | Apparatuses, methods, and systems for a portable, image-processing transmitter |
US8555170B2 (en) * | 2010-08-10 | 2013-10-08 | Apple Inc. | Tool for presenting and editing a storyboard representation of a composite presentation |
KR101260834B1 (ko) * | 2010-12-14 | 2013-05-06 | 삼성전자주식회사 | 타임라인 바를 이용한 터치스크린 제어방법, 장치 및 이를 위한 프로그램이 기록된 기록매체 및 사용자 단말 |
KR20130107863A (ko) * | 2012-03-23 | 2013-10-02 | 삼성테크윈 주식회사 | 영상 검색 장치 |
KR101528312B1 (ko) * | 2014-02-14 | 2015-06-11 | 주식회사 케이티 | 영상 편집 방법 및 이를 위한 장치 |
CN107005675B (zh) * | 2014-09-05 | 2019-08-06 | 富士胶片株式会社 | 动态图像编辑装置、动态图像编辑方法及存储介质 |
US20170294212A1 (en) * | 2015-04-10 | 2017-10-12 | OMiro IP LLC | Video creation, editing, and sharing for social media |
US10109319B2 (en) * | 2016-01-08 | 2018-10-23 | Gopro, Inc. | Digital media editing |
US10622021B2 (en) * | 2016-02-19 | 2020-04-14 | Avcr Bilgi Teknolojileri A.S | Method and system for video editing |
CN110582018B (zh) * | 2019-09-16 | 2022-06-10 | 腾讯科技(深圳)有限公司 | 一种视频文件处理的方法、相关装置及设备 |
CN111078348B (zh) * | 2019-12-25 | 2023-06-23 | 广州市百果园信息技术有限公司 | 一种界面管理方法、装置、设备和存储介质 |
-
2020
- 2020-06-10 CN CN202010525242.8A patent/CN111629252B/zh active Active
-
2021
- 2021-05-24 EP EP21822298.2A patent/EP4152758A4/en active Pending
- 2021-05-24 BR BR112022025162A patent/BR112022025162A2/pt unknown
- 2021-05-24 KR KR1020227043537A patent/KR102575848B1/ko active IP Right Grant
- 2021-05-24 JP JP2022576468A patent/JP7307864B2/ja active Active
- 2021-05-24 WO PCT/CN2021/095502 patent/WO2021249168A1/zh unknown
-
2022
- 2022-12-09 US US18/064,128 patent/US20230107220A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100086136A (ko) * | 2009-01-22 | 2010-07-30 | (주)코드엑트 | 동영상 편집 시스템 |
CN104811629A (zh) * | 2015-04-21 | 2015-07-29 | 上海极食信息科技有限公司 | 一种在同一界面内获取视频素材并对其制作的方法及*** |
CN109120997A (zh) * | 2018-09-30 | 2019-01-01 | 北京微播视界科技有限公司 | 视频处理方法、装置、终端和介质 |
CN109495791A (zh) * | 2018-11-30 | 2019-03-19 | 北京字节跳动网络技术有限公司 | 一种视频贴纸的添加方法、装置、电子设备及可读介质 |
CN110198486A (zh) * | 2019-05-28 | 2019-09-03 | 上海哔哩哔哩科技有限公司 | 一种预览视频素材的方法、计算机设备及可读存储介质 |
CN110381371A (zh) * | 2019-07-30 | 2019-10-25 | 维沃移动通信有限公司 | 一种视频剪辑方法及电子设备 |
CN110636382A (zh) * | 2019-09-17 | 2019-12-31 | 北京达佳互联信息技术有限公司 | 在视频中添加可视对象的方法、装置、电子设备及存储介质 |
CN111629252A (zh) * | 2020-06-10 | 2020-09-04 | 北京字节跳动网络技术有限公司 | 视频处理方法、装置、电子设备及计算机可读存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4152758A1 * |
Also Published As
Publication number | Publication date |
---|---|
KR20230016049A (ko) | 2023-01-31 |
CN111629252A (zh) | 2020-09-04 |
BR112022025162A2 (pt) | 2022-12-27 |
JP7307864B2 (ja) | 2023-07-12 |
JP2023527250A (ja) | 2023-06-27 |
CN111629252B (zh) | 2022-03-25 |
EP4152758A4 (en) | 2023-09-06 |
KR102575848B1 (ko) | 2023-09-06 |
EP4152758A1 (en) | 2023-03-22 |
US20230107220A1 (en) | 2023-04-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021249168A1 (zh) | 视频处理方法、装置、电子设备及计算机可读存储介质 | |
WO2021196903A1 (zh) | 视频处理方法、装置、可读介质及电子设备 | |
WO2021008223A1 (zh) | 信息的确定方法、装置及电子设备 | |
WO2021218325A1 (zh) | 视频处理方法、装置、计算机可读介质和电子设备 | |
WO2021135626A1 (zh) | 菜单项选择方法、装置、可读介质及电子设备 | |
WO2020207085A1 (zh) | 信息分享方法、装置、电子设备及存储介质 | |
WO2022077996A1 (zh) | 一种多媒体数据处理方法、装置、电子设备和存储介质 | |
CN111629151B (zh) | 视频合拍方法、装置、电子设备及计算机可读介质 | |
WO2021135648A1 (zh) | 直播间礼物列表配置方法、装置、介质及电子设备 | |
WO2021244480A1 (zh) | 主题视频生成方法、装置、电子设备及可读存储介质 | |
WO2022194031A1 (zh) | 视频的处理方法、装置、电子设备和存储介质 | |
WO2021197024A1 (zh) | 视频特效配置文件生成方法、视频渲染方法及装置 | |
WO2022042389A1 (zh) | 搜索结果的展示方法、装置、可读介质和电子设备 | |
WO2023165515A1 (zh) | 拍摄方法、装置、电子设备和存储介质 | |
WO2021218318A1 (zh) | 视频传输方法、电子设备和计算机可读介质 | |
CN110070592B (zh) | 特效包的生成方法、装置和硬件装置 | |
WO2023116479A1 (zh) | 视频的发布方法、装置、电子设备、存储介质和程序产品 | |
JP2023528398A (ja) | ライブ配信ルームの作成方法、装置、電子機器及び記憶媒体 | |
CN115278275B (zh) | 信息展示方法、装置、设备、存储介质和程序产品 | |
WO2021227953A1 (zh) | 图像特效配置方法、图像识别方法、装置及电子设备 | |
WO2021089002A1 (zh) | 多媒体信息处理方法、装置、电子设备及介质 | |
WO2024041568A1 (zh) | 一种直播视频处理方法、装置、设备及介质 | |
WO2021057738A1 (zh) | 用户界面展示方法、装置、计算机可读介质及电子设备 | |
WO2024041556A1 (zh) | 一种连麦展示方法、装置、电子设备、计算机可读介质 | |
WO2024037491A1 (zh) | 媒体内容处理方法、装置、设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21822298 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022576468 Country of ref document: JP Kind code of ref document: A Ref document number: 20227043537 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021822298 Country of ref document: EP Effective date: 20221212 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112022025162 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112022025162 Country of ref document: BR Kind code of ref document: A2 Effective date: 20221208 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |