CN104822087A - Processing method and apparatus of video segment - Google Patents

Processing method and apparatus of video segment Download PDF

Info

Publication number
CN104822087A
CN104822087A CN201510219043.3A CN201510219043A CN104822087A CN 104822087 A CN104822087 A CN 104822087A CN 201510219043 A CN201510219043 A CN 201510219043A CN 104822087 A CN104822087 A CN 104822087A
Authority
CN
China
Prior art keywords
video
frequency band
file
merging
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510219043.3A
Other languages
Chinese (zh)
Other versions
CN104822087B (en
Inventor
余合兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Tvmining Juyuan Media Technology Co Ltd
Original Assignee
Wuxi Tvmining Juyuan Media Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Tvmining Juyuan Media Technology Co Ltd filed Critical Wuxi Tvmining Juyuan Media Technology Co Ltd
Priority to CN201510219043.3A priority Critical patent/CN104822087B/en
Publication of CN104822087A publication Critical patent/CN104822087A/en
Application granted granted Critical
Publication of CN104822087B publication Critical patent/CN104822087B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23424Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for inserting or substituting an advertisement

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The invention discloses a processing method and apparatus of a video segment. With the method and apparatus, video data of a video segment that has been processed by cutting and combination can be protected from being lost. The method comprises: obtaining a video file formed by a plurality of TS stream fragment files; selecting a video segment from the video file and combining all TS stream fragment files of the video segment to generate a new video file; and according to an initial point and an ending point, of the video segment, carrying out cutting on the video file. On the basis of the scheme, because all TS stream fragment files of the to-be-cut video segment are combined and then are cut and combined, no video data loss phenomenon occurs during the editing process of cutting and combination, thereby avoiding re editing after data loss. Therefore, the editing efficiency is improved and the user experience is enhanced.

Description

A kind of processing method of video-frequency band and device
Technical field
The present invention relates to Internet technical field, particularly a kind of processing method of video-frequency band and device.
Background technology
Along with the development of science and technology, comprehensive resource abundant to all kinds of films and television programs, news, broadcast, chat, education and game etc. is carried out representing shared by TV of today and the Internet with the form of video, therefore video has become people's work, study, the social and irreplaceable important way of amusement and recreation.
At present, in the video obtained at first after photographing, all contain useful, carry out the video-frequency band play at follow-up needs, also contains useless, in the follow-up video-frequency band not needing to carry out playing, therefore user needs by useful video-frequency band cutting out usually, then cutting useful video-frequency band is out merged.At present, fragment file is flowed owing to comprising multiple TS in a transport stream video file, and each TS flows the duration that fragment file is about 10 seconds, therefore in video file, the starting point of the video-frequency band of cutting can be positioned at the centre position that a TS flows fragment file, and its end point also can be positioned at the centre position that another TS flows fragment file, now, if directly flow the centre position cutting separately of fragment file from two TS, then video data can be lost; Or, the video-frequency band of cutting is needed to be the wherein part that certain TS flows fragment file, also namely its starting point and end point flow in fragment file at same TS, if when now needing its cutting and this video-frequency band after cutting and other video-frequency band merged, the situation of obliterated data also can be there is.
Summary of the invention
The invention provides a kind of processing method and device of video-frequency band, the video data in order to protection cutting and the video-frequency band after merging is not lost.
According to the first aspect of the embodiment of the present invention, a kind of processing method of video-frequency band is provided, comprises:
Obtain the video file be made up of multiple TS stream fragment file;
Selecting video section in described video file, and after all being merged by the described TS stream fragment file at described video-frequency band place, generate new video file;
According to the starting point of described video-frequency band and end point, new described video file is cut.
In one embodiment, described in described video file selecting video section, and the described TS at described video-frequency band place is flowed after fragment file all merges, generates new video file, comprising:
Selecting video section in described video file;
Flow the starting point and the end point that mark described video-frequency band in fragment file at described TS, and the TS of the starting point and end point difference place that record described video-frequency band flows fragment file;
The TS TS flowed between the starting point of fragment file and described video-frequency band and end point at the starting point of described video-frequency band and end point place is respectively flowed after fragment file all merges, generate new video file.
In one embodiment, described in described video file selecting video section, and the described TS at described video-frequency band place is flowed after fragment file all merges, generates new video file, also comprise:
In described video file, choose the first merging video-frequency band and second merge video-frequency band;
Flow in fragment file the starting point and the end point that mark the described first starting point merging video-frequency band and end point and described second respectively and merges video-frequency band at described TS, and record the starting point of described first merging video-frequency band and end point and described second and merge the TS that the starting point of video-frequency band and end point distinguish place and flow fragment file;
The TS merging the starting point of video-frequency band and end point place respectively by described first flows fragment file, described first TS merged between the starting point of video-frequency band and end point and flows TS that fragment file, described second merges the starting point of video-frequency band and end point place respectively and flow fragment file, described second TS merged between the starting point of video-frequency band and end point and flow after fragment file all merges, and generates new video file.
In one embodiment, the described starting point according to described video-frequency band and end point are cut new described video file, comprising:
Merges the starting point of video-frequency band and end point is cut new described video file in the described first starting point and end point and described second merging video-frequency band, obtain described first merging video-frequency band to be combined and described second merging video-frequency band;
Merge video-frequency band and described second by described first and merge video-frequency band merging.
In one embodiment, described in described video file selecting video section, and the described TS at described video-frequency band place is flowed after fragment file all merges, generates new video file, also comprise:
When choose described first merge video-frequency band and described second merge the reproduction time of video-frequency band in described video file occur occuring simultaneously time, the first merging point is labeled as by playing preceding described first starting point merging video-frequency band of initial time order, the end point playing end time sequentially posterior described first merging video-frequency band or described second merging video-frequency band is labeled as the second merging point, and records the TS stream fragment file that described first merges point and a described second merging place;
Merge TS that point and described second merges some place respectively to flow fragment file and described first and merge point and described second TS merged between point by described first and flow after fragment file all merges, generate new video file;
The described starting point according to described video-frequency band and end point are cut new described video file, also comprise:
Merge a some place at described first merging point and described second to cut new described video file, obtain the described first merging video-frequency band and described second after merging and merge video-frequency band.
According to the second aspect of the embodiment of the present invention, a kind of processing unit of video-frequency band is also provided, comprises:
Acquisition module, for obtaining the video file be made up of multiple TS stream fragment file;
Merge module, for selecting video section in described video file, and after all being merged by the described TS stream fragment file at described video-frequency band place, generate new video file;
Processing module, for cutting new described video file according to the starting point of described video-frequency band and end point.
In one embodiment, described merging module comprises:
First chooses submodule, for selecting video section in described video file;
First mark submodule, for flowing the starting point and the end point that mark described video-frequency band in fragment file at described TS, and the TS of the starting point and end point difference place that record described video-frequency band flows fragment file;
First merges submodule, for being flowed by the TS TS flowed between the starting point of fragment file and described video-frequency band and end point at the starting point of described video-frequency band and end point place respectively after fragment file all merges, generates new video file.
In one embodiment, described merging module also comprises:
Second chooses submodule, merges video-frequency band for choosing the first merging video-frequency band and second in described video file;
Second mark submodule, for flowing in fragment file the starting point and the end point that mark the described first starting point merging video-frequency band and end point and described second respectively and merges video-frequency band at described TS, and record the starting point of described first merging video-frequency band and end point and described second and merge the TS that the starting point of video-frequency band and end point distinguish place and flow fragment file;
Second merges submodule, TS for merging the starting point of video-frequency band and end point place respectively by described first flows fragment file, described first TS merged between the starting point of video-frequency band and end point and flows TS that fragment file, described second merges the starting point of video-frequency band and end point place respectively and flow fragment file, described second TS merged between the starting point of video-frequency band and end point and flow after fragment file all merges, and generates new video file.
In one embodiment, described processing module comprises:
First cutting submodule, starting point and end point and described second for merging video-frequency band described first merge the starting point of video-frequency band and end point is cut new described video file, obtain described first merging video-frequency band to be combined and described second merging video-frequency band;
Video-frequency band merges submodule, merges video-frequency band merging for merging video-frequency band and described second by described first.
In one embodiment, described merging module also comprises:
3rd mark submodule, for when choose described first merge video-frequency band and described second merge the reproduction time of video-frequency band in described video file occur occuring simultaneously time, the first merging point is labeled as by playing preceding described first starting point merging video-frequency band of initial time order, the end point playing end time sequentially posterior described first merging video-frequency band or described second merging video-frequency band is labeled as the second merging point, and records the TS stream fragment file that described first merges point and a described second merging place;
3rd merges submodule, and the TS merging some place respectively for merging point and described second by described first flows fragment file and described first and merges point and described second TS merged between point and flow after fragment file all merges, and generates new video file;
Described processing module also comprises:
Second cutting submodule, cuts new described video file for merging a some place at described first merging point and described second, obtains the described first merging video-frequency band and described second after merging and merges video-frequency band.
The technical scheme that the embodiment of the present invention provides can produce following beneficial effect: obtain the video file be made up of multiple TS stream fragment file; Selecting video section in described video file, and after all being merged by the described TS stream fragment file at described video-frequency band place, generate new video file; According to the starting point of described video-frequency band and end point, new described video file is cut.After first the TS at video-frequency band place to be cut stream fragment file all merges by the program, again it cut and merge, therefore in cutting with the editing process merged, all can not there is the situation of video data loss, and then the situation that will re-start editor after avoiding loss of data occurs, improve editorial efficiency, improve Consumer's Experience.
Other features and advantages of the present invention will be set forth in the following description, and, partly become apparent from specification, or understand by implementing the present invention.Object of the present invention and other advantages realize by structure specifically noted in write specification, claims and accompanying drawing and obtain.
Below by drawings and Examples, technical scheme of the present invention is described in further detail.
Accompanying drawing explanation
Accompanying drawing is used to provide a further understanding of the present invention, and forms a part for specification, together with embodiments of the present invention for explaining the present invention, is not construed as limiting the invention.
In the accompanying drawings:
Fig. 1 is the flow chart of the processing method of a kind of video-frequency band of the present invention according to an exemplary embodiment;
Fig. 2 is the execution mode flow chart of step S20 in the processing method of a kind of video-frequency band of the present invention according to an exemplary embodiment;
Fig. 3 is the execution mode flow chart of step S20 in the processing method of the another kind of video-frequency band of the present invention according to an exemplary embodiment;
Fig. 4 is the execution mode flow chart of step S30 in the processing method of a kind of video-frequency band of the present invention according to an exemplary embodiment;
Fig. 5 is the execution mode flow chart of step S20 and step S30 in the processing method of another video-frequency band of the present invention according to an exemplary embodiment;
Fig. 6 is the block diagram of the processing unit of a kind of video-frequency band of the present invention according to an exemplary embodiment;
Fig. 7 is the block diagram merging module 62 in the processing unit of a kind of video-frequency band of the present invention according to an exemplary embodiment;
Fig. 8 is the block diagram merging module 62 in the processing unit of the another kind of video-frequency band of the present invention according to an exemplary embodiment;
Fig. 9 is the block diagram of processing module 63 in the processing unit of a kind of video-frequency band of the present invention according to an exemplary embodiment;
Figure 10 is the block diagram merging module 62 in the processing unit of another video-frequency band of the present invention according to an exemplary embodiment;
Figure 11 is the block diagram of processing module 63 in the processing unit of another video-frequency band of the present invention according to an exemplary embodiment.
Embodiment
Below in conjunction with accompanying drawing, the preferred embodiments of the present invention are described, should be appreciated that preferred embodiment described herein is only for instruction and explanation of the present invention, is not intended to limit the present invention.
Disclosure embodiment provides a kind of processing method of video-frequency band, and the video data for the protection of cutting and the video-frequency band after merging is not lost.As shown in Figure 1, the method comprising the steps of S10-S30:
In step slo, obtain and flow by multiple TS (Transport Stream: transport stream) video file that fragment file forms.Also namely, in the disclosure, described video to be played is made up of multiple TS fragment stream file, and the duration of each described TS fragment stream file is very short, and each TS flows fragment file and can be about 10S.
In step S20, selecting video section in described video file, and after all being merged by the described TS stream fragment file at described video-frequency band place, generate new video file.Also be, first selecting video section determine that all TS corresponding in described video file of described video-frequency band flow fragment files in described video file, and after all TS stream fragment files at described video-frequency band place are all merged, form a new video file, in the new described video file generated after consolidation, described video-frequency band is cut out, the loss of video data in cutting process can not be caused.
In one embodiment, as shown in Figure 2, step S20 can comprise:
Step S201, in described video file selecting video section; Also be, if only need to cut a video-frequency band in described video file, then first selecting video section determine that all TS corresponding in described video file of described video-frequency band flow fragment files in described video file, determine simultaneously the starting point of described video-frequency band in described video file and end point distinguish corresponding TS and flow fragment file.
Step S202, flow the starting point and the end point that mark described video-frequency band in fragment file at described TS, and the TS that the starting point recording described video-frequency band and end point distinguish place flows fragment file; Determine in step s 201 the starting point of described video-frequency band in described video file and end point distinguish after corresponding TS flows fragment file, starting point in described video file and end point TS corresponding respectively flow the starting point and end point that mark described video-frequency band in fragment file respectively, the TS at the starting point of described video-frequency band and end point place is respectively flowed fragment file record simultaneously.
Step S203, the TS TS flowed between the starting point of fragment file and described video-frequency band and end point at the starting point of described video-frequency band and end point place respectively flowed after fragment file all merges, generate new video file.Also be, after all TS stream fragment files (TS that the TS at the starting point of described video-frequency band and end point difference place flows between the starting point of fragment file and described video-frequency band and end point flows fragment file) at described video-frequency band place are all merged, form a new video file, in the new described video file generated after consolidation, described video-frequency band is cut out, the loss of video data in cutting process can not be caused.
In one embodiment, as shown in Figure 3, step S20 also can comprise:
Step S204, in described video file, choose the first merging video-frequency band and second merge video-frequency band; Also be, if need to cut two video-frequency bands in described video file, and two video-frequency bands after cutting are merged, in described video file, then first choose the first merging video-frequency band and second merge video-frequency band, and determine described first merge video-frequency band and second merge video-frequency band in described video file distinguish corresponding all TS and flow fragment file, determine simultaneously described first merging video-frequency band and second merge the starting point of video-frequency band in described video file and end point distinguish corresponding TS and flow fragment file.
Step S205, flow in fragment file the starting point and the end point that mark the described first starting point merging video-frequency band and end point and described second respectively and merges video-frequency band at described TS, and record the starting point of described first merging video-frequency band and end point and described second and merge the TS that the starting point of video-frequency band and end point distinguish place and flow fragment file; Determine in step S204 described first merge video-frequency band and second merge the starting point of video-frequency band in described video file and end point distinguish after corresponding TS flows fragment file, starting point in described video file and end point TS corresponding respectively flow in fragment file and mark described first respectively and merge starting point and the end point that video-frequency band and second merges video-frequency band, merge video-frequency band and second simultaneously by described first and merge the TS that the starting point of video-frequency band and end point distinguish place and flow fragment file record.
Step S206, the TS merging the starting point of video-frequency band and end point place respectively by described first flow fragment file, described first TS merged between the starting point of video-frequency band and end point and flow TS that fragment file, described second merges the starting point of video-frequency band and end point place respectively and flow fragment file, described second TS merged between the starting point of video-frequency band and end point and flow after fragment file all merges, and generate new video file.Also be, (TS of the described first starting point and end point difference place merging video-frequency band flows fragment file all TS that described first merging video-frequency band and second merges video-frequency band place to be flowed fragment file, described first TS merged between the starting point of video-frequency band and end point flows fragment file, the TS of the described second starting point and end point difference place merging video-frequency band flows fragment file, described second TS merged between the starting point of video-frequency band and end point flow fragment file) whole merge after, form a new video file, in the new described video file generated after consolidation, described first merging video-frequency band and second is merged video-frequency band to cut out respectively, and automatically merge, the loss of video data all can not be caused in described cutting and merging process.
In step s 30, according to the starting point of described video-frequency band and end point, new described video file is cut.Also namely, in the new described video file generated in step S20, video-frequency band is cut, the loss of video data can not be caused.
In one embodiment, as shown in Figure 4, step S30 can comprise:
Step S301, merges the starting point of video-frequency band and end point is cut new described video file in the described first starting point and end point and described second merging video-frequency band, obtain described first merging video-frequency band to be combined and described second merging video-frequency band; Also be, in step S206, merging all TS that video-frequency band and second merges video-frequency band place by described first flows after fragment file all merges, form a new video file, in the new described video file generated after consolidation, described first merging video-frequency band and second is merged video-frequency band to cut out respectively, the loss of video data can not be caused in described cutting process.
Step S302, by described first merge video-frequency band and described second merge video-frequency band merge.Also namely, in step S301, described first merging video-frequency band and second is merged video-frequency band and cut out respectively, and automatically merge in this step, in described merging process, the loss of video data can not be caused equally.
In one embodiment, as shown in Figure 5, step S20 also can comprise:
Step S207, when choose described first merge video-frequency band and described second merge the reproduction time of video-frequency band in described video file occur occuring simultaneously time, the first merging point is labeled as by playing preceding described first starting point merging video-frequency band of initial time order, the end point playing end time sequentially posterior described first merging video-frequency band or described second merging video-frequency band is labeled as the second merging point, and records the TS stream fragment file that described first merges point and a described second merging place, also be, when choose in described video file in step S204 described first merge video-frequency band and second merge the reproduction time of video-frequency band in described video file occur occuring simultaneously time, now, because reproduction time overlaps, therefore cut the mode of video-frequency band in step S204 to step 206 and be not suitable for this kind of situation, now, because choose described first merges video-frequency band broadcasting initial time order front, therefore, the starting point that described first merges video-frequency band can be labeled as the first merging point, and merge described first o'clock as first cut point, need to look for reproduction time order in last play time simultaneously, now need detect described first merge video-frequency band or described second merge video-frequency band end point in which reproduction time order in the end, and the reproduction time detected order is labeled as the second merging point in the end point that last described first merges video-frequency band or described second merging video-frequency band, and merge described second o'clock as second cut point, meanwhile, the TS merging point and described second merging point place to described first flows fragment file and carries out record.
Step S208, to merge TS that point and described second merges some place respectively by described first and flow fragment file and described first and merge point and described second TS merged between point and flow after fragment file all merges, generate new video file; Occur occuring simultaneously because described first merging video-frequency band and second merges the reproduction time of video-frequency band in described video file, therefore the described first video segment merged between point and described second merging point is continuous print, now, only need to merge described first TS that point and described second merges some place respectively to flow fragment file and described first and merge point and described second TS merged between point and flow after fragment file all merges, generate new video file.
In one embodiment, as shown in Figure 5, step S30 also can comprise:
Step S303, described first merge point and described second merge some a place new described video file is cut, obtain merge after described first merge video-frequency band and described second merge video-frequency band.Also be, be merged into new video file in step S208 after, because the described first video segment merged between point and described second merging point is continuous print, therefore merging point and described second described first merges between point, do not need to cut, only need to merge a some place at described first merging point and described second to cut new described video file, and the described first merging video-frequency band and second after cutting merges video-frequency band originally continuous print, does not need the process carrying out again merging.In the process of above-mentioned cutting, the loss of video data can not be caused.
The said method that the embodiment of the present invention provides, obtains the video file be made up of multiple TS stream fragment file; Selecting video section in described video file, and after all being merged by the described TS stream fragment file at described video-frequency band place, generate new video file; According to the starting point of described video-frequency band and end point, new described video file is cut.After first the TS at video-frequency band place to be cut stream fragment file all merges by the program, again it cut and merge, therefore in cutting with the editing process merged, all can not there is the situation of video data loss, and then the situation that will re-start editor after avoiding loss of data occurs, improve editorial efficiency, improve Consumer's Experience.
The processing method of the video-frequency band that the corresponding embodiment of the present invention provides, the present invention also provides the processing unit of video-frequency band, and as shown in Figure 6, this device can comprise:
Acquisition module 61, for obtaining the video file be made up of multiple TS stream fragment file;
Merge module 62, for selecting video section in described video file, and after all being merged by the described TS stream fragment file at described video-frequency band place, generate new video file;
Processing module 63, for cutting new described video file according to the starting point of described video-frequency band and end point.
In one embodiment, as shown in Figure 7, described merging module 62 comprises:
First chooses submodule 621, for selecting video section in described video file;
First mark submodule 622, for flowing the starting point and the end point that mark described video-frequency band in fragment file at described TS, and the TS of the starting point and end point difference place that record described video-frequency band flows fragment file;
First merges submodule 623, for being flowed by the TS TS flowed between the starting point of fragment file and described video-frequency band and end point at the starting point of described video-frequency band and end point place respectively after fragment file all merges, generates new video file.
In one embodiment, as shown in Figure 8, described merging module 62 also comprises:
Second chooses submodule 624, merges video-frequency band for choosing the first merging video-frequency band and second in described video file;
Second mark submodule 625, for flowing in fragment file the starting point and the end point that mark the described first starting point merging video-frequency band and end point and described second respectively and merges video-frequency band at described TS, and record the starting point of described first merging video-frequency band and end point and described second and merge the TS that the starting point of video-frequency band and end point distinguish place and flow fragment file;
Second merges submodule 626, TS for merging the starting point of video-frequency band and end point place respectively by described first flows fragment file, described first TS merged between the starting point of video-frequency band and end point and flows TS that fragment file, described second merges the starting point of video-frequency band and end point place respectively and flow fragment file, described second TS merged between the starting point of video-frequency band and end point and flow after fragment file all merges, and generates new video file.
In one embodiment, as shown in Figure 9, described processing module 63 comprises:
First cutting submodule 631, starting point and end point and described second for merging video-frequency band described first merge the starting point of video-frequency band and end point is cut new described video file, obtain described first merging video-frequency band to be combined and described second merging video-frequency band;
Video-frequency band merges submodule 632, merges video-frequency band merging for merging video-frequency band and described second by described first.
In one embodiment, as shown in Figure 10, described merging module 62 also comprises:
3rd mark submodule 627, for when choose described first merge video-frequency band and described second merge the reproduction time of video-frequency band in described video file occur occuring simultaneously time, the first merging point is labeled as by playing preceding described first starting point merging video-frequency band of initial time order, the end point playing end time sequentially posterior described first merging video-frequency band or described second merging video-frequency band is labeled as the second merging point, and records the TS stream fragment file that described first merges point and a described second merging place;
3rd merges submodule 628, and the TS merging some place respectively for merging point and described second by described first flows fragment file and described first and merges point and described second TS merged between point and flow after fragment file all merges, and generates new video file;
In one embodiment, as shown in figure 11, described processing module 63 also comprises:
Second cutting submodule 633, cuts new described video file for merging a some place at described first merging point and described second, obtains the described first merging video-frequency band and described second after merging and merges video-frequency band.
Those skilled in the art should understand, embodiments of the invention can be provided as method, system or computer program.Therefore, the present invention can adopt the form of complete hardware embodiment, completely software implementation or the embodiment in conjunction with software and hardware aspect.And the present invention can adopt in one or more form wherein including the upper computer program implemented of computer-usable storage medium (including but not limited to magnetic disc store and optical memory etc.) of computer usable program code.
The present invention describes with reference to according to the flow chart of the method for the embodiment of the present invention, equipment (system) and computer program and/or block diagram.Should understand can by the combination of the flow process in each flow process in computer program instructions realization flow figure and/or block diagram and/or square frame and flow chart and/or block diagram and/or square frame.These computer program instructions can being provided to the processor of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing device to produce a machine, making the instruction performed by the processor of computer or other programmable data processing device produce device for realizing the function of specifying in flow chart flow process or multiple flow process and/or block diagram square frame or multiple square frame.
These computer program instructions also can be stored in can in the computer-readable memory that works in a specific way of vectoring computer or other programmable data processing device, the instruction making to be stored in this computer-readable memory produces the manufacture comprising command device, and this command device realizes the function of specifying in flow chart flow process or multiple flow process and/or block diagram square frame or multiple square frame.
These computer program instructions also can be loaded in computer or other programmable data processing device, make on computer or other programmable devices, to perform sequence of operations step to produce computer implemented process, thus the instruction performed on computer or other programmable devices is provided for the step realizing the function of specifying in flow chart flow process or multiple flow process and/or block diagram square frame or multiple square frame.
Obviously, those skilled in the art can carry out various change and modification to the present invention and not depart from the spirit and scope of the present invention.Like this, if these amendments of the present invention and modification belong within the scope of the claims in the present invention and equivalent technologies thereof, then the present invention is also intended to comprise these change and modification.

Claims (10)

1. a processing method for video-frequency band, is characterized in that, comprising:
Obtain the video file be made up of multiple TS stream fragment file;
Selecting video section in described video file, and after all being merged by the described TS stream fragment file at described video-frequency band place, generate new video file;
According to the starting point of described video-frequency band and end point, new described video file is cut.
2. the method for claim 1, is characterized in that, described in described video file selecting video section, and the described TS at described video-frequency band place is flowed after fragment file all merges, generates new video file, comprising:
Selecting video section in described video file;
Flow the starting point and the end point that mark described video-frequency band in fragment file at described TS, and the TS of the starting point and end point difference place that record described video-frequency band flows fragment file;
The TS TS flowed between the starting point of fragment file and described video-frequency band and end point at the starting point of described video-frequency band and end point place is respectively flowed after fragment file all merges, generate new video file.
3. the method for claim 1, is characterized in that, described in described video file selecting video section, and the described TS at described video-frequency band place is flowed after fragment file all merges, generates new video file, also comprise:
In described video file, choose the first merging video-frequency band and second merge video-frequency band;
Flow in fragment file the starting point and the end point that mark the described first starting point merging video-frequency band and end point and described second respectively and merges video-frequency band at described TS, and record the starting point of described first merging video-frequency band and end point and described second and merge the TS that the starting point of video-frequency band and end point distinguish place and flow fragment file;
The TS merging the starting point of video-frequency band and end point place respectively by described first flows fragment file, described first TS merged between the starting point of video-frequency band and end point and flows TS that fragment file, described second merges the starting point of video-frequency band and end point place respectively and flow fragment file, described second TS merged between the starting point of video-frequency band and end point and flow after fragment file all merges, and generates new video file.
4. method as claimed in claim 3, it is characterized in that, the described starting point according to described video-frequency band and end point are cut new described video file, comprising:
Merges the starting point of video-frequency band and end point is cut new described video file in the described first starting point and end point and described second merging video-frequency band, obtain described first merging video-frequency band to be combined and described second merging video-frequency band;
Merge video-frequency band and described second by described first and merge video-frequency band merging.
5. method as claimed in claim 3, is characterized in that, described in described video file selecting video section, and the described TS at described video-frequency band place is flowed after fragment file all merges, generates new video file, also comprise:
When choose described first merge video-frequency band and described second merge the reproduction time of video-frequency band in described video file occur occuring simultaneously time, the first merging point is labeled as by playing preceding described first starting point merging video-frequency band of initial time order, the end point playing end time sequentially posterior described first merging video-frequency band or described second merging video-frequency band is labeled as the second merging point, and records the TS stream fragment file that described first merges point and a described second merging place;
Merge TS that point and described second merges some place respectively to flow fragment file and described first and merge point and described second TS merged between point by described first and flow after fragment file all merges, generate new video file;
The described starting point according to described video-frequency band and end point are cut new described video file, also comprise:
Merge a some place at described first merging point and described second to cut new described video file, obtain the described first merging video-frequency band and described second after merging and merge video-frequency band.
6. a processing unit for video-frequency band, is characterized in that, comprising:
Acquisition module, for obtaining the video file be made up of multiple TS stream fragment file;
Merge module, for selecting video section in described video file, and after all being merged by the described TS stream fragment file at described video-frequency band place, generate new video file;
Processing module, for cutting new described video file according to the starting point of described video-frequency band and end point.
7. device as claimed in claim 6, it is characterized in that, described merging module comprises:
First chooses submodule, for selecting video section in described video file;
First mark submodule, for flowing the starting point and the end point that mark described video-frequency band in fragment file at described TS, and the TS of the starting point and end point difference place that record described video-frequency band flows fragment file;
First merges submodule, for being flowed by the TS TS flowed between the starting point of fragment file and described video-frequency band and end point at the starting point of described video-frequency band and end point place respectively after fragment file all merges, generates new video file.
8. device as claimed in claim 6, it is characterized in that, described merging module also comprises:
Second chooses submodule, merges video-frequency band for choosing the first merging video-frequency band and second in described video file;
Second mark submodule, for flowing in fragment file the starting point and the end point that mark the described first starting point merging video-frequency band and end point and described second respectively and merges video-frequency band at described TS, and record the starting point of described first merging video-frequency band and end point and described second and merge the TS that the starting point of video-frequency band and end point distinguish place and flow fragment file;
Second merges submodule, TS for merging the starting point of video-frequency band and end point place respectively by described first flows fragment file, described first TS merged between the starting point of video-frequency band and end point and flows TS that fragment file, described second merges the starting point of video-frequency band and end point place respectively and flow fragment file, described second TS merged between the starting point of video-frequency band and end point and flow after fragment file all merges, and generates new video file.
9. device as claimed in claim 8, it is characterized in that, described processing module comprises:
First cutting submodule, starting point and end point and described second for merging video-frequency band described first merge the starting point of video-frequency band and end point is cut new described video file, obtain described first merging video-frequency band to be combined and described second merging video-frequency band;
Video-frequency band merges submodule, merges video-frequency band merging for merging video-frequency band and described second by described first.
10. device as claimed in claim 8, it is characterized in that, described merging module also comprises:
3rd mark submodule, for when choose described first merge video-frequency band and described second merge the reproduction time of video-frequency band in described video file occur occuring simultaneously time, the first merging point is labeled as by playing preceding described first starting point merging video-frequency band of initial time order, the end point playing end time sequentially posterior described first merging video-frequency band or described second merging video-frequency band is labeled as the second merging point, and records the TS stream fragment file that described first merges point and a described second merging place;
3rd merges submodule, and the TS merging some place respectively for merging point and described second by described first flows fragment file and described first and merges point and described second TS merged between point and flow after fragment file all merges, and generates new video file;
Described processing module also comprises:
Second cutting submodule, cuts new described video file for merging a some place at described first merging point and described second, obtains the described first merging video-frequency band and described second after merging and merges video-frequency band.
CN201510219043.3A 2015-04-30 2015-04-30 A kind of processing method and processing device of video-frequency band Expired - Fee Related CN104822087B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510219043.3A CN104822087B (en) 2015-04-30 2015-04-30 A kind of processing method and processing device of video-frequency band

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510219043.3A CN104822087B (en) 2015-04-30 2015-04-30 A kind of processing method and processing device of video-frequency band

Publications (2)

Publication Number Publication Date
CN104822087A true CN104822087A (en) 2015-08-05
CN104822087B CN104822087B (en) 2017-11-28

Family

ID=53732232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510219043.3A Expired - Fee Related CN104822087B (en) 2015-04-30 2015-04-30 A kind of processing method and processing device of video-frequency band

Country Status (1)

Country Link
CN (1) CN104822087B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019023967A1 (en) * 2017-08-02 2019-02-07 深圳传音通讯有限公司 Video recording method and video recording device based on intelligent terminal
CN109413486A (en) * 2018-08-30 2019-03-01 安徽四创电子股份有限公司 Multimedia file based on video mark splices playback method
CN111031385A (en) * 2019-12-20 2020-04-17 北京爱奇艺科技有限公司 Video playing method and device
CN105828096B (en) * 2016-05-19 2020-05-15 网宿科技股份有限公司 Method and device for processing media stream file

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101901620A (en) * 2010-07-28 2010-12-01 复旦大学 Automatic generation method and edit method of video content index file and application
CN102196008A (en) * 2010-03-08 2011-09-21 株式会社日立制作所 Peer-to-peer downloading method, video equipment and content transmission method
CN102694966A (en) * 2012-03-05 2012-09-26 天津理工大学 Construction method of full-automatic video cataloging system
CN102780878A (en) * 2011-05-09 2012-11-14 腾讯科技(深圳)有限公司 Method and system for acquiring media files
CN103096184A (en) * 2013-01-18 2013-05-08 深圳市龙视传媒有限公司 Method and device for video editing
CN103745736A (en) * 2013-12-27 2014-04-23 宇龙计算机通信科技(深圳)有限公司 Method of video editing and mobile terminal thereof
CN103763638A (en) * 2014-01-23 2014-04-30 中国联合网络通信集团有限公司 Video resource obtaining method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102196008A (en) * 2010-03-08 2011-09-21 株式会社日立制作所 Peer-to-peer downloading method, video equipment and content transmission method
CN101901620A (en) * 2010-07-28 2010-12-01 复旦大学 Automatic generation method and edit method of video content index file and application
CN102780878A (en) * 2011-05-09 2012-11-14 腾讯科技(深圳)有限公司 Method and system for acquiring media files
CN102694966A (en) * 2012-03-05 2012-09-26 天津理工大学 Construction method of full-automatic video cataloging system
CN103096184A (en) * 2013-01-18 2013-05-08 深圳市龙视传媒有限公司 Method and device for video editing
CN103745736A (en) * 2013-12-27 2014-04-23 宇龙计算机通信科技(深圳)有限公司 Method of video editing and mobile terminal thereof
CN103763638A (en) * 2014-01-23 2014-04-30 中国联合网络通信集团有限公司 Video resource obtaining method and device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105828096B (en) * 2016-05-19 2020-05-15 网宿科技股份有限公司 Method and device for processing media stream file
WO2019023967A1 (en) * 2017-08-02 2019-02-07 深圳传音通讯有限公司 Video recording method and video recording device based on intelligent terminal
CN109413486A (en) * 2018-08-30 2019-03-01 安徽四创电子股份有限公司 Multimedia file based on video mark splices playback method
CN111031385A (en) * 2019-12-20 2020-04-17 北京爱奇艺科技有限公司 Video playing method and device
CN111031385B (en) * 2019-12-20 2022-03-08 北京爱奇艺科技有限公司 Video playing method and device

Also Published As

Publication number Publication date
CN104822087B (en) 2017-11-28

Similar Documents

Publication Publication Date Title
CN104822087A (en) Processing method and apparatus of video segment
CN104185088B (en) A kind of method for processing video frequency and device
CN104869477A (en) Method and device for segmented playing of video
CN104780456A (en) Video dotting and playing method and device
CN105025358A (en) Video playing method and device based on EPG
CN105472207A (en) Method and device for video audio file rendering
CN105530534B (en) A kind of method and apparatus of video clipping
CN104506920A (en) Method and device for playing omnimedia data information
CN104935975A (en) Short video record playing method and apparatus
CN104918051A (en) Video processing method and device
CN104837074B (en) A kind of method to set up and device for showing the time
CN103686343A (en) A method and an apparatus for positioning scenes in a recorded video
JP2016189613A (en) Moving image data editing device, moving image data editing method, reproducer, and program
CN104837061A (en) Method and device for modifying and managing video playlist
CN104994435A (en) Method and device for accurately dotting video resources
CN103514196B (en) Information processing method and electronic equipment
CN105578260A (en) Video editing method and device
CN105611401A (en) Video cutting method and video cutting device
CN104994434A (en) Video playing method and device
CN104683882A (en) Generation and play method and device for multiple speed file of stream medium
CN104853245A (en) Movie preview method and device thereof
CN107948720A (en) A kind of news acquisition methods and device
CN103096164A (en) Handling method and handling device of live broadcast stream
CN104703038A (en) Multimedia processing method and device
CN105049951A (en) Editing method and device for editing list

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A video segment processing method and device

Effective date of registration: 20210104

Granted publication date: 20171128

Pledgee: Inner Mongolia Huipu Energy Co.,Ltd.

Pledgor: WUXI TVMINING MEDIA SCIENCE & TECHNOLOGY Co.,Ltd.

Registration number: Y2020990001517

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20171128

Termination date: 20210430