CN105791974A - Video matching method and apparatus - Google Patents
Video matching method and apparatus Download PDFInfo
- Publication number
- CN105791974A CN105791974A CN201410814202.XA CN201410814202A CN105791974A CN 105791974 A CN105791974 A CN 105791974A CN 201410814202 A CN201410814202 A CN 201410814202A CN 105791974 A CN105791974 A CN 105791974A
- Authority
- CN
- China
- Prior art keywords
- video
- compared
- coding
- frame data
- code stream
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 239000000284 extract Substances 0.000 claims description 31
- 238000004040 coloring Methods 0.000 claims description 18
- 238000000605 extraction Methods 0.000 claims description 13
- 230000008878 coupling Effects 0.000 claims description 6
- 238000010168 coupling process Methods 0.000 claims description 6
- 238000005859 coupling reaction Methods 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000013011 mating Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
The invention discloses a video matching method. The video matching method comprises the following steps of extracting code identifiers of videos to be compared, and based on the code identifiers, determining whether code types of the videos to be compared are the same; when the code types of the videos to be compared are the same, extracting code parameters of the videos to be compared; comparing the code parameters of the videos to be compared; when the code parameters of the videos to be compared are matched, determining that the videos to be compared are matched. The invention further discloses a video matching apparatus. According to the video matching method and the video matching apparatus which are disclosed by the invention, video matching efficiency and accuracy are improved.
Description
Technical field
The present invention relates to video technique field, particularly relate to video matching method and device.
Background technology
Along with computer, the improving constantly of digital media technology, video matching technology is more and more important.Video is the set of continuous print series of frames in time, is the comprehensive media information integrating image, sound, text.Conventional video matching scheme is based primarily upon the content of video and mates, need to browse all of video file, different users extracts different video contents according to their different angles, and the video content extracted is analyzed, video content according to extracting carries out video matching, the result mated judge that whether video content is identical.But traditional video matching needs to browse all of video file, and need to extract the different video content of all time points according to different user perspectives, and in carrying out matching process, need to mate the video content of extraction with all of video file, therefore the time mated is longer and operand is relatively big, and the video content of different time points has certain error in the matching process.
Summary of the invention
Present invention is primarily targeted at a kind of video matching method of proposition and device, it is intended to solve video matching efficiency and the low technical problem of precision.
For achieving the above object, a kind of video matching method provided by the invention, described video matching method comprises the following steps:
Extract the code identification of video to be compared, determine that based on described code identification whether the type of coding of described video to be compared is identical;
When the type of coding of video to be compared is identical, extract the coding parameter of video to be compared;
The coding parameter of comparison video to be compared;
When the coding parameter of video to be compared mates, it is determined that video matching to be compared.
Preferably, when the described type of coding at video to be compared is identical, the step of the coding parameter extracting video to be compared includes:
When the type of coding of video to be compared is identical, extract the audio code stream in video to be compared and image code stream;
When extracting audio code stream and/or image code stream, extract the coding parameter of described audio code stream and/or image code stream.
Preferably, described video matching method also includes:
When the type of coding difference of video to be compared, extract the I frame data of same time point in video to be compared;
Obtain the image parameter that each described I frame data are corresponding;
The image parameter corresponding to described I frame data of same time point in comparison video to be compared;
When in video to be compared, the image parameter corresponding to described I frame data of same time point is mated, it is determined that video matching to be compared.
Preferably, the step of the image parameter that each I frame data of described acquisition are corresponding includes:
Obtain the face information in described I frame data and colouring information;
Using the face information extracted and colouring information as image parameter corresponding to described I frame data.
Additionally, for achieving the above object, the present invention also proposes a kind of video matching device, and described video matching device includes:
Extraction module, for extracting the code identification of video to be compared;
Determine module, for determining that whether the type of coding of described video to be compared is identical based on described code identification;
Extraction module, is additionally operable to, when the type of coding of video to be compared is identical, extract the coding parameter of video to be compared;
Comparing module, for the coding parameter of comparison video to be compared;
Processing module, for when the coding parameter of video to be compared mates, it is determined that video matching to be compared.
Preferably, described extraction module is additionally operable to when the type of coding of video to be compared is identical, extract the audio code stream in video to be compared and image code stream, and when extracting audio code stream and/or image code stream, extract the coding parameter of described audio code stream and/or image code stream.
Preferably, described extraction module is additionally operable to, when the type of coding difference of video to be compared, extract the I frame data of same time point in video to be compared;Described video matching device also includes acquisition module, for obtaining the image parameter that each described I frame data are corresponding;Described comparing module, the image parameter corresponding to described I frame data being additionally operable in comparison video to be compared same time point;Described processing module, during the image parameter corresponding to described I frame data being additionally operable in video to be compared same time point coupling, it is determined that video matching to be compared.
Preferably, described acquisition module includes:
Acquiring unit, is used for the face information and the colouring information that obtain in I frame data;
Processing unit, for using the face information extracted and colouring information as image parameter corresponding to described I frame data.
The video matching method of present invention proposition and device, when the type of coding of video to be compared is identical, coding parameter either directly through video to be compared determines whether video to be compared mates, and need not extract too much content-data and compare, and efficiency is higher and accuracy is high.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of video matching method first embodiment of the present invention;
Fig. 2 is the schematic flow sheet of video matching method the second embodiment of the present invention;
Fig. 3 is the high-level schematic functional block diagram of video matching device first embodiment of the present invention;
Fig. 4 is the high-level schematic functional block diagram of video matching device the second embodiment of the present invention.
The realization of the object of the invention, functional characteristics and advantage will in conjunction with the embodiments, are described further with reference to accompanying drawing.
Detailed description of the invention
Should be appreciated that specific embodiment described herein is only in order to explain the present invention, is not intended to limit the present invention.
The present invention provides a kind of video matching method.
With reference to the schematic flow sheet that Fig. 1, Fig. 1 are video matching method first embodiment of the present invention.
The present embodiment proposes a kind of video matching method, and described video matching method includes step:
Step S10, extracts the code identification of video to be compared, determines that based on described code identification whether the type of coding of described video to be compared is identical;
Step S20, when the type of coding of video to be compared is identical, extracts the coding parameter of video to be compared;
Coding parameter can include the Width (width) of video, Height (height), Frame_rate (frame per second), Bit_depth (bit depth), Bit_rate (code check), Block_align (audio frequency alignment thereof), Sample_rate (audio sample rate), Channel (voice-grade channel number), Extra_data (video file header), Extradata_size (size of byte shared by Extra_data), the parameter informations such as Metadata (video source data).The parameter value that parameters information is corresponding can be obtained when extracting coding parameter.It will be appreciated by persons skilled in the art that the difference due to type of coding, the parameter information comprised in the coding parameter that the video of different coding type is corresponding is different.
It will be appreciated by those skilled in the art that to be, comprise audio code stream and/or image code stream in video, and audio code stream and coding parameter corresponding to image code stream are different, then step S10 includes:
When the type of coding of video to be compared is identical, extract the audio code stream in video to be compared and image code stream;
When extracting audio code stream and/or image code stream, extract the coding parameter of described audio code stream and/or image code stream.
Step S30, the coding parameter of comparison video to be compared;
Step S40, when the coding parameter of video to be compared mates, it is determined that video matching to be compared.
When the coding parameter of comparison video to be compared, can distinguishing the parameter value that in comparison coding parameter, parameters information is corresponding, it is all equal that the coding parameter of video to be compared mates the parameter value that in the coding parameter referring to video to be compared, parameters information is corresponding.Step S20 further comprises the steps of: when the coding parameter of video to be compared mates, it is determined that video matching to be compared or do not mate, and after determining video matching to be compared or not mating, the output result of exportable video to be compared.Skilled person is it is understood that when the quantity of video to be compared is more than three, can show the video of coupling in multiple video to be compared and the information of unmatched video.
The video matching method that the present embodiment proposes, when the type of coding of video to be compared is identical, the coding parameter either directly through video to be compared determines whether video to be compared mates, and need not extract too much content-data and compare, and efficiency is higher and accuracy is high.
Further, for improving the accuracy of video comparison, with reference to Fig. 2, proposing video comparison method the second embodiment of the present invention based on first embodiment, in the present embodiment, described video matching method also includes:
Step S50, when the type of coding difference of described video to be compared, extracts the I frame data of same time point in described video to be compared;
Video data includes I frame, P frame and B frame, and I frame is basis frame, it is common that first frame of each GOP (a kind of video compression technology that MPEG uses), through moderately compressing, as the reference point of random access, it is possible to as image.
In the present embodiment, the I frame data of same time point refer to the I frame data less than predetermined threshold value of the time difference between same time point.These I frame data are the I frame data put sometime in the time period that video to be compared is identical, such as, video to be compared includes film source one and film source two, the recording time of film source one is 10:00-12:00, and the recording time of film source two is 11:00-12:00, the time period that then film source one is identical with in film source two is 11:00-12:00, then desirable 11:30 is the same time point of film source one and film source two.Or, these I frame data are the I frame data of multiple time points in the time period that video to be compared is identical, such as, video to be compared includes film source one and film source two, the recording time of film source one is 10:00-12:00, and the recording time of film source two is 11:00-12:00, then the time period that film source one is identical with in film source two is 11:00-12:00, then desirable 11:00,11:10,11:20,11:30,11:40,11:50,12:00 are the same time point of film source one and film source two.When the image parameter then having the I frame data of any time point corresponding is not mated, it is determined that video to be compared does not mate.
Step S60, obtains the image parameter that each described I frame data are corresponding;
In the present embodiment, the image parameter that described I frame data are corresponding can include face information and colouring information etc., and namely described step S40 includes step: obtains the face information in described I frame data and colouring information;Using the face information extracted and colouring information as image parameter corresponding to described I frame data.
Described face information can include face characteristic (such as shape etc.), face texture and colouring information, this face characteristic can pass through Adaboost algorithm calculating and obtain, and face texture can pass through the calculating of Gabor algorithm and obtain, this Adaboost algorithm and Gabor algorithm are existing algorithm, and concrete calculating process does not repeat at this.And the calculation of colouring information is also prior art, do not repeat at this.
Step S70, the image parameter corresponding to described I frame data of same time point in video to be compared described in comparison;
Step S80, when in video to be compared, the image parameter corresponding to described I frame data of same time point is mated, it is determined that video matching to be compared.
It will be appreciated by persons skilled in the art that when the type of coding difference of described video to be compared, also can directly think that video to be compared does not mate.
When the type of coding difference of video to be compared, by the comparison of same time point I frame data so that the comparison of video to be compared is more accurate.
The present invention further provides a kind of video matching device.
With reference to the high-level schematic functional block diagram that Fig. 3, Fig. 3 are video matching device first embodiment of the present invention.
It is emphasized that, to one skilled in the art, functional block diagram shown in Fig. 3 is only the exemplary plot of a preferred embodiment, and those skilled in the art, around the functional module of the video matching device shown in Fig. 3, can carry out supplementing of new functional module easily;The title of each functional module is self-defined title, only for assisting each program function block understanding this video matching device, being not used in restriction technical scheme, the core of technical solution of the present invention is, the function that the functional module of each self-defined title to be reached.
The present embodiment proposes a kind of video matching device, and described video matching device includes:
Extraction module 10, for extracting the code identification of video to be compared;
Determine module 20, for determining that whether the type of coding of described video to be compared is identical based on described code identification;
Described extraction module 10, is additionally operable to, when the type of coding of video to be compared is identical, extract the coding parameter of video to be compared;
Coding parameter can include the Width (width) of video, Height (height), Frame_rate (frame per second), Bit_depth (bit depth), Bit_rate (code check), Block_align (audio frequency alignment thereof), Sample_rate (audio sample rate), Channel (voice-grade channel number), Extra_data (video file header), Extradata_size (size of byte shared by Extra_data), the parameter informations such as Metadata (video source data).The parameter value that parameters information is corresponding can be obtained when extracting coding parameter.It will be appreciated by persons skilled in the art that the difference due to type of coding, the parameter information comprised in the coding parameter that the video of different coding type is corresponding is different.
It will be appreciated by those skilled in the art that to be, video comprises audio code stream and/or image code stream, and audio code stream and coding parameter corresponding to image code stream are different, then described extraction module 10 is additionally operable to when the type of coding of video to be compared is identical, extract the audio code stream in video to be compared and image code stream, and when extracting audio code stream and/or image code stream, extract the coding parameter of described audio code stream and/or image code stream.
Comparing module 30, for the coding parameter of comparison video to be compared;
Processing module 40, for when the coding parameter of video to be compared mates, it is determined that video matching to be compared.
When the coding parameter of comparison video to be compared, can distinguishing the parameter value that in comparison coding parameter, parameters information is corresponding, the coding parameter coupling of video to be compared refers to, the parameter value that in the coding parameter of video to be compared, parameters information is corresponding is all equal.Processing module 30 is additionally operable to when the coding parameter of video to be compared mates, it is determined that video matching to be compared or do not mate, and after determining video matching to be compared or not mating, the output result of exportable video to be compared.Skilled person is it is understood that when the quantity of video to be compared is more than three, can show the video of coupling in multiple video to be compared and the information of unmatched video.
The video matching device that the present embodiment proposes, when the type of coding of video to be compared is identical, the coding parameter either directly through video to be compared determines whether video to be compared mates, and need not extract too much content-data and compare, and efficiency is higher and accuracy is high.
Further, for improving the accuracy of video comparison, with reference to Fig. 4, based on first embodiment, video Compare System the second embodiment of the present invention is proposed, in the present embodiment, described extraction module 10 is additionally operable to, when the type of coding difference of video to be compared, extract the I frame data of same time point in video to be compared;Described video matching device also includes acquisition module 50, for obtaining the image parameter that each described I frame data are corresponding;Described comparing module, the image parameter corresponding to described I frame data being additionally operable in comparison video to be compared same time point;Described processing module 40, during the image parameter corresponding to described I frame data being additionally operable in video to be compared same time point coupling, it is determined that video matching to be compared.
Video data includes I frame, P frame and B frame, and I frame is usually first frame of each GOP (a kind of video compression technology that MPEG uses), through moderately compressing, as the reference point of random access, it is possible to as image.
In the present embodiment, the I frame data of same time point refer to the I frame data less than predetermined threshold value of the time difference between same time point.These I frame data are the I frame data put sometime in the time period that video to be compared is identical, such as, video to be compared includes film source one and film source two, the recording time of film source one is 10:00-12:00, and the recording time of film source two is 11:00-12:00, the time period that then film source one is identical with in film source two is 11:00-12:00, then desirable 11:30 is the same time point of film source one and film source two.Or, these I frame data are the I frame data of multiple time points in the time period that video to be compared is identical, such as, video to be compared includes film source one and film source two, the recording time of film source one is 10:00-12:00, and the recording time of film source two is 11:00-12:00, then the time period that film source one is identical with in film source two is 11:00-12:00, then desirable 11:00,11:10,11:20,11:30,11:40,11:50,12:00 are the same time point of film source one and film source two.When the image parameter then having the I frame data of any time point corresponding is not mated, it is determined that video to be compared does not mate.
In the present embodiment, the image parameter that described I frame data are corresponding can include face information and colouring information etc., and namely described acquisition module 50 includes: acquiring unit, is used for the face information and the colouring information that obtain in I frame data;Processing unit, for using the face information extracted and colouring information as image parameter corresponding to described I frame data.Described face information can include face characteristic (such as shape etc.), face texture and colouring information, this face characteristic can pass through Adaboost algorithm calculating and obtain, and face texture can pass through the calculating of Gabor algorithm and obtain, this Adaboost algorithm and Gabor algorithm are existing algorithm, and concrete calculating process does not repeat at this.And the calculation of colouring information is also prior art, do not repeat at this.
It should be noted that, in this article, term " includes ", " comprising " or its any other variant are intended to comprising of nonexcludability, so that include the process of a series of key element, method, article or system not only include those key elements, but also include other key elements being not expressly set out, or also include the key element intrinsic for this process, method, article or system.When there is no more restriction, statement " including ... " key element limited, it is not excluded that there is also other identical element in including the process of this key element, method, article or system.
The invention described above embodiment sequence number, just to describing, does not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art is it can be understood that can add the mode of required general hardware platform by software to above-described embodiment method and realize, hardware can certainly be passed through, but in a lot of situation, the former is embodiment more preferably.Based on such understanding, the part that prior art is contributed by technical scheme substantially in other words can embody with the form of software product, this computer software product is stored in a storage medium (such as ROM/RAM, magnetic disc, CD), including some instructions with so that a station terminal equipment (can be mobile phone, computer, server, air-conditioner, or the network equipment etc.) perform the method described in each embodiment of the present invention.
These are only the preferred embodiments of the present invention; not thereby the scope of the claims of the present invention is limited; every equivalent structure utilizing description of the present invention and accompanying drawing content to make or equivalence flow process conversion; or directly or indirectly it is used in other relevant technical fields, all in like manner include in the scope of patent protection of the present invention.
Claims (8)
1. a video matching method, it is characterised in that described video matching method comprises the following steps:
Extract the code identification of video to be compared, determine that based on described code identification whether the type of coding of described video to be compared is identical;
When the type of coding of video to be compared is identical, extract the coding parameter of video to be compared;
The coding parameter of comparison video to be compared;
When the coding parameter of video to be compared mates, it is determined that video matching to be compared.
2. video matching method as claimed in claim 1, it is characterised in that when the described type of coding at video to be compared is identical, the step of the coding parameter extracting video to be compared includes:
When the type of coding of video to be compared is identical, extract the audio code stream in video to be compared and image code stream;
When extracting audio code stream and/or image code stream, extract the coding parameter of described audio code stream and/or image code stream.
3. video matching method as claimed in claim 1 or 2, it is characterised in that described video matching method also includes:
When the type of coding difference of video to be compared, extract the I frame data of same time point in video to be compared;
Obtain the image parameter that each described I frame data are corresponding;
The image parameter corresponding to described I frame data of same time point in comparison video to be compared;
When in video to be compared, the image parameter corresponding to described I frame data of same time point is mated, it is determined that video matching to be compared.
4. video matching method as claimed in claim 3, it is characterised in that the step of the image parameter that each I frame data of described acquisition are corresponding includes:
Obtain the face information in described I frame data and colouring information;
Using the face information extracted and colouring information as image parameter corresponding to described I frame data.
5. a video matching device, it is characterised in that described video matching device includes:
Extraction module, for extracting the code identification of video to be compared;
Determine module, for determining that whether the type of coding of described video to be compared is identical based on described code identification;
Extraction module, is additionally operable to, when the type of coding of video to be compared is identical, extract the coding parameter of video to be compared;
Comparing module, for the coding parameter of comparison video to be compared;
Processing module, for when the coding parameter of video to be compared mates, it is determined that video matching to be compared.
6. video matching device as claimed in claim 5, it is characterized in that, described extraction module is additionally operable to when the type of coding of video to be compared is identical, extract the audio code stream in video to be compared and image code stream, and when extracting audio code stream and/or image code stream, extract the coding parameter of described audio code stream and/or image code stream.
7. the video matching device as described in claim 5 or 6, it is characterised in that described extraction module is additionally operable to, when the type of coding difference of video to be compared, extract the I frame data of same time point in video to be compared;Described video matching device also includes acquisition module, for obtaining the image parameter that each described I frame data are corresponding;Described comparing module, the image parameter corresponding to described I frame data being additionally operable in comparison video to be compared same time point;Described processing module, during the image parameter corresponding to described I frame data being additionally operable in video to be compared same time point coupling, it is determined that video matching to be compared.
8. video matching device as claimed in claim 7, it is characterised in that described acquisition module includes:
Acquiring unit, is used for the face information and the colouring information that obtain in I frame data;
Processing unit, for using the face information extracted and colouring information as image parameter corresponding to described I frame data.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410814202.XA CN105791974B (en) | 2014-12-24 | 2014-12-24 | Video matching method and device |
PCT/CN2014/095112 WO2016101256A1 (en) | 2014-12-24 | 2014-12-26 | Video matching method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410814202.XA CN105791974B (en) | 2014-12-24 | 2014-12-24 | Video matching method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105791974A true CN105791974A (en) | 2016-07-20 |
CN105791974B CN105791974B (en) | 2018-11-02 |
Family
ID=56148984
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410814202.XA Active CN105791974B (en) | 2014-12-24 | 2014-12-24 | Video matching method and device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN105791974B (en) |
WO (1) | WO2016101256A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101478684A (en) * | 2008-12-31 | 2009-07-08 | 杭州华三通信技术有限公司 | Method and system for detecting integrity of stored video data |
CN102802138A (en) * | 2011-05-25 | 2012-11-28 | 腾讯科技(深圳)有限公司 | Video file processing method and system, and video proxy system |
CN102938840A (en) * | 2012-11-26 | 2013-02-20 | 南京邮电大学 | Key frame quantization parameter selecting method applied to multi-viewpoint video coding system |
US8442117B2 (en) * | 2006-03-27 | 2013-05-14 | Chang Gung University | Method of block matching-based motion estimation in video coding |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102281471B (en) * | 2011-05-24 | 2014-10-01 | 深圳Tcl新技术有限公司 | TV program automatic reminding device and method thereof |
CN102222103B (en) * | 2011-06-22 | 2013-03-27 | 央视国际网络有限公司 | Method and device for processing matching relationship of video content |
-
2014
- 2014-12-24 CN CN201410814202.XA patent/CN105791974B/en active Active
- 2014-12-26 WO PCT/CN2014/095112 patent/WO2016101256A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8442117B2 (en) * | 2006-03-27 | 2013-05-14 | Chang Gung University | Method of block matching-based motion estimation in video coding |
CN101478684A (en) * | 2008-12-31 | 2009-07-08 | 杭州华三通信技术有限公司 | Method and system for detecting integrity of stored video data |
CN102802138A (en) * | 2011-05-25 | 2012-11-28 | 腾讯科技(深圳)有限公司 | Video file processing method and system, and video proxy system |
CN102938840A (en) * | 2012-11-26 | 2013-02-20 | 南京邮电大学 | Key frame quantization parameter selecting method applied to multi-viewpoint video coding system |
Also Published As
Publication number | Publication date |
---|---|
WO2016101256A1 (en) | 2016-06-30 |
CN105791974B (en) | 2018-11-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110008797B (en) | Multi-camera multi-face video continuous acquisition method | |
CN103235956B (en) | A kind of commercial detection method and device | |
JP4725690B2 (en) | Video identifier extraction device | |
CN102611823B (en) | Method and equipment capable of selecting compression algorithm based on picture content | |
Duan et al. | Compact descriptors for visual search | |
CN110267061B (en) | News splitting method and system | |
EP3513310A1 (en) | Summarizing video content | |
CN103631932A (en) | Method for detecting repeated video | |
CN103067713B (en) | Method and system of bitmap joint photographic experts group (JPEG) compression detection | |
CN102682024A (en) | Method for recombining incomplete JPEG file fragmentation | |
CN103020138A (en) | Method and device for video retrieval | |
CN104333732B (en) | A kind of distributed video analysis method and system | |
CN110175591A (en) | A kind of method and system obtaining video similarity | |
CN102301697A (en) | Video identifier creation device | |
CN106162222B (en) | A kind of method and device of video lens cutting | |
CN109040784A (en) | Commercial detection method and device | |
CN101339662B (en) | Method and device for creating video frequency feature data | |
KR20200042979A (en) | Method and System for Non-Identification of Personal Information in Imaging Device | |
CN112434049A (en) | Table data storage method and device, storage medium and electronic device | |
CN103916677A (en) | Advertisement video identifying method and device | |
CN105791974A (en) | Video matching method and apparatus | |
Dalmia et al. | First quantization matrix estimation for double compressed JPEG images utilizing novel DCT histogram selection strategy | |
CN105989063B (en) | Video retrieval method and device | |
CN113569719B (en) | Video infringement judging method and device, storage medium and electronic equipment | |
CN104637496A (en) | Computer system and audio comparison method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |