CN111147884A - Data processing method, device, system, user side and storage medium - Google Patents

Data processing method, device, system, user side and storage medium Download PDF

Info

Publication number
CN111147884A
CN111147884A CN202010000660.5A CN202010000660A CN111147884A CN 111147884 A CN111147884 A CN 111147884A CN 202010000660 A CN202010000660 A CN 202010000660A CN 111147884 A CN111147884 A CN 111147884A
Authority
CN
China
Prior art keywords
video
repeated
frame
video playing
rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010000660.5A
Other languages
Chinese (zh)
Other versions
CN111147884B (en
Inventor
周云鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huya Technology Co Ltd
Original Assignee
Guangzhou Huya Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huya Technology Co Ltd filed Critical Guangzhou Huya Technology Co Ltd
Priority to CN202010000660.5A priority Critical patent/CN111147884B/en
Publication of CN111147884A publication Critical patent/CN111147884A/en
Application granted granted Critical
Publication of CN111147884B publication Critical patent/CN111147884B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/004Diagnosis, testing or measuring for television systems or their details for digital television systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application provides a data processing method, a device, a system, a user side and a storage medium, relates to the technical field of the Internet, and obtains a video playing file by recording a video playing picture, so that the number of each repeated video frame in the video playing file is obtained, the rendering score of the video playing picture can be obtained based on the number of each repeated video frame, and the rendering uniformity degree of the video playing picture can be evaluated by utilizing the rendering score.

Description

Data processing method, device, system, user side and storage medium
Technical Field
The present application relates to the field of internet technologies, and in particular, to a data processing method, apparatus, system, user side, and storage medium.
Background
The rendering time of the playing side refers to an interval time for rendering the picture by the playing side, for example, assuming that a video Frame rate (Frame rate) of the playing side is 30FPS (Frames per Second), the rendering time of the playing side is 33/34ms (millisecond), that is, the picture is rendered once every 33/34 ms.
The rendering time of the playing end affects the fluency of video playing, for example, when the rendering time of the playing end is not uniform, the time of a picture watched by a user may be slow, and the watching experience of the user is low.
However, currently, the fluency of the video played at the playing end is generally evaluated only by simply using an FPS, and the higher the FPS is, the smoother the video playing is represented; however, this method only considers the video frame rate at the playing end, and the evaluation on the fluency of video playing is not accurate.
Disclosure of Invention
The application aims to provide a data processing method, a data processing device, a data processing system, a user side and a storage medium, which can improve the accuracy of evaluating the fluency of a video playing picture.
In order to achieve the above purpose, the embodiments of the present application employ the following technical solutions:
in a first aspect, an embodiment of the present application provides a data processing method, where the method includes:
recording a video playing picture to obtain a video playing file;
obtaining the number of each repeated video frame in the video playing file;
and obtaining a rendering score of the video playing picture based on the number of each repeated video frame, wherein the rendering score represents the rendering uniformity of the video playing picture.
In a second aspect, an embodiment of the present application provides a data processing apparatus, where the apparatus includes:
the processing module is used for recording a video playing picture to obtain a video playing file;
the processing module is further configured to obtain the number of each repeated video frame in the video playing file;
and the calculation module is used for obtaining a rendering score of the video playing picture based on the number of each repeated video frame, wherein the rendering score represents the rendering uniformity degree of the video playing picture.
In a third aspect, an embodiment of the present application provides a user side, where the user side includes a memory for storing one or more programs; a processor; the one or more programs, when executed by the processor, implement the data processing method described above.
In a fourth aspect, an embodiment of the present application provides a data processing system, including a server and a client as provided in the third aspect, where the server establishes communication with the client.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the data processing method described above.
According to the data processing method, the device, the system, the user side and the storage medium, the video playing file is obtained by recording the video playing picture, so that the number of each repeated video frame in the video playing file is obtained, the rendering score of the video playing picture can be obtained based on the number of each repeated video frame, the rendering uniformity degree of the video playing picture can be evaluated by the rendering score, and compared with the prior art, the accuracy of evaluating the fluency of the video playing picture can be improved by utilizing the number of each repeated video frame.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and it will be apparent to those skilled in the art that other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 shows a schematic application scenario diagram of a data processing method provided by an embodiment of the present application;
fig. 2 is a schematic structural block diagram of a user side according to an embodiment of the present application;
FIG. 3 is a schematic flow chart diagram of a data processing method provided by the embodiment of the application;
FIG. 4 is a schematic diagram of a video frame included in a video playback file;
FIG. 5 shows a schematic flow diagram of the substeps of step 204 in FIG. 3;
FIG. 6 shows another schematic flow chart of a data processing method provided by an embodiment of the present application;
fig. 7 shows a schematic block diagram of a data processing apparatus according to an embodiment of the present application.
In the figure: 100-user terminal; 101-a memory; 102-a processor; 103-a memory controller; 104-peripheral interfaces; 105-a radio frequency unit; 106-communication bus/signal line; 107-a display unit; 300-a data processing apparatus; 301-a processing module; 302-calculation module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
In an application scene such as video live broadcast, rendering time of a playing end affects user experience of watching live broadcast pictures.
For example, assume that the playback frame rate of the playback end is 30FPS, that is, the rendering time of the playback end is 33/34 ms; then in some good conditions, if the rendering time of the playing end is 33/34ms each time, the rendering time of the playing end is considered to be uniform.
However, in some possible application scenarios, assuming that the rendering time of the n-1 th time is 16/17ms, the rendering time of the n-1 th time is 50/51ms, and the rendering time of the n +1 th time is 33/34ms, although the play frame rate of the play end has not changed overall, that is, the FPS of the picture has not changed, the rendering time has actually changed, that is, the time for the play end to render each video frame is not uniform, so that the video played by the play end is not smooth, and the picture is fast and slow. Therefore, the evaluation of the fluency of the video played by the playing end by using the FPS is not accurate.
Therefore, based on the above defects, a possible implementation manner provided by the embodiment of the present application is as follows: the number of each repeated video frame in the video playing file is obtained by recording the video playing picture, so that the rendering score of the video playing picture can be obtained based on the number of each repeated video frame, the rendering uniformity degree of the video playing picture can be evaluated by using the rendering score, and the accuracy of evaluating the fluency of the video playing picture is improved.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic application scenario diagram of a data processing method according to an embodiment of the present application. In an application scenario such as live webcasting, a server, a main broadcasting end and a spectator end are all located in a wireless network or a wired network, and through the wireless network or the wired network, the server, the main broadcasting end and the spectator end of the server can perform data interaction, for example, the main broadcasting end can transmit a live video code stream to the server, and then the server sends the video code stream to the spectator end, so that spectators can watch live webcasting video.
In the embodiment of the present application, both the anchor terminal and the viewer terminal may employ a mobile terminal device, which may be, for example, a smart phone, a Personal Computer (PC), a tablet computer, and the like. The anchor terminal and the audience terminal can be the same equipment or different equipment; for example, the anchor terminal and the audience terminal can both be mobile phones, or the anchor terminal can be a personal computer and the audience terminal can be a mobile phone.
The data processing method provided by the embodiment of the application can be applied to the audience shown in fig. 1, wherein an application program is installed in the audience, corresponds to a main broadcasting terminal and a server, and is used for providing services for users; for example, the audience can receive and play the live video code stream sent by the server, so that the user can watch the live video of the main broadcast.
In addition, when the live video stream is played, the viewer can execute various corresponding functional applications, image processing, and the like through a software program, a module, and the like installed in the viewer, so as to implement the data processing method provided by the embodiment of the application.
It is understood, of course, that fig. 1 is only an illustration, in which a main broadcast end and a viewer end are illustrated, and in some other application scenarios, the server may also establish communication with more main broadcast ends and more viewer ends, so that different main broadcasts can transmit respective live videos to the server, and different viewers can obtain live videos of different main broadcasts from the server for viewing.
The foregoing is only an illustration, and the spectator end shown in fig. 1 is taken as an execution subject of the data processing method provided in the embodiment of the present application; in some other possible implementation manners of the embodiment of the present application, the anchor shown in fig. 1 may also be used as an execution main body of the data processing method provided by the embodiment of the present application.
Referring to fig. 2, fig. 2 is a schematic block diagram illustrating a user terminal 100 according to an embodiment of the present application, where the user terminal 100 can be used as a viewer terminal in the application scenario shown in fig. 1, and can also be used as a broadcaster terminal in the application scenario shown in fig. 1.
The user terminal 100 includes a memory 101, one or more processors 102 (only one is shown), a memory controller 103, a peripheral interface 104, a radio frequency unit 105, a display unit 107, and the like. These components communicate with each other via one or more communication buses/signal lines 106.
The memory 101 may be used to store software programs and modules, such as program instructions/modules corresponding to the data processing apparatus provided in the embodiment of the present application, and the processor 102 executes various functional applications, image processing, and the like by running the software programs and modules stored in the memory 101, so as to implement the data processing method provided in the embodiment of the present application.
The Memory 101 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Programmable Read-Only Memory (EEPROM), and the like.
The processor 102 may be an integrated circuit chip having signal processing capabilities. The processor 102 may be a general-purpose processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), a voice processor, a video processor, and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor 102 may be any conventional processor or the like.
The peripheral interface 104 couples various input/output devices to the processor 102 as well as to the memory 101. In some embodiments, the peripheral interface 104, the processor 102, and the memory controller 103 may be implemented in a single chip. In other embodiments of the present application, they may be implemented by separate chips.
The rf unit 105 is used for receiving and transmitting electromagnetic waves, and implementing interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices.
The display unit 107 is used for providing a graphical output interface for a user and displaying image information for the user to view, for example, a video live picture.
It is understood that the structure shown in fig. 2 is only an illustration, and the user terminal 100 may also include more or less components than those shown in fig. 2, or have a different configuration than that shown in fig. 2. The components shown in fig. 2 may be implemented in hardware, software, or a combination thereof.
The data processing method provided by the embodiment of the present application is schematically described below by taking the user terminal 100 shown in fig. 2 as the viewer terminal in fig. 1 as an example.
Referring to fig. 3, fig. 3 shows a schematic flowchart of a data processing method provided by an embodiment of the present application, which may include the following steps:
step 202, recording a video playing picture to obtain a video playing file;
step 204, obtaining the number of each repeated video frame in the video playing file;
and step 206, obtaining the rendering score of the video playing picture based on the number of each repeated video frame.
In the embodiment of the application, when the display unit of the user side displays image information such as a live video image, the user side can record and store the video playing image to obtain a video playing file.
Then, the user end may traverse each video frame of the video playing file, and determine the video frames with the structural similarity greater than the set threshold as a group of repeated video frames in combination with a Structural Similarity Index (SSIM) calculation or the like, so as to count the number of each repeated video frame in the video playing file.
Exemplarily, taking the video frames included in the video playing file shown in fig. 4 as an example, assuming that, of eight video frames with the numbers of "1", "2", "3", "4", "5", "6", "7" and "8", respectively, the structural similarity of the first frame and the second frame is greater than a set threshold, the first frame and the second frame are a group of repeated video frames; the structural similarity of the third frame, the fourth frame and the fifth frame is greater than a set threshold value, and the third frame, the fourth frame and the fifth frame form a group of repeated video frames; and if the structural similarity of the sixth frame, the seventh frame and the eighth frame is greater than a set threshold value, the sixth frame, the seventh frame and the eighth frame form a group of repeated video frames.
Then, based on the number of each repeated video frame, a rendering score of the video playing picture can be obtained, and the rendering score represents the rendering uniformity degree of the video playing picture.
For example, in some possible implementations, the higher the rendering score, the more uneven the rendering of the representation video playing picture; the lower the rendering score, the more uniform the rendering of the characterization video playing picture. Or setting a score threshold, and judging that the video playing picture is not rendered uniformly when the rendering score is greater than the score threshold; and when the rendering score is less than or equal to the score threshold value, judging that the playing picture is rendered uniformly.
Therefore, based on the above design, the data processing method provided in the embodiment of the present application obtains the video playing file by recording the video playing frame, so as to obtain the number of each repeated video frame in the video playing file, and obtain the rendering score of the video playing frame based on the number of each repeated video frame, and further evaluate the rendering uniformity of the video playing frame by using the rendering score.
In step 206, the rendering score of the video playing screen may be calculated and obtained in various ways.
Illustratively, the rendering score of the video playing picture can be obtained according to the difference degree of the respective numbers of all the repeated video frames.
For example, the calculated variance or standard deviation of the respective number of all the repeated video frames can be used as the rendering score of the video playing picture.
For example, according to the above example, the first frame and the second frame are a set of repeated video frames, the corresponding number of which is 2; the third frame, the fourth frame and the fifth frame are a group of repeated video frames, and the corresponding number of the repeated video frames is 3; the sixth frame, the seventh frame and the eighth frame are a group of repeated video frames, and the corresponding number of the repeated video frames is 3; the rendering score of the 8 frames of video is 2/9 when the rendering score is calculated in such a way that the variance is calculated.
At this time, the higher the rendering score is, the more uneven the number distribution of each group of repeated video frames in the video playing picture is, the greater the difference degree of the number of the video frames contained among the repeated video frames is, the more uneven the rendering is; conversely, the lower the rendering score is, the more uniform the number distribution of each group of repeated video frames in the video playing picture is, and the smaller the difference degree of the number of video frames contained between the repeated video frames is, the more uniform the rendering is.
Of course, it can be understood that the above is only an illustration, and a way of calculating the variance or the standard deviation is adopted, so as to obtain the difference degree of the respective number of all repeated video frames, and further obtain the rendering score of the video playing picture; in some other possible implementation manners of the embodiment of the present application, other manners may also be used to calculate and obtain the rendering score of the video playing image, and the calculation manner of the rendering score is not limited in the embodiment of the present application.
In addition, in an application scenario such as webcast, video playing pictures at different time points may have the same picture content; at this time, even if the two time points are in a normal rendering state, it may be considered to belong to a repeated rendering, i.e., to the same group of repeated video frames; in practice, however, when the two video frames are separated by a long time, the two video frames generally belong to two different groups of repeated video frames, and the two video frames are determined as the repeated video frames of one group, which may result in the accuracy of the rendering score being reduced.
For this reason, in executing step 204, as a possible implementation manner, all the consecutive and similar video frames may be determined as a group of repeated video frames starting from the first video frame in the video playing file, and the number of the repeated video frames is counted, that is, the number of the video frames included in the group of repeated video frames is counted until the last video frame in the video playing file is traversed.
In the counting of the number of the repeated video frames, taking one of all the video frames of the video playing file as an example of a target repeated video frame, please refer to fig. 5, fig. 5 shows a schematic flowchart of the sub-step of step 204 in fig. 3, as a possible implementation manner, step 204 may include the following sub-steps:
step 204-1, calculating the structural similarity of the target repeated video frame and the next video frame;
step 204-2, judging whether the structural similarity is greater than a set threshold value; when the value is larger than the preset value, executing a step 204-3; when the current frame is less than or equal to the target frame, repeating the video frame by taking the next video frame as a new target, and continuing to execute the step 204-1;
and step 204-3, updating the number of the target repeated video frames and deleting the next video frame.
Taking the first frame in fig. 4 as an example of a target repeated video frame, assuming that the initial number of the first frame is "1", when step 204 is executed, the structural similarity between the first frame and the second frame may be calculated first, and it is determined whether the calculated structural similarity is greater than a set threshold; when the structural similarity is greater than the set threshold, updating the number of the target repeated video frames to be 2, deleting the second frame, taking the third frame as a new next video frame (namely, taking the third frame as a new second frame), and returning to execute the step 204-1; next, calculating the structural similarity of the first frame and the third frame, and continuously judging whether the calculated structural similarity is greater than a set threshold value; when the structural similarity is less than or equal to the set threshold, repeating the video frame by taking the third frame (namely the next video frame) as a new target, and continuing to execute the step 204-1; thus, the respective number of each group of repeated video frames is obtained until the last video frame in the video playing file is traversed.
It should be noted that, in an application scenario shown in fig. 1, for example, when a viewer is used as an execution main body of the data processing method provided in the embodiment of the present application, a live code stream issued by a server to the viewer may be played on a video playing screen of the viewer.
Therefore, as a possible implementation manner, referring to fig. 6 on the basis of fig. 3, fig. 6 shows another schematic flowchart of the data processing method provided by the embodiment of the present application, before executing step 202, the data processing method may further include the following steps:
step 201, playing the received live broadcast code stream to form a video playing picture.
In an application scene such as network live broadcast, a main broadcast end can send a generated live broadcast code stream to a server, and the server sends the live broadcast code stream to a spectator end; correspondingly, the viewer can play the received live broadcast code stream to form a video playing screen, so as to execute the above steps 202, 204 and 206 to obtain the rendering score of the video playing screen.
As a possible implementation manner, when playing a received live broadcast code stream, the user side may set to a full-screen playing mode, so that when recording a video playing screen, the user side may eliminate interference of non-live broadcast content.
In addition, an embodiment of the present application further provides a data processing system, including a server and the user terminal provided in the embodiment of the present application, where the server establishes communication with the user terminal, and the server sends data such as a live code stream to the user terminal, so that the video playing picture of the user terminal is rendered and scored by executing the steps in the data processing method provided in the embodiment of the present application.
Referring to fig. 7, fig. 7 shows a schematic block diagram of a data processing apparatus 300 according to an embodiment of the present application, where the data processing apparatus 300 includes a processing module 301 and a calculating module 302. Wherein:
the processing module 301 is configured to record a video playing picture to obtain a video playing file;
the processing module 301 is further configured to obtain the number of each repeated video frame in the video playing file;
the calculating module 302 is configured to obtain a rendering score of the video playing image based on the number of each repeated video frame, where the rendering score represents a rendering uniformity of the video playing image.
Optionally, as a possible implementation manner, when obtaining the rendering score of the video playing image based on the number of each repeated video frame, the calculating module 302 is specifically configured to:
and obtaining the rendering score of the video playing picture according to the difference degree of the respective number of all the repeated video frames.
Optionally, as a possible implementation manner, when obtaining the rendering score of the video playing image according to the difference degree of the respective numbers of all the repeated video frames, the calculating module 302 is specifically configured to:
and taking the calculated variance or standard deviation of the respective number of all the repeated video frames as the rendering score of the video playing picture.
Optionally, as a possible implementation manner, when obtaining the number of each repeated video frame in the video playing picture, the processing module 301 is specifically configured to:
and determining all continuous and similar video frames as a group of repeated video frames from the first video frame in the video playing picture, and counting the number of the repeated video frames until the last video frame in the video playing picture is traversed.
Optionally, as a possible implementation manner, when counting the number of the repeated video frames, the processing module 301 is specifically configured to:
calculating the structural similarity of a target repeated video frame and a next video frame, wherein the target repeated video frame is one of all video frames of a video playing picture;
when the structural similarity is larger than a set threshold value, updating the number of the target repeated video frames, and after deleting the next video frame, continuously executing the step of calculating the structural similarity between the target repeated video frame and the next video frame;
and when the structural similarity is smaller than or equal to the set threshold, taking the next video frame as a new target repeated video frame, and continuously executing the step of calculating the structural similarity between the target repeated video frame and the next video frame.
Optionally, as a possible implementation manner, before the processing module 301 records the video playing picture to obtain the video playing file, the processing module is further configured to:
and playing the received live broadcast code stream to form a video playing picture.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative and, for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
To sum up, according to the data processing method, the data processing device, the data processing system, the user side and the storage medium provided by the embodiment of the application, the video playing file is obtained by recording the video playing picture, so that the number of each repeated video frame in the video playing file is obtained, the rendering score of the video playing picture can be obtained based on the number of each repeated video frame, and the rendering uniformity degree of the video playing picture can be evaluated by using the rendering score.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (10)

1. A method of data processing, the method comprising:
recording a video playing picture to obtain a video playing file;
obtaining the number of each repeated video frame in the video playing file;
and obtaining a rendering score of the video playing picture based on the number of each repeated video frame, wherein the rendering score represents the rendering uniformity of the video playing picture.
2. The method of claim 1, wherein the step of obtaining the rendering score of the video playback screen based on the number of each of the repeated video frames comprises:
and obtaining the rendering score of the video playing picture according to the difference degree of the respective number of the repeated video frames.
3. The method of claim 2, wherein the step of obtaining the rendering score of the video playback screen according to the difference degree of the respective numbers of all the repeated video frames comprises:
and taking the calculated variance or standard deviation of the respective number of all the repeated video frames as the rendering score of the video playing picture.
4. The method of claim 1, wherein the step of obtaining the number of each repeated video frame in the video play-out picture comprises:
and determining all continuous and similar video frames as a group of repeated video frames from the first video frame in the video playing picture, and counting the number of the repeated video frames until the last video frame in the video playing picture is traversed.
5. The method of claim 4, wherein the step of counting the number of repeated video frames comprises:
calculating the structural similarity of a target repeated video frame and a next video frame, wherein the target repeated video frame is one of all video frames of the video playing picture;
when the structural similarity is larger than a set threshold value, updating the number of the target repeated video frames, and after deleting the next video frame, continuing to execute the step of calculating the structural similarity between the target repeated video frames and the next video frame;
and when the structural similarity is smaller than or equal to the set threshold, taking the next video frame as a new target repeated video frame, and continuously executing the step of calculating the structural similarity between the target repeated video frame and the next video frame.
6. The method of claim 1, wherein prior to the step of recording the video playback picture to obtain the video playback file, the method further comprises:
and playing the received live broadcast code stream to form the video playing picture.
7. A data processing apparatus, characterized in that the apparatus comprises:
the processing module is used for recording a video playing picture to obtain a video playing file;
the processing module is further configured to obtain the number of each repeated video frame in the video playing file;
and the calculation module is used for obtaining a rendering score of the video playing picture based on the number of each repeated video frame, wherein the rendering score represents the rendering uniformity degree of the video playing picture.
8. A user terminal, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method of any of claims 1-6.
9. A data processing system comprising a server and a user terminal according to claim 8, said server establishing communication with said user terminal.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-6.
CN202010000660.5A 2020-01-02 2020-01-02 Data processing method, device, system, user side and storage medium Active CN111147884B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010000660.5A CN111147884B (en) 2020-01-02 2020-01-02 Data processing method, device, system, user side and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010000660.5A CN111147884B (en) 2020-01-02 2020-01-02 Data processing method, device, system, user side and storage medium

Publications (2)

Publication Number Publication Date
CN111147884A true CN111147884A (en) 2020-05-12
CN111147884B CN111147884B (en) 2021-12-17

Family

ID=70523263

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010000660.5A Active CN111147884B (en) 2020-01-02 2020-01-02 Data processing method, device, system, user side and storage medium

Country Status (1)

Country Link
CN (1) CN111147884B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112015644A (en) * 2020-08-25 2020-12-01 百度在线网络技术(北京)有限公司 Screen fluency determination method, device, equipment and medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101682796A (en) * 2007-04-03 2010-03-24 英国电讯有限公司 Method and system for video quality assessment
US20130265387A1 (en) * 2012-04-06 2013-10-10 Adobe Systems Incorporated Opt-Keyframe Reconstruction for Robust Video-Based Structure from Motion
US20130300846A1 (en) * 2012-05-14 2013-11-14 Intuitive Surgical Operations, Inc. Method and system for video processing
CN104702968A (en) * 2015-02-17 2015-06-10 华为技术有限公司 Frame loss method for video frame and video sending device
CN105828184A (en) * 2015-08-31 2016-08-03 维沃移动通信有限公司 Video processing method and mobile terminal
CN106792262A (en) * 2016-12-05 2017-05-31 乐视控股(北京)有限公司 Method of transmitting video data and device
CN108174191A (en) * 2017-12-29 2018-06-15 广州虎牙信息科技有限公司 Video fluency test method, computer storage media and terminal
CN110505522A (en) * 2019-09-16 2019-11-26 腾讯科技(深圳)有限公司 Processing method, device and the electronic equipment of video data

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101682796A (en) * 2007-04-03 2010-03-24 英国电讯有限公司 Method and system for video quality assessment
JP2010524318A (en) * 2007-04-03 2010-07-15 ブリティッシュ・テレコミュニケーションズ・パブリック・リミテッド・カンパニー Video quality evaluation method and system
US20130265387A1 (en) * 2012-04-06 2013-10-10 Adobe Systems Incorporated Opt-Keyframe Reconstruction for Robust Video-Based Structure from Motion
US20130300846A1 (en) * 2012-05-14 2013-11-14 Intuitive Surgical Operations, Inc. Method and system for video processing
CN104702968A (en) * 2015-02-17 2015-06-10 华为技术有限公司 Frame loss method for video frame and video sending device
CN105828184A (en) * 2015-08-31 2016-08-03 维沃移动通信有限公司 Video processing method and mobile terminal
CN106792262A (en) * 2016-12-05 2017-05-31 乐视控股(北京)有限公司 Method of transmitting video data and device
CN108174191A (en) * 2017-12-29 2018-06-15 广州虎牙信息科技有限公司 Video fluency test method, computer storage media and terminal
CN110505522A (en) * 2019-09-16 2019-11-26 腾讯科技(深圳)有限公司 Processing method, device and the electronic equipment of video data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈杰: "基于BIM技术的室外作业场地照明模拟与校核研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112015644A (en) * 2020-08-25 2020-12-01 百度在线网络技术(北京)有限公司 Screen fluency determination method, device, equipment and medium

Also Published As

Publication number Publication date
CN111147884B (en) 2021-12-17

Similar Documents

Publication Publication Date Title
CN110519621B (en) Video recommendation method and device, electronic equipment and computer readable medium
US20150256885A1 (en) Method for determining content for a personal channel
CN107846629B (en) Method, device and server for recommending videos to users
CN116761007A (en) Method for giving virtual gift to multicast live broadcasting room and electronic equipment
CN112929678B (en) Live broadcast method, live broadcast device, server side and computer readable storage medium
CN107018440B (en) Methods, systems, and media for presenting advertisements while buffering video
CN109040830B (en) Live broadcast pause prediction method, switching method and device
CN110519645B (en) Video content playing method and device, electronic equipment and computer readable medium
CN114071179A (en) Live broadcast preview method, device, equipment, program product and medium
CN112929728A (en) Video rendering method, device and system, electronic equipment and storage medium
CN113630614A (en) Game live broadcast method, device, system, electronic equipment and readable storage medium
CN114390308A (en) Interface display method, device, equipment, medium and product in live broadcast process
CN113721807A (en) Information display method and device, electronic equipment and storage medium
CN111147884B (en) Data processing method, device, system, user side and storage medium
CN109714626B (en) Information interaction method and device, electronic equipment and computer readable storage medium
CN107948206B (en) Method and system for downloading and/or uploading multimedia data
CN114257572B (en) Data processing method, device, computer readable medium and electronic equipment
CN114817698A (en) Information pushing method and device, information display method and device, equipment and medium
CN113542856A (en) Reverse playing method, device, equipment and computer readable medium for online video
CN109194975B (en) Audio and video live broadcast stream following method and device
US12015834B2 (en) Methods, systems, and media for streaming video content using adaptive buffers
CN111263179A (en) Live broadcast room drainage method, device, system, server and storage medium
CN114827675A (en) Video data processing method and device for application program
EP3547698A1 (en) Method and device for determining inter-cut time bucket in audio/video
CN111225263B (en) Video playing control method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant