CN112235604B - Rendering method and device, computer readable storage medium and electronic device - Google Patents

Rendering method and device, computer readable storage medium and electronic device Download PDF

Info

Publication number
CN112235604B
CN112235604B CN202011122501.9A CN202011122501A CN112235604B CN 112235604 B CN112235604 B CN 112235604B CN 202011122501 A CN202011122501 A CN 202011122501A CN 112235604 B CN112235604 B CN 112235604B
Authority
CN
China
Prior art keywords
rendering
picture
frame number
video
image sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011122501.9A
Other languages
Chinese (zh)
Other versions
CN112235604A (en
Inventor
谢文政
王毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguan Information Technology Co Ltd
Original Assignee
Guangzhou Boguan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguan Information Technology Co Ltd filed Critical Guangzhou Boguan Information Technology Co Ltd
Priority to CN202011122501.9A priority Critical patent/CN112235604B/en
Publication of CN112235604A publication Critical patent/CN112235604A/en
Application granted granted Critical
Publication of CN112235604B publication Critical patent/CN112235604B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23412Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs for generating or manipulating the scene composition of objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234336Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by media transcoding, e.g. video is transformed into a slideshow of still pictures or audio is converted into text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440236Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by media transcoding, e.g. video is transformed into a slideshow of still pictures, audio is converted into text

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention relates to a rendering method and a device, a computer readable storage medium and electronic equipment, which relate to the technical field of computers, and the method comprises the following steps: acquiring a video material to be rendered, and converting the video material to be rendered to obtain a plurality of image sequence frames; reading pictures with the same frame number as the target rendering frame number from the plurality of image sequence frames according to the target rendering frame number in the rendering queue; and endowing the picture to an object map corresponding to the picture, and rendering the object map endowed with the picture to obtain an output video. The embodiment of the invention can solve the problem that when a real-time engine is adopted to press the video, the picture time of the externally-cited video and the picture rendering time are not synchronous because the picture rendering rate is not consistent with the externally-cited video rate.

Description

Rendering method and device, computer readable storage medium and electronic device
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a rendering method, a rendering device, a computer-readable storage medium and electronic equipment.
Background
With the development of computer technology, the use of real-time engines for movie production is becoming a trend. The real-time engine developed specifically for games is beginning to be applied to movie production due to its powerful resource management system, coupled with rendering tools developed for the stage of making customs clearance.
In recent years, many gaming companies have begun to use real-time engines to produce video clips for content delivery. Due to the resource standard and the richness of management tools, the real-time engine greatly improves the movie production efficiency and reduces the production cost on the premise of slightly sacrificing the image quality.
However, the above method has the following drawbacks: when the real-time engine is adopted to suppress the video, the picture time of the externally-referenced video and the picture rendering time are not synchronous due to the fact that the picture rendering rate is not consistent with the externally-referenced video rate.
Therefore, it is desirable to provide a new rendering method and apparatus.
It is to be noted that the information invented in the above background section is only for enhancing the understanding of the background of the present invention, and therefore, may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present invention is to provide a rendering method, a rendering apparatus, a computer-readable storage medium, and an electronic device, which overcome, at least to some extent, the problem of picture time of externally-referenced video and time of rendering pictures being not synchronized due to the limitations and disadvantages of the related art.
According to an aspect of the present disclosure, there is provided a rendering method applied to a real-time rendering engine, the rendering method including:
acquiring a video material to be rendered, and converting the video material to be rendered to obtain a plurality of image sequence frames;
reading pictures with the same frame number as the target rendering frame number from the plurality of image sequence frames according to the target rendering frame number in the rendering queue;
and endowing the picture to an object map corresponding to the picture, and rendering the object map endowed with the picture to obtain an output video.
In an exemplary embodiment of the present disclosure, the rendering method further includes:
and converting the floating point number in the rendering queue to obtain the target rendering frame number.
In an exemplary embodiment of the present disclosure, the reading, from the plurality of image sequence frames, a picture having the same number of frames as the target rendering frame number according to the target rendering frame number in a rendering queue includes:
constructing a picture sequence reader, and configuring the picture sequence position and the file name of each image sequence frame in the picture sequence reader; the file name comprises a picture prefix, a picture suffix and an extension;
and reading pictures with the same frame number as the target rendering frame number in the rendering queue from the plurality of image sequence frames by using the picture sequence reader according to the picture sequence positions and the file names.
In an exemplary embodiment of the present disclosure, the assigning the picture to the object map corresponding to the picture includes:
and converting the current picture format of the picture into a preset texture format, and endowing the picture with the converted format to an object map corresponding to the picture.
In an exemplary embodiment of the present disclosure, the preset texture format is a format that can be read by the real-time rendering engine.
In an exemplary embodiment of the present disclosure, the rendering method further includes:
storing the plurality of image sequence frames to a preset storage directory; wherein the preset storage directory is not set in the real-time rendering engine.
In an exemplary embodiment of the present disclosure, the rendering method further includes:
playing the output video; and the playing frame rate of the output video is consistent with the input frame rate of the video material to be rendered.
According to an aspect of the present disclosure, there is provided a rendering apparatus applied to a real-time rendering engine, the rendering apparatus including:
the video material conversion module is used for acquiring a video material to be rendered and converting the video material to be rendered to obtain a plurality of image sequence frames;
the picture reading module is used for reading pictures with the same frame number as the target rendering frame number from the plurality of image sequence frames according to the target rendering frame number in the rendering queue;
and the rendering module is used for endowing the picture with the object map corresponding to the picture, and rendering the object map endowed with the picture to obtain an output video.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the rendering method of any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the rendering method of any of the above via execution of the executable instructions.
On one hand, according to the rendering method provided by the embodiment of the invention, a plurality of image sequence frames are obtained by acquiring a video material to be rendered and converting the video material to be rendered; reading pictures with the same frame number as the target rendering frame number from the plurality of image sequence frames according to the target rendering frame number in the rendering queue; finally, the picture is endowed with the object map corresponding to the picture, the object map endowed with the picture is rendered to obtain an output video, and the picture is read according to a target rendering frame number, so that the number of the read pictures and the rendering frame number can be ensured to be consistent, and further, the problem that when a real-time engine is adopted to press the video, the picture time of the externally-referenced video and the picture rendering time are not synchronous due to the fact that the picture rendering rate is not consistent with the externally-referenced video rate can be solved; on the other hand, a plurality of image sequence frames are obtained by converting the video material to be rendered, and then the corresponding picture is obtained when rendering is needed, so that the problem that the whole consumption of a real-time rendering engine is high due to the fact that all the video material to be rendered needs to be loaded in the prior art is solved, the bandwidth needed by loading the picture is reduced, and the loading speed is increased; on the other hand, the consumption of the GPU is reduced, and the rendering speed is further improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 schematically shows a flow chart of a rendering method according to an exemplary embodiment of the present invention.
Fig. 2 schematically illustrates an example of a rendering process according to an example embodiment of the present invention.
Fig. 3 is a flowchart schematically illustrating a method for reading a picture having the same number of frames as a target rendering frame number from a plurality of image sequence frames according to the target rendering frame number in a rendering queue according to an exemplary embodiment of the present invention.
Fig. 4 schematically shows a flow chart of another rendering method according to an exemplary embodiment of the present invention.
Fig. 5 schematically shows a flow chart of an audio material processing method according to an exemplary embodiment of the present invention.
Fig. 6 schematically shows a block diagram of a rendering apparatus according to an exemplary embodiment of the present invention.
Fig. 7 schematically illustrates an electronic device for implementing the above-described rendering method according to an exemplary embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention may be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the invention.
Furthermore, the drawings are merely schematic illustrations of the invention and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The present exemplary embodiment first provides a rendering method, which may be run in a real-time rendering engine or the like; of course, those skilled in the art may also operate the method of the present invention on other platforms as needed, and this is not particularly limited in this exemplary embodiment. Referring to fig. 1, the rendering method may include the steps of:
s110, acquiring a video material to be rendered, and converting the video material to be rendered to obtain a plurality of image sequence frames;
s120, reading pictures with the same frame number as the target rendering frame number from the plurality of image sequence frames according to the target rendering frame number in the rendering queue;
and S130, endowing the picture to an object map corresponding to the picture, and rendering the object map endowed with the picture to obtain an output video.
In the rendering method, on one hand, a plurality of image sequence frames are obtained by obtaining a video material to be rendered and converting the video material to be rendered; reading pictures with the same frame number as the target rendering frame number from the plurality of image sequence frames according to the target rendering frame number in the rendering queue; finally, the picture is endowed with the object map corresponding to the picture, the object map endowed with the picture is rendered to obtain an output video, and the picture is read according to a target rendering frame number, so that the number of the read pictures and the rendering frame number can be ensured to be consistent, and further, the problem that when a real-time engine is adopted to press the video, the picture time of the externally-referenced video and the picture rendering time are not synchronous due to the fact that the picture rendering rate is not consistent with the externally-referenced video rate can be solved; on the other hand, a plurality of image sequence frames are obtained by converting the video material to be rendered, and then the corresponding picture is obtained when rendering is needed, so that the problem that the whole consumption of a real-time rendering engine is high due to the fact that all the video material to be rendered needs to be loaded in the prior art is solved, the bandwidth needed by loading the picture is reduced, and the loading speed is increased; on the other hand, the consumption of the GPU is reduced, and the rendering speed is further improved.
Hereinafter, each step involved in the rendering method according to the exemplary embodiment of the present invention will be explained and explained in detail with reference to the drawings.
First, an application scenario of the exemplary embodiment of the present invention is explained and explained.
With the development of hardware performance, the rendering quality of the real-time engine is gradually improved, and the application of the real-time rendering engine to film production becomes a choice of a part of film production teams. And moreover, the asset management of the real-time engine is more convenient, the work flow is quicker, and compared with the traditional movie and television rendering tool, the method can be used for producing films with low cost and high efficiency. But there is a video material acceleration problem during the rendering production process using the real-time engine. The patent is used for solving the problem of video material acceleration in the process of converting a real-time engine into a movie rendering engine.
The method is used for solving the problem of video material acceleration when a real-time engine is used as a film and television rendering engine. Most non-real-time video rendering engines have solved this problem, such as C4D, Maya, etc., but some real-time engines that do not focus on video rendering generally do not address this problem when the engine architecture is set up at the bottom because of business related issues (real-time engines are mostly used for gaming, live broadcast, etc.).
When a real-time engine is used to compress video, the following problems exist: when the imported VJ video is rendered at a high speed, the image speed is very high, and the speed is different every time; among them, the VJ video refers to a video obtained by a video Jockey (Visual Jockey) instantly splicing Visual elements such as video and animation, and most importantly splicing and adding effects. To solve this problem, the definition of frame and time and the relationship between frame and time in different situations are clarified.
Specifically, a frame is a unit of a screen in a video motion picture. Generally, the number of frames (rate), which is simply the number of frames of a picture transmitted in 1 second, can also be understood as the number of times a graphics processor can refresh every second, and is generally denoted by fps (frames Per second).
In a real-time rendering engine, a dynamic frame rate is employed. Depending on hardware configuration, software optimization, and rendering content, the frame rate may vary from 30-200 FPS. While video and television video generally has a fixed frame rate, 24FPS is the visual limit, and video programs are usually compressed by using different FPSs such as 30, 59.97, 60 and the like in order to ensure absolute fluency.
In the real-time engine, since the video material is generally fixed-frame, and the real-time rendered picture is a dynamic frame number, and the specific running frame number is determined by the video card, in order to synchronize the frame numbers on both sides, the video material is generally used as an external reference, so that the engine second number is ensured to be consistent with the video second number, and thus, during running, the picture seen every second is consistent, which can be specifically referred to table 1 (in table 1, the video time is fixed, that is, 30 frames are rendered every second, based on which, the video material can be used as an external reference, and 30 frames are rendered every second). In fig. 2, after one second, the real-time engine renders a picture of 60-90 frames (for example, from frame 65 to frame 135, or from frame 135 to frame 199, etc.), and the smiling face moves from a to B. External video, after one second, runs 30 frames steadily, with a smiling face moving from a to B.
The real-time engine is used as a movie rendering engine, and the frame time and the rendering time are not synchronized during rendering because the occurrence rate of each frame of the frame is reduced during rendering (for example, in table 1, each frame requires a rendering time of 2-3 seconds in a movie rendering system, meanwhile, in a video frame-locked system, each frame also requires a rendering time of 2-3 seconds in reading the sequential frames according to the time returned by the rendering engine, while in the real-time rendering engine, 60-90 frames can be rendered each second, which results in the reduction of the occurrence rate of each frame of the frame), while the externally-referenced video speed is still normally played, and the current frame of the frame is provided for the rendering engine when rendering is needed, which results in that the rendered video speed is very high. The details are shown in table 1 below:
TABLE 1
Figure BDA0002732481910000071
Next, steps S110 to S130 will be explained and explained.
In step S110, a video material to be rendered is obtained, and the video material to be rendered is converted to obtain a plurality of image sequence frames.
In this example embodiment, first, an externally imported video material to be rendered may be obtained, where the video material to be rendered may be a VJ material in a 3D scene, or may be a video material in another scene, and this example does not specially limit this. Secondly, converting the video material to be rendered to obtain a plurality of image sequence frames; it should be noted that, a plurality of image sequence frames are obtained by converting the video material to be rendered, and then the corresponding picture is obtained when rendering is needed, so that the problem that the whole consumption of a real-time rendering engine is high due to the fact that all the video material to be rendered needs to be loaded in the prior art is solved, the bandwidth needed by loading the picture is reduced, and the loading speed is increased.
Further, in order to further reduce the consumption of the real-time rendering engine, the rendering method further includes: storing the plurality of image sequence frames to a preset storage directory; wherein the preset storage directory is not set in the real-time rendering engine. By the method, when a certain image sequence frame needs to be loaded, the image sequence frame can be loaded from the preset storage directory according to the corresponding file position and the file name, so that the problem that the rendering speed is slow due to overlarge consumption of a real-time rendering engine caused by frequent loading of the image sequence frame can be avoided, and the rendering efficiency is further improved.
In step S120, according to the target rendering frame number in the rendering queue, a picture having the same frame number as the target rendering frame number is read from the plurality of image sequence frames.
In the present exemplary embodiment, first, in order to solve the problem that when a real-time engine is used to suppress a video, the picture time of an externally-referenced video and the picture rendering time are not synchronous due to the fact that the rate of rendering a picture is not consistent with the rate of an externally-referenced video, which is proposed by the present application, the frame number at the time of video rendering and suppressing needs to be obtained. Therefore, in order to obtain the frame number when the video is compressed, the floating point number in the rendering queue needs to be converted to obtain the target rendering frame number. Specifically, a floating point number may be registered in the rendering queue to monitor the current rendering frame number; the floating point number may then be converted to obtain the desired target render frame number.
Secondly, after the target rendering frame number is obtained, according to the target rendering frame number in the rendering queue, a picture sequence reader is used for reading pictures with the same frame number as the target rendering frame number from the plurality of image sequence frames. Specifically, referring to fig. 3, according to the target rendering frame number in the rendering queue, reading a picture having the same frame number as the target rendering frame number from the plurality of image sequence frames may include step S310 and step S320. Wherein:
in step S310, a picture sequence reader is constructed, and a picture sequence position and a file name of each image sequence frame are configured in the picture sequence reader; the file name comprises a picture prefix, a picture suffix and an extension;
in step S320, a picture having the same number of frames as the target rendering frame number in the rendering queue is read from the plurality of image sequence frames by the picture sequence reader according to the picture sequence position and the file name.
Hereinafter, step S310 and step S320 will be explained and explained. In particular, to reduce rendering performance overhead, video material to be rendered has been converted into image sequence frames and placed in a directory outside of the real-time rendering engine, as described above. Therefore, in order to obtain pictures in real time during the rendering process, a picture sequence reader needs to be constructed, and the work of the picture sequence reader is to obtain corresponding pictures according to floating point numbers (target rendering frame numbers) in a rendering queue.
The picture sequence reader needs to configure a picture sequence position and a file name of the image sequence frame, where the file name may include a picture prefix, a picture suffix, and an extension, and obtains a current file name (the file name may be, for example, layer3-LED _00001.png) at a picture sequence file position (the picture sequence file position may be, for example, E: \\ UE4_ Temp \ LED _ png _ VJ \ layer3-LED \) when rendering is started when switching to a next frame picture. Further, under the condition of definite position and file name, loading the picture by using the code, and obtaining all VJ object material maps corresponding to the current rendering frame.
Specifically, the code for reading the picture is as follows:
Figure BDA0002732481910000091
Figure BDA0002732481910000101
in step S130, the picture is assigned to the object map corresponding to the picture, and the object map to which the picture is assigned is rendered to obtain an output video.
In the present exemplary embodiment, first, a picture is given to an object map corresponding to the picture. The method specifically comprises the following steps: and converting the current picture format of the picture into a preset texture format, and endowing the picture with the converted format to an object map corresponding to the picture, wherein the preset texture format is a format which can be read by the real-time rendering engine.
Specifically, first, a current picture format of the picture is converted into a texture format that can be read by an engine; then, the picture after format conversion is given to an object map used for the picture (namely, the picture after format conversion is given to a model map used for the picture), so that the picture can be correctly displayed in the same frame in real time; the object here refers to a corresponding VJ video projection model in a three-dimensional scene, and there is a correspondence between the object and the map, which can project the map to a position corresponding to the model by UV mapping (mapping different regions (different positions) of a plane image onto different surfaces of the 3D model). Further, after the object map corresponding to the picture is given, the output video can be played; and the playing frame rate of the output video is consistent with the input frame rate of the video material to be rendered.
Specifically, after rendering is started, the output video is normal. Furthermore, a plurality of paths of video sequence frames are continuously added to be endowed to a screen of a scene, and the load consumption of the whole system is found to be greatly reduced because compared with the traditional method that a real-time engine needs to maintain the normal playing of a video, the method only needs to load one picture when a video is rendered and frame-cut, the whole consumption is reduced, the consumption is shifted to the consumption of a hard disk IO from the consumption of a CPU and a GPU, and the bandwidth of the hard disk required by loading is very low in the solid-state speed.
Hereinafter, the rendering method according to the exemplary embodiment of the present invention will be further explained and explained with reference to fig. 4. Referring to fig. 4, the rendering method may include the steps of:
step S410, acquiring the frame number when the video is rendered and pressed, and reading the corresponding VJ video frame number by using the frame number;
step S420, constructing a picture sequence reader, and acquiring a corresponding picture by using the picture sequence reader;
step S430, after reading the picture needed by the corresponding rendering frame, endowing the picture with an object map of the picture so as to realize correct same frame in rendering;
and step S440, starting rendering, enabling the output video to be normal, and playing the output video.
The rendering method provided by the exemplary embodiment of the present invention, on the one hand, solves the problem of frame synchronization in rendering video material when a real-time engine is used as a movie rendering engine. The scheme is not limited to a platform and is suitable for similar problems encountered by rendering video materials by all real-time engines. On the other hand, the method can also be used for the audio acceleration problem when the real-time engine renders the audio. On the other hand, the performance overhead of operation and rendering is reduced, the GPU consumption is saved, the manufacturing process is smoother, and the rendering speed is also improved. The saved performance may be used for scenario optimization margins.
Fig. 5 schematically shows a flow chart of an audio material processing method according to an exemplary embodiment of the present invention. Referring to fig. 5, the audio material processing method may include the steps of:
step S510, obtaining an audio material to be rendered, and converting the audio material to be rendered to obtain a plurality of audio sampling sequences;
step S520, reading target sampling sequences with the same number as the target rendering number from the plurality of audio sampling sequences by using an audio sequence reader according to the target rendering number in the rendering queue;
step S530, the target sampling sequence is endowed to an audio model corresponding to the target sampling sequence, and the audio model endowed with the target sampling sequence is rendered to obtain output audio.
In the exemplary embodiment schematically illustrated in fig. 5, on one hand, since the target sampling sequence is read according to the target rendering frame number, it can be ensured that the number of the read pictures is consistent with the rendering frame number, and thus, the problem that when the real-time engine is adopted to suppress audio, the display time of the externally-referenced audio and the time of outputting the audio are not synchronized due to the inconsistency between the rendering rate of the audio and the rate of the externally-referenced audio can be solved.
The embodiment of the invention also provides a rendering device which is applied to the real-time rendering engine. Referring to fig. 6, the rendering apparatus may include a video material conversion module 610, a picture reading module 620, and a rendering module 630. Wherein:
the video material conversion module 610 may be configured to obtain a video material to be rendered, and convert the video material to be rendered to obtain a plurality of image sequence frames;
the picture reading module 620 may be configured to read, from the plurality of image sequence frames, a picture with the same number of frames as the target rendering frame number according to the target rendering frame number in the rendering queue;
the rendering module 630 may be configured to assign the picture to an object map corresponding to the picture, and render the object map assigned with the picture to obtain an output video.
In an exemplary embodiment of the present disclosure, the rendering apparatus may further include:
the floating point number conversion module may be configured to convert floating point numbers in the rendering queue to obtain the target rendering frame number.
In an exemplary embodiment of the present disclosure, the reading, from the plurality of image sequence frames, a picture having the same number of frames as the target rendering frame number according to the target rendering frame number in a rendering queue includes:
constructing a picture sequence reader, and configuring the picture sequence position and the file name of each image sequence frame in the picture sequence reader; the file name comprises a picture prefix, a picture suffix and an extension;
and reading pictures with the same frame number as the target rendering frame number in the rendering queue from the plurality of image sequence frames by using the picture sequence reader according to the picture sequence positions and the file names.
In an exemplary embodiment of the present disclosure, the assigning the picture to the object map corresponding to the picture includes:
and converting the current picture format of the picture into a preset texture format, and endowing the picture with the converted format to an object map corresponding to the picture.
In an exemplary embodiment of the present disclosure, the preset texture format is a format that can be read by the real-time rendering engine.
In an exemplary embodiment of the present disclosure, the rendering apparatus further includes:
the image sequence frame storage module can be used for storing the plurality of image sequence frames to a preset storage directory; wherein the preset storage directory is not set in the real-time rendering engine.
In an exemplary embodiment of the present disclosure, the rendering apparatus further includes:
the output video playing module can be used for playing the output video; and the playing frame rate of the output video is consistent with the input frame rate of the video material to be rendered.
The specific details of each module in the rendering device have been described in detail in the corresponding rendering method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the invention. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present invention are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
In an exemplary embodiment of the present invention, there is also provided an electronic device capable of implementing the above method.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 700 according to this embodiment of the invention is described below with reference to fig. 7. The electronic device 700 shown in fig. 7 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 7, electronic device 700 is embodied in the form of a general purpose computing device. The components of the electronic device 700 may include, but are not limited to: the at least one processing unit 710, the at least one memory unit 720, a bus 730 connecting different system components (including the memory unit 720 and the processing unit 710), and a display unit 740.
Wherein the storage unit stores program code that is executable by the processing unit 710 such that the processing unit 710 performs the steps according to various exemplary embodiments of the present invention as described in the above section "exemplary method" of the present specification. For example, the processing unit 710 may perform step S110 as shown in fig. 1: acquiring a video material to be rendered, and converting the video material to be rendered to obtain a plurality of image sequence frames; step S120: reading pictures with the same frame number as the target rendering frame number from the plurality of image sequence frames according to the target rendering frame number in the rendering queue; step S130: and endowing the picture to an object map corresponding to the picture, and rendering the object map endowed with the picture to obtain an output video.
The storage unit 720 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)7201 and/or a cache memory unit 7202, and may further include a read only memory unit (ROM) 7203.
The storage unit 720 may also include a program/utility 7204 having a set (at least one) of program modules 7205, such program modules 7205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 730 may be any representation of one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 700 may also communicate with one or more external devices 800 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 700, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 700 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 750. Also, the electronic device 700 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 760. As shown, the network adapter 760 communicates with the other modules of the electronic device 700 via the bus 730. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 700, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present invention can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (which can be a personal computer, a server, a terminal device, or a network device, etc.) execute the method according to the embodiment of the present invention.
In an exemplary embodiment of the present invention, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
According to the program product for realizing the method, the portable compact disc read only memory (CD-ROM) can be adopted, the program code is included, and the program product can be operated on terminal equipment, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.

Claims (7)

1. A rendering method applied to a real-time rendering engine, the rendering method comprising:
acquiring a video material to be rendered, and converting the video material to be rendered to obtain a plurality of image sequence frames; storing the plurality of image sequence frames to a preset storage directory; wherein the preset storage directory is not set in the real-time rendering engine;
converting the floating point number in the rendering queue to obtain a target rendering frame number;
reading pictures with the same frame number as the target rendering frame number from the plurality of image sequence frames according to the target rendering frame number in a rendering queue; the method comprises the following steps:
constructing a picture sequence reader, and configuring the picture sequence position and the file name of each image sequence frame in the picture sequence reader; the file name comprises a picture prefix, a picture suffix and an extension;
reading pictures with the same frame number as the target rendering frame number in the rendering queue from the plurality of image sequence frames by using the picture sequence reader according to the picture sequence positions and the file names;
and endowing the picture to an object map corresponding to the picture, and rendering the object map endowed with the picture to obtain an output video.
2. The rendering method according to claim 1, wherein the assigning the picture to the object map corresponding to the picture comprises:
and converting the current picture format of the picture into a preset texture format, and endowing the picture with the converted format to an object map corresponding to the picture.
3. The rendering method according to claim 2, wherein the predetermined texture format is a format readable by the real-time rendering engine.
4. The rendering method according to any one of claims 1 to 3, further comprising:
playing the output video; and the playing frame rate of the output video is consistent with the input frame rate of the video material to be rendered.
5. A rendering apparatus, applied to a real-time rendering engine, the rendering apparatus comprising:
the video material conversion module is used for acquiring a video material to be rendered and converting the video material to be rendered to obtain a plurality of image sequence frames;
the image sequence frame storage module is used for storing the plurality of image sequence frames to a preset storage directory; wherein the preset storage directory is not set in the real-time rendering engine;
the floating point number conversion module is used for converting the floating point number in the rendering queue to obtain a target rendering frame number;
the picture reading module is used for reading pictures with the same frame number as the target rendering frame number from the plurality of image sequence frames according to the target rendering frame number in the rendering queue; the method comprises the following steps: constructing a picture sequence reader, and configuring the picture sequence position and the file name of each image sequence frame in the picture sequence reader; the file name comprises a picture prefix, a picture suffix and an extension; reading pictures with the same frame number as the target rendering frame number in the rendering queue from the plurality of image sequence frames by using the picture sequence reader according to the picture sequence positions and the file names;
and the rendering module is used for endowing the picture with the object map corresponding to the picture, and rendering the object map endowed with the picture to obtain an output video.
6. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the rendering method of any one of claims 1 to 4.
7. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the rendering method of any of claims 1-4 via execution of the executable instructions.
CN202011122501.9A 2020-10-20 2020-10-20 Rendering method and device, computer readable storage medium and electronic device Active CN112235604B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011122501.9A CN112235604B (en) 2020-10-20 2020-10-20 Rendering method and device, computer readable storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011122501.9A CN112235604B (en) 2020-10-20 2020-10-20 Rendering method and device, computer readable storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN112235604A CN112235604A (en) 2021-01-15
CN112235604B true CN112235604B (en) 2021-12-10

Family

ID=74118675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011122501.9A Active CN112235604B (en) 2020-10-20 2020-10-20 Rendering method and device, computer readable storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN112235604B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113194266A (en) * 2021-04-28 2021-07-30 深圳迪乐普数码科技有限公司 Image sequence frame real-time rendering method and device, computer equipment and storage medium
CN113316020B (en) * 2021-05-28 2023-09-15 上海曼恒数字技术股份有限公司 Rendering method, device, medium and equipment
CN113421321B (en) * 2021-07-09 2024-03-19 北京七维视觉传媒科技有限公司 Rendering method and device for animation, electronic equipment and medium
CN114860358A (en) * 2022-03-31 2022-08-05 北京达佳互联信息技术有限公司 Object processing method and device, electronic equipment and storage medium
CN114881901A (en) * 2022-04-29 2022-08-09 北京字跳网络技术有限公司 Video synthesis method, device, equipment, medium and product
CN115035228B (en) * 2022-06-08 2023-01-17 北京领为军融科技有限公司 Rendering method for generating texture by reading satellite film file in real time through asynchronous io
CN115018967B (en) * 2022-06-30 2024-05-03 联通智网科技股份有限公司 Image generation method, device, equipment and storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8493399B1 (en) * 2012-01-10 2013-07-23 Google Inc. Multiprocess GPU rendering model
CN105120183B (en) * 2015-09-02 2018-09-28 广东建设职业技术学院 A kind of method and system improving material object display fluency
CN106910234A (en) * 2015-12-18 2017-06-30 普联软件股份有限公司 One kind is based on improved 3 d rendering engine Distributed Rendering Environment method and system
US10741143B2 (en) * 2017-11-28 2020-08-11 Nvidia Corporation Dynamic jitter and latency-tolerant rendering
CN108846791B (en) * 2018-06-27 2022-09-20 珠海豹趣科技有限公司 Rendering method and device of physical model and electronic equipment
US10861421B2 (en) * 2018-09-27 2020-12-08 Mediatek Inc. Adaptive control of GPU rendered frame quality
US11164496B2 (en) * 2019-01-04 2021-11-02 Channel One Holdings Inc. Interrupt-free multiple buffering methods and systems
CN111028361B (en) * 2019-11-18 2023-05-02 杭州群核信息技术有限公司 Three-dimensional model, material merging method, device, terminal, storage medium and rendering method

Also Published As

Publication number Publication date
CN112235604A (en) 2021-01-15

Similar Documents

Publication Publication Date Title
CN112235604B (en) Rendering method and device, computer readable storage medium and electronic device
CN111669623B (en) Video special effect processing method and device and electronic equipment
US10499035B2 (en) Method and system of displaying a popping-screen
US10229651B2 (en) Variable refresh rate video capture and playback
CN111899322B (en) Video processing method, animation rendering SDK, equipment and computer storage medium
CN110070496B (en) Method and device for generating image special effect and hardware device
US20160373502A1 (en) Low latency application streaming using temporal frame transformation
CN113368492A (en) Rendering method and device
CN110290398B (en) Video issuing method and device, storage medium and electronic equipment
CN109672902A (en) A kind of video takes out frame method, device, electronic equipment and storage medium
US11893770B2 (en) Method for converting a picture into a video, device, and storage medium
CN112714357A (en) Video playing method, video playing device, electronic equipment and storage medium
CN113453073B (en) Image rendering method and device, electronic equipment and storage medium
CN113411660B (en) Video data processing method and device and electronic equipment
CN110782387A (en) Image processing method and device, image processor and electronic equipment
CN112764877A (en) Method and system for communication between hardware acceleration equipment and process in docker
CN114222185B (en) Video playing method, terminal equipment and storage medium
KR20210135859A (en) Ar remote randering method for real time mr service with volumetric 3d video data
CN113411661B (en) Method, apparatus, device, storage medium and program product for recording information
CN114554269A (en) Data processing method, electronic device and computer readable storage medium
CN111367598B (en) Method and device for processing action instruction, electronic equipment and computer readable storage medium
CN112954452A (en) Video generation method, device, terminal and storage medium
US20140086550A1 (en) System, terminal device, and image capturing method
CN111435995B (en) Method, device and system for generating dynamic picture
CN114615546B (en) Video playing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant