CN111225271A - Multi-engine image capturing and screen recording method based on android set top box platform - Google Patents

Multi-engine image capturing and screen recording method based on android set top box platform Download PDF

Info

Publication number
CN111225271A
CN111225271A CN202010058323.1A CN202010058323A CN111225271A CN 111225271 A CN111225271 A CN 111225271A CN 202010058323 A CN202010058323 A CN 202010058323A CN 111225271 A CN111225271 A CN 111225271A
Authority
CN
China
Prior art keywords
module
engine
texture
rendering
initialization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010058323.1A
Other languages
Chinese (zh)
Inventor
洪清泉
齐培娣
陆一
何涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Wheat Interactive Enterprise Development Co ltd
Original Assignee
Shanghai Wheat Interactive Enterprise Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Wheat Interactive Enterprise Development Co ltd filed Critical Shanghai Wheat Interactive Enterprise Development Co ltd
Priority to CN202010058323.1A priority Critical patent/CN111225271A/en
Publication of CN111225271A publication Critical patent/CN111225271A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a multi-engine image capturing and screen recording method based on an android set top box platform, which comprises an initialization module, a multi-engine rendering unified interface module, multi-engine general texture generation, a multi-engine rendering module, a texture video coding module and a stream pushing module, wherein the initialization module comprises set top box android application end integrated environment initialization and server cdn receiving service starting initialization; switching the rendering environment of each engine system into an egl environment established by the engine system by using a hook opengles technology, and copying texture data rendered by each engine to a uniform rendering texture through fbo and pbo; and finally, the uniform texture is put into the egl rendering environment of the user for rendering, all core functions of the method are realized by C + +, the interfaces are uniform, and the performance is high.

Description

Multi-engine image capturing and screen recording method based on android set top box platform
Technical Field
The invention relates to the field of computers, in particular to a multi-engine picture-capturing and screen-recording method based on an android set top box platform.
Background
Along with the gradual popularization of the smart television, the game playing of the user on the television is more and more popular, and the pleasure brought to the user by the large-screen experience cannot be achieved by a mobile phone and a computer. Every time the user enjoys the fun, the user wants to catch a wonderful picture on a picture recording screen and share the picture with other people in real time. There are two types of android screen-capturing methods existing in the market: firstly, android is applied to a self-contained screen recording function (such as a self-contained screen recording function of an android game), a pure hook opengles technology is adopted for capturing and recording a screen, although the method is efficient, some engines are incompatible (such as versions after unity 5.2), and the method is not universal; and secondly, the method is universal by adopting Android screen capturing and recording software, a mode of capturing and recording the whole screen is adopted, the system is required to be based on an Android version of more than 5.0, image data needs to undergo a process of copying the gpu pipeline rendering data to a cpu cache line by line and then encoding, the data flow direction is gpu-cpu-gpu, and the efficiency is not high. Most set top boxes of smart televisions use a low-version Android system (Android 4.4 or lower), and for set top box applications without a picture-capturing and picture-recording function, picture-capturing and picture-recording can not be performed in real time, so that user experience is greatly reduced.
Disclosure of Invention
The invention aims to provide a multi-engine picture-capturing and screen-recording method based on an android set top box platform aiming at the defects of the prior art so as to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: a multi-engine image capturing and screen recording method based on an android set top box platform comprises an initialization module, a multi-engine rendering unified interface module, a multi-engine general texture generation module, a multi-engine rendering module, a texture video coding module and a stream pushing module, wherein the initialization module comprises a set top box android application end integrated environment initialization module and a server cdn receiving service starting initialization, the multi-engine rendering unified interface module comprises a position module for defining a unified interface to each engine and a meaning definition module for defining interface parameters to each engine layer, the multi-engine general texture generation module comprises a multi-engine texture acquisition module, a multi-engine texture parameter generation module and an engine to opengl texture conversion module, the multi-engine rendering module comprises a multi-engine rendering initialization module, a general opengles rendering environment initialization module, a multi-engine acquisition renderget texture module and an engine renderget to unified texture rendering module, the texture video coding module comprises a soft and hard coding capability analysis module, a hard coding initial module, a soft coding initial module, an egl rendering environment creation module of a texture-to-coding system, a texture-to-hard coding media codec module, a texture-to-soft coding module and a coding result output module, and the stream pushing module comprises a network environment analysis module and an audio and video data sending module.
As a preferred technical solution of the present invention, the initialization of the set-top box android application integration environment includes initialization of code and picture resources and integration of sdk interfaces.
As a preferred technical solution of the present invention, the texture forward to soft coding module includes a texture to cpu buffer module.
As a preferred technical solution of the present invention, the initialization module is respectively connected to a multi-engine rendering unified interface module and a multi-engine general texture generation module, both the multi-engine rendering unified interface module and the multi-engine general texture generation module are connected to the multi-engine rendering module, and the texture video coding module is respectively connected to the multi-engine rendering module and the stream pushing module.
As a preferred technical scheme of the invention, the initialization module is used for starting to receive audio and video streams, initializing an environment space and automatically starting a monitoring program.
As a preferred technical scheme of the invention, the multi-engine rendering unified interface module unifies a multi-engine rendering interface, and each engine is copied in a rendering manner according to opengles standard; defining the unified engine interfaces of initCapturer, bineFbo, drawTexture, captureFrame, unbineFbo and stopCapturer; a render-one-frame-end swapglbuffers method is also defined, namely, fbo, pbo and android gpu are used for sharing a graphic buffer to generate corresponding renderearthget to replace renderearthget of an engine.
As a preferred technical solution of the present invention, the multi-engine general texture generation is to convert the texture of each engine into a general opengl texture, and this module outputs a general opengl texture id in a unified manner; the multi-engine rendering module is used for cleaning the rendering environment of each engine system, switching into an egl environment and a hook rendering environment which are created by the multi-engine rendering module, switching into a Shader script which is rendered by the multi-engine rendering module, binding the uniform texture of each engine, and copying the texture data rendered by each engine to the uniform rendering texture through fbo and pbo.
As a preferred technical solution of the present invention, the texture video coding module creates a surface according to createInputSurface of MediaCodec, then creates an egl rendering environment according to the surface, and places a uniform texture textureid in the previously created egl rendering environment for rendering, so that there is data input to MediaCodec, then MediaCodec takes a coding buffer index dequeuoutputbuffer, and then takes getOutputBuffers coded data to the stream pushing module.
As a preferred technical solution of the present invention, the stream pushing module converts the encoded data obtained by the texture video encoding module into rtmp data, and pushes the rtmp data to the client or cdn server.
The invention has the beneficial effects that: the invention creates a whole set of multi-engine unified rendering interface and multi-engine general texture generation interface; switching the rendering environment of each engine system into an egl environment established by the engine system by using a hook opengles technology, and copying texture data rendered by each engine to a uniform rendering texture through fbo and pbo; and finally, the uniform texture is put into the egl rendering environment of the user for rendering, all core functions of the method are realized by C + +, the interfaces are uniform, and the performance is high.
Drawings
FIG. 1 is a block diagram of a process of the present invention;
the method comprises the following steps: 10. initializing a module; 11. initializing an android application end integration environment of the set top box; 111. initializing code and picture resources; 112. sdk integration of interfaces; 12. the server cdn receives a service start initialization; 20. rendering a unified interface module by multiple engines; 21. the unified interface defines a location module to each engine; 22. Unifying the interface parameters to the meaning definition module of each engine layer; 30. generating a multi-engine general texture; 31. a multi-engine texture acquisition module; 32. a multi-engine texture parameter generation module; 33. an engine-to-opengl texture conversion module; 40. a multi-engine rendering module; 41. a multi-engine rendering initialization module; 42. initializing a general opengles rendering environment; 43. acquiring a render texture module by the multiple engines; 44. the engine render to the unified texture rendering module; 50. a texture video coding module; 51. a soft and hard coding capability analysis module; 52. a hard-coded initial module; 53. a soft coding initial module; 54. an egl rendering environment creation module of a texture-to-coding system; 55. the texture is transferred to a hard coding mediaodec module; 56. the texture is transferred to a soft coding module; 561. texture to cpu cache module; 57. a coding result output module; 60. a plug flow module; 61. a network environment analysis module; 62. and an audio and video data sending module.
Detailed Description
The following detailed description of the preferred embodiments of the present invention, taken in conjunction with the accompanying drawings, will make the advantages and features of the invention more readily understood by those skilled in the art, and thus will more clearly and distinctly define the scope of the invention.
Example (b): referring to fig. 1, the present invention provides a technical solution: a multi-engine image-capturing and screen-recording method based on an android set-top box platform comprises an initialization module 10, a multi-engine rendering unified interface module 20, a multi-engine universal texture generation module 30, a multi-engine rendering module 40, a texture video coding module 50 and a stream pushing module 60, wherein the initialization module 10 comprises a set-top box android application end integrated environment initialization 11 and a server cdn receiving service starting initialization 12, the multi-engine rendering unified interface module 20 comprises a unified interface definition to position module 21 of each engine and a unified interface parameter to meaning definition module 22 of each engine layer, the multi-engine universal texture generation module 30 comprises a multi-engine texture acquisition module 31, a multi-engine texture parameter generation module 32 and an engine to opengl texture conversion module 33, the multi-engine rendering module 40 comprises a multi-engine rendering initialization module 41, a universal opengles rendering environment initialization 42, a multi-engine acquisition renderget texture module 43 and an engine renderget to unified texture rendering module 44, the texture video coding module 50 comprises a soft-hard coding capability analysis module 51, a hard-coding initial module 52, a soft-coding initial module 53, an egl rendering environment creation module 54 of a texture-to-coding system, a texture-to-hard-coding media codec module 55, a texture-to-soft coding module 56 and a coding result output module 57, and the stream pushing module 60 comprises a network environment analysis module 61 and an audio-video data transmission module 62.
Set-top box android application side integration environment initialization 11 includes code and picture resource initialization 111 and sdk interface integration 112.
Texture pass-to-soft encoding module 56 includes a texture-to-cpu cache module 561.
The initialization module 10 is respectively connected to the multi-engine rendering unified interface module 20 and the multi-engine general texture generation module 30, the multi-engine rendering unified interface module 20 and the multi-engine general texture generation module 30 are both connected to the multi-engine rendering module 40, and the texture video coding module 50 is respectively connected to the multi-engine rendering module 40 and the stream pushing module 60.
The initialization module 10 initializes the environment space and automatically starts the monitoring program for starting and receiving the audio and video stream; initializing an android application end integrated environment of the set top box 11: the module is mainly used for accessing an initialization registration interface for authorization, service event registration and the like; code and picture resource initialization 111: the module puts the code, the so and the picture resource into the position of a corresponding program, and performs android authority application flow control; sdk interface integration 112: the module mainly accesses the sdk by each engine, facilitates the subsequent unified access, and verifies the condition of access non-specification or flow error; the server cdn receives the service initiation initialization 12: the module is used for actively responding to the feedback of the transmission stream, and the preparation scheme can be arranged in time under the condition that the network is not good due to the pre-starting.
The multi-engine rendering unified interface module 20 unifies a multi-engine rendering interface, and each engine performs rendering copy according to opengles standard; defining the unified engine interfaces of initCapturer, bineFbo, drawTexture, captureFrame, unbineFbo and stopCapturer; a render-one-frame-end swapglbuffers method is also defined, namely fbo, pbo and android gpu are used for sharing a graphic buffer to generate corresponding renderearthget to replace renderearthget of an engine; the unified interface defines the location module 21 to each engine: it is the responsibility of this module to abstract the specificity that encapsulates each engine. If android faceview is to access the unified interface in front of onDrawFrame, fbo, pbo related data initialization (preparefo), fbo texture binding write (bineFbo); a picture data acquisition interface call is made in super.ondrowframe (captureFrame, unbineFbo). The unity adopts double fbo and pbo, a unified interface is accessed at an onPreRender interface, double fbo, pbo related data initialization (prepareFbo), fbo texture binding writing (bineFbo) and the like are adopted; the unit 5.2 previous version is called for texture collection (captureFrame, glReadPixels, unbenefbo), viewport setting and the like at the onPostRender interface; the version behind unity5.2 is that the OnRenderImage interface is used for screen capture collection; a meaning definition module 22 that unifies interface parameters to various engine layers: the module encapsulates the meaning of each interface reorganization, such as fbo and pbo for opengles2.0 and 3.0, respectively, and double fbo and double pbo for unity for opengles2.0 and 3.0, respectively, in the surfaceview of the android.
fbo: one of the most common applications is: the method can realize the dazzling effects of lighting effect, environment mapping, shadow mapping and the like by rendering to textures. The Frame Buffer Object extension in OpenGL is recommended for rendering data to a texture Object. The use of fbo technology is more efficient and easier to implement than other comparable technologies, such as data copying or swapping buffers.
pbo: the Buffer Object storing the Pixel data is called a Pixel Buffer Object (PBO). The ARB _ pixel _ buffer _ object extension mirrors all the frames and APIs of VBO and adds two "Target" tags. These two targets assist the PBO store manager (OpenGL driven) in deciding the optimal location of the buffer object: system memory, shared memory, and video card memory.
The multi-engine general texture generation 30 is to convert the texture of each engine into a general opengl texture, and the module uniformly outputs a general opengl texture id; the multi-engine rendering module 40 is used for cleaning the rendering environment of each engine system, switching to an egl environment and a hook rendering environment which are created by the multi-engine rendering module, switching to a Shader script which is rendered by the multi-engine rendering module, binding the uniform texture of each engine, and copying the texture data rendered by each engine to the uniform rendering texture through fbo and pbo. The multi-engine texture fetch module 31: this module exists because the texture fetch mode differs from engine to engine. The texture is protected like the above version of unity5.2, and no texture has been captured at the external setting fbo. When this is encountered, we receive the final frame of the picture rendered by the engine through the RenderTexture of unity's camera. Since the renderTexture of unity and the renderget concept of opengles are different, a set of flow for receiving engine pictures needs to be specially realized for the renderTexture of unity, namely copying a unity image cache to the customized renderTexture through graphics in an OnPost render callback, and then transferring the texture bound by the renderTexture. GetNativeTexturePtr to the 'engine-to-opengl texture conversion module 33' for texture conversion; the unified texture format employed by the multi-engine texture parameter generation module 32 is the GL _ RGB color format. Because rgb uses GL _ unknown _ SHORT _5_6_5, defining such a format has the advantages of small memory consumption, fast processing speed, and the like. And defining GL _ LINEAR LINEAR interpolation filtering, and acquiring a weighted average value of 4 pixels near a coordinate point, wherein the mode ensures the definition of a picture as much as possible under the condition of ensuring low memory consumption. In the GL _ CLAMP _ TO _ EDGE format, where textures are set, coordinates outside the range of the textures are truncated TO 0 and 1, creating the effect of texture EDGE extension. Because some 3d application textures are also used for rendering a buffer area, the unified textures are compatible with a depth buffer area by utilizing glGenRenderbuffers so as to achieve the best display effect; engine to opengl texture conversion module 33: since there are differences in the texture of the engine and our unified texture parameter format, viewport size, the module we re-do the format rendering conversion with fbo.
The multi-engine rendering initialization module 41: the module stores rendering parameters, rendering textures, and shader equivalent values of different engines, and encapsulates a special unit engine render texture initialization process, so that the purpose of the initialization is to render and restore the engines again and to perform uniform opengles rendering analysis. General opengles rendering environment initialization 42: the module creates double fbo, initializes the unified texture binding, binds to the rbo depth cache area, and saves all state values of the current opengles, so as to clear the previous rendering state, and prevent the unified opengles texture copy rendering process from being influenced. The multiple engine retrieve texture module 43: the module encapsulates the rendertarget acquisition process of each engine, and is divided into two cases: one is opengl compliant Framebuffer, which encapsulates render according to conventional fbo; alternatively, in the case where the unit does not recognize fbo, the unit engine is first made to render the render texture streamed to the unit, and then rendered to the render texture of the unified opengles through the "engine-to-opengl texture conversion module 33". Engine render target to unified texture rendering module 44: the module comprises the steps of redefining a viewport, resetting color buffering, clearing a gl depth cache, resetting unified useprogam (comprising a vertex shader and a fragment shader), rebinding rendered textures, binding unified textures of engines, binding customized fbo, binding mvp for viewport size scaling, and realizing engine render to a unified texture rendering module through a series of state machine settings.
The texture video coding module 50 creates a surface according to createInputSurface of MediaCodec, then creates egl rendering environment according to the surface, and puts uniform texture textureid into the previously created egl rendering environment for rendering, so that there is data input for MediaCodec, then MediaCodec takes the coding buffer index dequeuoutputbuffer, and then takes getOutputBuffers coded data to the stream pushing module 60. The soft-hard coding capability analysis module 51: because the soft and hard codes have advantages and disadvantages respectively: soft coding: the method has the advantages of direct and simple realization, convenient parameter adjustment and easy upgrade, but the CPU has heavy load, harder performance and low code rate, and the quality is generally better than that of hard code under the low code rate; hard coding: the performance is high, the quality is usually lower than that of soft coding under low code rate, but part of products are transplanted with excellent soft coding algorithm (such as X264) on a GPU hardware platform, and the quality is basically equal to that of the soft coding; it is compatible with hard coding and is best compatible. Judging whether the machine type is compatible with hard coding through compatibility.xml under android/system/etc., judging whether the machine type can be well compatible with the hard coding through whether system media code.configuration is abnormal, and selecting a correct coding mode through coding capability analysis so as to well improve user experience. Hard-coded initialization block 52: the module adopts a system general mediaodec initialization process, and performs delay optimization on set top boxes of different manufacturers on the basis of a general method. Since each hard-coded low-latency parameter is different, the specific process is to adjust the parameter value of mediacodec. Soft coding initialization module 53: the module is realized by ffmpeg, and the experience is optimized by assembly instructions according to hardware capability during compiling. The egl rendering environment creation module 54 of the texture-to-coding system: the module flexibly controls opengles api by using a hook opengles technology, firstly creates a graphic buffer, then creates EGLImageKHR according to the buffer address, then creates texture, and binds to the EGLImage through the glEGLImageTargetTexture2DOES, and after the environment is established, texture cpu cache can be obtained later according to the environment. The texture goes to the hard coded mediaodec module 55: if hard coding is adopted, the texture updated by the multi-engine rendering module 40 is rendered to the surface of the mediaodec again through an event response mechanism. The texture goes to the soft coding module 56: if soft coding is used, the main technique used is texture-to-cpu buffering. According to the aforementioned egl rendering caching environment creating module 54 "of the texture-to-coding system, after one frame of rendering is finished, fbo data is synchronized to the graphics buffer, and then an image cpu cache is obtained in a graphics buffer- > lock manner, and then the cpu cache is transferred to ffmpeg for coding. Encoding result output module 57: the module respectively outputs from a frame call-back of soft and hard codes, and the module defines the output format to adopt the H264 output format.
The stream pushing module 60 converts the encoded data obtained by the texture video encoding module into rtmp data and pushes the rtmp data to the client or cdn server. Network environment analysis module 61: the current network conditions are evaluated, i.e., whether the current network has the capability to send to cdn or the client. If no network exists at present, the data is stored locally in a file form, the size and the time of the data are stored in another configuration file at the same time, the data configuration file is read after the network is recovered, and the video data is taken out and sent. If the current network is poor, the resolution and the code rate are reduced according to the broadband coefficient, and then the data is transmitted. If the current network is good, the resolution and the code rate are improved according to the broadband coefficient, and then the data is transmitted. At the same time, the adjusted relevant parameters are written into qos.
The audio and video data sending module 62: this module contains qos and pushes rtmp data to cdn or the server according to the "network environment analysis module 61".
The invention creates a whole set of multi-engine unified rendering interface and multi-engine general texture generation interface; switching the rendering environment of each engine system into an egl environment established by the engine system by using a hook opengles technology, and copying texture data rendered by each engine to a uniform rendering texture through fbo and pbo; finally, the unified texture is put into the egl rendering environment of the user to be rendered, all core functions of the method are realized by C + +, the interfaces are unified, and the performance is high; the method is compatible with the screen capturing and recording of the version application with the Android more than 4.0, and is more efficient; in Andriod version 4.0-4.4, it is encoded by copying to ffmpeg through the gpu- > egl graphic buffer- > cpu quick copy channel; in the version above android4.4, a renderrendered texture is obtained by using the method that fbo, pbo, egl and android of opengl share a graphic buffer memory, and then gpu is adopted to copy the gpu to a mediaodec for coding; this is all faster than the gpu-cpu-gpu on the market; the invention simultaneously supports the application of all versions of engines such as cos, android, unity and the like, and constructs a universal texture to enable each engine to draw on the texture. The different processing is carried out on different versions of the unity engine, the former version of unity5.2 obtains the engine rendering texture by setting double fbo, and the later version of unity5.2 obtains the texture by the unity engine render texture, and the concept of the texture is converted into opengl rendering texture. In addition, because API compatibility of each opengle version and each android model is inconsistent, fbo mode is used for completing data rendering copy for opengles version 2.0, and pbo mode is used for completing data rendering copy for opengles version 3.0. The module related by the invention comprises a multi-engine rendering unified interface module, a multi-engine general texture generation module, a multi-engine rendering module, a multi-engine render texture acquisition module, a texture video coding module and a stream pushing module.
The above examples only show some embodiments of the present invention, and the description thereof is specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention.

Claims (9)

1. A multi-engine picture-capturing and screen-recording method based on an android set top box platform comprises an initialization module (10), a multi-engine rendering unified interface module (20), a multi-engine general texture generation module (30), a multi-engine rendering module (40), a texture video coding module (50) and a stream pushing module (60), and is characterized in that: the initialization module (10) comprises a set top box android application end integrated environment initialization (11) and a server cdn receiving service starting initialization (12), the multi-engine rendering unified interface module (20) comprises a unified interface definition to position module (21) of each engine and a unified interface parameter to meaning definition module (22) of each engine layer, the multi-engine universal texture generation (30) comprises a multi-engine texture acquisition module (31), a multi-engine texture parameter generation module (32) and an engine to opengl texture conversion module (33), the multi-engine rendering module (40) comprises a multi-engine rendering initialization module (41), a universal opengles rendering environment initialization (42), a multi-engine acquisition render texture module (43) and an engine render to unified texture rendering module (44), and the texture video coding module (50) comprises a soft and hard coding capacity analysis module (51), an engine render and a unified texture coding capacity analysis module (44), The device comprises a hard coding initial module (52), a soft coding initial module (53), an egl rendering environment creating module (54) of a texture-to-coding system, a texture-to-hard coding media codec module (55), a texture-to-soft coding module (56) and a coding result output module (57), wherein the stream pushing module (60) comprises a network environment analysis module (61) and an audio and video data sending module (62).
2. The multi-engine picture-capturing and screen-recording method based on the android set-top box platform as claimed in claim 1, characterized in that: the initialization (11) of the android application end integration environment of the set top box comprises code and picture resource initialization (111) and sdk interface integration (112).
3. The multi-engine picture-capturing and screen-recording method based on the android set-top box platform as claimed in claim 1, characterized in that: the texture pass-to-soft-encoding module (56) includes a texture-to-cpu buffer module (561).
4. The multi-engine picture-capturing and screen-recording method based on the android set-top box platform as claimed in claim 1, characterized in that: the initialization module (10) is respectively connected with the multi-engine rendering unified interface module (20) and the multi-engine general texture generation module (30), the multi-engine rendering unified interface module (20) and the multi-engine general texture generation module (30) are both connected with the multi-engine rendering module (40), and the texture video coding module (50) is respectively connected with the multi-engine rendering module (40) and the stream pushing module (60).
5. The multi-engine picture-capturing and screen-recording method based on the android set-top box platform as claimed in claim 1, characterized in that: the initialization module (10) is used for receiving audio and video stream when starting, initializing the environment space and automatically starting the monitoring program.
6. The multi-engine picture-capturing and screen-recording method based on the android set-top box platform as claimed in claim 1, characterized in that: the multi-engine rendering unified interface module (20) unifies multi-engine rendering interfaces, and each engine is copied in a rendering mode according to opengles standards; defining the unified engine interfaces of initCapturer, bineFbo, drawTexture, captureFrame, unbineFbo and stopCapturer; a render-one-frame-end swapglbuffers method is also defined, namely, fbo, pbo and android gpu are used for sharing a graphic buffer to generate corresponding renderearthget to replace renderearthget of an engine.
7. The multi-engine picture-capturing and screen-recording method based on the android set-top box platform as claimed in claim 1, characterized in that: the multi-engine universal texture generation (30) converts the texture of each engine into a universal opengl texture, and the module uniformly outputs a universal opengl texture id; the multi-engine rendering module (40) is used for cleaning the rendering environment of each engine system, switching to an egl environment and a hook rendering environment which are created by the multi-engine rendering module, switching to a rendering Shader script, binding the uniform texture of each engine, and copying the texture data rendered by each engine to the uniform rendering texture through fbo and pbo.
8. The multi-engine picture-capturing and screen-recording method based on the android set-top box platform as claimed in claim 1, characterized in that: the texture video coding module (50) creates a surface according to createInputSurface of MediaCodec, then creates an egl rendering environment according to the surface, and puts uniform texture textureid into the previously created egl rendering environment for rendering, so that the MediaCodec has data input, then the MediaCodec takes a coding buffer index dequeuoutputbuffer, and then takes getOutputBuffers coding data to a stream pushing module (60).
9. The multi-engine picture-capturing and screen-recording method based on the android set-top box platform as claimed in claim 1, characterized in that: and the stream pushing module (60) is used for converting the coded data obtained by the texture video coding module into rtmp data and pushing the rtmp data to a client or cdn server.
CN202010058323.1A 2020-01-19 2020-01-19 Multi-engine image capturing and screen recording method based on android set top box platform Pending CN111225271A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010058323.1A CN111225271A (en) 2020-01-19 2020-01-19 Multi-engine image capturing and screen recording method based on android set top box platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010058323.1A CN111225271A (en) 2020-01-19 2020-01-19 Multi-engine image capturing and screen recording method based on android set top box platform

Publications (1)

Publication Number Publication Date
CN111225271A true CN111225271A (en) 2020-06-02

Family

ID=70828452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010058323.1A Pending CN111225271A (en) 2020-01-19 2020-01-19 Multi-engine image capturing and screen recording method based on android set top box platform

Country Status (1)

Country Link
CN (1) CN111225271A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112218148A (en) * 2020-09-11 2021-01-12 杭州易现先进科技有限公司 Screen recording method and device, computer equipment and computer readable storage medium
CN113411661A (en) * 2021-06-11 2021-09-17 北京百度网讯科技有限公司 Method, apparatus, device, storage medium and program product for recording information
CN113663328A (en) * 2021-08-25 2021-11-19 腾讯科技(深圳)有限公司 Picture recording method and device, computer equipment and storage medium
CN113923510A (en) * 2021-10-11 2022-01-11 深圳创维-Rgb电子有限公司 Method, device and equipment for forwarding digital television content and readable storage medium
CN115361583A (en) * 2022-08-10 2022-11-18 吉林动画学院 Method for real-time rendering of video frame textures by aiming at APP and Unity

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112218148A (en) * 2020-09-11 2021-01-12 杭州易现先进科技有限公司 Screen recording method and device, computer equipment and computer readable storage medium
CN113411661A (en) * 2021-06-11 2021-09-17 北京百度网讯科技有限公司 Method, apparatus, device, storage medium and program product for recording information
CN113663328A (en) * 2021-08-25 2021-11-19 腾讯科技(深圳)有限公司 Picture recording method and device, computer equipment and storage medium
CN113663328B (en) * 2021-08-25 2023-09-19 腾讯科技(深圳)有限公司 Picture recording method, device, computer equipment and storage medium
CN113923510A (en) * 2021-10-11 2022-01-11 深圳创维-Rgb电子有限公司 Method, device and equipment for forwarding digital television content and readable storage medium
CN113923510B (en) * 2021-10-11 2024-02-02 深圳创维-Rgb电子有限公司 Method, device, equipment and readable storage medium for forwarding digital television content
CN115361583A (en) * 2022-08-10 2022-11-18 吉林动画学院 Method for real-time rendering of video frame textures by aiming at APP and Unity
CN115361583B (en) * 2022-08-10 2024-05-17 吉林动画学院 Method for rendering video frame textures in real time aiming at APP and Unity

Similar Documents

Publication Publication Date Title
CN111225271A (en) Multi-engine image capturing and screen recording method based on android set top box platform
US7173635B2 (en) Remote graphical user interface support using a graphics processing unit
US10779011B2 (en) Error concealment in virtual reality system
CN112235626B (en) Video rendering method and device, electronic equipment and storage medium
CN112204993B (en) Adaptive panoramic video streaming using overlapping partitioned segments
US11483475B2 (en) Adaptive panoramic video streaming using composite pictures
CN105917382B (en) Method and system for interactive graphics streaming
US11381835B2 (en) Systems and methods for game-generated motion vectors
JP2019517191A (en) Hybrid graphics and pixel domain architecture for 360 degree video
TW200948088A (en) System and method for virtual 3D graphics acceleration and streaming multiple different video streams
CN112843676A (en) Data processing method, device, terminal, server and storage medium
US11418830B2 (en) Distributed video and graphics rendering system
US9226003B2 (en) Method for transmitting video signals from an application on a server over an IP network to a client device
US7876996B1 (en) Method and system for time-shifting video
CN115190345A (en) Coordinated control for display media
US9335964B2 (en) Graphics server for remotely rendering a composite image and method of use thereof
CN108184053B (en) Embedded image processing method and device
CN114245137A (en) Video frame processing method performed by GPU and video frame processing apparatus including GPU
KR101779527B1 (en) Web server for processing 360° VR video having high resolution and method thereof
WO2024120031A1 (en) Video data processing method and apparatus, computer device, and storage medium
WO2023193524A1 (en) Live streaming video processing method and apparatus, electronic device, computer-readable storage medium, and computer program product
WO2022191070A1 (en) 3d object streaming method, device, and program
WO2021199128A1 (en) Image data transfer device, image generation method and computer program
CN114245138A (en) Video frame processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200602

RJ01 Rejection of invention patent application after publication