CN117041515A - Stereoscopic display method, stereoscopic display device, stereoscopic display equipment and readable medium - Google Patents

Stereoscopic display method, stereoscopic display device, stereoscopic display equipment and readable medium Download PDF

Info

Publication number
CN117041515A
CN117041515A CN202311111805.9A CN202311111805A CN117041515A CN 117041515 A CN117041515 A CN 117041515A CN 202311111805 A CN202311111805 A CN 202311111805A CN 117041515 A CN117041515 A CN 117041515A
Authority
CN
China
Prior art keywords
eye
rendering target
determining
executing
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311111805.9A
Other languages
Chinese (zh)
Inventor
周清会
翁开宇
师国超
张建国
刘成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Manheng Digital Technology Co ltd
Original Assignee
Shanghai Manheng Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Manheng Digital Technology Co ltd filed Critical Shanghai Manheng Digital Technology Co ltd
Priority to CN202311111805.9A priority Critical patent/CN117041515A/en
Publication of CN117041515A publication Critical patent/CN117041515A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a stereoscopic display method, a device, equipment and a readable medium of a display device, wherein the method is applied to a stereoscopic display system, the stereoscopic display system comprises a host and at least one display device, the host is in communication connection with the at least one display device, and the method comprises the following steps: performing an initialization operation and creating a context object corresponding to the display device; creating a left eye camera and a right eye camera from the context object; determining a left-eye rendering target according to the left-eye camera, and determining a right-eye rendering target according to the right-eye camera; executing a first operation of a frame cycle according to the left-eye rendering target, and executing a second operation of the frame cycle according to the right-eye rendering target; and carrying out synthesis processing according to the result of the first operation and the result of the second operation to realize stereoscopic display, so as to solve the technical problem that in the related art, some display devices can support stereoscopic display by hardware, but cannot realize stereoscopic display due to the reason of software implementation.

Description

Stereoscopic display method, stereoscopic display device, stereoscopic display equipment and readable medium
Technical Field
The present application relates to the field of information technologies, and in particular, to a stereoscopic display method, apparatus, device and readable medium for a display device.
Background
Virtual reality, english full scale virtual reality, abbreviated as "VR", is simply a computer system that can create and experience a virtual world. As an advanced man-machine communication technology, the virtual reality technology has been widely used in the fields of military simulation, visual simulation, virtual manufacturing, virtual design, virtual assembly, scientific visualization, and the like. The stereoscopic display technology is an extremely important supporting technology for realizing virtual reality, and mainly comprises the following steps: two-color glasses display, active stereoscopic display, passive synchronized stereoscopic projection device display, stereoscopic display, true three-dimensional stereoscopic display, other higher-level device display modes.
The stereoscopic display principle is as follows: since the eyes of a person are typically located at a distance of 4cm-6cm, the images in the eyes are different when actually looking at an object. After the two different images are transmitted to the brain, the images with depth of field are seen, so that the stereoscopic display of the computer and the projection system can be realized according to the images.
However, the inventors found that there are at least the following technical problems in the related art: some display devices may support stereoscopic display in hardware, but may not be capable of stereoscopic display due to their software implementation; in practical applications, there is often a need for these display devices to support stereoscopic display.
Disclosure of Invention
An object of the present application is to provide a stereoscopic display method, apparatus, device and readable medium for a display device, at least to solve the technical problem that some display devices can support stereoscopic display due to the hardware of the display devices, but cannot realize stereoscopic display due to the software implementation of the display devices in the related art.
To achieve the above object, some embodiments of the present application provide the following aspects:
in a first aspect, some embodiments of the present application also provide a stereoscopic display method of a display device, the method being applied to a stereoscopic display system including a host and at least one display device, the host and the at least one display device being communicatively connected, the method comprising: performing an initialization operation and creating a context object corresponding to the display device; creating a left eye camera and a right eye camera according to the context object; determining a left-eye rendering target according to the left-eye camera, and determining a right-eye rendering target according to the right-eye camera; executing a first operation of a frame cycle according to the left-eye rendering target, and executing a second operation of the frame cycle according to the right-eye rendering target; and carrying out synthesis processing according to the result of the first operation and the result of the second operation to realize three-dimensional display.
In a second aspect, some embodiments of the present application further provide a stereoscopic display apparatus of a display device, the apparatus including: a first creation module for performing an initialization operation and creating a context object corresponding to the display device; a second creating module for creating a left eye camera and a right eye camera according to the context object; the determining module is used for determining a left-eye rendering target according to the left-eye camera and determining a right-eye rendering target according to the right-eye camera; an execution module, configured to execute a first operation of a frame cycle according to the left-eye rendering target and execute a second operation of the frame cycle according to the right-eye rendering target; and the display module is used for carrying out synthesis processing according to the result of the first operation and the result of the second operation so as to realize three-dimensional display.
In a third aspect, some embodiments of the present application also provide a computer apparatus, the apparatus comprising: one or more processors; and a memory storing computer program instructions that, when executed, cause the processor to perform the method as described above.
In a fourth aspect, some embodiments of the application also provide a computer readable medium having stored thereon computer program instructions executable by a processor to implement a method as described above.
Compared with the prior art, in the scheme provided by the embodiment of the application, the method is applied to a stereoscopic display system, the stereoscopic display system comprises a host and at least one display device, the host and the at least one display device are in communication connection, and by performing an initialization operation and creating a context object corresponding to the display device, then creating a left-eye camera and a right-eye camera according to the context object, determining a left-eye rendering target according to the left-eye camera, determining a right-eye rendering target according to the right-eye camera, further performing a first operation of frame cycle according to the left-eye rendering target, performing a second operation of frame cycle according to the right-eye rendering target, and performing a synthesis process according to the result of the first operation and the result of the second operation, stereoscopic display is realized, so that the display device which does not support stereoscopic display can achieve the effect of supporting stereoscopic display.
Drawings
Fig. 1 is an exemplary flowchart of a stereoscopic display method of a display device according to a first embodiment of the present application;
fig. 2 is a schematic flow chart of a sub-step of step S102 in a stereoscopic display method of a display device according to a second embodiment of the present application;
fig. 3 is a schematic structural diagram of a computer device according to a sixth embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The following terms are used herein.
Unity, a game engine and content development platform. The UnitySDK is mainly used for helping game developers to quickly realize scene functions in games, such as functions of some world channels, game public conventions, team group chat, 1-to-1 private chat and the like. In addition, the UnitySDK can provide safer and more stable service, and brings better service experience for game players.
LinkXR, chinese name: the creation chain is a virtual reality content hardware adaptation platform.
And the switching chain is used for managing the display of the graphic frames in the windowed application program, allowing the graphic frames to be prepared in the background and then presented on a screen at a proper time, so that smooth animation and graphic rendering effect are realized.
Example 1
An embodiment of the present application provides a stereoscopic display method of a display device, where the method is applied to a stereoscopic display system, and the stereoscopic display system includes a host and at least one display device, where the host is communicatively connected to the at least one display device, and the method may include the following steps, as shown in fig. 1:
step S101, performing an initialization operation and creating a context object corresponding to the display device;
step S102, creating a left eye camera and a right eye camera according to the context object;
step S103, determining a left-eye rendering target according to the left-eye camera, and determining a right-eye rendering target according to the right-eye camera;
step S104, executing a first operation of frame circulation according to the left-eye rendering target, and executing a second operation of frame circulation according to the right-eye rendering target;
and step 105, performing synthesis processing according to the result of the first operation and the result of the second operation, so as to realize stereoscopic display.
The above steps are described in detail below.
Specifically, in some examples, the stereoscopic display system may include a host and n display devices, where n is an integer greater than or equal to 1. The display device may be a conventional screen, an arc screen, a special-shaped screen, a creative screen, etc., which is not specifically limited by the embodiments of the present application. The method provided in steps S101 to S105 described below is for each display device. For ease of understanding, the display device, specifically, the ith display device, will be described below as device_i.
Specifically, in some examples, before executing the step S101, the stereoscopic display system may determine whether the hardware of the display device supports stereoscopic display, and if it is determined that the hardware of the display device does not support stereoscopic display, exit, so that rendering may be performed according to a monocular rendering method; if it is determined that the hardware of the display device supports stereoscopic display, the process advances to step S101.
Specifically, in some examples, for step S101, an initialization operation may be performed and a context object corresponding to the display device may be automatically created at the start-up of the system. Wherein, the corresponding automatic creation of the context object corresponding to the display device can be different corresponding to different display devices, and the corresponding relation between the display device and the corresponding context object can be preset. Wherein the context object of the display device, in particular, refers to a core object for managing and executing graphics rendering operations, which is associated with a specific graphics processor, i.e. GPU, which allows a user to send rendering commands to the GPU in an application program, enabling drawing, rendering and operation of graphics.
Specifically, in some examples, for step S102, for example, two cameras for the left and right eyes may be added to the upper layer of the Unity engine, and the cameras for the left and right eyes may be displayed in a stereoscopic manner. Here, the left-eye camera created is denoted as reftcamera_i, and the right-eye camera is denoted as lightcomera_i. Further, the process advances to step S103, where a left-eye rendering target is determined from the left-eye camera and a right-eye rendering target is determined from the right-eye camera.
Specifically, in some examples, for step S104, a first operation of a frame cycle may be performed according to the left-eye rendering target, and then a second operation of a frame cycle may be performed according to the right-eye rendering target; alternatively, the second operation of the frame cycle may be performed according to the right-eye rendering target, and then the first operation of the frame cycle may be performed according to the left-eye rendering target. The order in which the first operation of performing the frame cycle on the left-eye rendering target and the second operation of performing the frame cycle on the right-eye rendering target are not particularly limited in the embodiments of the present application.
Specifically, in some examples, with respect to step S105, the display device may implement stereoscopic display after the first operation according to the frame cycle is performed and the second operation according to the frame cycle is performed. Therefore, after wearing equipment such as 3D glasses and the like, relevant personnel can watch the image effect displayed by the display equipment in a three-dimensional mode.
Specifically, in some examples, the steps described above may be implemented based on DirectX 11.
In addition, in some embodiments of the present application, the application program corresponding to the method may be integrated into a Unity plugin, where the Unity plugin is installed in a Unity engine, and the Unity engine is installed in the host. In some other examples, the method may also be applied to a fantasy engine, an IdeaXR engine, etc., which is not particularly limited in the embodiments of the present application. When the application program corresponding to the method can be integrated in the Unity plugin, the Unity plugin can be specifically a unitySDK. Those skilled in the art will understand that the UnitySDK is a file in unitypackage format, which is a proprietary format of a Unity plug-in package, and is mainly used for related developers to import plug-ins into a Unity engine to call related interfaces to create and secondarily develop virtual reality immersive environment contents. Among them, the built-in functions may include, but are not limited to, scene roaming, scene jumping, character snap, UI interactive click, UI interactive drag, object interactive click, object interactive drag, etc.
It can be understood that, the support for the stereoscopic effect is removed from the Unity engine of the version 2020 and above, and the codes provided by the Unity engine in the related art are closed sources, so that the logic of the workflow thereof cannot be known to the person skilled in the art, and therefore, the stereoscopic effect is difficult to realize on the version.
Further, in practical application, virtual reality contents manufactured by a Unity, unreal, ideaVR, VRDO content creation engine can be quickly adapted to LED direct screen, three-folded screen, cave and other virtual reality immersive large screen interaction environments and zSpace hardware environments through LinkXR. In some examples, a user may perform immersive interaction experience on VR content with any VRPN protocol-based optical tracking system and interaction device, and support performing interaction experience on the head-displayed VR content directly in a virtual reality immersive large-screen interaction environment or zSpace hardware environment without modification, and have a 3D stereoscopic effect. In addition, positioning unification based on a real physical space can be performed on the head-mounted display device, so that the relative position relation of each person in the virtual scene is consistent with the real scene, and local multi-person collaborative content experience consistent with the real physical space is provided.
It is found that, compared with the related art, the embodiment of the application provides a stereoscopic display method of a display device, which is applied to a stereoscopic display system including a host and at least one display device, the host and the at least one display device being communicatively connected, by performing an initialization operation and creating a context object corresponding to the display device, then creating a left-eye camera and a right-eye camera according to the context object, and determining a left-eye rendering target according to the left-eye camera, determining a right-eye rendering target according to the right-eye camera, and further performing a first operation of a frame cycle according to the left-eye rendering target, performing a second operation of a frame cycle according to the right-eye rendering target, and performing a composition process according to a result of the first operation and a result of the second operation, stereoscopic display is achieved, so that a display device that does not originally support stereoscopic display can achieve an effect of supporting stereoscopic display.
Example two
In the second embodiment of the present application, the creating a left-eye camera and a right-eye camera according to the context object, that is, step S102 in the first embodiment may further include the following steps, as shown in fig. 2:
step S1021, determining a window corresponding to the display device;
step S1022, creating a switching chain according to the context object and the window;
step S1023, determining a left-eye rendering target according to the switching chain and the left-eye camera, and determining a right-eye rendering target according to the switching chain and the right-eye camera.
Specifically, in some examples, assuming that the display device is device_i and the window corresponding to the display device device_i is window_i, creating a swap chain swapchain_i according to the context object and the window window_i, further creating a left-eye rendering target leftrendertarget_i based on the swap chain swapchain_i and the left-eye camera, and creating a right-eye rendering target leftrendertarget_i according to the swap chain swapchain_i and the right-eye camera.
Example III
The third embodiment of the present application is a further improvement based on the first embodiment, and the specific improvement is that in the third embodiment of the present application, a specific implementation manner of executing the first operation of the frame cycle according to the left-eye rendering target and executing the second operation of the frame cycle according to the right-eye rendering target is provided respectively.
In some embodiments of the present application, the performing the first operation of the frame loop according to the left-eye rendering target may include: acquiring first texture data for rendering the left-eye camera; transmitting the first texture data to the left-eye rendering target; the first texture data is processed based on the left-eye rendering target to perform a first operation of a frame loop.
In some embodiments of the present application, the performing of the second operation of the frame loop according to the right-eye rendering target may include: acquiring second texture data for rendering the right eye camera; transmitting the second texture data to the right-eye rendering target; processing the second texture data based on the right-eye rendering target to perform a second operation of a frame loop.
In general, by rendering a left-eye camera and a right-eye camera for each frame, respectively, rendering texture data of the left-eye camera and the right-eye camera are acquired, and then first texture data of the left-eye camera is transmitted to a left-eye rendering target, second texture data of the right-eye camera is transmitted to a right-eye rendering target, and then the first texture data is processed based on the left-eye rendering target, and the second texture data is processed based on the right-eye rendering target. And then synthesizing the result after the first texture data processing and the result after the second texture data processing of the context object corresponding to the display equipment, so that the stereoscopic effect can be realized. And circularly executing the steps until all the frames are executed.
In particular, the system may create vertex buffer object verterbuffer_i, index buffer object indexbuffer_i, layout object inputlayout_i, sample object sampler_i, vertex shader object vsshader_i, fragment shader object psshader_i, which are needed for rendering.
Further, assume that the first texture data of the left-eye camera leftcamera_i is lefttexture_i; in some embodiments of the present application, the processing the first texture data based on the left-eye rendering target to perform a first operation of a frame loop may further include: creating a first shader resource view object leftshaderresourceview_i according to the first texture data lefttexture_i; executing a clearing operation according to the left-eye rendering target leftrendertarget_i, and setting background information of the left-eye rendering target after executing the clearing operation; determining a vertex buffer object verterbuffer_i, an index buffer object index buffer_i, a layout object inputlayouout_i, a vertex shader object vsshader_i, a fragment shader object pslader_i, a first shader resource view object leftshadderresource eveew_i and a sampling object sampler_i of the left eye rendering target; and executing a first operation of frame circulation according to an objective function, the vertex buffer object verterbuffer_i, the index buffer object index buffer_i, the layout object inputlayouout_i, the vertex shader object vsshader_i, the fragment shader object pslader_i, the first shader resource view object leftshaderresourceview_i and the sampling object sampler_i to finish the rendering of each frame of the left-eye rendering target.
Similarly, in some embodiments of the present application, processing the second texture data based on the right-eye rendering target to perform a second operation of a frame loop may further include: acquiring second texture data for rendering the right eye camera; creating a second shader resource view object from the second texture data; executing a clearing operation on a historical rendering level of the right-eye rendering target, and setting background information of the right-eye rendering target after executing the clearing operation; determining a vertex buffer object, an index buffer object, a layout object, a vertex shader object, a fragment shader object, a second shader resource view object, and a sample object of the right eye render target; and executing a second operation of frame circulation according to the objective function and the vertex buffer object, the index buffer object, the layout object, the vertex shader object, the fragment shader object, the second shader resource view object and the sampling object so as to complete the rendering of each frame of the right eye rendering target.
It should be noted that, for the vertex buffer object verterbuffer_i, the index buffer object indexbuffer_i, the layout object inputlayout_i, the sampling object sampler_i, the vertex shader object vsshader_i, and the fragment shader object psshader_i required for rendering the left-eye rendering object and the right-eye rendering object, the vertex buffer object verterbuffer_i, the index buffer object indexbuffer_i, the layout object inputlayout_i, the sampling object sampler_i, the vertex shader object vsshader_i, and the fragment shader object psshader_i may be directly called on the same resource created in advance before the system performs rendering. The operation flow of executing the frame loop according to the objective function and the corresponding vertex buffer object, index buffer object, layout object, vertex shader object, fragment shader object, shader resource view object, and sampling object is the same as that of the prior art, and is not described here with emphasis.
Wherein, in some examples, black (0, 1) may be set as the background of the left-eye rendering object in the process of setting the background information of the left-eye rendering object after the clear operation is performed. Similarly, in the setting of the background information of the right-eye rendering target after the clear operation is performed, black (0, 1) may be set as the background of the right-eye rendering target.
Wherein, in some examples, the above-mentioned objective function may be, but is not limited to, a Draw function.
Further, in some embodiments of the present application, the performing a clear operation on the historical rendering level at which the left-eye rendering target exists comprises: executing a clearing operation on the left-eye rendering target obtained in the (N-1) th time in the process of executing the Nth frame cycle on the left-eye rendering target; wherein N is a positive integer greater than or equal to 2.
Further, in some embodiments of the present application, the performing a clear operation on the historical rendering level at which the right-eye rendering target exists comprises: in the operation process of executing the nth frame cycle on the right-eye rendering target, executing a clearing operation on the right-eye rendering target obtained in the (N-1) th time; likewise, N is a positive integer greater than or equal to 2.
It should be noted that, in some examples, the first texture data and the second texture data are specifically obtained directly through the Unity engine, so that the data format of the texture data used in the scheme provided by the embodiment of the application is ensured to be consistent with the data format of the original texture data, thereby ensuring the consistency of the stereoscopic display effect.
The third embodiment of the present application may be an improvement on the second embodiment.
Example IV
In the fourth embodiment of the present application, after the synthesizing process is performed according to the result of the first operation and the result of the second operation, the method further includes: if the received exit instruction, at least the context object corresponding to the display device, the left-eye camera and the right-eye camera are destroyed.
Specifically, in some examples, if an exit instruction is received, the context object corresponding to the display device created for each display device needs to be released, so that the context object corresponding to the display device is destroyed, and the left-eye camera and the right-eye camera are destroyed at the same time.
Further, assuming that a context object created by the display device device_i and corresponding to the display device needs to be released, the swap chain swapchain_i, the left-eye rendering target leftrendertarget_i, the right-eye rendering target rightendertarget_i, the vertex buffer object vertedbuffer_i, the index buffer object index buffer_i, the layout object inputlayout_i, the sampling object sampler_i, the vertex shader object vsshader_i, the fragment shader object psshader_i, the left-eye camera leftcamera_i and the right-eye camera righttcamera_i need to be destroyed.
The fourth embodiment of the present application may be an improvement on the basis of the first and/or second embodiments.
Example five
The fifth embodiment of the application provides a stereoscopic display device of display equipment.
In particular, the device may comprise: a first creation module for performing an initialization operation and creating a context object corresponding to the display device; a second creating module for creating a left eye camera and a right eye camera according to the context object; the determining module is used for determining a left-eye rendering target according to the left-eye camera and determining a right-eye rendering target according to the right-eye camera; an execution module, configured to execute a first operation of a frame cycle according to the left-eye rendering target and execute a second operation of the frame cycle according to the right-eye rendering target; and the display module is used for carrying out synthesis processing according to the result of the first operation and the result of the second operation so as to realize three-dimensional display.
In some examples, the determining a left-eye rendering target from the left-eye camera, the determining a right-eye rendering target from the right-eye camera includes: determining a window corresponding to the display equipment; creating a swap chain according to the context object and the window; and determining a left-eye rendering target according to the switching chain and the left-eye camera, and determining a right-eye rendering target according to the switching chain and the right-eye camera.
In some examples, the first operation of performing a frame cycle according to the left-eye rendering target includes: acquiring first texture data for rendering the left-eye camera; transmitting the first texture data to the left-eye rendering target; the first texture data is processed based on the left-eye rendering target to perform a first operation of a frame loop.
In some examples, the performing a second operation of a frame loop according to the right eye render target includes: acquiring second texture data for rendering the right eye camera; transmitting the second texture data to the right-eye rendering target; processing the second texture data based on the right-eye rendering target to perform a second operation of a frame loop.
In some examples, the processing the first texture data based on the left-eye rendering target to perform a first operation of a frame loop includes: creating a first shader resource view object from the first texture data; executing a clearing operation on a historical rendering level of the left-eye rendering target, and setting background information of the left-eye rendering target after executing the clearing operation; determining a vertex buffer object, an index buffer object, a layout object, a vertex shader object, a fragment shader object, a first shader resource view object, and a sample object of the left eye render target; and executing a first operation of frame circulation according to the objective function and the vertex buffer object, the index buffer object, the layout object, the vertex shader object, the fragment shader object, the first shader resource view object and the sampling object so as to complete the rendering of each frame of the left eye rendering target.
In some examples, the processing the second texture data based on the right-eye rendering target to perform a second operation of a frame loop includes: creating a second shader resource view object from the second texture data; executing a clearing operation on a historical rendering level of the right-eye rendering target, and setting background information of the right-eye rendering target after executing the clearing operation; determining a vertex buffer object, an index buffer object, a layout object, a vertex shader object, a fragment shader object, a second shader resource view object, and a sample object of the right eye render target; and executing a second operation of frame circulation according to the objective function and the vertex buffer object, the index buffer object, the layout object, the vertex shader object, the fragment shader object, the second shader resource view object and the sampling object so as to complete the rendering of each frame of the right eye rendering target.
In some examples, the performing a clear operation on the historical rendering level at which the left-eye rendering target exists comprises: executing a clearing operation on the left-eye rendering target obtained in the (N-1) th time in the process of executing the Nth frame cycle on the left-eye rendering target; wherein N is a positive integer greater than or equal to 2.
In some examples, the performing a clear operation on the historical rendering level at which the right eye rendering target exists comprises: in the operation process of executing the nth frame cycle on the right-eye rendering target, executing a clearing operation on the right-eye rendering target obtained in the (N-1) th time; wherein N is a positive integer greater than or equal to 2.
In some examples, after the synthesizing process is performed according to the result of the first operation and the result of the second operation, the method further includes: if the received exit instruction, at least the context object corresponding to the display device, the left-eye camera and the right-eye camera are destroyed.
In some examples, the application corresponding to the method is integrated into a Unity plugin, the Unity plugin is installed in a Unity engine, and the Unity engine is installed in the host.
It is to be noted that, in the stereoscopic display device of the display apparatus provided in the fifth embodiment of the present application, the device embodiment corresponding to any one or more of the first to fourth embodiments is provided, and implementation details provided in the foregoing method embodiments are also applicable to the embodiments of the present application, so that repetition is avoided and no redundant description is provided herein.
Example six
A sixth embodiment of the present application further provides a computer device, the structure of which is shown in fig. 3, the device comprising a memory 10 for storing computer readable instructions and a processor 20 for executing the computer readable instructions, wherein the computer readable instructions, when executed by the processor, trigger the processor to perform the method.
The methods and/or embodiments of the present application may be implemented as a computer software program. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. The above-described functions defined in the method of the application are performed when the computer program is executed by a processing unit.
The computer readable medium according to the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowchart or block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
As another aspect, the embodiment of the present application also provides a computer-readable medium that may be contained in the apparatus described in the above embodiment; or may be present alone without being fitted into the device. The computer readable medium carries one or more computer readable instructions executable by a processor to perform the steps of the methods and/or aspects of the various embodiments of the application described above.
In one exemplary configuration of the application, the terminal, the devices of the services network each include one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer-readable media include both permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information that can be accessed by a computing device.
In addition, the embodiment of the application also provides a computer program which is stored in the computer equipment, so that the computer equipment executes the method for executing the control code.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, e.g., using Application Specific Integrated Circuits (ASIC), a general purpose computer or any other similar hardware device. In some embodiments, the software program of the present application may be executed by a processor to implement the above steps or functions. Likewise, the software programs of the present application (including associated data structures) may be stored on a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. In addition, some steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
It will be evident to those skilled in the art that the application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is evident that the word "comprising" does not exclude other elements or steps, and that the singular does not exclude a plurality. A plurality of units or means recited in the apparatus claims can also be implemented by means of one unit or means in software or hardware. The terms first, second, etc. are used to denote a name, but not any particular order.

Claims (10)

1. A stereoscopic display method of a display device, the method being applied to a stereoscopic display system including a host and at least one display device, the host and the at least one display device being communicatively connected, the method comprising:
performing an initialization operation and creating a context object corresponding to the display device;
creating a left eye camera and a right eye camera according to the context object;
determining a left-eye rendering target according to the left-eye camera, and determining a right-eye rendering target according to the right-eye camera;
executing a first operation of a frame cycle according to the left-eye rendering target, and executing a second operation of the frame cycle according to the right-eye rendering target;
and carrying out synthesis processing according to the result of the first operation and the result of the second operation to realize three-dimensional display.
2. The method of claim 1, wherein the determining a left-eye rendering target from the left-eye camera and determining a right-eye rendering target from the right-eye camera comprises:
determining a window corresponding to the display equipment;
creating a swap chain according to the context object and the window;
and determining a left-eye rendering target according to the switching chain and the left-eye camera, and determining a right-eye rendering target according to the switching chain and the right-eye camera.
3. The method of claim 1, wherein the first operation of performing a frame cycle according to the left-eye rendering target comprises:
acquiring first texture data for rendering the left-eye camera;
transmitting the first texture data to the left-eye rendering target;
processing the first texture data based on the left-eye rendering target to perform a first operation of a frame cycle;
the second operation of performing a frame cycle according to the right-eye rendering target includes:
acquiring second texture data for rendering the right eye camera;
transmitting the second texture data to the right-eye rendering target;
processing the second texture data based on the right-eye rendering target to perform a second operation of a frame loop.
4. The method of claim 3, wherein the first operation of processing the first texture data based on the left-eye rendering target to perform a frame cycle comprises:
creating a first shader resource view object from the first texture data;
executing a clearing operation on a historical rendering level of the left-eye rendering target, and setting background information of the left-eye rendering target after executing the clearing operation;
determining a vertex buffer object, an index buffer object, a layout object, a vertex shader object, a fragment shader object, a first shader resource view object, and a sample object of the left eye render target;
performing a first operation of a frame loop according to an objective function and the vertex buffer object, index buffer object, layout object, vertex shader object, fragment shader object, first shader resource view object, sampling object, to complete rendering of each frame of the left-eye rendering object;
the processing the second texture data based on the right-eye rendering target to perform a second operation of a frame loop includes:
creating a second shader resource view object from the second texture data;
executing a clearing operation on a historical rendering level of the right-eye rendering target, and setting background information of the right-eye rendering target after executing the clearing operation;
determining a vertex buffer object, an index buffer object, a layout object, a vertex shader object, a fragment shader object, a second shader resource view object, and a sample object of the right eye render target;
and executing a second operation of frame circulation according to the objective function and the vertex buffer object, the index buffer object, the layout object, the vertex shader object, the fragment shader object, the second shader resource view object and the sampling object so as to complete the rendering of each frame of the right eye rendering target.
5. The method of claim 4, wherein the step of determining the position of the first electrode is performed,
the performing a clear operation on the historical rendering hierarchy in which the left-eye rendering target exists includes: executing a clearing operation on the left-eye rendering target obtained in the (N-1) th time in the process of executing the Nth frame cycle on the left-eye rendering target;
the performing a clear operation on the historical rendering hierarchy in which the right eye rendering target exists includes: in the operation process of executing the nth frame cycle on the right-eye rendering target, executing a clearing operation on the right-eye rendering target obtained in the (N-1) th time;
wherein N is a positive integer greater than or equal to 2.
6. The method according to claim 1, wherein after the synthesizing process is performed according to the result of the first operation and the result of the second operation, the method further comprises:
if the received exit instruction, at least the context object corresponding to the display device, the left-eye camera and the right-eye camera are destroyed.
7. The method according to any one of claims 1 to 6, wherein an application corresponding to the method is integrated in a Unity plugin, the Unity plugin is installed in a Unity engine, and the Unity engine is installed in the host.
8. A stereoscopic display apparatus of a display device, the apparatus comprising:
a first creation module for performing an initialization operation and creating a context object corresponding to the display device;
a second creating module for creating a left eye camera and a right eye camera according to the context object;
the determining module is used for determining a left-eye rendering target according to the left-eye camera and determining a right-eye rendering target according to the right-eye camera;
an execution module, configured to execute a first operation of a frame cycle according to the left-eye rendering target and execute a second operation of the frame cycle according to the right-eye rendering target;
and the display module is used for carrying out synthesis processing according to the result of the first operation and the result of the second operation so as to realize three-dimensional display.
9. A computer device, the device comprising:
one or more processors; the method comprises the steps of,
a memory storing computer program instructions that, when executed, cause the processor to perform the method of any of claims 1 to 7.
10. A computer readable medium having stored thereon computer program instructions executable by a processor to implement the method of any of claims 1 to 7.
CN202311111805.9A 2023-08-31 2023-08-31 Stereoscopic display method, stereoscopic display device, stereoscopic display equipment and readable medium Pending CN117041515A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311111805.9A CN117041515A (en) 2023-08-31 2023-08-31 Stereoscopic display method, stereoscopic display device, stereoscopic display equipment and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311111805.9A CN117041515A (en) 2023-08-31 2023-08-31 Stereoscopic display method, stereoscopic display device, stereoscopic display equipment and readable medium

Publications (1)

Publication Number Publication Date
CN117041515A true CN117041515A (en) 2023-11-10

Family

ID=88631709

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311111805.9A Pending CN117041515A (en) 2023-08-31 2023-08-31 Stereoscopic display method, stereoscopic display device, stereoscopic display equipment and readable medium

Country Status (1)

Country Link
CN (1) CN117041515A (en)

Similar Documents

Publication Publication Date Title
CN107832108B (en) Rendering method and device of 3D canvas webpage elements and electronic equipment
US10499035B2 (en) Method and system of displaying a popping-screen
EP3329682B1 (en) A system for compositing video with interactive, dynamically rendered visual aids
CN108876887B (en) Rendering method and device
KR20230154979A (en) Mapping architecture of the Immersive Technology Media Format (ITMF) specification with rendering engines
Schwab et al. Scalable scalable vector graphics: Automatic translation of interactive svgs to a multithread vdom for fast rendering
Zorrilla et al. HTML5-based system for interoperable 3D digital home applications
CN117041515A (en) Stereoscopic display method, stereoscopic display device, stereoscopic display equipment and readable medium
US11677979B2 (en) Freeview video coding
US20220279234A1 (en) Live stream display method and apparatus, electronic device, and readable storage medium
JP2023517461A (en) Training the alignment of multiple images
CN116055540B (en) Virtual content display system, method, apparatus and computer readable medium
CN117596377B (en) Picture push method, device, electronic equipment, storage medium and program product
CN115640044B (en) Unity-based three-dimensional software development method, device, equipment and medium
US11438633B2 (en) Temporal de-noising for video
CN115643462B (en) Interactive animation display method and device, computer equipment and storage medium
CN114187408B (en) Three-dimensional face model reconstruction method and device, electronic equipment and storage medium
Liu et al. Web‐Cloud Collaborative Mobile Online 3D Rendering System
CN110662099A (en) Method and device for displaying bullet screen
RU2810701C2 (en) Hybrid rendering
CN117873972A (en) Method, device, electronic equipment and computer readable medium for importing files
Tang Research on Interactive Spatial Scheduling of VR Movie Based on Spatiotemporal Relational Narration
CN117609646A (en) Scene rendering method and device, electronic equipment and storage medium
CN117218269A (en) Video processing method, device, equipment and storage medium
CN116957967A (en) 3D interface chromatic aberration solving method, system, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination