CN111831353B - Operation library based on OpenXR standard, data interaction method, device and medium - Google Patents

Operation library based on OpenXR standard, data interaction method, device and medium Download PDF

Info

Publication number
CN111831353B
CN111831353B CN202010655829.0A CN202010655829A CN111831353B CN 111831353 B CN111831353 B CN 111831353B CN 202010655829 A CN202010655829 A CN 202010655829A CN 111831353 B CN111831353 B CN 111831353B
Authority
CN
China
Prior art keywords
client
application program
module
library
gesture data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010655829.0A
Other languages
Chinese (zh)
Other versions
CN111831353A (en
Inventor
李孟臻
范承鑫
赵凯
李岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Parallel Cloud Technology Beijing Co ltd
Original Assignee
Parallel Cloud Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Parallel Cloud Technology Beijing Co ltd filed Critical Parallel Cloud Technology Beijing Co ltd
Priority to CN202010655829.0A priority Critical patent/CN111831353B/en
Publication of CN111831353A publication Critical patent/CN111831353A/en
Application granted granted Critical
Publication of CN111831353B publication Critical patent/CN111831353B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/544Buffers; Shared memory; Pipes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application provides an operation library based on an OpenXR standard, a data interaction method, equipment and a medium, wherein the operation library comprises: the device comprises an identifier acquisition module, a database and a database, wherein the identifier acquisition module is used for acquiring a unique identifier, the unique identifier is matched with a VR/AR application program request sent by a client, and the database is used for being loaded by the VR/AR application program when the VR/AR application program is started; the configuration information reading module is used for reading the configuration information in the corresponding named shared memory according to the acquired unique identifier; the network communication module is used for establishing UDP socket links with the client; and the video stream generating module is used for generating a video stream and sending the video stream to the network communication module. According to the method and the device, one-to-one binding between the VR/AR application programs and the clients can be realized by creating the unique identifier, so that the application programs of different VR/ARs requested by a plurality of clients can be started on one PC at the same time without confusion, and system resources and cost can be effectively saved.

Description

Operation library based on OpenXR standard, data interaction method, device and medium
Technical Field
The application relates to the technical field of virtual reality, in particular to an operation library based on the OpenXR standard, a data interaction method, equipment and a medium.
Background
VR (Virtual Reality) and AR (Augmented Reality ) have received widespread attention in recent years, but a problem that follows is that while a large number of hardware and software companies have begun to make a doubling effort in this field, while of great variety, more and more devices (each device having its own incompatible API) SDKs are increasing the severity of the problem of fragmentation.
API fragmentation results in application developers having to spend a significant amount of time, money and resources integrating with various hardware to achieve the goal of being compatible with more hardware devices. Even large teams are forced to choose the platforms and devices they support, while for small teams the problem is even more serious. They lack the money and various resources that large teams do, which would lead to severe bipolar differentiation of VR and AR markets. The richness and diversity of the resources are greatly hit.
Therefore, the Valve company has proposed OpenVR, which is a set of API commonly used by VR/AR devices, and the disclosure of OpenVR does solve the problem of API fragmentation, but the runtime of OpenVR, namely stepvr, is not open-source, which results in no opportunity for secondary development, and the need for customization cannot be satisfied, and the OpenVR limits that only one VR/AR application instance can be run on one computer, which has the consequence of wasting computer resources. If the resources of one computer are sufficient to run 5 VR/AR applications with the same resource consumption, only one VR/AR application can be run on one computer after you select OpenVR. If we set up a server for VR/AR applications, our server can only serve one user at a time.
At present, some enterprises develop running libraries according to the specification of OpenXR, for example, microsoft develops own run time and run time issued by oculus based on OpenXR, but Microsoft run time can only be connected with the HoloLens series of VR/AR equipment of the user, and the run time issued by oculus is also only aimed at own equipment, so that the compatibility of the running libraries is poor; in addition, traditional VR/AR mostly adopts the mode of plug wire, namely uses an HDMI line or DP line to connect with VR/AR equipment and PC for the user can not remove anywhere at any time, and the use convenience is relatively poor, and user experience is not good.
Disclosure of Invention
In view of this, the application provides a runtime library, a data interaction method, a device and a medium based on the OpenXR standard, which aim to realize a runtime library solution with better universality and compatibility, so that users can smoothly share VR/AR content in different VR/AR devices.
In order to achieve the above purpose, the technical scheme adopted in the application is as follows:
in a first aspect, the present application provides an OpenXR standard-based runtime library, where the runtime library is in a PC, and the PC further includes a server, a VR/AR application, and a naming shared memory, and the runtime library includes:
the device comprises an identifier acquisition module, a processing module and a processing module, wherein the identifier acquisition module is used for acquiring a unique identifier, the unique identifier is matched with a VR/AR application program request sent by a client, the operation library is used for being loaded by the VR/AR application program when the VR/AR application program is started, the unique identifier is used for identifying the naming shared memory, configuration information of the client for sending the VR/AR application program request is written in the naming shared memory, and the configuration information comprises field angle information, resolution, frame rate, pupil distance information and height information of the client;
the configuration information reading module is used for reading the configuration information in the corresponding named shared memory according to the acquired unique identifier;
the network communication module is used for establishing UDP socket links with the client, receiving gesture data from the client and sending the generated video stream to the client;
and the video stream generating module is used for processing the received gesture data, generating a video stream and sending the video stream to the network communication module.
Optionally, the video stream generating module specifically includes:
the data operation module is used for performing matrix rotation and matrix translation operation on the posture data, the pupil distance information and the height information to obtain processed posture data of the left eye and the right eye, and providing the processed posture data for the VR/AR application program;
the texture extraction module is used for extracting left eye textures and right eye textures of the VR/AR application program after rendering according to the obtained processed gesture data, and sending the left eye textures and the right eye textures to the rendering coding module;
rendering and encoding module: the method is used for re-rendering the received textures, encoding and packaging the rendered textures into video streams, and submitting the video streams to the network communication module.
Optionally, the operation library further includes:
the buffer queue module is used for buffering gesture data received from the client;
the monitoring module is used for monitoring the length of the gesture data buffer queue;
and the refreshing module is used for adjusting the frequency of the gesture information provided for the VR/AR application program according to the refreshing rate of the client after the gesture data are received by the operation library.
Optionally, the operation library further includes:
the first key set module is used for storing a handle key action set focused by the VR/AR application program;
the second key set module is used for storing a handle key action set which can be sent by the client;
and the key action sending module is used for taking the intersection of the key actions stored in the first case assembly module and the second key assembly module and providing the intersection to the VR/AR application program.
In a second aspect, the present application provides a data interaction method of an operating library based on the OpenXR standard, where the operating library is in a PC, and the PC further includes a server, a VR/AR application, and a naming shared memory, and the method includes:
the server in the PC receives a VR/AR application program request from a client, wherein the VR/AR application program request comprises configuration information of the client, and the configuration information comprises view angle information, resolution, frame rate, pupil distance information and height information of the client;
the server side analyzes configuration information in the VR/AR application program request, writes the analyzed configuration information into a named shared memory, and creates a unique identifier matched with the VR/AR application program request to identify the named shared memory;
the operation library is loaded by the VR/AR application program requested by the client;
the operation library obtains a unique identifier matched with the VR/AR application program request, and reads configuration information in a corresponding memory through the unique identifier;
after establishing UDP socket link between the operation library and the client, the client sends gesture data to the operation library;
and the operation library receives the gesture data, processes the gesture data, generates a video stream and sends the video stream to the client.
Optionally, the specific method for the operation library to receive the gesture data for processing, generate a video stream and send the video stream to the client includes:
the operation library performs matrix rotation and matrix translation operation on the gesture data, the pupil distance information and the height information to obtain processed gesture data of the left eye and the right eye, and provides the processed gesture data for the VR/AR application program;
the VR/AR application program renders left eye textures and right eye textures according to the obtained processed gesture data, generates texture information and sends the texture information to the operation library;
the operation library receives the texture information and extracts textures therein;
and the operation library re-renders the received texture, packs the rendered texture code into a video stream and sends the video stream to the client.
Optionally, after the runtime receives the gesture data, the runtime further includes:
and placing the gesture data into a buffer queue.
Optionally, the specific method for establishing the UDP socket link between the client and the runtime library is:
the operation library starts UDP socket service and provides port numbers bound by UDP protocol to the client;
and the client is connected with the operation library according to the port number.
In a third aspect, embodiments of the present application further provide an apparatus, including: a processor, a memory, and a communication unit;
the memory stores machine readable instructions executable by the processor, the processor and the memory in communication via the communication unit when the device is operating;
wherein the processor executes the machine readable instructions to perform the method of the above aspects.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium, where a computer program is stored, where the computer program is executed by a processor to perform the method according to the above aspects.
The beneficial effects of this application are:
1. according to the method and the device, one-to-one binding between the VR/AR application programs and the clients can be realized by creating the unique identifier, so that the application programs of different VR/ARs requested by a plurality of clients can be started on one PC at the same time without confusion, and system resources and cost can be effectively saved;
2. the application running library communicates with the client through the UDP protocol, so that all clients capable of establishing UDP communication links with the running library can be connected with the application running library, and manufacturers, models and software and hardware configuration of the clients are not particularly limited, therefore, the application running library has better universality and compatibility.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a block diagram of an architecture of a runtime library based on the OpenXR standard of the present application;
FIG. 2 is a block diagram of a video stream generation module of the present application;
fig. 3 is a flowchart of a data interaction method of a runtime based on the OpenXR standard in the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments.
The runtime library and VR/AR (VR or AR) application are in one-to-one relationship, and each time a VR/AR application is started, the VR/AR application is correspondingly loaded to the runtime library once, and each loading creates an independent space in memory for the runtime library, i.e., the runtime library loaded each time is independent. If 3 VR/AR applications are started: a, B, C, illustrate that there are 3 clients, i.e., 3 VR/AR glasses: first, second, third, then each runtime needs to know to which client the video encoded by itself is to be sent, and must be in a strict one-to-one relationship, if the a-application is started by the a-glasses, then it should receive the image of the a-application captured and encoded by the runtime loaded by the a-application, but not the B-application or the C-application. At the same time, the pose information sent by the glasses should also be sent to the running library loaded by the application a instead of the application B or the application C.
If the a, b, and c 3 VR/AR glasses come from 3 different vendors, then the parameters requirements of the 3 glasses on the received video may also be different, for example: resolution and FOV (field of view). Even though the resolution and FOV requirements are the same, some VR/AR applications may require a higher FPS (frame rate) to have a better experience, but some VR/AR applications may not require that high FPS, and the height of the user of each client may be different, in VR/AR applications, the height of the user may be a large parameter affecting the user experience, and the above individual personalized parameters may also require that each client's corresponding runtime be processed and then transferred to the VR/AR application through the interface between the runtime and the VR/AR application.
In order to solve the technical problem of how to realize one-to-one matching between the VR/AR application program and the client, in a first aspect of the present application, an operation library based on the OpenXR standard is provided, where the operation library is in a PC, and the PC further includes a server, the VR/AR application program, and a naming shared memory, as shown in fig. 1, where the operation library includes:
the identifier obtaining module 110 is configured to obtain a unique identifier, where the unique identifier is matched with a VR/AR application request sent by a client, the runtime is configured to be loaded by the VR/AR application when the VR/AR application is started, the unique identifier is used to identify the named shared memory, and configuration information of the client sending the VR/AR application request is written in the named shared memory, and the configuration information includes field angle information, resolution, frame rate, pupil distance information and height information of the client;
typically, starting a VR/AR application will be associated with loading a library, which is the same library, but in memory is a different entity and has different memory intervals. The running library communicates with external processes by sharing memory. Each of the runtime libraries, which is essentially a dynamic library, has a separate block of shared memory in communication with external processes.
In the application, each time the server receives a VR/AR application request sent by a client, a unique identifier matched with the VR/AR application request is generated, and the unique identifier is used for identifying a named shared memory storing client configuration information. The export function of the running library has not only the export interface specified by the OpenXR standard, but also an interface for importing the unique identifier, and before the running library is loaded correctly, that is, before the application program calls the interface specified by the OpenXR standard, the newly-added interface is called to import the unique identifier in a current operating system.
Specifically, the unique identifier may be used as a suffix to the name of the named shared memory, for example, the name of the named shared memory is cloudlark_123456, where 123456 is the unique identifier.
Specifically, the client is present in a client device, which may be smart glasses or VR integrated machine, etc., and the height information is height information of the VR/AR application experimenter or user.
The configuration information reading module 120 is configured to read configuration information in the corresponding named shared memory according to the obtained unique identifier;
and after the unique identifier is obtained by the operation library, the unique identifier is used as the identifier for naming the shared memory to read the content in the shared memory. It should be noted that, each loading of the runtime library has an independent block of shared memory that communicates with external processes.
The network communication module 130 is configured to establish a UDP socket link with a client, receive gesture data from the client, and send a generated video stream to the client;
in the application, the communication between the client and the operation library is a UDP protocol, so that one-to-one between the client and the VR/AR application, that is, one-to-one between the client and the operation library loaded by the VR/AR application, the operation library needs to inform the client of a port number bound to the UDP protocol, and after the UDP component is started, the operation library informs the client of the UDP port number, and the client sends gesture data to the operation library according to the port number, or receives a video stream encoded by the operation library.
The video stream generating module 140 is configured to process the received gesture data, generate a video stream, and send the video stream to the network communication module.
Because the unique identifier is matched with the VR/AR application program request, and the VR/AR application program request is matched with the client side of the VR/AR application program request, and the operation library is correspondingly loaded once when one VR/AR application program is started, the operation library obtains the unique identifier to read the information of the corresponding client side, so that the matching between the operation library and the client side is realized, and the matching between the VR/AR application program loading the operation library and the client side is realized.
The one-to-one matching between the VR/AR application program and the client can be realized by creating the unique identifier, so that the application programs of different VR/ARs requested by a plurality of clients can be started simultaneously on one PC without confusion; in addition, the operation library communicates with the client through the UDP protocol, so that all clients which can establish UDP communication links with the operation library can be connected with the operation library of the application, and the manufacturer, model and software and hardware configuration of the client are not particularly limited, therefore, the operation library of the application has better universality and compatibility.
Specifically, as shown in fig. 2, the video stream generating module 140 includes:
the data operation module 141 is configured to perform matrix rotation and matrix translation operations on the posture data, the pupil distance information, and the height information, obtain processed posture data of the left and right eyes, and provide the processed posture data to the VR/AR application program;
in order to enable the rendered picture to have a stereoscopic impression, the application needs to calculate the gesture data of the left eye and the right eye by combining the interpupillary distance, wherein the default value of the interpupillary distance is 0.064 meter, and the interpupillary distance can be specifically set according to actual conditions.
The texture extraction module 142 is configured to extract left-eye and right-eye textures of the VR/AR application program after rendering according to the provided processed gesture data, and send the extracted left-eye and right-eye textures to the rendering encoding module;
specifically, after the VR/AR application obtains the processed gesture data, rendering the left-eye and right-eye textures according to the processed gesture data, generating texture information, and sending the texture information to the texture extraction module 142 of the runtime, where the texture extraction module 142 extracts the left-eye and right-eye textures in the texture information.
In the actual communication process, the VR/AR application may call the interface of the runtime to tell whether the texture of the runtime is separated by the left and right eyes or synthesized by the left and right eyes, and if the texture is separated by the left and right eyes, the texture extraction module 142 needs to synthesize the left and right eye textures and then submit the synthesized left and right eye textures to the rendering encoding module 143, and if the synthesized left and right eye textures are synthesized, directly submit the synthesized left and right eye textures to the rendering encoding module 143.
The rendering encoding module 143 is configured to re-render the received texture, package the rendered texture code into a video stream, and submit the video stream to the network communication module 130.
The rendering and encoding module 143 re-renders the received texture in the local device, that is, the graphics card, and packages the rendered texture code into a video stream, and submits the video stream to the network communication module 130. In terms of coding, the application runtime can use an Injedak hardware coding interface, and is more efficient and stable than soft coding. The rendering and encoding module 143 may perform decoding according to the resolution requirement in the configuration information of the client when encoding the video, so that the client obtains the resolution matching with itself after receiving the video for decoding.
As an alternative embodiment, the runtime further includes:
the signaling processing module is used for processing various received signaling, such as starting or stopping sending video streams;
for example, when the client disconnects, the VR/AR application is also turned off and the video stream stops sending.
The buffer queue module is used for buffering gesture data received from the client;
the monitoring module is used for monitoring the length of the gesture data buffer queue;
the operation library creates a buffer queue of gesture data to prepare for receiving gesture data sent by a client, and meanwhile, the length of the gesture data queue needs to be monitored, because the length of the gesture data queue is too large, gesture data congestion can be caused, even communication delay is directly caused to rise, and when the delay is as high as a certain value, for example, 80 milliseconds or more, the gesture data queue can be cleaned, so that the delay caused by congestion is reduced.
And the refreshing module is used for adjusting the frequency of the gesture information provided for the VR/AR application program according to the refreshing rate of the client after the gesture data are received by the operation library.
After the client and the operation library establish a UDP socket link, the operation library starts to send gesture data to the operation library, after the operation library receives the gesture data, the frequency of the gesture data provided to the VR/AR application program is adjusted according to the refresh rate of the client, i.e., the frame rate (the client typically sends the gesture data to the operation library with a frequency of 60 HZ), and the frequency of the gesture information requested by the VR/AR application program can be adjusted by using a mode that the thread is temporarily suspended, so that the VR/AR application program needs to render the picture of the next frame according to the received gesture information.
It should be noted that the client informs the runtime that the frame rate is actually the frequency at which the client sends the gesture data. Because of the generation of a frame of picture, a VR/AR application needs to acquire a piece of gesture data each time, and then render it once and submit it to the runtime.
As an alternative embodiment, the runtime further includes:
the first key set module is used for storing a handle key action set focused by the VR/AR application program;
the second key set module is used for storing a handle key action set which can be sent by the client;
and the key action sending module is used for taking the intersection of the key actions stored in the first case assembly module and the second key assembly module and providing the intersection to the VR/AR application program.
In the use of VR/AR applications, the gesture data includes not only the gesture of the headset, but many VR/AR applications use handles to accomplish the user's interaction with the application. After the VR/AR application is started, the application does not know the type of hardware currently connected to the runtime, i.e., the type of handle, nor does the hardware know what key states the application needs to pay attention to, which creates a problem of bi-directional blind selection.
In order to solve the above problem, the application regards all types of handles and key actions supported by the OpenXR standard as a whole set of key actions by making a mapping relation in the runtime. After the application program is started, the application program informs the operation library of the key action concerned by the application program through a specific interface, the operation library puts the keys into the set 1, and when the client, namely the VR/AR equipment is connected with the operation library, the client also informs the operation library of the key action which can be sent currently, the operation library puts the key action which can be sent by the client into the set 2, and when the application program requests the key action state of the handle, the operation library provides the application program with the content in the intersection of the set 1 and the set 2. In the above scheme, set 1 and set 2 must be subsets of the full set of key actions, i.e., must be the devices and key actions supported within the OpenXR specification.
In a second aspect, the present application provides a data interaction method of an operating library based on the OpenXR standard, where the operating library is in a PC, and the PC further includes a server, a VR/AR application, and a naming shared memory, as shown in fig. 3, and the method includes:
s301, a server in a PC receives a VR/AR application program request from a client, wherein the VR/AR application program request comprises configuration information of the client, and the configuration information comprises view angle information, resolution, frame rate, pupil distance information and height information of the client;
specifically, the client is present in a client device, which may be smart glasses or VR integrated machine, etc., and the height information is height information of the VR/AR application experimenter or user.
S302: the server side analyzes configuration information in the VR/AR application program request, writes the analyzed configuration information into a named shared memory, and creates a unique identifier matched with the VR/AR application program request to identify the named shared memory;
specifically, the unique identifier may be used as a suffix to the name of the named shared memory, for example, the name of the named shared memory is cloudlark_123456, where 123456 is the unique identifier.
S303: the operation library is loaded by the VR/AR application program requested by the client;
s304, the operation library obtains a unique identifier matched with the VR/AR application program request, and reads configuration information in a corresponding memory through the unique identifier;
in the application, the export function of the runtime library has not only the export interface specified by the OpenXR standard, but also an interface for importing the unique identifier, and before the runtime library is loaded correctly, that is, before the application program calls the interface specified by the OpenXR standard, the newly added interface is called to import the unique identifier in a current operating system.
And after the unique identifier is obtained by the operation library, the unique identifier is used as the identifier for naming the shared memory to read the content in the shared memory. It should be noted that, each loading of the runtime library has an independent block of shared memory that communicates with external processes.
Application example:
the a glasses required 2400 x 1200 resolution and 60fps for the a application and the B glasses required 2800 x 1400 resolution and 72fps for the B application. The server opens up a piece of memory, named app_a, based on the request from the glasses a, the content is 2400,1200,60. And the name app_a is transferred to the running library loaded by the application a through the export interface of the dynamic library, and the running library obtains the name of the shared memory and then reads the section of memory with the app_a name to read 2400,1200 and 60 for correct processing.
S305: after establishing UDP socket link between the operation library and the client, the client sends gesture data to the operation library;
in the application, the communication between the client and the operation library is a UDP protocol, so that one-to-one between the client and the VR/AR application, that is, one-to-one between the client and the operation library loaded by the VR/AR application, the operation library needs to inform the client of the port number bound by the UDP protocol, and after the UDP component is started, the operation library informs the client of the UDP port number, and the client sends gesture data to the operation library according to the port number.
And S306, the operation library receives the gesture data, processes the gesture data, generates a video stream and sends the video stream to the client.
After receiving the gesture data, the operation library performs matrix rotation and matrix translation operation on the gesture data, pupil distance information and height information to obtain processed gesture data of the left eye and the right eye, and sends the processed gesture data to a VR/AR application program;
in order to enable the rendered picture to have a stereoscopic impression, the application needs to calculate the gesture data of the left eye and the right eye by combining the interpupillary distance, wherein the default value of the interpupillary distance is 0.064 meter, and the interpupillary distance can be specifically set according to actual conditions.
After the VR/AR application program receives the processed gesture data, rendering the left-eye and right-eye textures according to the processed gesture data, generating texture information, sending the texture information to a running library, and extracting the left-eye and right-eye textures in the texture information by the running library.
In the actual communication process, whether the texture provided to the operation library by the VR/AR application program is separated by the left eye and the right eye or synthesized by the left eye and the right eye needs to be judged according to the previous configuration interface. If the left and right eyes are independent, the left and right eye textures are synthesized and then the rendering coding is needed, and if the left and right eyes are synthesized, the rendering coding is directly carried out. And after rendering coding, packaging the rendered texture coding into a video stream and sending the video stream to the client.
In terms of coding, the application runtime can use an Injedak hardware coding interface, and is more efficient and stable than soft coding. When the video is encoded, the video is de-encoded according to the resolution requirement in the configuration information of the client, so that the client obtains the resolution matched with the client after receiving the video for decoding.
The one-to-one binding between the VR/AR application program and the client can be realized by creating the unique identifier, so that the application programs of different VR/ARs requested by a plurality of clients can be started on one PC at the same time without confusion; in addition, the operation library communicates with the client through the UDP protocol, so that all clients which can establish UDP communication links with the operation library can be connected with the operation library of the application, and the manufacturer, model and software and hardware configuration of the client are not particularly limited, therefore, the operation library of the application has better universality and compatibility.
As an optional implementation manner, after the runtime receives the gesture data, the runtime further includes:
and placing the gesture data into a buffer queue.
The operation library creates a buffer queue of gesture data to prepare for receiving gesture data sent by a client, and meanwhile, the length of the gesture data queue needs to be monitored, because the length of the gesture data queue is too large, gesture data congestion can be caused, even communication delay is directly caused to rise, and when the delay is as high as a certain value, for example, 80 milliseconds or more, the gesture data queue can be cleaned, so that the delay caused by congestion is reduced.
It should be noted that, the present application may receive various information sent by the VR/AR integrated machine (i.e., the client) through a wireless network communication manner, and transmit the information to the application when the application needs the information. The wireless interaction of VR/AR is realized, and the bulkiness and inconvenience of wired PCVR are avoided.
In a third aspect, embodiments of the present application further provide an apparatus, including: a processor, a memory, and a communication unit;
the memory stores machine readable instructions executable by the processor, the processor and the memory in communication via the communication unit when the device is operating;
wherein the processor executes the machine readable instructions to perform the method of the above aspects.
The memory may be used to store the processor's execution instructions and may be implemented by any type of volatile or non-volatile memory terminal or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk, or optical disk. The execution of the instructions in memory, when executed by the processor, causes the apparatus to perform some or all of the steps in the method embodiments described below.
The processor is a control center of the memory terminal, connects various parts of the entire electronic terminal using various interfaces and lines, and executes various functions of the electronic terminal and/or processes data by running or executing software programs and/or modules stored in the memory, and invoking data stored in the memory. The processor may be comprised of an integrated circuit (Integrated Circuit, simply referred to as an IC), for example, a single packaged IC, or may be comprised of a plurality of packaged ICs connected to the same function or different functions. For example, the processor may include only a central processing unit (Central Processing Unit, simply CPU). In the embodiment of the application, the CPU may be a single operation core or may include multiple operation cores.
And the communication unit is used for establishing a communication channel so that the storage device can communicate with other terminals. Receiving user data sent by other terminals or sending the user data to other terminals.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium, where a computer program is stored, where the computer program is executed by a processor to perform the method according to the above aspects.
The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random-access memory (random access memory, RAM), or the like.
The method and the device can realize data interaction between the VR/AR application programs which realize the OpenXR standard on the application layer simultaneously on the same PC, thereby saving system resources and cost; in addition, the operation library communicates with the client through the UDP protocol, so that all clients which can establish UDP communication links with the operation library can be connected with the operation library of the application, and the manufacturer, model and software and hardware configuration of the client are not particularly limited, therefore, the operation library of the application has better universality and compatibility.
In the embodiments provided herein, it should be understood that the disclosed systems and methods may be implemented in other ways. For example, the node embodiments described above are merely illustrative, e.g., the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another device, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The modules described as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes or substitutions are covered by the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. The system of a runtime library based on the OpenXR standard is characterized in that the system is arranged in a PC, the PC also comprises a server, a VR/AR application program and a naming shared memory, and the system comprises:
the system comprises an identifier acquisition module, a system and a naming sharing memory, wherein the identifier acquisition module is used for acquiring a unique identifier, the unique identifier is matched with a VR/AR application program request sent by a client, the system is used for being loaded by the VR/AR application program when the VR/AR application program is started, the unique identifier is used for identifying the naming sharing memory, configuration information of the client for sending the VR/AR application program request is written in the naming sharing memory, and the configuration information comprises view angle information, resolution, frame rate, pupil distance information and height information of the client;
the configuration information reading module is used for reading the configuration information in the corresponding named shared memory according to the acquired unique identifier;
the network communication module is used for establishing UDP socket links with the client, receiving gesture data from the client and sending the generated video stream to the client;
the video stream generating module is used for processing the received gesture data, generating a video stream and sending the video stream to the network communication module;
the video stream generating module specifically comprises:
the data operation module is used for performing matrix rotation and matrix translation operation on the posture data, the pupil distance information and the height information to obtain processed posture data of the left eye and the right eye, and sending the processed posture data to the VR/AR application program;
the texture extraction module is used for extracting left eye textures and right eye textures of the VR/AR application program after rendering according to the provided processed gesture data, and sending the extracted left eye textures and right eye textures to the rendering coding module;
rendering and encoding module: the method is used for re-rendering the received textures, encoding and packaging the rendered textures into video streams, and submitting the video streams to the network communication module.
2. The system of open xr standard-based runtime library of claim 1, wherein the system further comprises:
the buffer queue module is used for buffering gesture data received from the client;
the monitoring module is used for monitoring the length of the gesture data buffer queue;
and the refreshing module is used for adjusting the frequency of the gesture information provided for the VR/AR application program according to the refreshing rate of the client after the gesture data are received by the operation library.
3. The system of open xr standard-based runtime library of claim 1, wherein the system further comprises:
the first key set module is used for storing a handle key action set focused by the VR/AR application program;
the second key set module is used for storing a handle key action set which can be sent by the client;
and the key action sending module is used for taking the intersection of the key actions stored in the first key set module and the second key set module and providing the intersection to the VR/AR application program.
4. The data interaction method of the operation library based on the OpenXR standard is characterized in that the operation library is arranged in a PC, the PC further comprises a server, a VR/AR application program and a naming shared memory, and the method comprises the following steps:
the server in the PC receives a VR/AR application program request from a client, wherein the VR/AR application program request comprises configuration information of the client, and the configuration information comprises view angle information, resolution, frame rate, pupil distance information and height information of the client;
the server side analyzes configuration information in the VR/AR application program request, writes the analyzed configuration information into a named shared memory, and creates a unique identifier matched with the VR/AR application program request to identify the named shared memory;
the operation library is loaded by the VR/AR application program requested by the client;
the operation library obtains a unique identifier matched with the VR/AR application program request, and reads configuration information in a corresponding memory through the unique identifier;
after establishing UDP socket link between the operation library and the client, the client sends gesture data to the operation library;
and the operation library receives the gesture data, processes the gesture data, generates a video stream and sends the video stream to the client.
5. The method for data interaction in an operating library based on the OpenXR standard according to claim 4, wherein the specific method for the operating library to receive the gesture data for processing, generate a video stream, and send the video stream to the client comprises:
the operation library performs matrix rotation and matrix translation operation on the gesture data, the pupil distance information and the height information to obtain processed gesture data of the left eye and the right eye, and provides the processed gesture data for the VR/AR application program;
the VR/AR application program renders left eye textures and right eye textures according to the obtained processed gesture data, generates texture information and sends the texture information to the operation library;
the operation library receives the texture information and extracts textures therein;
and the operation library re-renders the received texture, packs the rendered texture code into a video stream and sends the video stream to the client.
6. The method for data interaction of a runtime based on the OpenXR standard of claim 5, wherein the runtime further comprises, after receiving the gesture data:
and placing the gesture data into a buffer queue.
7. The method for data interaction of the runtime library based on the OpenXR standard according to claim 6, wherein the specific method for establishing the UDP socket link between the client and the runtime library is as follows:
the operation library starts UDP socket service and provides port numbers bound by UDP protocol to the client;
and the client is connected with the operation library according to the port number.
8. A computer device, comprising: a processor, a memory, and a communication unit;
the memory stores machine readable instructions executable by the processor, the processor and the memory in communication via the communication unit when the device is operating;
wherein the processor executes the machine readable instructions to perform the method of any of claims 4 to 7.
9. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, performs the method of any of claims 4-7.
CN202010655829.0A 2020-07-09 2020-07-09 Operation library based on OpenXR standard, data interaction method, device and medium Active CN111831353B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010655829.0A CN111831353B (en) 2020-07-09 2020-07-09 Operation library based on OpenXR standard, data interaction method, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010655829.0A CN111831353B (en) 2020-07-09 2020-07-09 Operation library based on OpenXR standard, data interaction method, device and medium

Publications (2)

Publication Number Publication Date
CN111831353A CN111831353A (en) 2020-10-27
CN111831353B true CN111831353B (en) 2024-02-20

Family

ID=72900368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010655829.0A Active CN111831353B (en) 2020-07-09 2020-07-09 Operation library based on OpenXR standard, data interaction method, device and medium

Country Status (1)

Country Link
CN (1) CN111831353B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112612456A (en) * 2020-12-25 2021-04-06 深圳市引力创新科技有限公司 Multi-program systematic management framework and management method
CN115209178A (en) * 2021-04-14 2022-10-18 华为技术有限公司 Information processing method, device and system
CN116560858A (en) * 2023-07-07 2023-08-08 北京蔚领时代科技有限公司 VR cloud server container isolation method and system
CN117596377B (en) * 2024-01-18 2024-05-28 腾讯科技(深圳)有限公司 Picture push method, device, electronic equipment, storage medium and program product

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7266616B1 (en) * 2001-08-08 2007-09-04 Pasternak Solutions Llc Method and system for digital rendering over a network
CN107197342A (en) * 2017-06-16 2017-09-22 深圳创维数字技术有限公司 A kind of data processing method, intelligent terminal, VR equipment and storage medium
CN107203434A (en) * 2017-06-22 2017-09-26 武汉斗鱼网络科技有限公司 A kind of texture shared method, device and computer-readable recording medium
CN109814719A (en) * 2018-07-26 2019-05-28 亮风台(上海)信息科技有限公司 A kind of method and apparatus of the display information based on wearing glasses
US10325410B1 (en) * 2016-11-07 2019-06-18 Vulcan Inc. Augmented reality for enhancing sporting events
US10452868B1 (en) * 2019-02-04 2019-10-22 S2 Systems Corporation Web browser remoting using network vector rendering
CN110413386A (en) * 2019-06-27 2019-11-05 深圳市富途网络科技有限公司 Multiprocessing method, apparatus, terminal device and computer readable storage medium
US10497180B1 (en) * 2018-07-03 2019-12-03 Ooo “Ai-Eksp” System and method for display of augmented reality
US10558824B1 (en) * 2019-02-04 2020-02-11 S2 Systems Corporation Application remoting using network vector rendering
CN111030990A (en) * 2019-11-05 2020-04-17 华为技术有限公司 Method for establishing communication connection, client and server
CN111064985A (en) * 2018-10-16 2020-04-24 北京凌宇智控科技有限公司 System, method and device for realizing video streaming
CN111316334A (en) * 2017-11-03 2020-06-19 三星电子株式会社 Apparatus and method for dynamically changing virtual reality environment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180336069A1 (en) * 2017-05-17 2018-11-22 Tsunami VR, Inc. Systems and methods for a hardware agnostic virtual experience

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7266616B1 (en) * 2001-08-08 2007-09-04 Pasternak Solutions Llc Method and system for digital rendering over a network
US10325410B1 (en) * 2016-11-07 2019-06-18 Vulcan Inc. Augmented reality for enhancing sporting events
CN107197342A (en) * 2017-06-16 2017-09-22 深圳创维数字技术有限公司 A kind of data processing method, intelligent terminal, VR equipment and storage medium
CN107203434A (en) * 2017-06-22 2017-09-26 武汉斗鱼网络科技有限公司 A kind of texture shared method, device and computer-readable recording medium
CN111316334A (en) * 2017-11-03 2020-06-19 三星电子株式会社 Apparatus and method for dynamically changing virtual reality environment
US10497180B1 (en) * 2018-07-03 2019-12-03 Ooo “Ai-Eksp” System and method for display of augmented reality
CN109814719A (en) * 2018-07-26 2019-05-28 亮风台(上海)信息科技有限公司 A kind of method and apparatus of the display information based on wearing glasses
CN111064985A (en) * 2018-10-16 2020-04-24 北京凌宇智控科技有限公司 System, method and device for realizing video streaming
US10452868B1 (en) * 2019-02-04 2019-10-22 S2 Systems Corporation Web browser remoting using network vector rendering
US10558824B1 (en) * 2019-02-04 2020-02-11 S2 Systems Corporation Application remoting using network vector rendering
CN110413386A (en) * 2019-06-27 2019-11-05 深圳市富途网络科技有限公司 Multiprocessing method, apparatus, terminal device and computer readable storage medium
CN111030990A (en) * 2019-11-05 2020-04-17 华为技术有限公司 Method for establishing communication connection, client and server

Also Published As

Publication number Publication date
CN111831353A (en) 2020-10-27

Similar Documents

Publication Publication Date Title
CN111831353B (en) Operation library based on OpenXR standard, data interaction method, device and medium
US10315109B2 (en) Qualified video delivery methods
CN108563517B (en) Calling method and device of system interface
CN102707986B (en) Shared storage between child partition and father's subregion
CN102301360B (en) The data of peripherals are transmitted selectively to multiple sending computer
CN113542757B (en) Image transmission method and device for cloud application, server and storage medium
US20090044112A1 (en) Animated Digital Assistant
WO2022257699A1 (en) Image picture display method and apparatus, device, storage medium and program product
CN102413150A (en) Server and virtual desktop control method and virtual desktop control system
CN102196033B (en) A kind ofly transmit and receive the long-range method and system presenting data
CN113034629B (en) Image processing method, image processing device, computer equipment and storage medium
CN111984114A (en) Multi-person interaction system based on virtual space and multi-person interaction method thereof
CN106797398B (en) For providing the method and system of virtual desktop serve to client
CN104765636B (en) A kind of synthetic method and device of remote desktop image
CN112023402B (en) Game data processing method, device, equipment and medium
US7075544B2 (en) Apparatus and method of processing image in thin-client environment and apparatus and method of receiving the processed image
CN115065684A (en) Data processing method, device, equipment and medium
CN113778593B (en) Cloud desktop control method and device, electronic equipment, storage medium and program product
WO2023179395A1 (en) Data transmission system and method, service system, device, and storage medium
US20210346799A1 (en) Qualified Video Delivery Methods
CN114398018A (en) Picture display method, device, storage medium and electronic equipment
US6618685B1 (en) Non-invasive testing of smart cards
CN111491210A (en) Data processing method, device and system
CN111013144B (en) Game picture drawing and rendering method and device and mobile terminal
CN108596825A (en) 3D effect display methods and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant