CN112040209B - VR scene projection method and device, projection system and server - Google Patents

VR scene projection method and device, projection system and server Download PDF

Info

Publication number
CN112040209B
CN112040209B CN202010958004.6A CN202010958004A CN112040209B CN 112040209 B CN112040209 B CN 112040209B CN 202010958004 A CN202010958004 A CN 202010958004A CN 112040209 B CN112040209 B CN 112040209B
Authority
CN
China
Prior art keywords
virtual
picture
patient
image
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010958004.6A
Other languages
Chinese (zh)
Other versions
CN112040209A (en
Inventor
聂镭
黄海
聂颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Longma Zhixin Zhuhai Hengqin Technology Co ltd
Original Assignee
Longma Zhixin Zhuhai Hengqin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Longma Zhixin Zhuhai Hengqin Technology Co ltd filed Critical Longma Zhixin Zhuhai Hengqin Technology Co ltd
Priority to CN202010958004.6A priority Critical patent/CN112040209B/en
Publication of CN112040209A publication Critical patent/CN112040209A/en
Application granted granted Critical
Publication of CN112040209B publication Critical patent/CN112040209B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Psychiatry (AREA)
  • Epidemiology (AREA)
  • Developmental Disabilities (AREA)
  • Hospice & Palliative Care (AREA)
  • Human Computer Interaction (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application is suitable for the technical field of virtual reality, and provides a projection method, a projection device, a projection system, a server and a computer readable storage medium of a VR scene, wherein the method comprises the following steps: acquiring a first virtual picture of a virtual scene shot by a first virtual camera; acquiring a 3D image of a patient; superposing the 3D image to the position of a first virtual camera in a virtual scene, and acquiring a second virtual image of the virtual scene superposed with the 3D image shot by a second virtual camera; and sending the first virtual picture to the virtual reality equipment, and sending the second virtual picture to the projection equipment. The virtual scene that experiences and the human portrait of patient appear at same picture with patient's this application embodiment, and in the in-process of treating the patient who wears the helmet through virtual reality technique, the therapist who does not wear the helmet can look over the virtual scene that the patient experienced and patient's gesture in real time to improve treatment.

Description

VR scene projection method and device, projection system and server
Technical Field
The application belongs to the technical field of virtual reality, and particularly relates to a VR scene projection method, a VR scene projection device, a VR scene projection system, a VR scene server and a computer-readable storage medium.
Background
With the development of science and technology, in the process of psychotherapy, a virtual reality technology can be used to assist a therapist in treating a patient, for example, in the process of treating psychological diseases such as posttraumatic stress disorder (PTSD), a virtual reality technology can be used to simulate a traumatic event related to traumatic memory of the patient, so that the patient wearing the helmet is directly exposed to the traumatic event, a sense of safety is found, cognitive deviation is reduced, avoidance behavior is reduced, and the fear of the traumatic memory is relieved, thereby achieving the purpose of psychotherapy of the patient.
However, when a patient wearing the helmet is directly exposed to a traumatic event, a therapist who does not wear the helmet cannot view the virtual scene experienced by the patient in real time, so that the exposure treatment process cannot be observed, and the expected treatment effect cannot be achieved.
Disclosure of Invention
In view of this, embodiments of the present application provide a projection method and apparatus for a VR scene, a projection system, a server, and a computer-readable storage medium, so as to solve the problem that, in the prior art, a therapist who does not wear a helmet cannot view a virtual scene experienced by a patient in real time during a treatment process of the patient wearing the helmet through a virtual reality technology.
A first aspect of an embodiment of the present application provides a projection method of a VR scene, including:
acquiring a first virtual picture of a virtual scene shot by a first virtual camera, wherein the first virtual camera is a virtual camera for shooting a first visual angle of a patient using virtual reality equipment;
acquiring a 3D image of the patient;
superposing the 3D image to the position of the first virtual camera in the virtual scene, and acquiring a second virtual image of the virtual scene superposed with the 3D image shot by a second virtual camera;
each second virtual camera is arranged at a position around the first virtual camera corresponding to the second virtual camera, and the second virtual camera is a virtual camera for shooting a second visual angle following the first visual angle;
sending the first virtual picture to virtual reality equipment, and sending the second virtual picture to projection equipment at the same time;
the first virtual picture is used for instructing the virtual reality device to play the first virtual picture to a patient, the second virtual picture is used for instructing the projection device to project the second virtual picture to a curtain corresponding to the projection device so as to display the second virtual picture to a therapist, each projection device corresponds to one second virtual picture, and each projection device corresponds to one curtain.
In a possible implementation manner of the first aspect, the number of the second virtual cameras is 4, the number of the orientations around the first virtual camera is 4, the number of the second virtual pictures is 4, the number of the projection devices is 4, and the number of the curtains is 4.
In one possible implementation manner of the first aspect, acquiring a 3D image of the patient includes:
acquiring a 3D model of the patient;
obtaining a human skeleton model of the patient;
and synthesizing the 3D model and the human skeleton model to obtain the 3D image.
In one possible implementation manner of the first aspect, acquiring the 3D model of the patient includes:
acquiring a 2D image of the patient;
constructing a 3D model of the patient from the 2D representation;
obtaining a human skeletal model of the patient, comprising:
acquiring a 2D image of the patient;
and generating a human skeleton model of the patient according to the 2D image.
In a possible implementation manner of the first aspect, after sending the first virtual picture to a virtual reality device and sending the second virtual picture to a projection device at the same time, the method further includes:
detecting a distance change value of a therapist relative to the curtain;
and adjusting the zooming degree of the picture lens displayed on the curtain according to the distance change value.
In a possible implementation manner of the first aspect, after sending the first virtual picture to a virtual reality device and sending the second virtual picture to a projection device at the same time, the method further includes:
detecting a change value of the moving direction of the therapist relative to the curtain;
and adjusting the rotation amplitude of the picture lens displayed on the curtain according to the movement direction change value.
A second aspect of an embodiment of the present application provides a projection apparatus for a VR scene, including:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a first virtual picture of a virtual scene shot by a first virtual camera, and the first virtual camera is used for shooting a first visual angle of a patient using virtual reality equipment;
a second acquisition module for acquiring a 3D image of the patient;
the superposition module is used for superposing the 3D image to the position of the first virtual camera in the virtual scene and acquiring a second virtual image of the virtual scene superposed with the 3D image shot by a second virtual camera;
each second virtual camera is arranged at a position around the first virtual camera corresponding to the second virtual camera, and the second virtual camera is a virtual camera for shooting a second visual angle following the first visual angle;
the sending module is used for sending the first virtual picture to virtual reality equipment and sending the second virtual picture to projection equipment;
the first virtual picture is used for instructing the virtual reality device to play the first virtual picture to a patient, the second virtual picture is used for instructing the projection device to project the second virtual picture to a curtain corresponding to the projection device so as to display the second virtual picture to a therapist, each projection device corresponds to one second virtual picture, and each projection device corresponds to one curtain.
In a possible implementation manner of the second aspect, the number of the second virtual cameras is 4, the number of the orientations around the first virtual camera is 4, the number of the second virtual pictures is 4, the number of the projection devices is 4, and the number of the curtains is 4.
In a possible implementation manner of the second aspect, the second obtaining module includes:
a first acquisition unit for acquiring a 3D model of the patient;
the second acquisition unit is used for acquiring a human skeleton model of the patient;
and the synthesis unit is used for synthesizing the 3D model and the human skeleton model to obtain the 3D image.
In a possible implementation manner of the second aspect, the first obtaining unit includes:
a first acquisition subunit for acquiring a 2D image of the patient;
a construction subunit for constructing a 3D model of the patient from the 2D image
The second acquisition unit includes:
a second acquisition subunit for acquiring a 2D image of the patient;
and the generating subunit is used for generating the human skeleton model of the patient according to the 2D image.
In a possible implementation manner of the second aspect, the apparatus further includes:
the first detection module is used for detecting a distance change value of a therapist relative to the curtain;
and the first adjusting module is used for adjusting the zooming degree of the picture lens displayed on the curtain according to the distance change value.
In a possible implementation manner of the second aspect, the apparatus further includes:
the second detection module is used for detecting a movement direction change value of the therapist relative to the curtain;
and the second rotation module is used for adjusting the rotation amplitude of the picture lens displayed on the curtain according to the movement direction change value.
A third aspect of an embodiment of the present application provides a server, including: a memory, a processor, an image pick-up device and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to the first aspect as described above when executing the computer program.
A fourth aspect of an embodiment of the present application provides a computer-readable storage medium, including: the computer readable storage medium stores a computer program which, when executed by a processor, performs the steps of the method of the first aspect as described above.
Compared with the prior art, the embodiment of the application has the advantages that:
the embodiment of the application sends the first virtual picture of shooting the virtual scene with the first virtual camera to the virtual reality equipment through the server, simultaneously sends the second virtual picture of shooting the virtual scene with the superimposition patient 3D picture to the projection equipment with the second virtual camera, so that the virtual scene experienced by the patient and the human body portrait of the patient appear on the same picture, in the process of treating the patient wearing the helmet through the virtual reality technology, a therapist who does not wear the helmet can check the virtual scene experienced by the patient and the posture of the patient in real time, and the treatment effect is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic diagram of a projection system according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a VR projection method according to an embodiment of the present application;
fig. 3 is a schematic flowchart illustrating a specific process of step S202 in fig. 2 of a VR projection method according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of a VR projection method provided by an embodiment of the application after step S204 in fig. 2;
fig. 5 is another schematic flowchart of a VR projection method provided by an embodiment of the application after step S204 in fig. 2;
fig. 6 is a schematic structural diagram of a projection apparatus for a VR scene according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a server provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of how a projection apparatus and a curtain of a projection system provided in an embodiment of the present application are arranged;
FIG. 9 is a schematic diagram of a human skeleton model of a VR projection method according to an embodiment of the present application;
fig. 10 is a schematic view of a virtual scene of a VR scene projection method provided in an embodiment of the present application;
fig. 11 is a schematic diagram illustrating calculation of a distance variation value of a therapist relative to a curtain for a projection method of a VR scene according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, a schematic structural diagram of a projection system 1 provided in the embodiment of the present disclosure includes a server 10, a virtual reality device 20 connected to the server, and a projection device 30 connected to the server, where the server includes a backend server or a cloud server, for example, a computing device such as a GPU server, and the virtual reality device may be a virtual reality helmet.
The server is used for acquiring a first virtual picture of a virtual scene shot by a first virtual camera; acquiring a 3D image of a patient; superposing the 3D image to the position of a first virtual camera in a virtual scene, and acquiring a second virtual image of the virtual scene superposed with the 3D image shot by a second virtual camera; each second virtual camera is arranged at a position around the first virtual camera corresponding to the second virtual camera, the first virtual camera is a virtual camera for shooting a first visual angle of a patient using the virtual reality equipment, and the second virtual camera is a virtual camera for shooting a second visual angle following the first visual angle; sending a first virtual picture to the virtual reality equipment, and simultaneously sending a second virtual picture to the projection equipment; the first virtual picture is used for instructing the virtual reality equipment to play the first virtual picture to the patient, the second virtual picture is used for instructing the projection equipment to project the second virtual picture to the curtain corresponding to the projection equipment so as to display the second virtual picture to the therapist, each projection equipment corresponds to one second virtual picture, and each projection equipment corresponds to one curtain.
And the virtual reality equipment is used for playing the first virtual picture to the patient.
The projection device is used for projecting the second virtual picture to a curtain corresponding to the projection device so as to display the second virtual picture to the therapist.
Illustratively, the number of the second virtual cameras is 4, the number of the azimuths around the first virtual camera is 4, the number of the second virtual pictures is 4, the number of the projection devices is 4, the number of the curtains is 4, the arrangement of the projection devices and the curtains can be as shown in fig. 8, each projection device corresponds to one curtain, the patient wearing the virtual reality device stands inside the area surrounded by the curtains, and the therapist stands outside the area surrounded by the curtains. In an alternative embodiment, the projection system may further comprise four depth cameras (not shown) each respectively arranged on the curtain for acquiring images of a patient standing within the area surrounded by the curtain.
In the embodiment of the application, the first virtual picture of the virtual scene shot by the first virtual camera is sent to the virtual reality equipment through the server, the second virtual picture of the virtual scene overlapped with the 3D image of the patient shot by the second virtual camera is sent to the projection equipment, so that the virtual scene experienced by the patient and the human body portrait of the patient appear on the same picture, in the process of treating the patient wearing the helmet through the virtual reality technology, a therapist not wearing the helmet can check the virtual scene experienced by the patient and the posture of the patient in real time, and the treatment effect is improved.
Referring to fig. 2, a schematic flow chart of a projection method of a VR scene provided in an embodiment of the present application is applied to the server in the above embodiment, where the method includes the following steps:
step S201, acquiring a first virtual picture of a virtual scene shot by a first virtual camera.
Wherein the first virtual camera is a virtual camera that photographs a first perspective of a patient using the virtual reality device: the first virtual picture is a 3D image.
Step S202, acquiring a 3D image of the patient.
Exemplarily, referring to fig. 3, a detailed flowchart of step S202 in fig. 2 of a VR scene projection method provided in an embodiment of the present application is shown, and acquiring a 3D image of a patient includes:
step S301, acquiring a 3D model of a patient.
Specifically, the server obtains a 2D representation of the patient and constructs a 3D model of the patient from the 2D representation.
For example, the server acquires 2D images of a patient through four depth cameras arranged on a curtain, stores the 2D images in a common format such as FBX, OBJ, and the like, and then performs depth measurement, update reconstruction, surface prediction, and pose prediction on the 2D images through a kinectfusion algorithm to obtain a 3D model.
And step S302, obtaining a human skeleton model of the patient.
Specifically, the server obtains a 2D image of the patient and generates a human skeleton model of the patient according to the 2D image.
For example, after a 2D image is acquired by a depth camera, basic background drying is performed on the 2D image; then, 16 human bone points are selected to form a human skeleton model through a posture recognition algorithm, such as an OpenPse recognition algorithm, the position of each joint point uses a three-dimensional vector representation Pi ═ (xi, yi, zi) T of the depth camera, wherein xi, yi represent the position on the color image, and zi represents the distance between the joint point and the sensor. As shown in fig. 10, P1, P2, … and P7 in the 16 skeleton points are respectively the right shoulder, neck, left shoulder, spine, hip center, right hip and left hip of the human body, which together form the trunk part of the human body; p8 is human head; p9, P10, … and P16 represent the four limbs of the human body, including the left and right elbow joints, wrist joints, knee joints and ankle joints, thereby obtaining a skeletal model of the human body of the patient.
And S303, synthesizing the 3D model and the human skeleton model to obtain a 3D image.
For example, after a basic human skeleton model is acquired, each joint point is restored in a 3D environment through a posture recognition algorithm, such as an OpenPse recognition algorithm, then the 3D model is bound in the skeleton model through a model tool built in a server, and coordinate points of the skeleton model are stored, so that the restoration of a 3D image is realized.
And S203, superposing the 3D image to the position of the first virtual camera in the virtual scene, and acquiring a second virtual image of the virtual scene superposed with the 3D image shot by the second virtual camera.
Wherein the second virtual picture is a 2D picture; each second virtual camera is arranged at a position around the first virtual camera corresponding to the second virtual camera, the first virtual camera is a virtual camera for shooting a first visual angle of a patient using the virtual reality equipment, and the second virtual camera is a virtual camera for shooting a second visual angle following the first visual angle.
Exemplarily, as shown in fig. 11, for a virtual scene schematic diagram of a projection method of a VR scene provided in an embodiment of the present application, it can be seen that, in the embodiment of the present application, there are 1 first virtual camera and 4 second virtual cameras, each of which is at an orientation around the first virtual camera, and then the first virtual camera takes a picture at a first viewing angle of a patient to obtain a first virtual picture, i.e., a 3D picture, and the second virtual camera takes a picture at a second viewing angle following the first viewing angle to obtain a second virtual picture, i.e., a 2D picture.
Step S204, the first virtual picture is sent to the virtual reality device, and meanwhile, the second virtual picture is sent to the projection device.
The first virtual picture is used for instructing the virtual reality equipment to play the first virtual picture to the patient, the second virtual picture is used for instructing the projection equipment to project the second virtual picture to the curtain corresponding to the projection equipment so as to display the second virtual picture to the therapist, each projection equipment corresponds to one second virtual picture, and each projection equipment corresponds to one curtain.
In a possible implementation manner, as shown in fig. 4, a flowchart after step S204 in fig. 1 provided in this embodiment of the application is shown, where sending a first virtual picture to a virtual reality device, sending the first virtual picture to the virtual reality device, and sending a second virtual picture to a projection device at the same time, the method further includes:
step S401, detecting the distance change value of the therapist relative to the curtain.
In specific application, a binocular camera is arranged outside the curtain, and the distance change value of a therapist relative to the curtain is calculated. Firstly, calibrating a binocular camera to obtain internal and external parameters and a homography matrix of two cameras in the binocular camera, after the parameters are obtained, obtaining imaging photos of a therapist shot by the two cameras, comparing the two imaging photos according to the parameters of the cameras, removing a background image of the therapist in the imaging photos through a grabcut algorithm, and horizontally correcting the photos. Pixel point matching is carried out on the two imaging images of the therapist through an algorithm, and the mode of calculating the position of the therapist by the binocular camera is as follows:
as shown in fig. 11, P is a certain point on the object to be measured, i.e. a point on the therapist, OR and OT are optical centers of the two cameras, respectively, imaging points of the point P on photoreceptors of the two cameras are P and P '(an imaging plane of the camera is placed in front of a lens after being rotated), f is a focal length of the camera, B is a center distance between the two cameras, Z is depth information that we want to obtain, and if a distance from the point P to the point P' is dis:
dis=B-(XR-XT)
according to the similar triangle principle:
Figure GDA0003145960510000081
the following can be obtained:
Figure GDA0003145960510000091
in the above formula, the focal length f and the camera center distance B can be obtained by calibration, so that the distance change value of the therapist relative to the curtain can be obtained as long as the value (i.e., the parallax d) is obtained.
And S402, adjusting the zooming degree of the picture shot displayed on the curtain according to the distance change value.
The distance change value of the therapist relative to the curtain is calculated, and the virtual camera is finely adjusted in the virtual environment, so that the reality sense is enhanced. For example, when the therapist walks to the wall surface, the second virtual camera also moves forward, and then the zoom level of the screen shot displayed on the curtain also changes.
It can be understood that, on the basis that the virtual scene experienced by the patient and the posture of the patient are displayed to the therapist for viewing, the zooming degree of the screen shot displayed on the curtain can be changed according to the distance between the therapist and the curtain, so that the viewing experience of the therapist is further improved.
In a possible implementation manner, as shown in fig. 5, another flowchart after step S204 in fig. 2 provided in this embodiment of the application is that, after sending the first virtual picture to the virtual reality device, and sending the second virtual picture to the projection device at the same time, the method further includes:
step S501 detects a change value of the movement direction of the therapist with respect to the curtain.
Specifically, the change value of the movement direction of the therapist relative to the curtain can be recognized by using an optical flow method. The optical flow method is a method for calculating motion information of an object between adjacent frames by finding a correspondence between a previous frame and a current frame using a change of a pixel in an image sequence in a time domain and a correlation between adjacent frames.
And step S502, adjusting the rotation amplitude of the picture lens displayed on the curtain according to the change value of the moving direction.
Specifically, according to the moving direction change value, the second virtual camera is called to perform left-right adjustment, so that the rotation amplitude of the picture lens displayed on the curtain is adjusted.
It can be understood that, in the embodiment of the application, on the basis that the virtual scene experienced by the patient and the posture of the patient are displayed to the therapist for viewing, the rotation amplitude of the screen shot displayed on the curtain can be adjusted according to the change value of the movement direction of the therapist relative to the curtain, so that the viewing experience of the therapist is further improved.
The first virtual picture of the virtual scene shot by the first virtual camera is sent to the virtual reality equipment through the server, the second virtual picture of the virtual scene overlapped with the 3D picture of the patient is shot by the second virtual camera and sent to the projection equipment, so that the virtual scene experienced by the patient and the human body portrait of the patient appear on the same picture, in the process of treating the patient wearing the helmet through the virtual reality technology, a therapist who does not wear the helmet can check the virtual scene experienced by the patient and the posture of the patient in real time, and the treatment effect is improved.
EXAMPLE five
A projection apparatus for a VR scene according to an embodiment of the present application will be described below. The projection apparatus of the VR scene of this embodiment corresponds to the projection method of the VR scene.
Fig. 6 is a schematic structural diagram of a projection apparatus for a VR scene according to an embodiment of the present disclosure, where the apparatus may be specifically integrated in a server, and the apparatus may include:
a first obtaining module 61, configured to obtain a first virtual picture of a virtual scene captured by a first virtual camera, where the first virtual camera is a virtual camera that captures a first perspective of a patient using a virtual reality device;
a second acquisition module 62 for acquiring a 3D image of the patient;
the superposition module 63 is configured to superpose the 3D image to the position of the first virtual camera in the virtual scene, and acquire a second virtual image of the virtual scene on which the 3D portrait is superposed, which is shot by a second virtual camera; each second virtual camera is arranged at a position around the first virtual camera corresponding to the second virtual camera, and the second virtual camera is a virtual camera for shooting a second visual angle following the first visual angle;
a sending module 64, configured to send the first virtual image to a virtual reality device, and send the second virtual image to a projection device at the same time;
the first virtual picture is used for instructing the virtual reality device to play the first virtual picture to a patient, the second virtual picture is used for instructing the projection device to project the second virtual picture to a curtain corresponding to the projection device so as to display the second virtual picture to a therapist, each projection device corresponds to one second virtual picture, and each projection device corresponds to one curtain.
In a possible implementation manner, the number of the second virtual cameras is 4, the number of the orientations around the first virtual camera is 4, the number of the second virtual pictures is 4, the number of the projection devices is 4, and the number of the curtains is 4.
In a possible implementation manner, the second obtaining module includes:
a first acquisition unit for acquiring a 3D model of the patient;
the second acquisition unit is used for acquiring a human skeleton model of the patient;
and the synthesis unit is used for synthesizing the 3D model and the human skeleton model to obtain the 3D image.
In one possible implementation manner, the first obtaining unit includes:
a first acquisition subunit for acquiring a 2D image of the patient;
a construction subunit for constructing a 3D model of the patient from the 2D image
The second acquisition unit includes:
a second acquisition subunit for acquiring a 2D image of the patient;
and the generating subunit is used for generating the human skeleton model of the patient according to the 2D image.
In one possible implementation, the apparatus further includes:
the first detection module is used for detecting a distance change value of a therapist relative to the curtain;
and the first adjusting module is used for adjusting the zooming degree of the picture lens displayed on the curtain according to the distance change value.
In one possible implementation, the apparatus further includes:
the second detection module is used for detecting a movement direction change value of the therapist relative to the curtain;
and the second rotation module is used for adjusting the rotation amplitude of the picture lens displayed on the curtain according to the movement direction change value.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Fig. 7 is a schematic structural diagram of a server according to an embodiment of the present application. As shown in fig. 7, the server 7 of this embodiment includes: at least one processor 70, a memory 71 and a computer program 72 stored in the memory 71 and executable on the at least one processor 70, the processor 70 implementing the steps in any of the above described embodiments of the method of projecting a VR scene when executing the computer program 72.
The server 7 may be a computing device such as a cloud server. The server may include, but is not limited to, a processor 70, a memory 71. Those skilled in the art will appreciate that fig. 7 is merely an example of the server 7, and does not constitute a limitation of the server 7, and may include more or less components than those shown, or combine certain components, or different components, such as input output devices, network access devices, etc.
The Processor 70 may be a Central Processing Unit (CPU), and the Processor 70 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 781 may in some embodiments be an internal storage unit of the server 7, such as a hard disk or a memory of the server 7.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A method of projecting a VR scene, the method comprising:
acquiring a first virtual picture of a virtual scene shot by a first virtual camera, wherein the first virtual camera is a virtual camera for shooting a first visual angle of a patient using virtual reality equipment;
acquiring a 3D image of the patient;
superposing the 3D image to the position of the first virtual camera in the virtual scene, and acquiring a second virtual image of the virtual scene superposed with the 3D image shot by a second virtual camera;
each second virtual camera is arranged at a position around the first virtual camera corresponding to the second virtual camera, and the second virtual camera is a virtual camera for shooting a second visual angle following the first visual angle;
sending the first virtual picture to virtual reality equipment, and sending the second virtual picture to projection equipment at the same time;
the first virtual picture is used for instructing the virtual reality device to play the first virtual picture to a patient, the second virtual picture is used for instructing the projection device to project the second virtual picture to a curtain corresponding to the projection device so as to display the second virtual picture to a therapist, each projection device corresponds to one second virtual picture, and each projection device corresponds to one curtain.
2. The method of claim 1, wherein the number of the second virtual cameras is 4, the number of the orientations around the first virtual camera is 4, the number of the second virtual pictures is 4, the number of the projection devices is 4, and the number of the curtains is 4.
3. The method of claim 1, wherein a 3D image of the patient is acquired,
the method comprises the following steps:
acquiring a 3D model of the patient;
obtaining a human skeleton model of the patient;
and synthesizing the 3D model and the human skeleton model to obtain the 3D image.
4. The method of projecting a VR scene of claim 3, wherein obtaining the 3D model of the patient comprises:
acquiring a 2D image of the patient;
constructing a 3D model of the patient from the 2D representation;
obtaining a human skeletal model of the patient, comprising:
acquiring a 2D image of the patient;
and generating a human skeleton model of the patient according to the 2D image.
5. The method of any of claims 1-4, wherein sending the first virtual picture to a virtual reality device while sending the second virtual picture to a projection device further comprises:
detecting a distance change value of a therapist relative to the curtain;
and adjusting the zooming degree of the picture lens displayed on the curtain according to the distance change value.
6. The method of any of claims 1-4, wherein sending the first virtual picture to a virtual reality device while sending the second virtual picture to a projection device further comprises:
detecting a change value of the moving direction of the therapist relative to the curtain;
and adjusting the rotation amplitude of the picture lens displayed on the curtain according to the movement direction change value.
7. An apparatus for projecting a VR scene, the apparatus comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a first virtual picture of a virtual scene shot by a first virtual camera, and the first virtual camera is used for shooting a first visual angle of a patient using virtual reality equipment;
a second acquisition module for acquiring a 3D image of the patient;
the superposition module is used for superposing the 3D image to the position of the first virtual camera in the virtual scene and acquiring a second virtual image of the virtual scene superposed with the 3D image shot by a second virtual camera;
each second virtual camera is arranged at a position around the first virtual camera corresponding to the second virtual camera, and the second virtual camera is a virtual camera for shooting a second visual angle following the first visual angle;
the sending module is used for sending the first virtual picture to virtual reality equipment and sending the second virtual picture to projection equipment;
the first virtual picture is used for instructing the virtual reality device to play the first virtual picture to a patient, the second virtual picture is used for instructing the projection device to project the second virtual picture to a curtain corresponding to the projection device so as to display the second virtual picture to a therapist, each projection device corresponds to one second virtual picture, and each projection device corresponds to one curtain.
8. A projection system for a VR scene, comprising:
the server is used for acquiring a first virtual picture of a virtual scene shot by a first virtual camera; acquiring a 3D image of a patient;
superposing the 3D image to the position of the first virtual camera in the virtual scene, and acquiring a second virtual image of the virtual scene superposed with the 3D image shot by a second virtual camera; each second virtual camera is arranged at an orientation corresponding to the second virtual camera around the first virtual camera, the first virtual camera is a virtual camera for shooting a first visual angle of a patient using the virtual reality equipment, and the second virtual camera is a virtual camera for shooting a second visual angle following the first visual angle; sending the first virtual picture to virtual reality equipment, and sending the second virtual picture to projection equipment at the same time; the first virtual picture is used for instructing the virtual reality device to play the first virtual picture to a patient, the second virtual picture is used for instructing the projection device to project the second virtual picture to a curtain corresponding to the projection device so as to display the second virtual picture to a therapist, each projection device corresponds to one second virtual picture, and each projection device corresponds to one curtain;
the virtual reality equipment is used for playing the first virtual picture to the patient;
and the projection equipment is used for projecting the second virtual picture to a curtain corresponding to the projection equipment so as to display the second virtual picture to a therapist.
9. A server comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 6 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN202010958004.6A 2020-09-14 2020-09-14 VR scene projection method and device, projection system and server Active CN112040209B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010958004.6A CN112040209B (en) 2020-09-14 2020-09-14 VR scene projection method and device, projection system and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010958004.6A CN112040209B (en) 2020-09-14 2020-09-14 VR scene projection method and device, projection system and server

Publications (2)

Publication Number Publication Date
CN112040209A CN112040209A (en) 2020-12-04
CN112040209B true CN112040209B (en) 2021-09-03

Family

ID=73589023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010958004.6A Active CN112040209B (en) 2020-09-14 2020-09-14 VR scene projection method and device, projection system and server

Country Status (1)

Country Link
CN (1) CN112040209B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112365956A (en) * 2020-12-13 2021-02-12 龙马智芯(珠海横琴)科技有限公司 Psychological treatment method, psychological treatment device, psychological treatment server and psychological treatment storage medium based on virtual reality

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105391988A (en) * 2015-12-11 2016-03-09 谭圆圆 Multi-view unmanned aerial vehicle and multi-view display method thereof
CN106412424A (en) * 2016-09-20 2017-02-15 乐视控股(北京)有限公司 View adjusting method and device for panoramic video
CN107396077A (en) * 2017-08-23 2017-11-24 深圳看到科技有限公司 Virtual reality panoramic video stream projecting method and equipment
CN107479699A (en) * 2017-07-28 2017-12-15 深圳市瑞立视多媒体科技有限公司 Virtual reality exchange method, apparatus and system
CN109985385A (en) * 2018-12-06 2019-07-09 派视觉虚拟现实(深圳)软件技术有限公司 The control method and device of a kind of game station and its game role
CN111429348A (en) * 2020-03-20 2020-07-17 中国铁建重工集团股份有限公司 Image generation method, device and system and readable storage medium
CN111586304A (en) * 2020-05-25 2020-08-25 重庆忽米网络科技有限公司 Panoramic camera system and method based on 5G and VR technology

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10038887B2 (en) * 2015-05-27 2018-07-31 Google Llc Capture and render of panoramic virtual reality content

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105391988A (en) * 2015-12-11 2016-03-09 谭圆圆 Multi-view unmanned aerial vehicle and multi-view display method thereof
CN106412424A (en) * 2016-09-20 2017-02-15 乐视控股(北京)有限公司 View adjusting method and device for panoramic video
CN107479699A (en) * 2017-07-28 2017-12-15 深圳市瑞立视多媒体科技有限公司 Virtual reality exchange method, apparatus and system
CN107396077A (en) * 2017-08-23 2017-11-24 深圳看到科技有限公司 Virtual reality panoramic video stream projecting method and equipment
CN109985385A (en) * 2018-12-06 2019-07-09 派视觉虚拟现实(深圳)软件技术有限公司 The control method and device of a kind of game station and its game role
CN111429348A (en) * 2020-03-20 2020-07-17 中国铁建重工集团股份有限公司 Image generation method, device and system and readable storage medium
CN111586304A (en) * 2020-05-25 2020-08-25 重庆忽米网络科技有限公司 Panoramic camera system and method based on 5G and VR technology

Also Published As

Publication number Publication date
CN112040209A (en) 2020-12-04

Similar Documents

Publication Publication Date Title
CN110599540B (en) Real-time three-dimensional human body shape and posture reconstruction method and device under multi-viewpoint camera
JP6632443B2 (en) Information processing apparatus, information processing system, and information processing method
CN106791784B (en) A kind of the augmented reality display methods and device of actual situation coincidence
JP6515813B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
KR101323966B1 (en) A system and method for 3D space-dimension based image processing
CN110913751B (en) Wearable eye tracking system with slip detection and correction functions
US8897502B2 (en) Calibration for stereoscopic capture system
CN111007939B (en) Virtual reality system space positioning method based on depth perception
CN108830905A (en) The binocular calibration localization method and virtual emulation of simulating medical instrument cure teaching system
CN109448105B (en) Three-dimensional human body skeleton generation method and system based on multi-depth image sensor
CN106843507A (en) A kind of method and system of virtual reality multi-person interactive
KR20150120066A (en) System for distortion correction and calibration using pattern projection, and method using the same
US9558719B2 (en) Information processing apparatus
CN107071388A (en) A kind of three-dimensional augmented reality display methods and device
KR20220092998A (en) Co-located pose estimation in a shared artificial reality environment
JP2020052979A (en) Information processing device and program
US20230024396A1 (en) A method for capturing and displaying a video stream
CN112040209B (en) VR scene projection method and device, projection system and server
CN107659772B (en) 3D image generation method and device and electronic equipment
CN113902845A (en) Motion video generation method and device, electronic equipment and readable storage medium
JP2015201734A (en) Image processing system, control method of the same, and program
JP2005252482A (en) Image generating apparatus and three-dimensional distance information acquisition apparatus
Rafighi et al. Automatic and adaptable registration of live RGBD video streams
Kim et al. AR timewarping: A temporal synchronization framework for real-Time sensor fusion in head-mounted displays
CN109644259A (en) 3-dimensional image preprocess method, device and wear display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: 519031 office 1316, No. 1, lianao Road, Hengqin new area, Zhuhai, Guangdong

Patentee after: LONGMA ZHIXIN (ZHUHAI HENGQIN) TECHNOLOGY Co.,Ltd.

Address before: Room 417.418.419, building 20, creative Valley, 1889 Huandao East Road, Hengqin New District, Zhuhai City, Guangdong Province

Patentee before: LONGMA ZHIXIN (ZHUHAI HENGQIN) TECHNOLOGY Co.,Ltd.