CN113516031A - VR teaching system and multimedia classroom - Google Patents

VR teaching system and multimedia classroom Download PDF

Info

Publication number
CN113516031A
CN113516031A CN202110473070.9A CN202110473070A CN113516031A CN 113516031 A CN113516031 A CN 113516031A CN 202110473070 A CN202110473070 A CN 202110473070A CN 113516031 A CN113516031 A CN 113516031A
Authority
CN
China
Prior art keywords
student
audio
teacher
image
course
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110473070.9A
Other languages
Chinese (zh)
Other versions
CN113516031B (en
Inventor
王亚刚
李元元
程思锦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Feidie Virtual Reality Technology Co ltd
Original Assignee
Shenzhen Feidie Virtual Reality Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Feidie Virtual Reality Technology Co ltd filed Critical Shenzhen Feidie Virtual Reality Technology Co ltd
Priority to CN202110473070.9A priority Critical patent/CN113516031B/en
Publication of CN113516031A publication Critical patent/CN113516031A/en
Application granted granted Critical
Publication of CN113516031B publication Critical patent/CN113516031B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/065Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/08Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
    • G09B5/14Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations with provision for individual teacher-student communication

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

The invention provides a VR teaching system and a multimedia classroom, wherein the system comprises: the teacher terminal is used for receiving courses recorded or compiled by a teacher; the server side is in communication connection with the teacher terminal and is used for receiving the courses uploaded by the teacher terminal; the student terminal is in communication connection with the server terminal and is used for receiving and playing the courses of the server terminal and receiving the interaction actions of students when the courses are played; wherein, student's terminal includes: and the VR handle is used for receiving the interaction of the student when the course is played. The VR teaching system enables a user to actively participate in VR scenes so as to achieve a better learning effect.

Description

VR teaching system and multimedia classroom
Technical Field
The invention relates to the technical field of teaching systems, in particular to a VR teaching system and a multimedia classroom.
Background
At present, the Virtual Reality technology (abbreviated as VR, english name) is also called smart technology, and is a brand new practical technology developed in the 20 th century. Virtual reality technology encompasses computer, electronic information, simulation technology, the basic implementation of which is that a computer simulates a virtual environment to give a person a sense of environmental immersion. With the continuous development of social productivity and scientific technology, VR technology is increasingly in great demand in various industries. The VR technology has made great progress and gradually becomes a new scientific and technical field.
The core content of the education informatization is the teaching informatization. Teaching is the central work in the education field, and the teaching informatization is to enable the teaching means to be scientific and technological, education and propagation informatization and the teaching mode to be modern. Education informatization requires that modern information technology based on computers, multimedia, big data, artificial intelligence and network communication is comprehensively applied in the education process to promote education reform, so that the education information system adapts to new requirements brought forward by the information-based society, and has great significance for deepening education reform and implementing quality education. Therefore, VR technology application and education informatization construction have important significance, and the existing VR technology can only record and broadcast and cannot enable users to actively participate in VR scenes.
Disclosure of Invention
One of the objectives of the present invention is to provide a VR teaching system, which enables a user to actively participate in a VR scenario to achieve a better learning effect.
An embodiment of the present invention provides a VR teaching system, including:
the teacher terminal is used for receiving courses recorded or compiled by a teacher;
the server side is in communication connection with the teacher terminal and is used for receiving the courses uploaded by the teacher terminal;
the student terminal is in communication connection with the server terminal and is used for receiving and playing the courses of the server terminal and receiving the interaction actions of students when the courses are played;
wherein, student's terminal includes:
and the VR handle is used for receiving the interaction of the student when the course is played.
Preferably, the teacher terminal includes:
the audio acquisition equipment is used for acquiring audio information of the teacher;
the image acquisition equipment is used for acquiring image information of the teacher;
the first control equipment is electrically connected with the audio acquisition equipment and the image acquisition equipment; the first control equipment is used for acquiring audio information and image information acquired by the audio acquisition equipment and the image acquisition equipment and receiving the editing operation of a teacher on the audio information and the image information to generate courses;
the first control apparatus includes:
the first communication module is in communication connection with the server;
the display module is used for displaying the audio information and the image information;
the human body input module is used for receiving the editing operation of a teacher;
and the processing module is electrically connected with the first communication module, the display module, the human body input module, the audio acquisition equipment and the image acquisition equipment.
Preferably, the image pickup apparatus includes:
and the at least four cameras are used for shooting images of the teacher recording the course from the front direction, the rear direction, the left direction and the right direction of the teacher respectively.
Preferably, the audio capturing apparatus includes:
and the microphones are arranged around the teacher in an array mode when the teacher records the lessons.
Preferably, the student terminal includes:
the wearable VR equipment is electrically connected with the VR handle and is used for playing courses sent by the server side;
the first mobile sensing module is arranged in the wearable VR equipment and used for sensing the movement of a student;
the server side executes the following operations:
constructing a virtual course space based on image information acquired by image acquisition equipment and audio information acquired by audio acquisition equipment;
correspondingly storing the virtual course space and the courses;
when the state of the student terminal playing the course is received, calling a corresponding virtual course space;
mapping the students to the initial point of the virtual course space, and playing images in the sight range of the students in the virtual course space;
sensing the movement of the student through a first movement sensing module;
determining images in the sight line range of the students in real time and playing the images based on the movement of the students;
and determining the position of the student in the virtual course space based on the movement and the initial point of the student, and playing the audio information acquired by the audio acquisition equipment corresponding to the position.
Preferably, the image within the sight line range of the student is determined in real time and played based on the movement of the student, and the method comprises the following steps:
acquiring central shooting vectors of all cameras for constructing a virtual space, wherein the central shooting vectors are shooting directions which take the central point of a camera lens and are perpendicular to the camera lens and face the camera;
acquiring a central visual field vector of the student, wherein the central visual field vector is outward by taking the pupil of the student as a starting point;
determining a location vector based on the student location and the teacher's location in the virtual course space;
when the included angle between the position vector and the central visual field vector is smaller than a preset first threshold value, the included angle between the central visual field vector and the central image pick-up vector of each camera is calculated, and the calculation formula is as follows:
Figure BDA0003046271720000031
wherein, thetaiIs the included angle between the central visual field vector and the ith central shooting vector; x is the number ofjA j-th dimension parameter value of the central visual field vector; y isijA j-th dimension parameter value of the ith central camera shooting vector; n is a data dimension;
comparing the sizes of the included angles, selecting the image shot by the camera with the smallest included angle as an image in the sight range of the student and playing the image;
based on the movement and initial point of the student, the position of the student in the virtual course space is determined, and the audio information collected by the audio collecting equipment corresponding to the playing position comprises:
calculating the distance between the position of the student and the set position of each audio acquisition device, wherein the calculation formula is as follows:
Figure BDA0003046271720000032
wherein L iskThe distance between the position of the student and the set position of the kth audio acquisition device; a. the1k、A2k、A3kCoordinate values of an X axis, a Y axis and a Z axis which are respectively set positions of the kth audio acquisition device; b is1、B1、B1The coordinate values of the X axis, the Y axis and the Z axis of the position of the student are respectively;
comparing the distances between the positions of the students and the set positions of the audio acquisition devices, and taking the audio information of the audio acquisition device with the shortest distance as a basic audio;
determining a second distance between the position of the student and the position of the teacher in the virtual lesson space, and a third distance between the audio acquisition equipment with the shortest distance and the position of the teacher in the virtual lesson space;
querying a preset association coefficient table based on the first distance, the second distance and the third distance, and determining a relation coefficient between the finally played audio and the basic audio;
and determining the final playing audio based on the relation coefficient and the basic audio.
Preferably, the server performs the following operations:
when receiving a first action of a student for representing note recording through a VR handle, acquiring a playing point of a first position of a currently played course, which is advanced forward for a preset time, as a recording starting point;
when receiving a second action of the student, which indicates the end of the note, through the VR handle, acquiring a second position of the currently played course as a recording end point;
converting the audio data between the recording starting point and the recording ending point into character data, and outputting the character data to students when the course is finished;
receiving the editing operation of the student on the text data to form a course note; a
Wherein the first action comprises: pressing a key of the VR handle; the second action includes: releasing the pressed key.
Preferably, the VR handle comprises:
the shell is cylindrical, and anti-skid grains are arranged on the periphery of the shell;
the key is arranged on one end face of the shell;
a pressure sensor array disposed at an outer periphery of the housing;
the second movement sensing module is arranged in the shell and used for sensing the movement of the VR handle;
the processor is arranged in the shell and is electrically connected with the key, the second mobile sensing module and the pressure sensor array respectively;
the second communication module is electrically connected with the processor and the server terminal;
the processor detects the grip strength of the student through the pressure sensor array; when the grip strength is larger than a preset first pressure threshold value, the corresponding virtual image hand is gripped to grab an object on the virtual image;
the processor senses the movement of the VR handle through the second sensing module so as to synchronously change the virtual image corresponding to the student and enable the hand to move.
The invention also provides a multimedia classroom applied to any one of the VR teaching systems, comprising:
and the plurality of VR handle connecting interfaces are respectively arranged on the student seats and used for connecting VR handles.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic diagram of a VR teaching system in an embodiment of the invention;
FIG. 2 is a diagram of a teacher terminal according to an embodiment of the present invention;
fig. 3 is a schematic view of a VR handle in an embodiment of the invention.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
An embodiment of the present invention provides a VR teaching system, as shown in fig. 1, including:
the teacher terminal 1 is used for receiving courses recorded or compiled by a teacher;
the server terminal 2 is in communication connection with the teacher terminal 1 and is used for receiving the courses uploaded by the teacher terminal 1;
the student terminal 3 is in communication connection with the server end 2 and is used for receiving and playing the courses of the server end 2 and receiving the interaction actions of students when the courses are played;
wherein, student terminal 3 includes:
at least one VR handle 31 for receiving student interaction while the lesson is playing.
The working principle and the beneficial effects of the technical scheme are as follows:
the teacher terminal 1 is used by a teacher, and the teacher can use the teacher terminal 1 to compile or record courses; the course recording mainly records the course of the teacher by adopting the video recording function; the course preparation is complex, and the interaction possibility of students needs to be considered when the course is played. For example: when a concentrated sulfuric acid dilution experiment is compiled, the correct steps are that concentrated sulfuric acid is poured into water along the wall of a cup; there may be a number of situations in the operation of a student: firstly, directly pouring into the concentrated sulfuric acid instantly, and secondly, pouring water into the concentrated sulfuric acid; thirdly, pouring concentrated sulfuric acid into water along the wall of the cup; during compiling, videos or phenomena of various interactive operations of students need to be compiled respectively.
The student terminal 3 is used by students, and is mainly used by the students logging in the server end 2; when the student uses student terminal 3 to watch the course that server end 2 sent, realize interactive action through VR handle 31, for example: video pause, playback, fast forward, etc.; in addition, experiment operation can be carried out when the experiment is played. For example, concentrated sulfuric acid dilution experiments can be realized by directly pouring concentrated sulfuric acid into water instantly and pouring water into concentrated sulfuric acid; and thirdly, pouring concentrated sulfuric acid into water along the wall of the cup, and the like.
The VR teaching system enables a user to actively participate in VR scenes so as to achieve better learning effect, and when dangerous experiments are carried out, accidents cannot happen due to errors of operation steps, and students can clearly and intuitively feel consequences caused by operation errors.
In one embodiment, as shown in fig. 2, the teacher terminal 1 includes:
the audio acquisition equipment 11 is used for acquiring audio information of the teacher;
the image acquisition equipment 12 is used for acquiring image information of the teacher;
the first control equipment 13 is electrically connected with the audio acquisition equipment 11 and the image acquisition equipment 12; the first control device 13 is used for acquiring the audio information and the image information acquired by the audio acquisition device 11 and the image acquisition device 12, and receiving the editing operation of the teacher on the audio information and the image information to generate a course;
the first control device 13 includes:
the first communication module 132 is in communication connection with the server 2;
a display module 133 for displaying audio information and image information;
a human body input module 134 for receiving the editing operation of the teacher;
and the processing module 131 is electrically connected with the first communication module 132, the display module 133, the human body input module 134, the audio acquisition device 11 and the image acquisition device 12.
The working principle and the beneficial effects of the technical scheme are as follows:
the teacher terminal 1 respectively collects video information and audio information of a teacher recorded course through the image collection device 12 and the audio collection device 11; course recording is realized. The recorded courses need to be edited; editing operations comprise clipping, setting trigger interaction actions and the like; wherein, the display module 133 is a display; the human body input module 134 includes: a keyboard and a mouse; the first communication module 132 includes a network card; the processing module 131 is a computer host.
In one embodiment, image acquisition device 12 includes:
and the at least four cameras are used for shooting images of the teacher recording the course from the front direction, the rear direction, the left direction and the right direction of the teacher respectively.
The working principle and the beneficial effects of the technical scheme are as follows:
panoramic recording is realized through omnibearing recording; when the student terminal 3 plays, can switch the angle of broadcast according to student's interactive action, make the student have the sensation of being personally on the scene, improve student's learning experience.
In one embodiment, the audio capture device 11 comprises:
and the microphones are arranged around the teacher in an array mode when the teacher records the lessons.
The working principle and the beneficial effects of the technical scheme are as follows:
the microphones arranged in the array can capture sound in all directions, and the synchronous switching of the sound playing is realized when the students switch the playing angle. For example, four microphones may be used in one-to-one correspondence with the cameras; and synchronously switching to play the sound collected by the corresponding microphone when switching to the picture recorded by the corresponding camera.
In one embodiment, the student terminal 3 includes:
the wearable VR equipment is electrically connected with the VR handle 31 and is used for playing courses sent by the server end 2;
the first mobile sensing module is arranged in the wearable VR equipment and used for sensing the movement of a student;
the server 2 performs the following operations:
constructing a virtual course space based on the image information acquired by the image acquisition device 12 and the audio information acquired by the audio acquisition device 11;
correspondingly storing the virtual course space and the courses;
when the state that the student terminal 3 plays the course is received, calling a corresponding virtual course space;
mapping the students to the initial point of the virtual course space, and playing images in the sight range of the students in the virtual course space;
sensing the movement of the student through a first movement sensing module;
determining images in the sight line range of the students in real time and playing the images based on the movement of the students;
based on the movement and the initial point of the student, the position of the student in the virtual lesson space is determined, and the audio information collected by the audio collection device 11 corresponding to the position is played.
The working principle and the beneficial effects of the technical scheme are as follows:
mapping between the student and the virtual course space is realized based on the wearable VR device and the first mobile sensing module, so that images are displayed according to visual angles of the student, corresponding visual angles and sounds can be switched according to movement of the student, and immersive learning experience is realized.
In one embodiment, the image within the sight line of the student is determined and played in real time based on the movement of the student, and the method comprises the following steps:
acquiring central shooting vectors of all cameras for constructing a virtual space, wherein the central shooting vectors are shooting directions which take the central point of a camera lens and are perpendicular to the camera lens and face the camera;
acquiring a central visual field vector of the student, wherein the central visual field vector is outward by taking the pupil of the student as a starting point;
determining a location vector based on the student location and the teacher's location in the virtual course space;
when the included angle between the position vector and the central visual field vector is smaller than a preset first threshold value, the included angle between the central visual field vector and the central image pick-up vector of each camera is calculated, and the calculation formula is as follows:
Figure BDA0003046271720000091
wherein, thetaiIs the included angle between the central visual field vector and the ith central shooting vector; x is the number ofjA j-th dimension parameter value of the central visual field vector; y isijA j-th dimension parameter value of the ith central camera shooting vector; n is a data dimension;
comparing the sizes of the included angles, selecting the image shot by the camera with the smallest included angle as an image in the sight range of the student and playing the image;
based on the movement and initial point of the student, the position of the student in the virtual course space is determined, and the audio information collected by the audio collecting device 11 corresponding to the playing position includes:
the distance between the position of the student and the set position of each audio capture device 11 is calculated by the following formula:
Figure BDA0003046271720000092
wherein L iskThe distance between the position of the student and the set position of the kth audio capture device 11; a. the1k、A2k、A3kX-axis, Y-axis and Z-axis coordinate values of the setting position of the kth audio capturing apparatus 11, respectively; b is1、B1、B1The coordinate values of the X axis, the Y axis and the Z axis of the position of the student are respectively;
comparing the distance between the position of the student and the set position of each audio acquisition device 11, and taking the audio information of the audio acquisition device 11 with the shortest distance as the basic audio;
determining a second distance between the position of the student and the position of the teacher in the virtual lesson space, and a third distance between the audio acquisition equipment 11 with the shortest distance and the position of the teacher in the virtual lesson space;
querying a preset association coefficient table based on the first distance, the second distance and the third distance, and determining a relation coefficient between the finally played audio and the basic audio;
and determining the final playing audio based on the relation coefficient and the basic audio.
The working principle and the beneficial effects of the technical scheme are as follows:
the central visual field vector is used as a characteristic parameter of the sight line, and whether the student is a lecture for watching a teacher can be determined through an included angle between the central visual field vector and the position vector; the first threshold may be set to 60 degrees only when the included angle thereof is smaller than the first threshold; when the included angle is larger than the first threshold value, the preset background of the virtual course space can be played, and the direction of sight adjustment is indicated by an arrow, so that the sight of the student can be adjusted conveniently and quickly. When the students speak for the teachers, the best display image is determined through the included angle between the central visual field vector and the central camera shooting vector so as to be displayed to the students. The final played audio is determined through the positions of students, and accurate audio determination is achieved. And then realized that audio frequency and video changes according to student's removal, improved student's sense of immersing, improved student's VR experience.
In one embodiment, the server side 2 performs the following operations:
when receiving a first action of a student indicating note recording through the VR handle 31, acquiring a playing point of a currently played course, wherein a first position of the currently played course is advanced forward for a preset time as a recording starting point;
when receiving a second action of the student indicating the end of the note through the VR handle 31, acquiring a second position of the currently played course as a recording end point;
converting the audio data between the recording starting point and the recording ending point into character data, and outputting the character data to students when the course is finished;
receiving the editing operation of the student on the text data to form a course note; a
Wherein the first action comprises: pressing button 313 of VR handle 31; the second action includes: the pressed key 313 is released.
The working principle and the beneficial effects of the technical scheme are as follows:
through pressing button 313, start teaching system's note recording function, solved the user and can't carry out the problem of note when watching the VR course, through directly carrying out the word conversion with audio data, improved the rate of accuracy and the efficiency of note. Pushing a recording starting point forward from a trigger point of a first action, and ensuring the integrity of note recording by considering a reaction time of a student and a time corresponding to an instruction of a system; finally, the character data are edited according to the requirements of students, and the editing comprises deleting redundant data, so that the notes are simplified. Further, the preset time is associated with the pressing force of the key 313, i.e. the larger the pressing force is, the longer the preset time is, a corresponding table may be set in advance, and the preset time is extracted according to the pressing force to determine the recording starting point.
In one embodiment, as shown in fig. 3, the VR handle 31 includes:
the shell is cylindrical, and anti-skid grains are arranged on the periphery of the shell;
a key 313 provided on one end surface of the housing;
a pressure sensor array 314 disposed at the outer periphery of the housing;
a second movement sensing module 315 disposed in the housing for sensing movement of the VR handle 31;
the processor 311 is arranged in the shell and is electrically connected with the key 313, the second mobile sensing module 315 and the pressure sensor array 314 respectively;
a second communication module 312 electrically connected to the processor 311 and the server 2;
the processor 311 detects the grip of the student through the pressure sensor array 314; when the grip strength is larger than a preset first pressure threshold value, the corresponding virtual image hand is gripped to grab an object on the virtual image;
the processor 311 senses the movement of the VR handle 31 through the second sensing module to synchronously change the corresponding avatar of the student to move the hands.
The working principle and the beneficial effects of the technical scheme are as follows:
based on the second movement sensing module 315, the pressure sensor array 314 and the keys 313; the input of interactive action is realized; with the operation of avatar and student to VR handle 31 synchronous, make things convenient for student's immersive experience in-process parameter operation. The pressure sensor array 314 is a plurality of pressure sensors arranged in an array; the second movement perception module 315 includes: a three-axis gyroscope and a three-axis accelerometer.
The invention also provides a multimedia classroom applied to any one of the VR teaching systems, comprising:
and the VR handle connecting interfaces are respectively arranged on the student seats and used for connecting the VR handles 31.
The working principle and the beneficial effects of the technical scheme are as follows:
the VR handle connecting interface is used for connecting students to enter a VR teaching system; realize that many people carry out VR course study simultaneously.
A specific application scenario: setting a large screen at the position of a teacher platform to display a teacher course; when the course enters the interactive operation end, the operation interface is projected to the small screen on the student seat, and the student performs interactive operation through the VR handle 31.
Another specific application scenario is as follows: the teacher explains in the podium position course, gets into the interactive operation end when the course, projects operation interface respectively on the little screen on the student seat, and the student carries out interactive operation through VR handle 31, summarizes the operation process result to teacher terminal 1 of teacher's position after the interactive operation is accomplished.
Furthermore, a holographic projection device is arranged at the teacher position, so that the holographic projection of the curriculum is realized.
In addition, the multimedia teacher still provides the access mouth of wearing formula VR equipment.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (9)

1. A VR teaching system comprising:
the teacher terminal is used for receiving courses recorded or compiled by a teacher;
the server side is in communication connection with the teacher terminal and is used for receiving the courses uploaded by the teacher terminal;
the student terminal is in communication connection with the server terminal and is used for receiving and playing the courses of the server terminal and receiving the interaction actions of students when the courses are played;
wherein, student's terminal includes:
at least one VR handle for receiving student interaction while the lesson is playing.
2. The VR teaching system of claim 1 wherein the teacher terminal includes:
the audio acquisition equipment is used for acquiring audio information of the teacher;
the image acquisition equipment is used for acquiring image information of the teacher;
the first control equipment is electrically connected with the audio acquisition equipment and the image acquisition equipment; the first control device is used for acquiring the audio information and the image information acquired by the audio acquisition device and the image acquisition device, and receiving the editing operation of a teacher on the audio information and the image information to generate the course;
the first control apparatus includes:
the first communication module is in communication connection with the server side;
the display module is used for displaying the audio information and the image information;
the human body input module is used for receiving the editing operation of a teacher;
and the processing module is electrically connected with the first communication module, the display module, the human body input module, the audio acquisition equipment and the image acquisition equipment.
3. The VR teaching system of claim 2, wherein the image capture device includes:
and the at least four cameras are used for shooting images of the teacher recording the course from the front direction, the rear direction, the left direction and the right direction of the teacher respectively.
4. The VR teaching system of claim 3, wherein the audio capture device includes:
the microphones are arranged around the teacher in an array mode when the teacher records the lessons.
5. The VR teaching system of claim 4, wherein the student terminal includes:
the wearable VR equipment is electrically connected with the VR handle and is used for playing the courses sent by the server end;
the first movement sensing module is arranged in the wearable VR equipment and used for sensing movement of a student;
the server side executes the following operations:
constructing a virtual course space based on the image information acquired by the image acquisition equipment and the audio information acquired by the audio acquisition equipment;
correspondingly storing the virtual course space and the courses;
when the state that the student terminal plays the courses is received, calling a corresponding virtual course space;
mapping a student to an initial point of the virtual course space, and playing an image in the sight range of the student in the virtual course space;
sensing the movement of the student through a first movement sensing module;
determining images in the sight line range of the student in real time and playing the images based on the movement of the student;
and determining the position of the student in the virtual course space based on the movement of the student and the initial point, and playing audio information acquired by the audio acquisition equipment corresponding to the position.
6. The VR teaching system of claim 5 wherein the determining and playing of the images in the student's gaze in real-time based on movement of the student comprises:
acquiring a central shooting vector of each camera for constructing a virtual space, wherein the central shooting vector is a shooting direction which takes the central point of a camera lens and is perpendicular to the camera lens and faces the camera;
acquiring a central visual field vector of the student, wherein the central visual field vector is outward by taking a pupil of the student as a starting point;
determining a location vector based on the student location and a location of a teacher in the virtual lesson space;
when the included angle between the position vector and the central visual field vector is smaller than a preset first threshold value, the included angle between the central visual field vector and the central camera shooting vector of each camera is calculated, and the calculation formula is as follows:
Figure FDA0003046271710000031
wherein, thetaiAn included angle between the central visual field vector and the ith central shooting vector is set; x is the number ofjA j-th dimension parameter value of the central visual field vector; y isijA j-th dimension parameter value of the ith central camera shooting vector; n is a data dimension;
comparing the size of the included angle, and selecting the image shot by the camera with the smallest included angle as an image in the sight range of the student and playing the image;
the determining, based on the movement of the student and the initial point, a position of the student in the virtual course space, and playing audio information collected by the audio collection device corresponding to the position, includes:
calculating the distance between the position of the student and the set position of each audio acquisition device, wherein the calculation formula is as follows:
Figure FDA0003046271710000032
wherein L iskThe distance between the position of the student and the setting position of the kth audio acquisition device is obtained; a. the1k、A2k、A3kCoordinate values of an X axis, a Y axis and a Z axis of the setting position of the kth audio acquisition device respectively; b is1、B1、B1The coordinate values of the X axis, the Y axis and the Z axis of the position of the student are respectively;
comparing the distances between the positions of the students and the set positions of the audio acquisition equipment, and taking the audio information of the audio acquisition equipment with the shortest distance as a basic audio;
determining a second distance between the position of the student and the position of the teacher in the virtual lesson space, and a third distance between the audio acquisition equipment with the shortest distance and the position of the teacher in the virtual lesson space;
querying a preset association coefficient table based on the first distance, the second distance and the third distance, and determining a relation coefficient between the finally played audio and the basic audio;
and determining final playing audio based on the relation coefficient and the basic audio.
7. The VR teaching system of claim 1, wherein the server performs the following:
when receiving a first action of a student for representing note recording through the VR handle, acquiring a playing point of a currently played course, wherein the first position of the currently played course is advanced forward for a preset time as a recording starting point;
when receiving a second action of the student, which indicates the end of the note, through the VR handle, acquiring a second position of the currently played course as a record end point;
converting the audio data between the recording starting point and the recording ending point into character data, and outputting the character data to the student when the course is finished;
receiving the editing operation of the student on the text data to form a course note; a
Wherein the first action comprises: pressing a key of the VR handle; the second action comprises: releasing the pressed key.
8. The VR teaching system of claim 1 wherein the VR handle includes:
the shell is cylindrical, and anti-skid grains are arranged on the periphery of the shell;
the key is arranged on one end face of the shell;
a pressure sensor array disposed at an outer periphery of the housing;
the second movement sensing module is arranged in the shell and used for sensing the movement of the VR handle;
the processor is arranged in the shell and is electrically connected with the key, the second mobile sensing module and the pressure sensor array respectively;
the second communication module is electrically connected with the processor and the server end;
the processor detects the grip strength of the student through the pressure sensor array; when the grip strength is larger than a preset first pressure threshold value, the corresponding virtual image hand is gripped to grab an object on the virtual image;
the processor senses the movement of the VR handle through the second sensing module so as to synchronously change the hand movement of the virtual image corresponding to the student.
9. A multimedia classroom for use in the VR teaching system of any of claims 1 to 8, comprising:
and the plurality of VR handle connecting interfaces are respectively arranged on the student seats and used for connecting the VR handles.
CN202110473070.9A 2021-04-29 2021-04-29 VR teaching system and multimedia classroom Active CN113516031B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110473070.9A CN113516031B (en) 2021-04-29 2021-04-29 VR teaching system and multimedia classroom

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110473070.9A CN113516031B (en) 2021-04-29 2021-04-29 VR teaching system and multimedia classroom

Publications (2)

Publication Number Publication Date
CN113516031A true CN113516031A (en) 2021-10-19
CN113516031B CN113516031B (en) 2024-03-19

Family

ID=78063502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110473070.9A Active CN113516031B (en) 2021-04-29 2021-04-29 VR teaching system and multimedia classroom

Country Status (1)

Country Link
CN (1) CN113516031B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114743419A (en) * 2022-03-04 2022-07-12 广州容溢教育科技有限公司 VR-based multi-user virtual experiment teaching system

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844987A (en) * 2016-05-30 2016-08-10 深圳科润视讯技术有限公司 Multimedia teaching interaction operating method and device
CN108172040A (en) * 2018-01-18 2018-06-15 安徽三弟电子科技有限责任公司 A kind of tutoring system based on VR virtual classrooms
CN109064811A (en) * 2018-09-03 2018-12-21 温州大学 A kind of tutoring system based on VR virtual classroom
CN109949187A (en) * 2019-03-01 2019-06-28 北华大学 A kind of novel Internet of Things teleeducation system and control method
CN110246383A (en) * 2019-07-18 2019-09-17 许昌学院 A kind of multimedia education system and its application method
CN110364041A (en) * 2019-06-24 2019-10-22 科谊达(北京)智能科技有限公司 A kind of classroom outdoor scene reproduction tutoring system
CN111081099A (en) * 2019-09-28 2020-04-28 马鞍山问鼎网络科技有限公司 Remote education system based on virtual reality technology
KR20200050281A (en) * 2018-11-01 2020-05-11 유엔젤주식회사 Learning Support System And Method Using Augmented Reality And Virtual reality based on Artificial Intelligence
CN111402651A (en) * 2020-05-22 2020-07-10 丽水学院 Intelligent teaching system based on VR technique
JP6739611B1 (en) * 2019-11-28 2020-08-12 株式会社ドワンゴ Class system, viewing terminal, information processing method and program
CN111787343A (en) * 2020-06-23 2020-10-16 深圳市思考乐文化教育科技发展有限公司 Classroom live broadcasting system with virtual live broadcasting and implementation method thereof
CN211787619U (en) * 2020-05-14 2020-10-27 刘班 Remote education and teaching system
CN111949822A (en) * 2020-08-20 2020-11-17 山东大学 Intelligent education video service system based on cloud computing and mobile terminal and operation method thereof
CN112087656A (en) * 2020-09-08 2020-12-15 远光软件股份有限公司 Online note generation method and device and electronic equipment
AU2020103849A4 (en) * 2020-12-02 2021-02-11 G. L. Bhong QCIU- Education Environment System: Quantum Computing Integrated Development Education Environment Using IoT-Based System

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844987A (en) * 2016-05-30 2016-08-10 深圳科润视讯技术有限公司 Multimedia teaching interaction operating method and device
CN108172040A (en) * 2018-01-18 2018-06-15 安徽三弟电子科技有限责任公司 A kind of tutoring system based on VR virtual classrooms
CN109064811A (en) * 2018-09-03 2018-12-21 温州大学 A kind of tutoring system based on VR virtual classroom
KR20200050281A (en) * 2018-11-01 2020-05-11 유엔젤주식회사 Learning Support System And Method Using Augmented Reality And Virtual reality based on Artificial Intelligence
CN109949187A (en) * 2019-03-01 2019-06-28 北华大学 A kind of novel Internet of Things teleeducation system and control method
CN110364041A (en) * 2019-06-24 2019-10-22 科谊达(北京)智能科技有限公司 A kind of classroom outdoor scene reproduction tutoring system
CN110246383A (en) * 2019-07-18 2019-09-17 许昌学院 A kind of multimedia education system and its application method
CN111081099A (en) * 2019-09-28 2020-04-28 马鞍山问鼎网络科技有限公司 Remote education system based on virtual reality technology
JP6739611B1 (en) * 2019-11-28 2020-08-12 株式会社ドワンゴ Class system, viewing terminal, information processing method and program
CN211787619U (en) * 2020-05-14 2020-10-27 刘班 Remote education and teaching system
CN111402651A (en) * 2020-05-22 2020-07-10 丽水学院 Intelligent teaching system based on VR technique
CN111787343A (en) * 2020-06-23 2020-10-16 深圳市思考乐文化教育科技发展有限公司 Classroom live broadcasting system with virtual live broadcasting and implementation method thereof
CN111949822A (en) * 2020-08-20 2020-11-17 山东大学 Intelligent education video service system based on cloud computing and mobile terminal and operation method thereof
CN112087656A (en) * 2020-09-08 2020-12-15 远光软件股份有限公司 Online note generation method and device and electronic equipment
AU2020103849A4 (en) * 2020-12-02 2021-02-11 G. L. Bhong QCIU- Education Environment System: Quantum Computing Integrated Development Education Environment Using IoT-Based System

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
黄娟;: "为远程教育插上翅膀――浅谈基于智能录播***的课件制作", 科技信息, no. 03, pages 486 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114743419A (en) * 2022-03-04 2022-07-12 广州容溢教育科技有限公司 VR-based multi-user virtual experiment teaching system
CN114743419B (en) * 2022-03-04 2024-03-29 国育产教融合教育科技(海南)有限公司 VR-based multi-person virtual experiment teaching system

Also Published As

Publication number Publication date
CN113516031B (en) 2024-03-19

Similar Documents

Publication Publication Date Title
CN107103801B (en) Remote three-dimensional scene interactive teaching system and control method
EP4184927A1 (en) Sound effect adjusting method and apparatus, device, storage medium, and computer program product
CN108389249B (en) Multi-compatibility VR/AR space classroom and construction method thereof
CN109600559B (en) Video special effect adding method and device, terminal equipment and storage medium
US11442685B2 (en) Remote interaction via bi-directional mixed-reality telepresence
JP6683864B1 (en) Content control system, content control method, and content control program
CN109951718A (en) A method of it can 360 degree of panorama captured in real-time live streamings by 5G and VR technology
CN114402276A (en) Teaching system, viewing terminal, information processing method, and program
CN112331001A (en) Teaching system based on virtual reality technology
CN113516031B (en) VR teaching system and multimedia classroom
CN108335542A (en) A kind of VR tutoring systems
CN205540577U (en) Live device of virtual teaching video
CN109844600A (en) Information processing equipment, information processing method and program
CN112675527A (en) Family education game system and method based on VR technology
CN112288876A (en) Long-distance AR identification server and system
CN112037090A (en) Knowledge education system based on VR technology and 6DOF posture tracking
JP7130290B2 (en) information extractor
CN110262662A (en) A kind of intelligent human-machine interaction method
JP7465736B2 (en) Content control system, content control method, and content control program
JP7465737B2 (en) Teaching system, viewing terminal, information processing method and program
CN210072615U (en) Immersive training system and wearable equipment
JP7361612B2 (en) Information processing method, information processing device, and program
JP6864041B2 (en) Information storage method and information storage system
CN115004281A (en) Viewing terminal, viewing method, viewing system, and program
CN206353386U (en) A kind of digital education device based on augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Country or region after: China

Address after: Room 502-503, Floor 5, Building 5, Hongtai Smart Valley, No. 19, Sicheng Road, Tianhe District, Guangzhou, Guangdong 510000

Applicant after: Guangdong Feidie Virtual Reality Technology Co.,Ltd.

Address before: 518000 3311, 3rd floor, building 1, aerospace building, No.51, Gaoxin South nine road, Gaoxin community, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: Shenzhen FEIDIE Virtual Reality Technology Co.,Ltd.

Country or region before: China

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant