CN110806803A - Integrated interactive system based on virtual reality and multi-source information fusion - Google Patents

Integrated interactive system based on virtual reality and multi-source information fusion Download PDF

Info

Publication number
CN110806803A
CN110806803A CN201911061261.3A CN201911061261A CN110806803A CN 110806803 A CN110806803 A CN 110806803A CN 201911061261 A CN201911061261 A CN 201911061261A CN 110806803 A CN110806803 A CN 110806803A
Authority
CN
China
Prior art keywords
user
scene
video
relaxation
experience
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911061261.3A
Other languages
Chinese (zh)
Inventor
卞玉龙
马浩凯
耿文秀
周超
陈叶青
刘娟
盖伟
杨承磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201911061261.3A priority Critical patent/CN110806803A/en
Publication of CN110806803A publication Critical patent/CN110806803A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • Health & Medical Sciences (AREA)
  • Dermatology (AREA)
  • Biomedical Technology (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The integrated interaction system comprises an interaction device, electrocardio monitoring equipment and a server, wherein the interaction device is connected with the server and is configured to provide a virtual scene and an interaction environment; the electrocardio monitoring equipment collects electrocardio data of a wearer and transmits the electrocardio data to the server; the server is configured to provide experience modes of interactive training games, audio and videos in two environments of ordinary and VR, provide corresponding scenes for playing games or viewing, realize real-time monitoring of the physiological state of the user by using the electrocardiogram data in the training process, form reports, determine the state of the user in each scene, have more visual experience and higher cognitive fluency, and can improve the training efficiency of the user.

Description

Integrated interactive system based on virtual reality and multi-source information fusion
Technical Field
The disclosure belongs to the technical field of human-computer interaction, and relates to an integrated interaction system based on virtual reality and multi-source information fusion.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
As a widely applied computer technology, the virtual reality has received great attention in the fields of national defense, scientific research, industry, entertainment, education and the like, and is one of the important contents for information-based construction of related departments in recent years. However, virtual reality is a collection of technologies, and the expected effect can be achieved only by integrating the technologies through reasonable software and hardware architectures and interactive means. Unfortunately, although there are many interactive technologies including physiological interactive technology, voice interactive technology, etc., there is a lack of information on how to integrate these interactive technologies to build a reasonable virtual reality system, and people have no way to start building a virtual reality system.
In addition, virtual reality is a virtual environment generated based on multi-dimensional information such as visual, auditory, and tactile information, and allows a user to become an internal participant in the virtual environment. Wherein, visual information often is multisource, takes VR + education say, and the multimedia video teaching of platyzing still is not enough to let the student produce very strong substitution sense and interactivity in traditional classroom study, if through the outdoor scene show that provides 360 degrees of panoramic video, the interaction technology that combines VR recreation experience is integrated again, and this virtual reality experience mode of educating in joy is a new education attempt direction. Therefore, an integrated interactive virtual reality system with multi-source information fusion is developed, and the system has important practical value in the fields of education, medicine, relaxation training and the like.
Disclosure of Invention
In order to solve the problems, the integrated interactive system based on virtual reality and multi-source information fusion is provided, and the interactive equipment and the technology integrating physiological characteristic acquisition function and psychological characterization perception function are enabled to have the functions of perceiving psychological characterization and physiological characteristics of users in VR experience and training in real time, so that VR training effect is improved. Interaction equipment and natural interaction technology matched in VR training can also support each other with the content of virtual training, have more audio-visual experience and have higher cognitive fluency, can promote user's training efficiency.
According to some embodiments, the following technical scheme is adopted in the disclosure:
the utility model provides an integrated form interactive system based on virtual reality and multisource information fusion, includes interactive device, electrocardio monitoring facilities and server, wherein:
the interaction device is connected with the server and is configured to provide a virtual scene and an interaction environment;
the electrocardio monitoring equipment acquires electrocardio data of a wearer and transmits the electrocardio data to the server;
the server is configured to provide experience modes of games, songs and videos in two environments of ordinary and VR, provide corresponding scenes for games or appreciation, realize real-time monitoring of the physiological state of the user by using the electrocardio data in the experience process, form reports and determine the state of the user in each scene.
As an alternative embodiment, the server comprises:
the voice interaction module is used for identifying the received user voice information, determining a corresponding instruction and executing the instruction;
the common video experience module comprises a video material library and provides multi-channel classified material content similar to a video platform aiming at a certain experience or training theme;
the VR panoramic video experience module adopts a VR panoramic camera, integrates video shooting, manufacturing and playing, can bring sufficient immersive experience to viewers while not needing excessive interaction modes and the learning cost caused by the interaction modes, and can obtain various extreme effects through offline rendering and shooting;
the VR scene experience module comprises a VR scene material library, adopts a Maya modeling tool to perform scene modeling to enrich the material library, and can enable a user to obtain experience of being personally on the scene;
the VR game experience module comprises a plurality of VR interactive games and can be selected according to training requirements;
and the analysis module is configured to analyze the state (such as tension, relaxation and the like) of the user in real time according to the acquired electrocardiogram data, and present the calculated state level (such as relaxation degree) and the heart rate value in each scene entered by the user through a graphical interface.
In an alternative embodiment, the interactive device selects VIVE Pro.
As an alternative implementation, the voice interaction module converts the voice of the user into text input, supports the characteristics of content inference and event registration, creates a dictation object, registers the result, calls back the event, completes the event, the error event and the recognition statement event, and then starts the voice recognition and executes the voice recognition.
As an alternative embodiment, the general Video experience module includes a current UI plugin and an AVPro Video plugin, where the current UI plugin is used to create a 180 ° cylindrical shape in a world space to wrap a Curved Video player, allow a user to view from various angles and perform handle or voice interaction with the Video player, so as to improve the sense of substitution of the user, and the AVPro Video plugin is used to implement a basic Video playing function and also implement a progress bar drag and drop and rate adjustment. Not only realized playing the 4K video of high definition in VR, still realized 360 panoramic video's in VR broadcast, greatly improved user's the sense of immersing.
As an optional implementation manner, the VR game experience module adopts a VR scene camera roaming technology to obtain a roaming direction through the orientation sensed by the interaction device, and obtains an included angle between the input position information and the (0, 1) point to obtain a roaming movement angle, so as to realize the forward and backward and the left and right translation of the roaming.
As an alternative implementation, the VR game experience module adopts a technology of acquiring a current sound decibel value through a microphone, the process of acquiring the current sound decibel value through the microphone is to acquire a maximum sound value of each frame, and display the value in a VR scene in a visual manner, where a user is initially in the scene, and the scene changes when the decibel of the user shouting reaches a certain threshold.
As an optional implementation mode, the electrocardio monitoring device comprises a PC, a BMD electrocardio collector and a plurality of application electrodes, wherein the BMD electrocardio collector is respectively connected with the PC and the three application electrodes, is connected with the PC through Bluetooth and is connected with the application electrodes through leads, a BMD chip receives analog signals from the application electrodes through SEP and SEN, converts analog limit numbers into digital signals and finally sends the digital signals to the PC through RX and TX.
As an alternative embodiment, the analysis module adopts a PDF file generation technology, PDF contents are obtained from type data in each scene code, and tables and specific contents of the tables are arranged and combined through PdfPTable and PdfPCell.
Compared with the prior art, the beneficial effect of this disclosure is:
the system integrates the functions of video, immersive scenes, electrocardiosignal monitoring, psychological training and the like, perfectly realizes the environment-psychology-physiology interaction promotion effect, improves the user experience through music, scenes and games, and learns and masters some effective training methods; providing diversified training modes for effective learning tools for users by utilizing VR technology and providing learning guidance; meanwhile, the VR game allows users to further control the change of the virtual scene through interactive technologies such as voice and the like, and helps them better control the training process by using the externalization technology, and presents a beautiful virtual reality situation after successful experience, thereby really realizing 'situation-by-situation-and-mind'.
The present disclosure provides VR somatosensory games for thematic training. Producing an interaction with a game through limb movements, having: the prison has strong participation sense, simulates the scene of a real game, has good interactivity, feedback and more game contents, and is suitable for the prison limited environment.
The utility model discloses integrated diversified computer aided training technique to provide VR immersive guidance training method, promote the learning effect. The VR biofeedback training tool provides practical training techniques and mastery methods for users through biofeedback, music treatment and the like, and effectively helps the users to learn the techniques through immersive experience and learning.
The VR auxiliary training tool disclosed by the invention can be used for calculating the pressure and other states of the party by acquiring the physiological information of the user, such as an electrocardiogram, an HRV and the like, through the physiological sensor, and giving instant feedback to the party through the real-time change of the real-time virtual reality scene, so that the user can be helped to clearly recognize and master the current state of the user.
The training approach is developed in an advanced way, and the training method combines different virtual reality scenes to form a plurality of training adjustment modes in two modes of passive experience and active training according to a common approach for training adjustment.
The training content module is diversified, and the multi-content training module is included in the training system, so that a user can be helped to construct a reasonable training mode, and various requirements of the user can be met practically.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure and are not to limit the disclosure.
FIG. 1 is a system architecture diagram of the present disclosure;
FIG. 2 is a flow chart of electrocardiographic monitoring of the present disclosure;
FIG. 2(a) is a schematic diagram of an electrocardiographic system according to the present disclosure;
FIG. 2(b) is a hardware diagram of an electrocardiographic system of the present disclosure;
FIG. 2(c) is a software diagram of an electrocardiographic system of the present disclosure;
FIG. 3 is a system flow diagram of the present disclosure;
FIG. 4 is a screenshot of a login interface of the present disclosure;
FIG. 5 is a screenshot of a relaxation mode selection page of the present disclosure;
FIG. 6 is a screenshot of a generic relaxation video selection page of the present disclosure;
FIG. 6(a) is a screenshot of various generic relaxation video pages of the present disclosure;
fig. 7 is a VR panorama release video selection page screenshot of the present disclosure;
FIG. 7(a) is a screenshot of a certain VR panoramic relaxation video page of the present disclosure;
FIG. 8 is a screenshot of a VR relaxation scene selection page of the present disclosure;
fig. 8(a) is a screenshot of various VR relaxation scene pages of the present disclosure;
FIG. 9 is a screenshot of a VR relaxation game selection page of the present disclosure;
FIG. 9(a) is a screenshot of various VR relaxation game pages of the present disclosure;
FIG. 10 is a fill in relaxation effect page of the present disclosure;
fig. 11 is a PDF screenshot finally generated by the present disclosure.
The specific implementation mode is as follows:
the present disclosure is further described with reference to the drawings and the embodiments, taking an integrated interactive relaxation system based on virtual reality and multi-source information fusion as an example.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present disclosure. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
An integrated interactive relaxation system based on virtual reality and multi-source information fusion mainly comprises three hardware parts: the system comprises a host, an HTC VIVE Pro and an electrocardio sensor. The software part mainly comprises a voice recognition module, a common video relaxation module, a VR panoramic video relaxation module, a VR scene relaxation module, a VR game relaxation module, an electrocardio-based relaxation evaluation module and an evaluation and feedback module.
And a voice interaction module. The embodiment has a voice recognition function, a manager can select handle control or voice control for a user according to the requirement of the user, the user can directly enter a scene by reciting a name of the scene which the user wants to enter in the voice control process, directly watch the video by reciting a name of the video which the user wants to watch, directly participate in a game by reciting a name of the game which the user wants to participate in, enter a main interface of the embodiment by reciting exit, and score the relaxation training by reciting the feeling of self-relaxation training.
The voice interaction part carries out voice recognition by using a Unity dictionary recognizer, carries out keyword retrieval on the recognized content and compares the recognized content with the established instruction library to complete the voice recognition. The Unity dictionary recognizers listen to the speech input and attempt to determine the phrase spoken by the user. The instruction database is an important module of our system. Each phrase found in the univocaltionrecognizers is matched to the instruction database to identify the user's command.
And a common video relaxation module. Contain in this embodiment and relax the video material storehouse, including breathing in the material storehouse and relax training teaching video, muscle and relax training teaching video, proneness and relax training teaching video, music and relax training teaching video and light decompression video for user in the virtual environment not only can be through watching light decompression video and alleviate pressure, relax, can also learn to breathe relax, muscle relaxes, proneness and relax and the like relaxation mode. Meanwhile, in the process of 'handle control', a user can realize the playing and the pause of the video by clicking the handle and can realize the forward and the backward of the video by dragging the handle; in the voice control flow, the user can also realize the pause and play, forward and backward of the voice control video by reciting pause, play, forward and backward. Through the learning of the module, a user can master various methods for processing pressure and realizing relaxation, so that the body and mind are relaxed, the physiological and psychological activities tend to be balanced, and the body and mind are recovered from a high-pressure state, and further the relaxation and calmness of the interior are achieved.
VR panorama video relaxs module. Contain VR panorama relaxation video material storehouse in this embodiment, material storehouse is including VR panorama relaxation video such as sky, seabed, lakeside, forest and bamboo forest, and the user can experience 360 space panorama videos in VIVE Pro, experiences the beautiful scenery of nature, experiences the visual effect of being personally on the scene.
VR scene relaxes module. The embodiment comprises a VR relaxation scene material library, wherein the material library comprises VR relaxation scenes such as forests, lakesides, seas and the like, and is matched with special relaxation music, so that a user can quickly relieve pressure and enter a relaxation state.
VR recreation relaxs the module. The embodiment comprises two types of VR relaxation games including seabed roaming and pressure releasing, the seabed roaming can help a user enjoy seabed scenery in a wonderful music accompany, the pressure releasing can help the user release the pressure and dissatisfaction of the inner heart in a whooping mode, and the two types of VR relaxation games can help the user to relieve the pressure.
An electrocardiogram based relaxation evaluation module. The present embodiment collects and analyzes in real time cardiac Electrical (ECG) data of a user. The relaxation degree algorithm represents the degree of relaxation of the user with a value of 1 to 100. A lower value indicates a more excited, stressed physiological state (sympathetic activation), while a higher value indicates a more relaxed state (parasympathetic activation).
And an evaluation and feedback module. After the relaxation training is finished, the embodiment provides a pdf feedback report, which includes multiple recorded indexes: the physiologically calculated relaxation degree and the heart rate value of the user in the respective scene he enters can be presented at the same time in the form of a line graph.
VIVE Pro is a professional edition basic set, and a vivid virtual environment is created through ultra-high definition image quality, optimized ergonomic design and high-resolution sound field. VIVE Pro provides the user with an immersive experience through three components: a head-mounted display, two single-hand-held controllers, and a positioning system (Lighthouse) capable of simultaneously tracking the display and the controllers in a space. The positioning system is the Valve patent which does not require a camera, but rather a laser and a light sensor to determine the position of the moving object, so the VIVE Pro allows the user to move about within a certain range.
The host computer is a desktop computer host of a Suzuki eight-generation six-core exclusive display game selected from Daierian (Alienware) ALW R7 core.
The present embodiment leverages a Unity API called a dictionary recognizer in the voice control module that can listen to voice input and attempt to determine uttered phrases that convert the user's voice to text input while supporting content inference and event registration features. When the API is used, a DictionationRecognizer dictation object is created, and then a DictionationResult result callback event, a DictionationComplete completion event, a DictionationError error event and a DictionationHypophthesis recognition statement event are registered, so that voice recognition dictation can be started.
The embodiment makes full use of the cut UI plug-in and the AVPro Video plug-in the common Video relaxation module. The collected UI plug-in is an integrated VR interface package designed for a Unity Canvas system, can realize that a 180-degree cylindrical shape is created in a world space to wrap a bent video player, allows a user to view from various angles and perform handle or voice interaction with the video player, and further improves the substituting feeling of the user. The AVPro Video plug-in is a Video playing plug-in with powerful functions used by the Unity3D, not only can realize the basic Video playing function, but also can realize progress bar drag and drop and speed adjustment, thereby not only realizing playing of a high-definition 4K Video in VR, but also realizing playing of a 360-degree panoramic Video in VR, and greatly improving the immersion of a user.
This embodiment make full use of the material storehouse and the panoramic video plug-in that the VR panoramic video that the VR camera was shot constitutes in the VR panoramic video relaxes the module. The VR camera Insta360 ° TITAN has 8M 4/3 sensors, and can record 10560 × 5280@30fps ultrahigh resolution video at most, and in this embodiment, there are multiple VR panoramic videos shot by the Insta360 ° TITAN, and these panoramic videos constitute the VR panoramic relaxation video material library of the module, and can be freely selected by the user and viewed.
According to the scheme, the Maya animation technology and the science news flying-off-line voice synthesis technology are adopted in the VR scene relaxing module in the embodiment. Maya is excellent software for solving virtual images such as CG movies under the flags of Autodesk companies, and can provide perfect 3D modeling, animation, special effects and efficient rendering functions, so that scenes are more realistic. The rapid off-line speech synthesis technology provides different users with different timbres, tones and speech rates, and converts text information into audible sound information (i.e., audio data). The method adopts a synthesis engine of an advanced machine learning algorithm in the industry, and the rich emotion corpus enables the synthesized voice to be more natural. Only the naming space of IFLYSpeech in the C # script of Unity is needed, and the flying speech synthesis related code is added, so that the pre-designed content can be synthesized into speech through the ta. The off-line voice synthesis technology can meet the requirement of voice conversion in a network-free environment, and the SDK is light, convenient and quick, responds in real time and does not need network flow.
In the embodiment, a VR scene camera roaming technology and a technology of acquiring a current sound decibel value through a microphone are adopted in a VR game relaxation module. The VR scene camera roaming technology is realized by combining a VIVEPro handle, the roaming direction is obtained through the orientation of a VIVEPro helmet, the roaming moving angle is obtained by solving the included angle between the pressed position information in the TouchPad and the (0, 1) point, and the roaming advancing, retreating and left-right translation are further realized. The technology of obtaining the current sound decibel value through a microphone obtains the maximum sound value of each frame through a GetMaxVolume () method in a C # script of Unity, and displays the value in a VR scene in a visual mode. The user is originally in the encirclement of stone and trees, and the stone can drop from the mountain in the opposite direction when the decibel that the user shouted reaches a certain threshold, and the user will see the beautiful scene of lake side after the stone completely scatters.
In the embodiment, a BMD101 chip is adopted in the electrocardio-based relaxation evaluation module. The BMD101 chip is an SoC device for biological signal monitoring and operation of the neural technology (Neurosky). The BMD101 consists of an advanced analog front-end circuit and a powerful digital signal processing architecture. Because of the extremely low system noise and programmable gain, the BMD101 can detect the biosignal and convert it to a 16-bit high resolution numerical signal by ADC (analog to digital conversion). Its goal is to take a biosignal input (ranging from UV to MV levels) and calculate the degree of relaxation via the heart rate values by a NeuroSky's proprietary algorithm.
In the embodiment, the pdf file generation technology and the line graph generation technology are adopted in the evaluation and feedback module. The PDF file generation technology refers to a namespace iTextSharp of C # to typeset the generated PDF files, PDF contents are obtained from static type data in each scene code, and tables and specific contents of the tables can be arranged and combined through PdfPTable and PdfPCell. The line graph generation technology utilizes a GraphMaker plug-in Unity, and electrocardio data obtained from an electrocardio-based relaxation evaluation module is displayed in a pdf file automatically generated after the user finishes training in a line graph mode, so that managers can intuitively feel the effect of the relaxation training.
Fig. 1 is a system architecture diagram of the embodiment, the architecture of the embodiment is that a voice recognition module, a general video relaxation module, a VR panoramic video relaxation module, a VR scene relaxation module, a relaxation game module, an electrocardiogram based relaxation evaluation module and an evaluation and feedback module of the integrated interactive relaxation training tool are implemented through a Unity3D rendering engine and an HTCVIVE Pro helmet, a physiological state of a user is monitored in real time through an electrocardiogram device, physiological data of the user is converted into a visual light looseness value, and the light looseness of the user in each scene is displayed in a PDF document in a form of a broken line diagram after the user experiences the embodiment.
The whole ease monitoring based on the electrocardio is realized according to the flow of data acquisition, data analysis, data processing, control output and feedback, an electrocardio value is acquired by utilizing an electrocardio sensor, the acquired electrocardio value is sent to a host through Bluetooth, the host processes, calculates, analyzes and evaluates the electrocardio value, and finally the current relaxation of a user is estimated and stored in a local txt file. Unity reads this txt file through the C # script to get the current play and draws it into a line graph to appear in the PDF file.
As shown in fig. 2(a) - (b), the entire ecg module is composed of a host (ecg data acquisition and analysis software), a BMD ecg collector, and three attached electrodes (see fig. 2 (a)). The BMD electrocardio collector is connected with the host through Bluetooth and is connected with the three pasting electrodes through leads. The BMD chip receives analog signals from the applied electrodes through SEP and SEN, converts analog limit signals into digital signals, and finally sends the digital signals to the PC through RX and TX.
As shown in fig. 2(c), the ecg software interface of this embodiment is implemented by measuring the relationship between the high frequency information (0.15-0.4 hz) and the low frequency information (0.04-0.15 hz) in the Heart Rate Variability (HRV) through the relaxation algorithm. Many scientific studies have shown that high frequency HRVs are associated with parasympathetic activity in the autonomic nervous system, while low frequency HRVs are associated with sympathetic activity. Scientific research also indicates that the parasympathetic system contributes to the relaxation and recovery of the body, while the sympathetic nervous system helps people to become excited or stressed under stress. Thus, a lower degree of relaxation indicates a more excited, stressed physiological state (sympathetic activity), while a higher degree of relaxation indicates a more relaxed state (parasympathetic activity).
Fig. 3 is a system flowchart of the present embodiment:
(1) the manager needs to fill in the basic information of the user.
(2) The user autonomously selects the category of relaxation he wants to perform.
(3) The user selects videos, entering scenes or games which the user wants to watch under the category from the relaxation category page selected by the user, and can continue to enter other scenes or categories for experience after the experience is finished.
(4) When the user selects to quit the integrated interactive relaxation training tool, the user firstly enters the page for filling the relaxation effect of the embodiment to fill the relaxation effect, and the user directly quits the embodiment after filling.
As shown in fig. 4, after the basic information of the user is filled in the page, the administrator may select "handle control login" to enter a handle control process to start relaxation training or select "voice control login" to enter a voice control process to start relaxation training, as shown in the login interface of this embodiment.
As shown in fig. 5, the relaxation mode selection page of this embodiment is used to allow the user to select a preferred relaxation mode on the page.
As shown in fig. 6, the general relaxation video selection page of the embodiment is a page on which the user can select the general relaxation video that the user wants to watch.
Fig. 6(a) shows respective general relaxation video pages of the present embodiment, and each of the small images represents one of the general relaxation videos existing in the present embodiment.
Fig. 7 is a page for selecting a VR panorama relaxed video according to this embodiment, where a user may select a VR panorama relaxed video that the user wants to watch.
Fig. 7(a) shows a VR panorama relaxed video page in this embodiment, which is a VR panorama relaxed video existing in this embodiment.
Fig. 8 is a page for selecting a VR relaxation scene in this embodiment, and a user may select a VR relaxation scene that the user wants to enter on the page.
Fig. 8(a) is a page of each VR relaxation scene in this embodiment, and each small graph represents a VR relaxation scene existing in this embodiment.
As shown in fig. 9, for the VR relaxation game selection page of this embodiment, the user may select to enter the "ocean bottom roaming" game page or "whooping to exhaust" game page.
Fig. 9(a) shows each VR relaxation game page in this embodiment, where the two upper views are "bottom roaming" game pages, where the user can change his/her position via the handle, roam in the bottom scene and open the treasure box to obtain diamonds; the right is "shouting lou" recreation page, and the user can express the pressure and discontent of inner heart through the mode of shouting in this recreation, keeps off the stone mountain in front of the user and drops the stone gradually along with user's shouting, and the user can see open lake face after the whole disappearance of stone mountain, and then makes self get into the state of relaxing.
Fig. 10 is a page of the fill-in relaxation effect of the present embodiment, in which the user can score the degree of relaxation he/she experiences in the integrated interactive relaxation training tool.
Fig. 11 is a PDF screenshot finally generated in this embodiment, which includes a line graph composed of personal details of the user and relaxation values of the user in each scene that the user enters.
As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present disclosure and is not intended to limit the present disclosure, and various modifications and changes may be made to the present disclosure by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure.
Although the present disclosure has been described with reference to specific embodiments, it should be understood that the scope of the present disclosure is not limited thereto, and those skilled in the art will appreciate that various modifications and changes can be made without departing from the spirit and scope of the present disclosure.

Claims (9)

1. An integrated form interactive system based on virtual reality and multisource information fusion, characterized by: including interactive device, electrocardio monitoring facilities and server, wherein:
the interaction device is connected with the server and is configured to provide a virtual scene and an interaction environment;
the electrocardio monitoring equipment acquires electrocardio data of a wearer and transmits the electrocardio data to the server;
the server is configured to provide experience modes of games, songs and videos in two environments of ordinary and VR, provide corresponding scenes for games or appreciation, realize real-time monitoring of the physiological state of the user by using the electrocardio data in the relaxation process, form reports and determine the state of the user in each scene.
2. The integrated interactive system based on virtual reality and multi-source information fusion of claim 1, wherein: the server includes:
the voice recognition module is used for recognizing the received user voice information and determining a corresponding instruction;
the common video experience module comprises a relaxation video material library and provides multi-channel classified material content similar to a video platform aiming at a certain experience or training theme;
the VR panoramic video experience module comprises a VR panoramic relaxation video material library, adopts a VR panoramic camera, integrates video shooting, manufacturing and playing, can bring sufficient immersive experience to a viewer while not requiring excessive interaction modes and the learning cost caused by the interaction modes, and obtains various extreme effects through offline rendering and shooting;
the VR scene experience module comprises a VR scene material library, the VR scene material library, and the material library is enriched by adopting a Maya modeling tool to perform scene modeling, so that a user can obtain experience of being personally on the scene in the VR scene experience module;
the VR game experience module comprises a plurality of VR experience games;
and the analysis module is configured to analyze the relaxation state of the user in real time according to the acquired electrocardiogram data and graphically display the calculated relaxation degree and the heart rate value of the user in each scene.
3. The integrated interactive system based on virtual reality and multi-source information fusion of claim 1, wherein: the interactive device selects VIVE Pro.
4. The integrated interactive system based on virtual reality and multi-source information fusion of claim 2, wherein: the voice interaction module converts the voice of the user into character input, supports the characteristics of content inference and event registration, creates a dictation object, registers a result call-back event, a completion event, an error event and a recognition statement event, and then can start voice recognition dictation.
5. The integrated interactive system based on virtual reality and multi-source information fusion of claim 2, wherein: the common Video relaxing module comprises a Curved UI plug-in and an AVPro Video plug-in, wherein the Curved UI plug-in is used for creating a 180-degree cylindrical shape in a world space to wrap a bent Video player, allowing a user to view from various angles and perform handle or voice interaction with the Video player, and further improving the sense of substitution of the user, and the AVProVideo plug-in is used for realizing a basic Video playing function and further realizing progress bar drag and drop and speed adjustment. Not only realized playing the 4K video of high definition in VR, still realized 360 panoramic video's in VR broadcast, greatly improved user's the sense of immersing.
6. The integrated interactive system based on virtual reality and multi-source information fusion of claim 2, wherein: the VR game experience module adopts a VR scene camera roaming technology to acquire the roaming direction, the input position information and the (0, 1) point included angle through the orientation sensed by the interaction device, so as to acquire the roaming moving angle, and further realize the advancing, retreating and left-right translation of the roaming.
7. The integrated interactive system based on virtual reality and multi-source information fusion of claim 2, wherein: VR recreation abstraction module has adopted and has obtained current sound decibel value technique through the microphone, and the process of obtaining current sound decibel value through the microphone is the sound maximum value of acquireing each frame to with this value with visual mode display in the VR scene, the user is in the scene at first, and the scene can change when the decibel that the user shouted reaches a certain threshold value.
8. The integrated interactive system based on virtual reality and multi-source information fusion of claim 1, wherein: the electrocardio monitoring equipment comprises a PC (personal computer), a BMD (BMD) electrocardio collector and a plurality of application electrodes, wherein the BMD electrocardio collector is respectively connected with the PC and the three application electrodes, is connected with the PC through Bluetooth and is connected with the application electrodes through leads, a BMD chip receives analog signals from the application electrodes through SEP (security assurance protocol) and SEN (security assurance protocol), converts analog limit numbers into digital signals and finally sends the digital signals to the PC through RX and TX.
9. The integrated interactive system based on virtual reality and multi-source information fusion of claim 1, wherein: the analysis module adopts a PDF file generation technology, PDF content is obtained from type data in each scene code, and tables and specific content of the tables are arranged and combined through Pdfptable and PdfPCell.
CN201911061261.3A 2019-11-01 2019-11-01 Integrated interactive system based on virtual reality and multi-source information fusion Pending CN110806803A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911061261.3A CN110806803A (en) 2019-11-01 2019-11-01 Integrated interactive system based on virtual reality and multi-source information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911061261.3A CN110806803A (en) 2019-11-01 2019-11-01 Integrated interactive system based on virtual reality and multi-source information fusion

Publications (1)

Publication Number Publication Date
CN110806803A true CN110806803A (en) 2020-02-18

Family

ID=69501024

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911061261.3A Pending CN110806803A (en) 2019-11-01 2019-11-01 Integrated interactive system based on virtual reality and multi-source information fusion

Country Status (1)

Country Link
CN (1) CN110806803A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111477061A (en) * 2020-06-02 2020-07-31 谭明新 Supervision place VR education correction system based on VR technology
CN111885130A (en) * 2020-07-10 2020-11-03 深圳市瑞立视多媒体科技有限公司 Voice communication method, device, system, equipment and storage medium
CN112274755A (en) * 2020-10-10 2021-01-29 中山大学孙逸仙纪念医院 Immersive virtual VR biofeedback therapeutic instrument for tinnitus
CN112337082A (en) * 2020-10-20 2021-02-09 深圳市杰尔斯展示股份有限公司 AR immersive virtual visual perception interaction system and method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102641541A (en) * 2012-05-15 2012-08-22 北京思博纳科技发展有限公司 Three-dimensional (3D) somatosensory biofeedback comprehensive relaxing and training system
CN107411726A (en) * 2017-05-11 2017-12-01 浙江凡聚科技有限公司 Biofeedback confrol method and system based on the training of HRV heart rate variabilities
CN107680165A (en) * 2017-09-25 2018-02-09 中国电子科技集团公司第二十八研究所 Computer operation table holography based on HoloLens shows and natural interaction application process
CN108461126A (en) * 2018-03-19 2018-08-28 傅笑 In conjunction with virtual reality(VR)The novel intelligent psychological assessment of technology and interfering system
CN108986888A (en) * 2018-07-16 2018-12-11 上海赞彤医疗科技有限公司 Examination anxiety and pressure regulating system and method, storage medium, operating system
CN109102862A (en) * 2018-07-16 2018-12-28 上海赞彤医疗科技有限公司 Concentrate the mind on breathing depressurized system and method, storage medium, operating system
CN109998570A (en) * 2019-03-11 2019-07-12 山东大学 Inmate's psychological condition appraisal procedure, terminal, equipment and system
CN110064117A (en) * 2019-03-27 2019-07-30 佛山职业技术学院 A kind of psychological venting exchange method and device based on scream sound
CN110325112A (en) * 2017-01-04 2019-10-11 斯托瑞阿普股份有限公司 The movable system and method for bioassay are modified using virtual reality therapy

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102641541A (en) * 2012-05-15 2012-08-22 北京思博纳科技发展有限公司 Three-dimensional (3D) somatosensory biofeedback comprehensive relaxing and training system
CN110325112A (en) * 2017-01-04 2019-10-11 斯托瑞阿普股份有限公司 The movable system and method for bioassay are modified using virtual reality therapy
CN107411726A (en) * 2017-05-11 2017-12-01 浙江凡聚科技有限公司 Biofeedback confrol method and system based on the training of HRV heart rate variabilities
CN107680165A (en) * 2017-09-25 2018-02-09 中国电子科技集团公司第二十八研究所 Computer operation table holography based on HoloLens shows and natural interaction application process
CN108461126A (en) * 2018-03-19 2018-08-28 傅笑 In conjunction with virtual reality(VR)The novel intelligent psychological assessment of technology and interfering system
CN108986888A (en) * 2018-07-16 2018-12-11 上海赞彤医疗科技有限公司 Examination anxiety and pressure regulating system and method, storage medium, operating system
CN109102862A (en) * 2018-07-16 2018-12-28 上海赞彤医疗科技有限公司 Concentrate the mind on breathing depressurized system and method, storage medium, operating system
CN109998570A (en) * 2019-03-11 2019-07-12 山东大学 Inmate's psychological condition appraisal procedure, terminal, equipment and system
CN110064117A (en) * 2019-03-27 2019-07-30 佛山职业技术学院 A kind of psychological venting exchange method and device based on scream sound

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱耀文: "多源信息融合的心理测量关键技术研究", 《中国优秀硕士学位论文全文数据库 哲学与人文科学辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111477061A (en) * 2020-06-02 2020-07-31 谭明新 Supervision place VR education correction system based on VR technology
CN111885130A (en) * 2020-07-10 2020-11-03 深圳市瑞立视多媒体科技有限公司 Voice communication method, device, system, equipment and storage medium
CN111885130B (en) * 2020-07-10 2023-06-30 深圳市瑞立视多媒体科技有限公司 Voice communication method, device, system, equipment and storage medium
CN112274755A (en) * 2020-10-10 2021-01-29 中山大学孙逸仙纪念医院 Immersive virtual VR biofeedback therapeutic instrument for tinnitus
CN112274755B (en) * 2020-10-10 2022-09-06 中山大学孙逸仙纪念医院 Immersive virtual VR biofeedback therapeutic instrument for tinnitus
CN112337082A (en) * 2020-10-20 2021-02-09 深圳市杰尔斯展示股份有限公司 AR immersive virtual visual perception interaction system and method

Similar Documents

Publication Publication Date Title
CN110806803A (en) Integrated interactive system based on virtual reality and multi-source information fusion
US20230362457A1 (en) Intelligent commentary generation and playing methods, apparatuses, and devices, and computer storage medium
CN104298722B (en) Digital video interactive and its method
Fothergill et al. Instructing people for training gestural interactive systems
Brown et al. Using a head-mounted video camera to understand social worlds and experiences
KR20200130231A (en) Direct live entertainment using biometric sensor data for detection of neural conditions
Zhou et al. Dance and choreography in HCI: a two-decade retrospective
US11205408B2 (en) Method and system for musical communication
CN112905015B (en) Meditation training method based on brain-computer interface
Gebhard et al. Exploring interaction strategies for virtual characters to induce stress in simulated job interviews
CN111953910B (en) Video processing method and device based on artificial intelligence and electronic equipment
CN110874859A (en) Method and equipment for generating animation
CN106345035A (en) Sleeping system based on virtual reality
Tan et al. Can you copyme? an expression mimicking serious game
Jégo et al. User-defined gestural interaction: A study on gesture memorization
Ahmadpour et al. Building enriching realities with children: creating makerspaces that intertwine virtual and physical worlds in pediatric hospitals
Yan et al. Exploring audience response in performing arts with a brain-adaptive digital performance system
Kolykhalova et al. A serious games platform for validating sonification of human full-body movement qualities
US20230335139A1 (en) Systems and methods for voice control in virtual reality
Kerr et al. A breath controlled AAC system
Kim et al. Perceptually motivated automatic dance motion generation for music
Kataoka Language and body in place and space: Discourse of Japanese rock climbing
JP3829005B2 (en) Virtual environment presentation device
Kang et al. One-Man Movie: A System to Assist Actor Recording in a Virtual Studio
Özkul et al. Multimodal analysis of upper-body gestures, facial expressions and speech

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200218