CN116594511B - Scene experience method and device based on virtual reality, computer equipment and medium - Google Patents

Scene experience method and device based on virtual reality, computer equipment and medium Download PDF

Info

Publication number
CN116594511B
CN116594511B CN202310870684.XA CN202310870684A CN116594511B CN 116594511 B CN116594511 B CN 116594511B CN 202310870684 A CN202310870684 A CN 202310870684A CN 116594511 B CN116594511 B CN 116594511B
Authority
CN
China
Prior art keywords
user
scene
virtual scene
psychological
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310870684.XA
Other languages
Chinese (zh)
Other versions
CN116594511A (en
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tian'an Star Control Beijing Technology Co ltd
Original Assignee
Tian'an Star Control Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tian'an Star Control Beijing Technology Co ltd filed Critical Tian'an Star Control Beijing Technology Co ltd
Priority to CN202310870684.XA priority Critical patent/CN116594511B/en
Publication of CN116594511A publication Critical patent/CN116594511A/en
Application granted granted Critical
Publication of CN116594511B publication Critical patent/CN116594511B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a scene experience method, device, computer equipment and medium based on virtual reality, comprising the following steps: receiving current psychological data of a first user sent by data monitoring equipment, wherein the current psychological data comprises: heart rate, body temperature, and blood pressure; determining an original virtual scene to be displayed to the first user by the VR device based on the emotional state of the first user; setting the display duration of the original virtual scene based on the heart rate of the first user, the body temperature of the first user and the blood pressure of the first user; and compressing the original virtual scene based on the display time of the original virtual scene to obtain a target virtual scene, so that the first user experiences the target virtual scene through the VR equipment. In the experience process, the user does not need to manually select operation, corresponding scene experience can be generated for the user according to the psychological data of the user, different experience requirements of the user can be met, the user experience feeling is improved, and the psychological pressure of the user is effectively relieved.

Description

Scene experience method and device based on virtual reality, computer equipment and medium
Technical Field
The embodiment of the disclosure relates to the technical field of virtual reality, in particular to a scene experience method, device, computer equipment and medium applicable to virtual reality.
Background
With the acceleration of the pace of life of people, more and more people become more diligent, the working intensity is continuously increased, and at the moment, most people can select a teams and adding points to complete corresponding working contents, which causes no small psychological stress to most people.
In the related art, a manner of relaxing psychological stress of a user may be implemented by a VR (Virtual Reality) device, for example, a plurality of different types of fixed Virtual scenes may be preset in the VR device for the user to select, and the user needs to select a Virtual scene from a plurality of Virtual scenes to experience.
However, the existing virtual scenes for user experience are all fixed scenes, different experience demands of users are difficult to meet, the users need to manually select experience scenes, and user experience is poor.
Disclosure of Invention
Embodiments described herein provide a virtual reality-based scene experience method, apparatus, computer device, and medium that overcome the above-described problems.
According to a first aspect of the present disclosure, there is provided a scene experience method based on virtual reality, a first user wearing a virtual reality VR device for displaying a virtual scene to the first user and a data monitoring device for monitoring psychological data of the first user;
The method comprises the following steps:
receiving current psychological data of the first user sent by the data monitoring equipment, wherein the current psychological data comprises: heart rate, body temperature, and blood pressure;
determining an original virtual scene to be displayed to the first user by the VR device based on the emotional state of the first user;
setting a display duration of the original virtual scene based on the heart rate of the first user, the body temperature of the first user, and the blood pressure of the first user;
based on the display duration of the original virtual scene, compressing the original virtual scene to obtain a target virtual scene, so that the first user experiences the target virtual scene through the VR equipment;
wherein, the target virtual scene includes: virtual farm scenes, virtual garden scenes, virtual autumn pasture scenes, virtual autumn defoliation scenes, and virtual snowfield scenes.
In a second aspect, according to the disclosure, there is provided a scene experience apparatus based on virtual reality, a first user wearing a virtual reality VR device for displaying a virtual scene to the first user and a data monitoring device for monitoring psychological data of the first user;
The device comprises:
the receiving module is configured to receive current psychological data of the first user, where the current psychological data is sent by the data monitoring device and includes: heart rate, body temperature, and blood pressure;
a determining module, configured to determine, based on an emotional state of the first user, an original virtual scene to be displayed to the first user by the VR device;
a setting module, configured to set a display duration of the original virtual scene based on the heart rate of the first user, the body temperature of the first user, and the blood pressure of the first user;
the compression module is used for compressing the original virtual scene based on the display duration of the original virtual scene to obtain a target virtual scene so that the first user experiences the target virtual scene through the VR equipment;
wherein, the target virtual scene includes: virtual farm scenes, virtual garden scenes, virtual autumn pasture scenes, virtual autumn defoliation scenes, and virtual snowfield scenes.
In a third aspect, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the virtual reality based scene experience method as in any one of the embodiments above when the computer program is executed.
In a fourth aspect, a computer readable storage medium is provided, on which a computer program is stored, which when executed by a processor implements the steps of a virtual reality based scene experience method as in any of the above embodiments.
According to the scene experience method based on virtual reality, which is provided by the embodiment of the application, a first user wears Virtual Reality (VR) equipment and data monitoring equipment, wherein the VR equipment is used for displaying a virtual scene to the first user, and the data monitoring equipment is used for monitoring psychological data of the first user; receiving current psychological data of a first user sent by data monitoring equipment, wherein the current psychological data comprises: heart rate, body temperature, and blood pressure; determining an original virtual scene to be displayed to the first user by the VR device based on the emotional state of the first user; setting the display duration of the original virtual scene based on the heart rate of the first user, the body temperature of the first user and the blood pressure of the first user; based on the display time length of the original virtual scene, compressing the original virtual scene to obtain a target virtual scene, so that a first user experiences the target virtual scene through VR equipment; wherein, the target virtual scene includes: virtual farm scenes, virtual garden scenes, virtual autumn pasture scenes, virtual autumn defoliation scenes, and virtual snowfield scenes. So, assist user experience to be fit for oneself when psychological feedback's natural scene, experience in-process user need not manual selection operation, can be according to user's psychological data for its corresponding scene experience of generation, can satisfy user's different experience demands, high-efficient psychological pressure of having alleviated the user when promoting user experience sense.
The foregoing description is only an overview of the technical solutions of the embodiments of the present application, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present application can be more clearly understood, and the following specific embodiments of the present application are given for clarity and understanding.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the following brief description of the drawings of the embodiments will be given, it being understood that the drawings described below relate only to some embodiments of the present disclosure, not to limitations of the present disclosure, in which:
fig. 1 is a flow chart of a scene experience method based on virtual reality provided by the present disclosure.
Fig. 2 is a schematic structural diagram of a scene experience device based on virtual reality provided by the present disclosure.
Fig. 3 is a schematic structural diagram of a computer device provided in the present disclosure.
It is noted that the elements in the drawings are schematic and are not drawn to scale.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings. It will be apparent that the described embodiments are some, but not all, of the embodiments of the present disclosure. All other embodiments, which can be made by those skilled in the art based on the described embodiments of the present disclosure without the need for creative efforts, are also within the scope of the protection of the present disclosure.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the presently disclosed subject matter belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the specification and relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein. As used herein, a statement that two or more parts are "connected" or "coupled" together shall mean that the parts are joined together either directly or joined through one or more intermediate parts.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of the phrase "an embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: there are three cases, a, B, a and B simultaneously. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. Terms such as "first" and "second" are used merely to distinguish one component (or portion of a component) from another component (or another portion of a component).
In the description of the present application, unless otherwise indicated, the meaning of "plurality" means two or more (including two), and similarly, "plural sets" means two or more (including two).
The embodiment provides a scene experience method based on virtual reality, which adopts an all-weather psychological adjustment system of VR, and can dredge psychological pressure through a relaxation mode of virtual reality, so that a user can relax himself in a graceful environment, the pressure is relieved, and the aim of psychological adjustment is achieved.
The psychological adjusting system has high immersion in natural wind and light scenes with different spring, summer, autumn and winter seasons, and the all-weather throttle time change is represented by the outstanding scene visual change. The influence of weather changes on the mind of a person, such as bright sunshine and other good weather conditions can lead the emotion of the person to rise and feel happy, and the bad weather conditions such as overcast and rainy can lead the emotion of the person to fall and feel smoky. The adjusting system can monitor input data according to psychological states, automatically adapt to psychological adjustment requirements of users by adjusting weather conditions and ambient day and night changes of scenes, and achieve the effect of individual heart physiotherapy through sensory stimulation of various virtual and real natural scenes in an all-weather mode.
The psychological adjustment system can select corresponding scenes to perform psychological intervention on the personnel by receiving psychological monitoring data transmitted by external monitoring equipment. The overall function of the system is divided into 8 functional modules, and the 8 functional modules comprise 5 sets of virtual reality scenes showing natural landscapes in different seasons. The system comprises a psychological monitoring data receiving module, a farm scene module, a garden scene module, a summer pasture scene module, a autumn defoliation scene module, a snowfield scene module, a psychological adjustment mapping module and a scheduling control module. Each scene can run in virtual reality hardware equipment, has high immersive and interactive characteristics.
In order to make the person skilled in the art better understand the solution of the present application, the technical solution of the embodiment of the present application will be clearly and completely described below with reference to the accompanying drawings.
Fig. 1 is a flow chart of a scene experience method based on virtual reality according to an embodiment of the disclosure. The first user wears virtual reality VR equipment and data monitoring equipment, and VR equipment can be used to show virtual scene to the first user through its inside display screen, and data monitoring equipment can be used to monitor first user's psychological data. As shown in fig. 1, the specific process of the scene experience method based on virtual reality includes:
S110, receiving current psychological data of a first user sent by data monitoring equipment, wherein the current psychological data comprise: heart rate, body temperature, and blood pressure.
The data monitoring device is an external detection device for collecting psychological data of the user, and can be in communication connection with the psychological adjustment system in the embodiment, so that effective collection of the data is achieved. When the psychological adjustment system receives the current psychological data of different users, the received psychological data can be stored independently based on the identity information of the different users.
S120, determining an original virtual scene to be displayed to the first user by the VR device based on the emotion state of the first user.
The method for acquiring the emotion state of the first user can comprise the following steps of; doctor diagnosis, user's own input (adapted to the user's own adjustment of the scene), or facial recognition device. The facial recognition device can effectively and accurately recognize the emotional state of the user.
The facial recognition device may send the emotional state of the first user to the mental adjustment system through a communication connection with the mental adjustment system, or the mental adjustment system may read the emotional state of the first user from the facial recognition device in real time.
For example, before determining an original virtual scene to be displayed by the VR device to the first user based on the emotional state of the first user may include: the emotional state of the first user transmitted by the face recognition device is received, or the emotional state of the first user is read from the face recognition device. The emotional state of the first user may be one that is recognized by the facial expression of the first user based on the facial recognition device.
Each original virtual scene has a unique corresponding keyword, and the keyword can be used for expressing the scene type of the original virtual scene, and the scene type can include: soothing, releasing, activating, etc.
Based on the emotional state of the first user, determining an original virtual scene to be displayed by the VR device to the first user may include: and matching the emotion state of the first user, and determining the original virtual scene corresponding to the keyword with the highest matching degree as the original virtual scene to be displayed to the first user by the VR equipment, wherein the keyword corresponds to each original virtual scene.
S130, setting the display duration of the original virtual scene based on the heart rate of the first user, the body temperature of the first user and the blood pressure of the first user.
The suitable time length for viewing the scene of the first user can be judged through the heart rate, the body temperature and the blood pressure of the first user, and the display time length of the original virtual scene is set based on the suitable time length.
When judging that the first user watches the scene for a proper time length, the method can be determined through a time length comparison table, and heart rate, body temperature and blood pressure and the corresponding relation between the time length comparison table and the time length are stored.
When the display time length of the original virtual scene is set based on the suitable time length, the suitable time length can be determined as the display time length of the original virtual scene; or on the basis of the proper time length, a certain time length is added as the display time length of the original virtual scene, so that the immersive experience of the user is conveniently realized.
And S140, compressing the original virtual scene based on the display time of the original virtual scene to obtain a target virtual scene.
The original virtual scene is compressed based on the display duration of the original virtual scene to obtain a target virtual scene, so that a first user experiences the target virtual scene through VR equipment.
The target virtual scene may include: virtual farm scenes, virtual garden scenes, virtual autumn pasture scenes, virtual autumn defoliation scenes, and virtual snowfield scenes.
For example, when a first user experiences a virtual farm scene through a VR device, a type area of the farm scene may be displayed to the first user through the VR device based on a movement track of the first user, the type area including: a virtual orchard area and a virtual vegetable area; controlling odor corresponding to the odor playing device emission type area; based on the weather state of the farm scene, controlling the ventilation device to adjust the environmental information of the real scene where the first user is located, wherein the environmental information of the real scene comprises: temperature, humidity, wind or wind direction.
The virtual farm scene comprises a virtual orchard and a virtual vegetable field, and various fruit trees and various vegetables and fruits are planted in the virtual orchard and the virtual vegetable field respectively. The first user can walk or fly low-altitude to visit and experience, and can see different kinds of vegetable growth states, and the process of flowering and fruiting of fruit trees in an orchard. Can smell fragrance of flowers or fruits. There is also an introduction of melons, fruits and vegetables in virtual farm scenes. Can simulate 24 hours day and night change and natural wind effect, and has weather change characteristics such as rainy day and night change.
When the first user experiences the virtual garden scene through the VR device, virtual flowers/virtual plants matching the season information can be displayed in the VR device based on the season information in the target virtual scene, and virtual animals matching the season information can be displayed in the VR device; controlling the odor playing device to emit the odor corresponding to the virtual flowers/virtual plants; controlling sound playing equipment to play the sound corresponding to the virtual animal; based on the weather state of garden scene, control ventilation equipment adjustment first user is located the environmental information of reality scene, and the environmental information of reality scene includes: temperature, humidity, wind or wind direction.
Different kinds of virtual flowers are planted in the virtual garden scene, and flowers and seas in different flowering periods show respective seasonal characteristics. This scenario is characterized by the complexity of the stimulus, visual stimulus comprising different kinds of flowers of different colors, auditory stimulus comprising various different types of wheezing, birds, olfactory stimulus comprising different types of floral scents and plant odors. The scene can simulate natural wind effect and has weather change characteristics such as rainy day, cloudy day and the like.
When a first user experiences a virtual autumn pasture scene through VR equipment, virtual animals are included in the autumn pasture scene, and the display tone of the VR equipment can be adjusted to be a matching tone corresponding to the autumn pasture scene; controlling sound playing equipment to play the sound of the virtual animal and the footstep sound of the first user; based on the virtual weather state of the autumn pasture scene, the ventilation equipment is controlled to adjust the environmental information of the first user in the real scene, and the environmental information of the real scene comprises: temperature, humidity, wind or wind direction.
The virtual autumn pasture scene is mainly green with cool tone, has wide visual field, mainly has warm and smooth animals, and can play a role in mental relaxation. The first user can see the typical free moving scenes of animals and plants inhabiting the real pasture, and can hear wind sounds, sounds such as crying of animals and the like and footstep sounds of the first user. And the wind-blowing device can simulate the natural wind-blowing effect and has the weather change characteristics of raining, cloudy and sunny changes and the like.
When a first user experiences a virtual autumn defoliation scene through VR equipment, the display tone of the VR equipment can be adjusted to be a matching tone corresponding to the autumn defoliation scene; controlling the sound playing device to play the falling leaf sound and the footstep sound of the first user; based on the virtual weather state of the fall leaf scene in autumn, the ventilation equipment is controlled to adjust the environmental information of the first user in the real scene, and the environmental information of the real scene comprises: temperature, humidity, wind or wind direction.
The virtual autumn defoliation scene mainly adopts warm tone. Orange, yellow and brown fallen leaves are matched with brown trunks, and tranquility on hearts is obtained through a scene showing fallen leaves. The first user can see the typical defoliation plants in autumn and the fallen leaves, can hear the sound of blowing the fallen leaves by wind and the footfall sound of the first user, and can experience walking or low-altitude flight. The scene can simulate natural wind effect and has weather change characteristics such as rainy day, cloudy day and the like.
When a first user experiences a virtual snowy scene through VR equipment, the display tone of the VR equipment can be adjusted to be a matching tone corresponding to the snowy scene; controlling the sound playing device to play wind sound and footstep sound of the first user; based on the virtual weather state of the snowy scene, the ventilation equipment is controlled to adjust the environmental information of the real scene where the first user is located, and the environmental information of the real scene comprises: temperature, humidity, wind or wind direction.
The virtual snowy scene is mainly composed of neutral gray, white, black and brown, and the weather can be a sunny day after snow, so that the visual field is wide and clear, and the smooth psychological experience is brought to people. The first user can see the plants common to the frigid zone and the typical scene covered by the snow, and can hear the wind sound and the footstep sound of the first user. The scene can simulate natural wind effect and has weather change characteristics such as rainy, snowy, cloudy and sunny changes.
In this embodiment, the first user wears a virtual reality VR device and a data monitoring device, where the VR device is configured to display a virtual scene to the first user, and the data monitoring device is configured to monitor psychological data of the first user; receiving current psychological data of a first user sent by data monitoring equipment, wherein the current psychological data comprises: heart rate, body temperature, and blood pressure; determining an original virtual scene to be displayed to the first user by the VR device based on the emotional state of the first user; setting the display duration of the original virtual scene based on the heart rate of the first user, the body temperature of the first user and the blood pressure of the first user; based on the display time length of the original virtual scene, compressing the original virtual scene to obtain a target virtual scene, so that a first user experiences the target virtual scene through VR equipment; wherein, the target virtual scene includes: virtual farm scenes, virtual garden scenes, virtual autumn pasture scenes, virtual autumn defoliation scenes, and virtual snowfield scenes. So, assist user experience to be fit for oneself when psychological feedback's natural scene, experience in-process user need not manual selection operation, can be according to user's psychological data for its corresponding scene experience of generation, can satisfy user's different experience demands, high-efficient psychological pressure of having alleviated the user when promoting user experience sense.
In some embodiments, based on the display duration of the original virtual scene, compressing the original virtual scene to obtain the target virtual scene further includes:
and obtaining mapping comparison information, wherein the mapping comparison information is used for describing a mapping relation between a psychological experience control point and an environment change sequence, and the psychological experience control point is used for representing the adjustment direction of the rhythm phase of the first user.
Compressing the original virtual scene based on the display duration of the original virtual scene to obtain a target virtual scene, including: fusing the target natural scene and the original virtual scene based on the mapping comparison information to obtain a fused scene; and based on the display duration, compressing the fusion scene by adopting a compression mapping algorithm to obtain a target virtual scene of the display duration.
For example, for individual psychological adjustment mapping comparison combination, according to exposure time setting, combining a plurality of typical weather phenomena randomly disordered in nature with natural scenes, enabling the combination to be displayed in a continuous form through smooth transition, modulating the continuous combination according to dimensional parameters such as time, intensity and the like by using a compression mapping algorithm, and automatically generating a customized scene for psychological treatment.
The weather change can be controlled by time, namely, the weather state at the corresponding moment is changed by inputting the time at the moment. Meanwhile, in order to ensure the continuity of the combination, a cooperative mode is adopted, the frame rate of a playing picture in each second is 24 frames, namely, a frame of image is played in real time for 4 seconds. The system sends a rendering instruction with a time interval of 4 seconds to the system in each rendering frame, and the api of the program is called to render the weather picture.
Depending on the exposure duration setting, the system enables the user to experience a 24 hour weather change in a short period of time. Taking the exposure duration of 10 minutes as an example, the compression mapping algorithm adopts a mapping mode to compress 24 hours within 10 minutes. The exposure duration corresponds to a continuous combined play time of 10 minutes, i.e. 10x60 = 600 seconds, and 24 hours of 24x60x60 = 86400 seconds. Dividing 86400 by 600 gives 144, i.e. the ratio of virtual time to real time is 144. The weather change time was 144 seconds each time the virtual time passed 1 second.
In this embodiment, a scene adaptation function with dynamically adjustable multiple types of parameters may also be provided. Each scene can realize dynamic adjustment of weather state. The intensity variation range of different weather types is continuously adjustable, such as rainfall intensity, wind blowing intensity, sunlight intensity and the like; meanwhile, the duration time of different weather types is adjustable. The method can simulate 24-hour day and night change, embed the small-period day and night change into a long-period season time line, realize scene illumination effects of different periods by adjusting illumination intensity and color temperature, and intervene and adjust the mind by means of variable day and night modes.
In some embodiments, obtaining mapping control information includes:
Obtaining psychological rhythm affecting characteristics including: the method comprises the steps of determining a characteristic corresponding to a first dimension, a characteristic corresponding to a second dimension and a characteristic corresponding to a third dimension, wherein the first dimension is a weather state, the second dimension is illumination intensity, and the third dimension is spectrum information; extracting features with the same emotion regulating direction and the same rhythm phase influence from the psychological rhythm influence features to obtain an environment change sequence, wherein the environment change sequence is used for describing different psychological emotion regulation; and determining mapping comparison information based on the environment change sequence and the psychological experience control point.
Wherein the external natural environment has a direct influence on the human mind. Different seasons have more obvious weather features, such as: the overcast and rainy weather in autumn is less, and the high and refreshing weather in autumn is more; winter weather is mainly snow and generally does not rain.
Meanwhile, different seasons and different moments of the day also have differences in illumination intensity and color temperature. Psychological impact features mainly include: the influence of weather conditions such as sunny days, cloudy days, rainy days and the like, illumination intensity changes, spectrum (color temperature) levels and the like on psychological moods and circadian rhythms.
Psychological rhythm influence features are classified from different dimensions such as weather (e.g. first dimension), illumination (e.g. second dimension), spectrum (e.g. third dimension), and features with the same mood adjustment direction and rhythm phase influence are extracted to generate an environmental change sequence related to psychological adjustment.
For example, environmental change sequences such as sunny summer days, strong afternoon illumination, higher color temperature and the like have enhanced adjustment characteristics on psychological moods; the environmental change sequences such as overcast and rainy weather, sunset and dusk, lower color temperature and the like have weakening type adjusting characteristics on psychological emotion. Therefore, the environment change sequence for adjusting different physiological emotions can be obtained, and the mapping comparison information can be effectively determined based on the environment change sequence and the psychological experience control point.
In some embodiments, determining mapping control information based on the sequence of environmental changes and the mental experience control points includes:
carrying out weight configuration on psychological rhythm influence characteristics in different dimensions; based on the weight configuration, ordering the environmental change sequence; and determining mapping control information based on the sequencing results of the psychological experience control points and the environment change sequence.
For example, according to the analysis result of the environmental condition change on the psychological influence, the psychological influence characteristics of the weather states in different dimensions are given weights by using a correlation coefficient method, the importance or the influence intensity of the dimensional characteristics are represented by different weights, the environmental change sequences such as weather, illumination, color temperature change and the like can be ordered according to the weights, and different mapping comparison combinations (such as mapping comparison information) are formed according to the psychological adjustment control points.
In some embodiments, the method of the present embodiment may further include:
determining scene smell corresponding to an original virtual scene/a target virtual scene; and controlling the odor playing device to emit scene odor based on the display time, wherein the emission time of the scene odor is less than or equal to the display time.
Therefore, the fragrance is released by controlling the fragrance player, so that the user smells fragrance such as flower fragrance/fruit fragrance, and input control response is performed on the VR interactive device, and the user can obtain experience similar to a real scene.
In some embodiments, the method of this embodiment further includes:
receiving current psychological data of a first user sent by data monitoring equipment when the first user experiences a target virtual scene through VR equipment; based on the current psychological data, adjusting the display time length of the target virtual scene; or, based on the current psychological data, adjusting the environmental information displayed in the target virtual scene, wherein the environmental information displayed in the target virtual scene comprises: weather status, illumination intensity, or spectral information.
The method comprises the steps of selecting corresponding scenes according to emotional states of different users, setting adjusting time according to psychological monitoring data of the users, automatically calling the scenes after adjustment and compression, performing psychological adjustment, realizing intelligent adjustment in the user experience process, and improving user experience sense.
In some embodiments, the method of this embodiment further includes:
receiving historical psychological data of a second user sent by the data monitoring device, wherein the second user comprises: the first user or the first user and other users; determining a rhythmic phase difference based on historical psychological data of the second user; a mental experience control point is determined based on the rhythmic phase difference and the emotional state of the second user.
According to psychological state characteristics of different users, discrete mapping comparison combination can be automatically formed through common-phase characteristic extraction and psychological influence analysis according to psychological adjustment control points (namely psychological experience control points) with fine granularity and the influence of factors such as comprehensive weather, illumination, spectrum and phase. The dynamic adjustable scene adjustment technology of various parameters such as day and night illumination is combined, so that the psychological adjustment of the user is hidden in the VR scene, and all-weather intelligent psychological adjustment is realized.
Psychological adjustment control points are basic points of psychological adjustment and control, and are used for adjusting and controlling emotion. Depending on the direction and degree of mood adjustment, mood adjustment may be categorized into reduced-form adjustment, enhanced-form adjustment, and maintenance-form adjustment, and for the purpose of mental adjustment, this embodiment may exhibit reduced-form adjustment and enhanced-form adjustment.
Reduced modulation primarily refers to reducing or lessening the mood of excessive intensity, such as reducing the mood of an individual extremely anger; enhanced modulation is to enhance a certain emotion, such as enhancing a happy emotion itself. The physiological indexes such as heart rate, body temperature, blood pressure and the like can reflect the emotional state of the human body, the physiological indexes also have obvious circadian rhythms, when the circadian rhythms deviate, adverse effects on psychological emotions can be brought, and the physiological indexes can be adjusted and improved through phase forward and phase backward.
Fig. 2 is a schematic structural diagram of a scene experience device based on virtual reality, where a first user wears a virtual reality VR device and a data monitoring device, where the VR device is configured to display a virtual scene to the first user, and the data monitoring device is configured to monitor psychological data of the first user. The scene experience device based on virtual reality can comprise: the device comprises a receiving module 210, a determining module 220, a setting module 230 and a compressing module 240.
A receiving module 210, configured to receive current psychological data of the first user sent by the data monitoring device, where the current psychological data includes: heart rate, body temperature, and blood pressure.
A determining module 220, configured to determine, based on the emotional state of the first user, an original virtual scene to be displayed to the first user by the VR device.
A setting module 230, configured to set a display duration of the original virtual scene based on the heart rate of the first user, the body temperature of the first user, and the blood pressure of the first user.
And the compression module 240 is configured to compress the original virtual scene based on the display duration of the original virtual scene, so as to obtain a target virtual scene, so that the first user experiences the target virtual scene through the VR device.
Wherein, the target virtual scene includes: virtual farm scenes, virtual garden scenes, virtual autumn pasture scenes, virtual autumn defoliation scenes, and virtual snowfield scenes.
In this embodiment, optionally, the method further includes: and an acquisition module.
The system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring mapping comparison information, the mapping comparison information is used for describing a mapping relation between a psychological experience control point and an environment change sequence, and the psychological experience control point is used for representing the adjustment direction of the rhythm phase of the first user.
The compression module 240 is specifically configured to:
Based on the mapping comparison information, fusing a target natural scene and the original virtual scene to obtain a fused scene; and based on the display duration, compressing the fusion scene by adopting a compression mapping algorithm to obtain the target virtual scene of the display duration.
In this embodiment, optionally, the acquiring module includes: an acquisition unit, an extraction unit, and a determination unit.
An acquisition unit configured to acquire a psychological-rhythm affecting feature including: the method comprises the steps of enabling a first dimension to be a weather state, enabling a second dimension to be an illumination intensity and enabling a third dimension to be spectrum information, wherein the first dimension corresponds to the feature, the second dimension corresponds to the feature and the third dimension corresponds to the feature;
the extraction unit is used for extracting the characteristics with the same emotion regulating direction and the same rhythm phase influence from the psychological rhythm influence characteristics to obtain an environment change sequence, wherein the environment change sequence is used for describing different psychological emotion regulation;
and the determining unit is used for determining the mapping comparison information based on the environment change sequence and the psychological experience control point.
In this embodiment, optionally, the determining unit is specifically configured to:
Carrying out weight configuration on psychological rhythm influence characteristics in different dimensions; sorting the sequence of environmental changes based on the weight configuration; and determining the mapping comparison information based on the psychological experience control point and the sequencing result of the environment change sequence.
In this embodiment, optionally, the method further includes: and a control module.
The determining module 220 is further configured to determine a scene smell corresponding to the original virtual scene/the target virtual scene.
The control module is used for controlling the odor playing device to emit the scene odor based on the display duration, and the emission duration of the scene odor is smaller than or equal to the display duration.
In this embodiment, optionally, the method further includes: and an adjustment module.
The receiving module 210 is further configured to receive current psychological data of the first user sent by the data monitoring device when the first user experiences the target virtual scene through the VR device.
The adjusting module is used for adjusting the display duration of the target virtual scene based on the current psychological data; or, based on the current psychological data, adjusting the environmental information displayed in the target virtual scene, where the environmental information displayed in the target virtual scene includes: weather status, illumination intensity, or spectral information.
In this embodiment, optionally, the receiving module 210 is further configured to receive historical psychological data of a second user sent by the data monitoring device, where the second user includes: the first user or the first user and other users.
The determining module 220 is further configured to determine a rhythm phase difference based on the historical psychological data of the second user.
The determining module 220 is further configured to determine the psychological experience control point based on the rhythm phase difference and the emotional state of the second user.
The virtual reality-based scene experience device provided by the present disclosure may execute the above method embodiment, and the specific implementation principle and technical effects of the method embodiment may be referred to the above method embodiment, which is not described herein.
The embodiment of the application also provides computer equipment. Referring specifically to fig. 3, fig. 3 is a basic structural block diagram of a computer device according to the present embodiment.
The computer device includes a memory 310 and a processor 320 communicatively coupled to each other via a system bus. It should be noted that only computer devices having components 310-320 are shown in the figures, but it should be understood that not all of the illustrated components are required to be implemented and that more or fewer components may be implemented instead. It will be appreciated by those skilled in the art that the computer device herein is a device capable of automatically performing numerical calculations and/or information processing in accordance with predetermined or stored instructions, the hardware of which includes, but is not limited to, microprocessors, application specific integrated circuits (Application Specific Integrated Circuit, ASICs), programmable gate arrays (fields-ProgrammableGate Array, FPGAs), digital processors (Digital Signal Processor, DSPs), embedded devices, etc.
The computer device may be a desktop computer, a notebook computer, a palm computer, a cloud server, or the like. The computer device can perform man-machine interaction with a user through a keyboard, a mouse, a remote controller, a touch pad or voice control equipment and the like.
The memory 310 includes at least one type of readable storage medium including non-volatile memory (non-volatile memory) or volatile memory, such as flash memory (flash memory), hard disk, multimedia card, card memory (e.g., SD or DX memory, etc.), random access memory (random accessmemory, RAM), read-only memory (ROM), erasable programmable read-only memory (erasableprogrammable read-only memory, EPROM), electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), programmable read-only memory (programmable read-only memory, PROM), magnetic memory, RAM, optical disk, etc., which may include static or dynamic. In some embodiments, memory 310 may be an internal storage unit of a computer device, such as a hard disk or memory of the computer device. In other embodiments, the memory 310 may also be an external storage device of a computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, or a Flash Card (Flash Card) provided on the computer device. Of course, memory 310 may also include both internal storage units for computer devices and external storage devices. In this embodiment, the memory 310 is typically used to store an operating system installed on a computer device and various types of application software, such as program codes of the above-described methods. In addition, the memory 310 may also be used to temporarily store various types of data that have been output or are to be output.
Processor 320 is typically used to perform the overall operations of the computer device. In this embodiment, the memory 310 is used for storing program codes or instructions, the program codes include computer operation instructions, and the processor 320 is used for executing the program codes or instructions stored in the memory 310 or processing data, such as the program codes for executing the above-mentioned method.
Herein, the bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, a peripheral component interconnect (Peripheral Component Interconnect, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The bus system may be classified as an address bus, a data bus, a control bus, etc. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
Still another embodiment of the present application provides a computer-readable medium, which may be a computer-readable signal medium or a computer-readable medium. A processor in a computer reads computer readable program code stored in a computer readable medium, such that the processor is capable of performing the functional actions specified in each step or combination of steps in the above-described method; a means for generating a functional action specified in each block of the block diagram or a combination of blocks.
The computer readable medium includes, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared memory or semiconductor system, apparatus or device, or any suitable combination of the foregoing, the memory storing program code or instructions, the program code including computer operating instructions, and the processor executing the program code or instructions of the above-described methods stored by the memory.
The definition of memory and processor may refer to the description of the embodiments of the computer device described above, and will not be repeated here.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The functional units or modules in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps other than those listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of first, second, third, etc. does not denote any order, and the words are to be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specifically stated.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (9)

1. The scene experience method based on virtual reality is characterized in that a first user wears Virtual Reality (VR) equipment and data monitoring equipment, wherein the VR equipment is used for displaying a virtual scene to the first user, and the data monitoring equipment is used for monitoring psychological data of the first user;
the method comprises the following steps:
receiving current psychological data of the first user sent by the data monitoring equipment, wherein the current psychological data comprises: heart rate, body temperature, and blood pressure;
determining an original virtual scene to be displayed to the first user by the VR device based on the emotional state of the first user, the emotional state of the first user being input by the first user; each original virtual scene has a unique corresponding keyword, and the determining, based on the emotional state of the first user, the original virtual scene to be displayed to the first user by the VR device includes: matching the emotion state of the first user, keywords corresponding to each original virtual scene, and determining the original virtual scene corresponding to the keyword with the highest matching degree as the original virtual scene to be displayed to the first user by the VR equipment;
Setting a display duration of the original virtual scene based on the heart rate of the first user, the body temperature of the first user, and the blood pressure of the first user, comprising: judging the proper time length for watching the scene of the first user through the heart rate, the body temperature and the blood pressure of the first user, and setting the display time length of the original virtual scene based on the proper time length, wherein the proper time length is determined through a time length comparison table, and the heart rate, the body temperature and the blood pressure and the corresponding relation of the time length are stored in the time length comparison table;
obtaining mapping comparison information, wherein the mapping comparison information is used for describing a mapping relation between a psychological experience control point and an environment change sequence, and the psychological experience control point is used for representing the adjustment direction of the rhythm phase of the first user;
based on the display duration of the original virtual scene, compressing the original virtual scene to obtain a target virtual scene, so that the first user experiences the target virtual scene through the VR equipment; based on the display duration of the original virtual scene, compressing the original virtual scene to obtain a target virtual scene, including: based on the mapping comparison information, fusing a target natural scene and the original virtual scene to obtain a fused scene; based on the display duration, compressing the fusion scene by adopting a compression mapping algorithm to obtain the target virtual scene of the display duration; the compression mapping algorithm adopts a mapping mode to compress 24 hours within 10 minutes;
Wherein, the target virtual scene includes: virtual farm scenes, virtual garden scenes, virtual autumn pasture scenes, virtual autumn defoliation scenes, and virtual snowfield scenes.
2. The method of claim 1, wherein the obtaining mapping control information comprises:
obtaining a psychological rhythm affecting feature, the psychological rhythm affecting feature comprising: the method comprises the steps of enabling a first dimension to be a weather state, enabling a second dimension to be an illumination intensity and enabling a third dimension to be spectrum information, wherein the first dimension corresponds to the feature, the second dimension corresponds to the feature and the third dimension corresponds to the feature;
extracting features with the same emotion regulating direction and the same rhythm phase influence from the psychological rhythm influence features to obtain an environment change sequence, wherein the environment change sequence is used for describing different psychological emotion regulation;
and determining the mapping comparison information based on the environment change sequence and the psychological experience control point.
3. The method of claim 2, wherein the determining the mapping contrast information based on the sequence of environmental changes and a mental experience control point comprises:
carrying out weight configuration on psychological rhythm influence characteristics in different dimensions;
Sorting the sequence of environmental changes based on the weight configuration;
and determining the mapping comparison information based on the psychological experience control point and the sequencing result of the environment change sequence.
4. The method according to claim 1, wherein the method further comprises:
determining scene smell corresponding to the original virtual scene/the target virtual scene;
and controlling the odor playing device to emit the scene odor based on the display duration, wherein the emission duration of the scene odor is smaller than or equal to the display duration.
5. The method according to claim 1, wherein the method further comprises:
receiving current psychological data of the first user sent by the data monitoring device when the first user experiences the target virtual scene through the VR device;
based on the current psychological data, adjusting the display duration of the target virtual scene;
or, based on the current psychological data, adjusting the environmental information displayed in the target virtual scene, where the environmental information displayed in the target virtual scene includes: weather status, illumination intensity, or spectral information.
6. The method according to claim 1, wherein the method further comprises:
Receiving historical psychological data of a second user sent by the data monitoring equipment, wherein the second user comprises: the first user or the first user and other users;
determining a rhythm phase difference based on the historical psychological data of the second user;
the mental experience control point is determined based on the rhythmic phase difference and the emotional state of the second user.
7. The scene experience device based on virtual reality is characterized in that a first user wears Virtual Reality (VR) equipment and data monitoring equipment, wherein the VR equipment is used for displaying a virtual scene to the first user, and the data monitoring equipment is used for monitoring psychological data of the first user;
the device comprises:
the receiving module is configured to receive current psychological data of the first user, where the current psychological data is sent by the data monitoring device and includes: heart rate, body temperature, and blood pressure;
a determining module, configured to determine, based on an emotional state of the first user, an original virtual scene to be displayed to the first user by the VR device; the emotional state of the first user is input by the first user, each original virtual scene has a keyword which is uniquely corresponding to the first user, and the determining module is specifically configured to: matching the emotion state of the first user, keywords corresponding to each original virtual scene, and determining the original virtual scene corresponding to the keyword with the highest matching degree as the original virtual scene to be displayed to the first user by the VR equipment;
A setting module, configured to set a display duration of the original virtual scene based on the heart rate of the first user, the body temperature of the first user, and the blood pressure of the first user; the setting module is specifically configured to: judging the proper time length for watching the scene of the first user through the heart rate, the body temperature and the blood pressure of the first user, and setting the display time length of the original virtual scene based on the proper time length, wherein the proper time length is determined through a time length comparison table, and the heart rate, the body temperature and the blood pressure and the corresponding relation of the time length are stored in the time length comparison table;
the system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring mapping comparison information, the mapping comparison information is used for describing a mapping relation between a psychological experience control point and an environment change sequence, and the psychological experience control point is used for representing the adjustment direction of the rhythm phase of the first user;
the compression module is used for compressing the original virtual scene based on the display duration of the original virtual scene to obtain a target virtual scene so that the first user experiences the target virtual scene through the VR equipment; the compression module is specifically configured to: based on the mapping comparison information, fusing a target natural scene and the original virtual scene to obtain a fused scene; based on the display duration, compressing the fusion scene by adopting a compression mapping algorithm to obtain the target virtual scene of the display duration; the compression mapping algorithm adopts a mapping mode to compress 24 hours within 10 minutes;
Wherein, the target virtual scene includes: virtual farm scenes, virtual garden scenes, virtual autumn pasture scenes, virtual autumn defoliation scenes, and virtual snowfield scenes.
8. A computer device comprising a memory and a processor, wherein the memory stores a computer program, and the processor implements the virtual reality-based scene experience method according to any one of claims 1-6 when executing the computer program.
9. A computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the virtual reality based scene experience method of any of claims 1-6.
CN202310870684.XA 2023-07-17 2023-07-17 Scene experience method and device based on virtual reality, computer equipment and medium Active CN116594511B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310870684.XA CN116594511B (en) 2023-07-17 2023-07-17 Scene experience method and device based on virtual reality, computer equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310870684.XA CN116594511B (en) 2023-07-17 2023-07-17 Scene experience method and device based on virtual reality, computer equipment and medium

Publications (2)

Publication Number Publication Date
CN116594511A CN116594511A (en) 2023-08-15
CN116594511B true CN116594511B (en) 2023-11-07

Family

ID=87599464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310870684.XA Active CN116594511B (en) 2023-07-17 2023-07-17 Scene experience method and device based on virtual reality, computer equipment and medium

Country Status (1)

Country Link
CN (1) CN116594511B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106249903A (en) * 2016-08-30 2016-12-21 广东小天才科技有限公司 Virtual reality scene content playing method and device
CN106598229A (en) * 2016-11-11 2017-04-26 歌尔科技有限公司 Virtual reality scene generation method and equipment, and virtual reality system
CN107320836A (en) * 2017-05-02 2017-11-07 华南理工大学 Depression auxiliary treatment virtual scene system and its implementation based on VR technologies
CN109036519A (en) * 2018-07-24 2018-12-18 四川大学华西医院 Virtual experience decompression method and device
CN109215804A (en) * 2018-10-09 2019-01-15 华南理工大学 Mental disorder assistant diagnosis system based on virtual reality technology and physio-parameter detection
CN110418095A (en) * 2019-06-28 2019-11-05 广东虚拟现实科技有限公司 Processing method, device, electronic equipment and the storage medium of virtual scene
CN112150778A (en) * 2019-06-29 2020-12-29 华为技术有限公司 Environmental sound processing method and related device
CN113633870A (en) * 2021-08-31 2021-11-12 武汉轻工大学 Emotional state adjusting system and method
CN113893429A (en) * 2020-06-23 2022-01-07 浙江瑞尚智能科技有限公司 Virtual/augmented reality auxiliary stabilization device and method
CN113975583A (en) * 2021-10-11 2022-01-28 苏州易富网络科技有限公司 Emotion persuasion system based on virtual reality technology
CN115808883A (en) * 2022-11-24 2023-03-17 珠海格力电器股份有限公司 Scene experience method, device, system and storage medium
CN115957419A (en) * 2023-02-15 2023-04-14 中国人民解放军军事科学院军事医学研究院 Information processing method, virtual reality system and device about psychological relaxation
CN116269385A (en) * 2023-02-27 2023-06-23 上海闻泰电子科技有限公司 Method, device, equipment and storage medium for monitoring equipment use experience

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200201434A1 (en) * 2018-12-20 2020-06-25 Samsung Electronics Co., Ltd. Bioresponsive virtual reality system and method of operating the same

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106249903A (en) * 2016-08-30 2016-12-21 广东小天才科技有限公司 Virtual reality scene content playing method and device
CN106598229A (en) * 2016-11-11 2017-04-26 歌尔科技有限公司 Virtual reality scene generation method and equipment, and virtual reality system
CN107320836A (en) * 2017-05-02 2017-11-07 华南理工大学 Depression auxiliary treatment virtual scene system and its implementation based on VR technologies
CN109036519A (en) * 2018-07-24 2018-12-18 四川大学华西医院 Virtual experience decompression method and device
CN109215804A (en) * 2018-10-09 2019-01-15 华南理工大学 Mental disorder assistant diagnosis system based on virtual reality technology and physio-parameter detection
CN110418095A (en) * 2019-06-28 2019-11-05 广东虚拟现实科技有限公司 Processing method, device, electronic equipment and the storage medium of virtual scene
CN112150778A (en) * 2019-06-29 2020-12-29 华为技术有限公司 Environmental sound processing method and related device
CN113893429A (en) * 2020-06-23 2022-01-07 浙江瑞尚智能科技有限公司 Virtual/augmented reality auxiliary stabilization device and method
CN113633870A (en) * 2021-08-31 2021-11-12 武汉轻工大学 Emotional state adjusting system and method
CN113975583A (en) * 2021-10-11 2022-01-28 苏州易富网络科技有限公司 Emotion persuasion system based on virtual reality technology
CN115808883A (en) * 2022-11-24 2023-03-17 珠海格力电器股份有限公司 Scene experience method, device, system and storage medium
CN115957419A (en) * 2023-02-15 2023-04-14 中国人民解放军军事科学院军事医学研究院 Information processing method, virtual reality system and device about psychological relaxation
CN116269385A (en) * 2023-02-27 2023-06-23 上海闻泰电子科技有限公司 Method, device, equipment and storage medium for monitoring equipment use experience

Also Published As

Publication number Publication date
CN116594511A (en) 2023-08-15

Similar Documents

Publication Publication Date Title
JP7229843B2 (en) multimedia presentation system
Clifford Your guide to forest bathing (expanded edition): Experience the healing power of nature
Sternberg Healing spaces: The science of place and well-being
Ohta A phenomenological approach to natural landscape cognition
CN107341333A (en) A kind of VR apparatus and method for aiding in psychological consultation
Liu Invisible planets: Contemporary Chinese science fiction in translation
CN116594511B (en) Scene experience method and device based on virtual reality, computer equipment and medium
Chapple Living landscapes: meditations on the five elements in Hindu, Buddhist, and Jain yogas
Kaneko et al. Izihlahla Ezikhuluma Ngezandla (" Trees Who Talk with Hands"): Tree Poems in South African Sign Language
CN111672006A (en) Sleep-aiding regulation and control system
Alam et al. Imageries in William Wordsworth’s Poems
Ryan Plants that perform for you'? From floral aesthetics to floraesthesis in the Southwest of Western Australia
Clare John Clare and Ecological Love
O'Mordha An Exploration of My Undergraduate Poetry Works
Chari Eschatology for Dummies
CN114842701A (en) Control method, system, device, equipment and medium for polar environment training
Stack REIMAGINING THE WEST
Amaris Beauty and the street: 72 hours in Union Square, NYC
Seifert Viewing Humans and Nonhumans in Fairy-Tale Animation: The Case of Michel Ocelot's Kirikou Films
Bottoms Utopia
Pavlik-Malone Deconstructing Dreamscapes of Femininity: Lolitas, in the Mist
Sinha Poetic Abstraction at Attention Restoration: Investigating the Effects of Nature Based Poetry on Attention Restoration
CN115981195A (en) Control method and device of intelligent mattress, intelligent mattress and storage medium
BARRETT Graduation Project
CN115463417A (en) System and method for plant scene type game

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant