CN112926116B - System and method for collecting fire evacuation behavior data of stadium based on virtual reality - Google Patents

System and method for collecting fire evacuation behavior data of stadium based on virtual reality Download PDF

Info

Publication number
CN112926116B
CN112926116B CN202110224035.3A CN202110224035A CN112926116B CN 112926116 B CN112926116 B CN 112926116B CN 202110224035 A CN202110224035 A CN 202110224035A CN 112926116 B CN112926116 B CN 112926116B
Authority
CN
China
Prior art keywords
data
human body
smoke
evacuation
fire
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110224035.3A
Other languages
Chinese (zh)
Other versions
CN112926116A (en
Inventor
刘莹
孙澄
朱玉洁
刘敏
黄丽蒂
杜家旺
甄蒙
杨阳
孙适
张洪瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202110224035.3A priority Critical patent/CN112926116B/en
Publication of CN112926116A publication Critical patent/CN112926116A/en
Application granted granted Critical
Publication of CN112926116B publication Critical patent/CN112926116B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • G06Q50/265Personal security, identity or safety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2119/00Details relating to the type or aim of the analysis or the optimisation
    • G06F2119/02Reliability analysis or reliability optimisation; Failure analysis, e.g. worst case scenario performance, failure mode and effects analysis [FMEA]

Abstract

The invention relates to a gymnasium fire evacuation behavior data collection system based on virtual reality and a collection method thereof. The invention relates to the technical field of building safety and evacuation simulation, which is characterized in that a virtual gymnasium three-dimensional building model is built, and a fire smoke visual scene is simulated in the gymnasium model; acquiring behavior data of experimenters during movement during fire simulation through a behavior data acquisition module, and performing subjective evaluation on movement intensity by using a subjective physical strength scale RPE after the simulation is finished; analyzing the collected behavior data to determine the motion behaviors under different evacuation postures; matching basic information of experimenters, human body posture data, human body motion parameters, fatigue evaluation grade and RPE evaluation data, numbering the data in groups, wherein the numbering also comprises experiment date and experiment sequence, and sealing the data.

Description

System and method for collecting fire evacuation behavior data of stadium based on virtual reality
Technical Field
The invention relates to the technical field of building safety and evacuation simulation, in particular to a gymnasium fire evacuation behavior data collection system and a method based on virtual reality.
Background
In recent years, the sports industry is rapidly developed by the continuous rising national fitness heat tide, the number of facilities in the gymnasium is greatly increased, and the use frequency of the gymnasium is increased. The gymnasium is a large public building and belongs to a place with dense personnel. Once a fire accident occurs, improper evacuation can have serious consequences. According to a large number of fire cases at home and abroad, most of casualties caused by fire are caused by smoke harm. The direct death from smoke damage in a fire accounts for about 1/3 to 2/3 of the total number of deaths. The harmfulness of fire smoke is mainly expressed in three aspects of toxicity, light reducing property and high-temperature radiation property. Due to the rising effect of hot air, a large amount of dense smoke generated in a fire disaster floats on the upper layer, the height of a safe evacuation space is reduced due to the spread of fire smoke, and personnel need to adopt unconventional evacuation movement postures (such as bending, kneeling, climbing and the like) to improve the escape probability. Therefore, studying the unconventional evacuation behavior pattern of the stadium population in a fire situation is crucial to achieving rapid and safe emergency evacuation.
The existing research on emergency evacuation behavior characteristics of personnel is mainly divided into two levels of groups and individuals, the research on the evacuation behavior of the individuals at the present stage is not mature, and the research method mainly comprises case analysis, questionnaire interview, exercise and practice, animal experiments and the like. However, each method has its limitations, and more rational and effective research methods are required. The Virtual Reality (VR) technology has three characteristics of immersion, interaction and conception, namely the characteristic of 3-I, and the rapid development in recent years makes it possible to create a high-reality evacuation situation and develop individual evacuation behavior experiments, and provides a brand-new method with a wide application prospect for the research of individual evacuation behaviors. The method for carrying out experiments by using virtual reality scenes and combining the evacuation behaviors of researchers in experimental psychology also becomes a method adopted by more and more students. The virtual reality technology is applied to building fire simulation, the psychology and behavior of people are researched under the experiment control condition, the behavior of people in the emergency evacuation process can be observed through reasonably designed behavior experiments, evacuation behavior data are collected, and then the decision and the movement process of people are presumed and analyzed, so that an important basis is provided for the establishment of an evacuation behavior model.
Three-dimensional motion capture is classified into 5 types, and acoustic, optical, inertial, electromagnetic, and mechanical motion capture 5 types. The mainstream motion capture types at present are mainly inertial type and optical type. The optical motion capture system is based on the computer vision principle, a plurality of high-speed cameras monitor and track target feature points from different angles, motion capture is completed by combining an algorithm of skeleton calculation, the precision is high, the constraint is small, and the real somatosensory interaction experience is more approximate. However, the optical motion capture system has the disadvantages that the optical motion capture system is greatly influenced by external environments, such as insufficient ambient light, motion limb shielding and the like, and is limited by the visible space compression caused by the fire smoke light reduction. The motion capture technology based on the inertial sensor is little affected by the outside, is not limited by light rays, does not need to install 'lighthouse', camera and other messy components on a use space, and has the advantages of large acquirable motion information amount, high sensitivity, good dynamic performance, wide movable range and body sensing interaction completely close to real interaction experience.
Disclosure of Invention
The invention provides a stadium fire evacuation behavior data collection system and a method based on virtual reality, aiming at realizing the analysis and research of an emergency evacuation behavior mode of an individual unconventional posture, thereby providing data support for a simulation experiment and simultaneously accurately and effectively making an evacuation escape strategy, and the invention provides the following technical scheme:
a virtual reality based stadium fire evacuation behavior data collection system, the system comprising: the system comprises a virtual scene construction module, a behavior data acquisition module, a behavior data processing and analyzing module and a data sealing and storing module;
the virtual scene building module carries out three-dimensional modeling, simulates fire smoke visualization and simulates smoke smell and heat sensation;
the behavior data acquisition module comprises an inertial sensing unit, a UWB positioning tag, a blood oxygen sensor and a heart rate sensor, and the inertial sensing unit, the UWB positioning tag, the blood oxygen sensor and the heart rate sensor are used for acquiring the behavior data of the person during exercise;
the behavior data processing and analyzing module determines motion behavior parameters under different evacuation postures according to the behavior data, and the data sealing and storing module seals and stores the evaluation data.
A method for collecting fire evacuation behavior data of a stadium based on virtual reality comprises the following steps:
step 1: building a virtual gymnasium three-dimensional building model, and simulating a fire smoke visualization scene in the gymnasium model;
step 2: acquiring behavior data of experimenters during movement during fire simulation through a behavior data acquisition module, and performing subjective evaluation on movement intensity by using a subjective physical strength scale RPE after the simulation is finished;
and step 3: analyzing the collected behavior data to determine the motion behaviors under different evacuation postures;
and 4, step 4: matching basic information of experimenters, human body posture data, human body motion parameters, fatigue evaluation grade and RPE evaluation data, numbering the data in groups, wherein the numbering also comprises experiment date and experiment sequence, and sealing the data.
Preferably, the step 1 specifically comprises:
building a three-dimensional building model of a virtual gymnasium, realizing the evacuation walkway of the gymnasium through 3D Max software, and carrying out 1:1, three-dimensional building modeling; simulating the flow of fire smoke through PyroSim software, secondarily developing Unity, constructing a fire virtual scene, and superposing the virtual scene on a gymnasium model to realize the visualization of the fire smoke in the gymnasium space; the laboratory technician wears the wear-type VR terminal, and the smell generator simulates the flue gas based on PyroSim simulation result, the simulation result includes: the smoke concentration, the smoke layer height and the fire temperature are matched to build a virtual reality simulation environment with real smell sense and heat sense.
Preferably, in the simulation of fire smoke flow, smoke simulation initialization is performed, the initial simulation height of the smoke layer is five emergency evacuation postures of vertical walking, stooping walking, hand and foot crawling, hand and knee kneeling crawling and crawling according to human bodies, and the height of a lower smoke interface from the evacuation ground is an arithmetic progression with d =0.4 m: 2.0m,1.6m,1.2m,0.8m and 0.4m.
Preferably, the upper part of the head-mounted VR terminal is provided with a spray can, a spray valve and a spray head, and the odor is generated to match the smoke concentration, the smoke layer height and the smoke temperature; when the head-mounted VR terminal is positioned above the smoke layer, the spray head continuously sprays simulated smoke, the smoke spraying rate is matched with smoke concentration data, and the smoke temperature is matched with fire temperature data; when the head-mounted VR terminal is located below the smoke layer, the aerosol valve is closed.
Preferably, the step 2 specifically comprises:
step 2.1: respectively installing inertial sensing units on the head, the trunk, the hip, the upper arms at two sides, the lower arms at two sides, two hands, thighs at two sides, the shanks at two sides and feet of a human body, and inducing 15 actions of the nodes of the human body by an inertial action capturing sensor; each inertial sensing unit consists of a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer and is used for measuring the acceleration, the angular velocity and the magnetic resistance yaw angle of limbs in the movement process of a human body to obtain the original data of the posture of the human body;
step 2.2: placing a UWB positioning tag at the waist part of a human body, and realizing the three-dimensional space positioning data of an experimenter with four positioning base stations by using a double-sided two-way distance measurement method;
step 2.3: placing the acquisition ends of the blood oxygen sensor and the heart rate sensor at the position where blood vessels on the wrist of a human body are dense, and acquiring a blood oxygen saturation signal and a real-time exercise heart rate in the exercise process of the human body to obtain vital sign data;
step 2.4: and (4) determining a physical strength scale (RPE), and carrying out subjective evaluation on the exercise intensity by experimenters after a group of experiments are finished to obtain evaluation data.
Preferably, the initial state of the experimenter is set to be the upright state, and the basic information of each experimenter is determined: ID and label name, sex, age interval, height and weight; calibrating and initializing the measuring instrument, wherein the sampling frequency of the instrument is uniformly set to t1; recording the resting heart rate and resting blood oxygen saturation indexes of experimenters, initial posture information and initial space positioning information before starting an experiment; the maximum evacuation time was set to 60S experimentally.
Preferably, the step 3 specifically comprises:
step 3.1: based on an extended Kalman filtering algorithm, fusing acceleration, angular velocity and magnetometers to obtain a carrier attitude angle, carrying out attitude calculation to obtain an attitude quaternion, and obtaining human body attitude data under different evacuation postures: the spatial position, the rotation angle, the orientation, the inclination angle and the range of motion of joints of each limb node;
step 3.2: synthesizing the gravity center of the human body by using an SKC algorithm and five node space positions of the trunk part, the thighs at two sides and the shanks at two sides to obtain the gravity center track of the human body under different evacuation posture time variables;
step 3.3: improve indoor positioning accuracy through ultra wide band location and inertial sensor fusion algorithm, obtain the experimenter spatial position, the reconsitution motion trail obtains human motion parameter under the different evacuation postures, and motion parameter includes: the moving time length, the moving distance, the moving direction, the average speed, the maximum speed and the evacuation path;
step 3.4: setting fatigue indexes, fusing data of a blood oxygen sensor and a heart rate sensor, and analyzing fatigue degree of vital sign data, wherein the fatigue degree grades comprise four grades of relaxation, fitness, larger grade and transition grade; when any of the experimenters' sensor data exceeds a threshold, the experiment is stopped.
Preferably, the experimental data are uploaded in real time in a wireless communication mode after being processed, basic information of experimenters, human body posture data, human body motion parameters, fatigue degree evaluation grades and RPE evaluation data are matched, the data are numbered in groups, and the numbering further comprises experimental date and experimental sequence; and in a preset time period, after the virtual reality experiment is finished, grouping and numbering basic information of experimenters, human body posture data, human body motion parameters, fatigue degree evaluation grades and RPE evaluation data, and packaging and storing the basic information, the human body posture data, the human body motion parameters, the fatigue degree evaluation grades and the RPE evaluation data.
The invention has the following beneficial effects:
the invention provides a virtual reality-based stadium fire evacuation behavior data collection experimental method and system for researching emergency evacuation behavior modes of individual unconventional postures under the condition of stadium fire. The virtual reality technology simulates fire smoke spreading, the psychology and behavior of evacuation progress of the non-conventional evacuation posture of people are observed under the experimental control condition, evacuation behavior data are collected, the decision and motion process are analyzed, and an important basis is provided for the establishment of an evacuation behavior model, so that the evacuation time and the evacuation safety are accurately and effectively predicted.
Drawings
Fig. 1 is a flow chart of an experimental method for collecting fire evacuation behavior data of a stadium based on virtual reality;
FIG. 2 is a diagram of a virtual reality-based stadium fire evacuation behavior data collection experiment system;
FIG. 3 is a diagram of a multi-sensor position profile;
figure 4 is an ergonomic illustration.
Detailed Description
The present invention will be described in detail with reference to specific examples.
The first embodiment is as follows:
according to fig. 1 to 4, the invention provides a virtual reality-based stadium fire evacuation behavior data collection system and a collection method thereof:
a virtual reality based stadium fire evacuation behavior data collection system, the system comprising: the system comprises a virtual scene construction module, a behavior data acquisition module, a behavior data processing and analyzing module and a data sealing and storing module;
the virtual scene building module carries out three-dimensional modeling, simulates fire smoke visualization, and simulates smoke smell and heat sensation;
the behavior data acquisition module comprises an inertial sensing unit, a UWB positioning tag, a blood oxygen sensor and a heart rate sensor, and the inertial sensing unit, the UWB positioning tag, the blood oxygen sensor and the heart rate sensor are used for acquiring the behavior data of the person during exercise;
the behavior data processing and analyzing module determines motion behavior parameters under different evacuation postures according to the behavior data, and the data sealing and storing module seals and stores the evaluation data.
The invention provides a method for collecting fire evacuation behavior data of a gymnasium based on virtual reality, which comprises the following steps:
step 1: building a virtual gymnasium three-dimensional building model, and simulating a fire smoke visualization scene in the gymnasium model;
the step 1 specifically comprises the following steps:
building a three-dimensional building model of a virtual gymnasium, realizing the evacuation walkway of the gymnasium through 3D Max software, and carrying out 1:1, three-dimensional building modeling; simulating the flow of fire smoke through PyroSim software, secondarily developing Unity, constructing a fire virtual scene, and superposing the virtual scene on a gymnasium model to realize the visualization of the fire smoke in the gymnasium space; the laboratory technician wears the wear-type VR terminal, and the smell generator simulates the flue gas based on PyroSim simulation result, the simulation result includes: the smoke concentration, the smoke layer height and the fire temperature are matched to build a virtual reality simulation environment with real smell sense and heat sense.
The smoke flow in the simulated fire is that smoke simulation is initialized, the initial height of the smoke layer simulation is five emergency evacuation postures of vertical walking, stooping walking, hand and foot crawling, hand and knee kneeling crawling and crawling according to the human body, and the height of the lower interface of the smoke from the evacuation ground is d =0.4m, and the smoke layer simulation is as follows: 2.0m,1.6m,1.2m,0.8m and 0.4m.
The upper part of the head-mounted VR terminal is provided with a spray can, a spray valve and a spray head, and the odor is generated to match the smoke concentration, the smoke layer height and the smoke temperature; when the head-mounted VR terminal is positioned above the smoke layer, the spray head continuously sprays simulated smoke, the smoke spraying rate is matched with smoke concentration data, and the smoke temperature is matched with fire temperature data; when the head-mounted VR terminal is located below the smoke layer, the aerosol valve is closed.
Step 2: acquiring behavior data of experimenters during movement during fire simulation through a behavior data acquisition module, and performing subjective evaluation on movement intensity by using a subjective physical strength scale RPE after the simulation is finished;
the step 2 specifically comprises the following steps:
step 2.1: respectively installing inertial sensing units on the head, the trunk, the hip, the upper arms at two sides, the lower arms at two sides, two hands, thighs at two sides, the shanks at two sides and feet of a human body, and inducing 15 actions of the nodes of the human body by an inertial action capturing sensor; each inertial sensing unit consists of a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer and is used for measuring the acceleration, the angular velocity and the magnetic resistance yaw angle of limbs in the movement process of a human body to obtain the original data of the posture of the human body;
step 2.2: placing a UWB positioning tag at the waist part of a human body, and realizing the three-dimensional space positioning data of an experimenter with four positioning base stations by using a double-sided two-way distance measurement method;
step 2.3: placing the acquisition ends of the blood oxygen sensor and the heart rate sensor at the position where blood vessels on the wrist of a human body are dense, and acquiring a blood oxygen saturation signal and a real-time exercise heart rate in the exercise process of the human body to obtain vital sign data;
step 2.4: and (4) determining a physical strength scale (RPE), and carrying out subjective evaluation on the exercise intensity by experimenters after a group of experiments are finished to obtain evaluation data.
The initial state of the experimenter is set to be the vertical state, and the basic information of each experimenter is determined: ID and label name, sex, age interval, height and weight; calibrating and initializing the measuring instrument, wherein the sampling frequency of the instrument is uniformly set to t1; recording the resting heart rate and resting blood oxygen saturation indexes of experimenters, initial posture information and initial space positioning information before starting an experiment; the maximum evacuation time was set to 60S for the experiment.
And step 3: analyzing the collected behavior data to determine the motion behaviors under different evacuation postures;
the step 3 specifically comprises the following steps:
step 3.1: based on an extended Kalman filtering algorithm, fusing acceleration, angular velocity and magnetometers to obtain a carrier attitude angle, carrying out attitude calculation to obtain an attitude quaternion, and obtaining human body attitude data under different evacuation postures: the spatial position, the rotation angle, the orientation, the inclination angle and the range of motion of joints of each limb node;
step 3.2: synthesizing the gravity center of the human body by using an SKC algorithm and five node space positions of the trunk part, the thighs at two sides and the shanks at two sides to obtain the gravity center track of the human body under different evacuation posture time variables;
step 3.3: improve indoor positioning accuracy through ultra wide band location and inertial sensor fusion algorithm, obtain the experimenter spatial position, the reconsitution motion trail obtains human motion parameter under the different evacuation postures, and motion parameter includes: the moving time length, the moving distance, the moving direction, the average speed, the maximum speed and the evacuation path;
step 3.4: setting fatigue indexes, fusing data of a blood oxygen sensor and a heart rate sensor, and analyzing fatigue degree of vital sign data, wherein the fatigue degree grades comprise four grades of relaxation, fitness, larger grade and transition grade; when any of the experimenters' sensor data exceeds a threshold, the experiment is stopped.
And 4, step 4: matching basic information of experimenters, human body posture data, human body motion parameters, fatigue evaluation grade and RPE evaluation data, numbering the data in groups, wherein the numbering also comprises experiment date and experiment sequence, and sealing the data.
The experimental data are processed and uploaded in real time in a wireless communication mode, basic information of experimenters, human body posture data, human body motion parameters, fatigue degree evaluation grades and RPE evaluation data are matched, the data are numbered in groups, and the numbering further comprises experimental date and experimental sequence; and in a preset time period, after the virtual reality experiment is finished, grouping and numbering basic information of experimenters, human body posture data, human body motion parameters, fatigue degree evaluation grades and RPE evaluation data, and packaging and storing the basic information, the human body posture data, the human body motion parameters, the fatigue degree evaluation grades and the RPE evaluation data.
The second concrete embodiment:
fig. 1 shows a virtual reality-based gymnasium fire evacuation behavior data collection experiment method, which includes the steps:
s1: building a virtual gymnasium three-dimensional building model, simulating a fire smoke visual scene in the gymnasium model, and further improving the real smell by using a smell generator;
s2: the method comprises the following steps that an inertial sensing unit, a UWB positioning tag, a blood oxygen sensor and a heart rate sensor acquisition end acquire personnel behavior data in motion in real time, and after simulation is finished, a subjective physical strength scale (RPE) is used for carrying out subjective evaluation on motion intensity;
s3: the data processing and analyzing module calculates and processes the experimental data to obtain relevant parameters of the motion behaviors under different evacuation postures;
s4: and matching basic information of experimenters, human body posture data, human body motion parameters, fatigue evaluation levels and RPE evaluation data, and numbering and packaging the data in groups.
According to an embodiment of the present application, the step S1 specifically includes:
s11: modeling scene objects, and realizing the stadium evacuation walkway 1 through 3D Max software: 1, three-dimensional building modeling;
s12: scene modeling, namely simulating the flow of fire smoke through PyroSim software, secondarily developing Unity, constructing a fire virtual scene, and superposing the virtual scene on a gymnasium model to realize the visualization of the fire smoke in the gymnasium space;
s13, the experimenter wears the head-wearing VR terminal, the odor generator simulates the odor of the smoke based on PyroSim simulation results (smoke concentration, smoke layer height and fire temperature), and a virtual reality simulation environment with real smell sense and heat sense is built.
In the step S12, the process is performed,
the flue gas simulation is initialized, and the initial height of flue gas layer simulation is according to the five emergent sparse postures of walking of human upright walking, the walking of bowing, the hand and foot is crawled, the hand knee kneels and crawls, and the lower interface of flue gas is d =0.4m apart from the arithmetic progression of sparse ground height: 2.0m,1.6m,1.2m,0.8m and 0.4m.
The data of the distance between the lower interface of the smoke and the evacuation ground height in different evacuation postures are derived from human engineering, for example, as shown in fig. 4, according to one embodiment of the present application, in the step S13,
the display screen is installed to wear-type VR terminal front side, and the spraying valve is installed to the display screen upper end, and the atomising head is installed to the spraying valve front end, and the injection spray pipe is installed to the injection spray valve rear end, on the injection spray pipe was fixed in the wear-type VR terminal, install a plurality of spraying tube heads that are provided with electromagnetic switch on the injection spray pipe, and spraying can is connected to the spraying tube head. Smell generation real-time matching PyroSim software simulation data (fire smoke concentration, smoke layer height and fire temperature), when the head-mounted VR terminal is positioned above the smoke layer, the spray head continuously sprays simulation smoke, the smoke spraying rate is matched with the smoke concentration data, and the smoke temperature is matched with the fire temperature data; when the head-mounted VR terminal is located below the smoke layer, the aerosol valve is closed. The simulation of real smell sense and heat sense further limits the influence factors of fire smoke on the evacuation posture.
The step S2 specifically includes:
s21: the inertial sensing units are respectively arranged on the head 1A, the trunk part 2A, the upper arms 4A and 5A at two sides of the hip 3A, the lower arms 6A and 7A at two sides, the hands 8A and 9A, the thighs 10A and 11A at two sides, the lower legs 12A and 13A at two sides and the feet 14A and 15A, and the inertial motion capture sensor senses 15 the motion of the human body nodes. The inertial sensing units are named 1A-14A with A according to the node position. Each inertial sensing unit consists of a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer and is used for measuring the acceleration, the angular velocity and the magnetic resistance yaw angle of limbs in the movement process of a human body to obtain the original data of the posture of the human body;
s22: the UWB positioning tag 1B is placed at the waist part of a human body, the experimental personnel three-dimensional space positioning data of the four positioning base stations is realized by using a double-sided two-way distance measurement method, and the UWB positioning tag is named as B;
s23: the blood oxygen sensor 1C and the heart rate sensor acquisition end 2C are placed at the position where blood vessels of the wrist of a human body are dense, blood oxygen saturation signals and real-time exercise heart rate in the process of human body exercise are accurately acquired, vital sign data are obtained, and the blood oxygen sensor and the heart rate sensor acquisition end are named as C;
s24: filling the subjective physical strength scale RPE as shown in the table 1, and carrying out subjective evaluation on the exercise intensity by an experimenter after a group of experiments are finished to obtain evaluation data.
TABLE 1 subjective physical Scale RPE
Figure BDA0002956188770000071
Figure BDA0002956188770000081
The initial state of the experimenter is set to be the vertical state, and the basic information of each experimenter is determined: ID and label name, sex, age interval, height, weight; and calibrating and initializing the measuring instrument, and uniformly setting the sampling frequency of the instrument to be t1. Before the experiment begins, the resting heart rate and resting blood oxygen saturation indexes of experimenters, initial posture information and initial space positioning information are recorded. The maximum evacuation time was set to 60S experimentally.
The processing and analyzing of the human body posture original data in the step S3 specifically comprises the following steps:
a. based on an extended Kalman filtering algorithm, fusing acceleration, angular velocity and magnetometers to obtain a carrier attitude angle, further performing attitude calculation to obtain an attitude quaternion, correcting an accumulated error of an inertial sensing unit, and obtaining human body attitude data under different evacuation attitudes: the spatial position of each limb node, the rotation angle of the node, the orientation of the node, the inclination angle of the trunk and the range of motion of the joint;
b. the gravity center of the human body is synthesized by using an SKC algorithm and five node posture data of a trunk part (2A), thighs (10A, 11A) on two sides and shanks (12A, 13A) on two sides, and the gravity center track of the human body under different evacuation posture time variables is obtained. The initial state of the experimenter is set to be the vertical state, and the gravity center track of the human body reflects the whole process of unconventional evacuation posture decision and movement of the experimenter. The human body gravity center track is synthesized by the mass ratio of each link and the gravity center position of the divided joints, and the conversion formula is as follows:
Figure BDA0002956188770000082
in the formula, r i =[x i y i z i ]Representing the locus of the center of gravity of the ith link, m i Represents the mass of the ith link, M represents the body weight, and N =15. The x-axis direction represents the front-rear gravity center position distribution; the y-axis direction represents the lateral gravity center distribution; the z-axis direction represents the gravity center distribution in the vertical direction.
In this embodiment, the three-dimensional barycentric coordinate formula simplified according to the five-node attitude data is as follows:
Figure BDA0002956188770000091
Figure BDA0002956188770000092
Figure BDA0002956188770000093
according to an embodiment of the present application, the processing and analyzing of the spatial location data in step S3 specifically includes:
the ultra-wideband positioning and inertial sensor fusion algorithm improves indoor positioning accuracy, obtains the spatial position of experimenters, reconstructs a motion track, and further obtains human motion parameters under different evacuation postures: the moving time, the moving distance, the moving direction, the average speed, the maximum speed and the evacuation path.
According to the distance information between rescue workers and each UWB base station, which is obtained by UWB equipment measurement, in the UWB positioning technology, an observation equation can be constructed:
Figure BDA0002956188770000094
in the formula (d) i,k And the distance information between the rescue worker and the ith UWB base station is measured by the UWB equipment when the kth measurement is performed. (X) i ,y i ,z i ) Indicates the position information of the ith UWB base station, i ∈ [1,n ]]。
The ultra-wideband positioning and inertial sensor fusion algorithm work flow is as follows:
(1) Reading acceleration data in an inertial sensor;
(2) Reading the position data of performers in the ultra-wideband positioning system;
(3) Judging whether the ultra-wideband positioning system is in a non-line-of-sight condition;
(4) Using two base station positioning and position estimation of inertial sensors in non-line-of-sight situations;
(5) A simple fusion algorithm is used in the case of line of sight.
According to an embodiment of the present application, the processing and analyzing of the vital sign data in step S3 specifically includes:
and setting fatigue indexes, fusing the data of the blood oxygen sensor and the data of the heart rate sensor, and further carrying out fatigue degree analysis on the vital sign data, wherein the fatigue degree grades comprise four grades of relaxation, fitness, larger and transition. The maximum evacuation time is set to be 60S in an experiment, the fatigue degree grade of the sensor data is evaluated once every t =5S, and the vital sign change of the irregular evacuation posture along with the time is judged. When any sensor data exceeds a threshold value in the experiment process of the experimenter, the experiment is stopped.
According to an embodiment of the present application, the step 4 specifically includes;
the experimental data are processed by the central processing unit and then uploaded to a computer storage system in a wireless communication mode in real time, basic information of experimenters, human body posture data, human body motion parameters, fatigue degree evaluation grades and RPE evaluation data are matched, the data are numbered in groups, and the numbering further comprises experimental date and experimental sequence.
The wireless communication modes are common in various forms such as WIFI, bluetooth, zigbee, NB-IoT and the like.
In addition, as shown in fig. 2, the invention also provides a virtual reality-based gymnasium fire evacuation behavior data collection experiment system, which comprises a virtual scene construction module, a behavior data acquisition module, a behavior data processing and analyzing module and a data encapsulation and storage module.
The virtual scene building module is used for building a virtual gymnasium three-dimensional building model and simulating a fire smoke visual scene in the gymnasium model, and the odor generator further improves the real smell and heat sensation; the behavior data acquisition module is used for acquiring real-time motion behavior data of experimenters in a fire virtual scene of a gymnasium and collecting motion intensity subjective evaluation data by a subjective physical strength scale RPE after simulation is finished; the behavior data processing and analyzing module is used for calculating and processing experimental data to obtain relevant parameters of the motion behaviors under different evacuation postures and performing data analysis on evacuation behavior modes; and the data packaging and storing module is used for grouping, numbering and packaging the basic information of the experimenters, the human body posture data, the human body motion parameters, the fatigue evaluation grade and the RPE evaluation data in a preset time period after the virtual reality experiment is finished.
According to the invention, by building a virtual gymnasium three-dimensional building model and simulating a fire smoke visual scene in the gymnasium model, the smell generator further improves the real smell sense and the heat sense; the method comprises the following steps that an inertial sensing unit, a UWB positioning tag, a blood oxygen sensor and a heart rate sensor acquisition end acquire personnel behavior data in motion in real time, and after simulation is finished, a subjective physical strength scale (RPE) is used for carrying out subjective evaluation on motion intensity; the data processing and analyzing module calculates and processes the experimental data to further obtain the relevant parameters of the motion behaviors under different evacuation postures; and matching basic information of experimenters, human body posture data, human body motion parameters, fatigue evaluation levels and RPE evaluation data, and numbering and packaging the data in groups. Through the collection and analysis of pedestrian evacuation behavior data, the research of the emergency evacuation behavior mode of the individual unconventional posture is realized, data support is provided for simulation experiments, meanwhile, the escape time is accurately and effectively predicted, and an evacuation escape strategy is formulated.
The above description is only a preferred embodiment of the virtual reality-based stadium fire evacuation behavior data collection system and the collection method thereof, and the protection scope of the virtual reality-based stadium fire evacuation behavior data collection system and the collection method thereof is not limited to the above embodiments, and all technical solutions belonging to the idea belong to the protection scope of the present invention. It should be noted that modifications and variations which do not depart from the gist of the invention will be those skilled in the art to which the invention pertains and which are intended to be within the scope of the invention.

Claims (8)

1. A gymnasium fire evacuation behavior data collection system based on virtual reality is characterized in that: the system comprises: the system comprises a virtual scene construction module, a behavior data acquisition module, a behavior data processing and analyzing module and a data sealing and storing module;
the virtual scene building module carries out three-dimensional modeling, simulates fire smoke visualization and simulates smoke smell and heat sensation;
the behavior data acquisition module comprises an inertial sensing unit, a UWB positioning tag, a blood oxygen sensor and a heart rate sensor, and the inertial sensing unit, the UWB positioning tag, the blood oxygen sensor and the heart rate sensor are used for acquiring the behavior data of the person during exercise;
the behavior data processing and analyzing module determines motion behavior parameters under different evacuation postures according to the behavior data, and the data sealing and storing module seals and stores evaluation data;
the method comprises the steps that through a behavior data acquisition module, behavior data of experimenters during movement during fire simulation are acquired, after simulation is finished, subjective evaluation on movement intensity is carried out through a subjective physical strength scale RPE, inertial sensing units are respectively installed on the head, the trunk, the hip, the upper arms on two sides, the lower arms on two sides, two hands, the thighs on two sides, the lower legs on two sides and the parts of two feet of a human body, and an inertial motion capture sensor senses 15 actions of human body nodes; each inertial sensing unit consists of a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer and is used for measuring the acceleration, the angular velocity and the magnetic resistance yaw angle of limbs in the movement process of a human body to obtain the original data of the posture of the human body; placing a UWB positioning tag at the waist part of a human body, and realizing the three-dimensional space positioning data of an experimenter with four positioning base stations by using a double-sided two-way distance measurement method; placing the acquisition ends of the blood oxygen sensor and the heart rate sensor at the position where blood vessels on the wrist of a human body are dense, and acquiring a blood oxygen saturation signal and a real-time exercise heart rate in the exercise process of the human body to obtain vital sign data; and (4) determining a physical strength scale (RPE), and carrying out subjective evaluation on the exercise intensity by experimenters after a group of experiments are finished to obtain evaluation data.
2. A method for collecting data of fire evacuation behaviors of a virtual reality-based stadium, the method being based on the system for collecting data of fire evacuation behaviors of a virtual reality-based stadium of claim 1, characterized by: the method comprises the following steps:
step 1: building a virtual gymnasium three-dimensional building model, and simulating a fire smoke visualization scene in the gymnasium model;
step 2: acquiring behavior data of experimenters during movement during fire simulation through a behavior data acquisition module, and performing subjective evaluation on movement intensity by using a subjective physical strength scale RPE after the simulation is finished;
the step 2 specifically comprises the following steps:
step 2.1: respectively installing inertial sensing units on the head, the trunk, the hip, the upper arms at two sides, the lower arms at two sides, two hands, thighs at two sides, the shanks at two sides and feet of a human body, and inducing 15 actions of the nodes of the human body by an inertial action capturing sensor; each inertial sensing unit consists of a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer and is used for measuring the acceleration, the angular velocity and the magnetic resistance yaw angle of limbs in the movement process of a human body to obtain the original data of the posture of the human body;
step 2.2: placing a UWB positioning tag at the waist part of a human body, and realizing experimenters three-dimensional space positioning data of four positioning base stations by using a double-sided two-way distance measurement method;
step 2.3: placing the acquisition ends of the blood oxygen sensor and the heart rate sensor at the position where blood vessels on the wrist of a human body are dense, and acquiring a blood oxygen saturation signal and a real-time exercise heart rate in the exercise process of the human body to obtain vital sign data;
step 2.4: determining a physical strength scale RPE, and carrying out subjective evaluation on the movement intensity by experimenters after a group of experiments are finished to obtain evaluation data;
and step 3: analyzing the collected behavior data to determine the motion behaviors under different evacuation postures;
and 4, step 4: matching basic information of experimenters, human body posture data, human body motion parameters, fatigue evaluation grade and RPE evaluation data, numbering the data in groups, wherein the numbering also comprises experiment date and experiment sequence, and sealing the data.
3. The method of claim 2, wherein the method comprises the following steps: the step 1 specifically comprises the following steps:
building a three-dimensional building model of a virtual gymnasium, realizing the evacuation walkway of the gymnasium through 3D Max software, and carrying out 1:1, three-dimensional building modeling; simulating the flow of fire smoke through PyroSim software, secondarily developing Unity, constructing a fire virtual scene, and superposing the virtual scene on a gymnasium model to realize the visualization of the fire smoke in the gymnasium space; the laboratory technician wears the wear-type VR terminal, and the smell generator simulates the flue gas based on PyroSim simulation result, the simulation result includes: the smoke concentration, the smoke layer height and the fire temperature are matched and built to form a virtual reality simulation environment with real smell sense and heat sense.
4. The method as claimed in claim 3, wherein the method comprises the following steps: the smoke flow in the simulated fire is that smoke simulation is initialized, the initial height of the smoke layer simulation is five emergency evacuation postures of vertical walking, stooping walking, hand and foot crawling, hand and knee kneeling crawling and crawling according to the human body, and the height of the lower interface of the smoke from the evacuation ground is d =0.4m, and the smoke layer simulation is as follows: 2.0m,1.6m,1.2m,0.8m and 0.4m.
5. The method of claim 3, wherein the method comprises the following steps: the upper part of the head-mounted VR terminal is provided with a spray can, a spray valve and a spray head, and the odor is generated to match the smoke concentration, the smoke layer height and the smoke temperature; when the head-mounted VR terminal is positioned above the smoke layer, the spray head continuously sprays simulated smoke, the smoke spraying rate is matched with smoke concentration data, and the smoke temperature is matched with fire temperature data; when the head-mounted VR terminal is located below the smoke layer, the aerosol valve is closed.
6. The method of claim 2, wherein the method comprises the following steps: the initial state of the experimenter is set to be the upright state, and the basic information of each experimenter is determined: ID and label name, sex, age interval, height and weight; calibrating and initializing the measuring instrument, wherein the sampling frequency of the instrument is uniformly set to t1; recording the resting heart rate and resting blood oxygen saturation indexes of experimenters, initial posture information and initial space positioning information before starting an experiment; the maximum evacuation time was set to 60S experimentally.
7. The method of claim 2, wherein the method comprises the following steps: the step 3 specifically comprises the following steps:
step 3.1: based on an extended Kalman filtering algorithm, fusing acceleration, angular velocity and magnetometers to obtain a carrier attitude angle, carrying out attitude calculation to obtain an attitude quaternion, and obtaining human body attitude data under different evacuation postures: the spatial position, the rotation angle, the orientation, the inclination angle and the range of motion of joints of each limb node;
step 3.2: synthesizing the gravity center of the human body by using an SKC algorithm and five node space positions of the trunk part, the thighs at two sides and the shanks at two sides to obtain the gravity center track of the human body under different evacuation posture time variables;
step 3.3: improve indoor positioning accuracy through ultra wide band location and inertial sensor fusion algorithm, obtain the experimenter spatial position, the reconsitution motion trail obtains human motion parameter under the different evacuation postures, and motion parameter includes: the moving time length, the moving distance, the moving direction, the average speed, the maximum speed and the evacuation path;
step 3.4: setting fatigue indexes, fusing data of a blood oxygen sensor and a heart rate sensor, and analyzing fatigue degree of vital sign data, wherein the fatigue degree grades comprise four grades of relaxation, fitness, larger grade and transition grade; when any of the experimenters' sensor data exceeds a threshold, the experiment is stopped.
8. The method of claim 2, wherein the method comprises the following steps:
the experimental data are processed and uploaded in real time in a wireless communication mode, basic information of experimenters, human body posture data, human body motion parameters, fatigue degree evaluation grades and RPE evaluation data are matched, the data are numbered in groups, and the numbering further comprises experimental date and experimental sequence; and in a preset time period, after the virtual reality experiment is finished, grouping and numbering basic information of experimenters, human body posture data, human body motion parameters, fatigue degree evaluation grades and RPE evaluation data, and packaging and storing the basic information, the human body posture data, the human body motion parameters, the fatigue degree evaluation grades and the RPE evaluation data.
CN202110224035.3A 2021-03-01 2021-03-01 System and method for collecting fire evacuation behavior data of stadium based on virtual reality Active CN112926116B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110224035.3A CN112926116B (en) 2021-03-01 2021-03-01 System and method for collecting fire evacuation behavior data of stadium based on virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110224035.3A CN112926116B (en) 2021-03-01 2021-03-01 System and method for collecting fire evacuation behavior data of stadium based on virtual reality

Publications (2)

Publication Number Publication Date
CN112926116A CN112926116A (en) 2021-06-08
CN112926116B true CN112926116B (en) 2023-02-17

Family

ID=76172696

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110224035.3A Active CN112926116B (en) 2021-03-01 2021-03-01 System and method for collecting fire evacuation behavior data of stadium based on virtual reality

Country Status (1)

Country Link
CN (1) CN112926116B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113485206A (en) * 2021-08-06 2021-10-08 时代云英(重庆)科技有限公司 Extensible Internet of things system and method
CN113887373B (en) * 2021-09-27 2022-12-16 中关村科学城城市大脑股份有限公司 Attitude identification method and system based on urban intelligent sports parallel fusion network

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8816817D0 (en) * 1988-07-14 1988-08-17 Ilsley D A Marine-rig escape pipeline
CN102693330A (en) * 2011-03-25 2012-09-26 上海日浦信息技术有限公司 Pedestrian evacuation simulation method based on extended BDI (base diffusion isolation) model
CN104517003A (en) * 2014-09-04 2015-04-15 上海市建筑科学研究院(集团)有限公司 Virtual experiment platform system and method for crowd evacuation feature testing and drilling
CN107131886A (en) * 2017-07-07 2017-09-05 四川云图瑞科技有限公司 The monitoring system of subway station evacuation guiding based on threedimensional model
CN107292064A (en) * 2017-08-09 2017-10-24 山东师范大学 A kind of crowd evacuation emulation method and system based on many ant colony algorithms
CN107333113A (en) * 2017-08-14 2017-11-07 长沙变化率信息技术有限公司 A kind of pipe gallery wireless supervisory control system
CN108883096A (en) * 2015-12-17 2018-11-23 林科杰诺米克斯股份有限公司 Choroid neovascularization inhibitors or glass-film wart inhibitor and its evaluation or screening technique
CN109377813A (en) * 2018-12-07 2019-02-22 天维尔信息科技股份有限公司 A kind of fire disaster simulation based on virtual reality and rescue drilling system
RU2733699C1 (en) * 2019-10-15 2020-10-06 Владимир Дмитриевич Романов Procedure for tests of means of respiratory system protection
CN112417754A (en) * 2020-11-10 2021-02-26 中山大学 Crowd evacuation simulation method based on scene semantic information under complex indoor structure

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200299521A1 (en) * 2002-09-09 2020-09-24 Reactive Surfaces, Ltd., Llp Peptide-containing antimicrobial coating compositions
US8107395B2 (en) * 2008-06-26 2012-01-31 Telcordia Technologies, Inc. Self-correcting adaptive tracking system (SATS)
DK2780022T3 (en) * 2011-11-14 2019-07-15 Astellas Inst For Regenerative Medicine PHARMACEUTICAL PREPARATIONS OF HUMAN RPE CELLS AND USES THEREOF
US9189766B2 (en) * 2013-09-10 2015-11-17 EnergySavvy Inc. Real time provisional evaluation of utility program performance
CN110110389B (en) * 2019-04-03 2022-10-21 河南城建学院 Virtual-real combined indoor and outdoor evacuation simulation method
CN110402841A (en) * 2019-08-12 2019-11-05 应急管理部四川消防研究所 A kind of escape and evacuation praxiology research method and simulator based on experimental animal
CN111047814B (en) * 2019-12-26 2022-02-08 山东科技大学 Intelligent evacuation system and method suitable for fire alarm condition of subway station

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8816817D0 (en) * 1988-07-14 1988-08-17 Ilsley D A Marine-rig escape pipeline
CN102693330A (en) * 2011-03-25 2012-09-26 上海日浦信息技术有限公司 Pedestrian evacuation simulation method based on extended BDI (base diffusion isolation) model
CN104517003A (en) * 2014-09-04 2015-04-15 上海市建筑科学研究院(集团)有限公司 Virtual experiment platform system and method for crowd evacuation feature testing and drilling
CN108883096A (en) * 2015-12-17 2018-11-23 林科杰诺米克斯股份有限公司 Choroid neovascularization inhibitors or glass-film wart inhibitor and its evaluation or screening technique
CN107131886A (en) * 2017-07-07 2017-09-05 四川云图瑞科技有限公司 The monitoring system of subway station evacuation guiding based on threedimensional model
CN107292064A (en) * 2017-08-09 2017-10-24 山东师范大学 A kind of crowd evacuation emulation method and system based on many ant colony algorithms
CN107333113A (en) * 2017-08-14 2017-11-07 长沙变化率信息技术有限公司 A kind of pipe gallery wireless supervisory control system
CN109377813A (en) * 2018-12-07 2019-02-22 天维尔信息科技股份有限公司 A kind of fire disaster simulation based on virtual reality and rescue drilling system
RU2733699C1 (en) * 2019-10-15 2020-10-06 Владимир Дмитриевич Романов Procedure for tests of means of respiratory system protection
CN112417754A (en) * 2020-11-10 2021-02-26 中山大学 Crowd evacuation simulation method based on scene semantic information under complex indoor structure

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"人群仿真在体育馆疏散设计中的应用";刘莹等;《数字技术与建筑进化》;20151001;第36-39页 *
"基于 BIM 技术的高层火灾应急疏散研究";钟炜等;《建筑防火设计》;20200630;第39卷(第6期);第790-793页 *
"基于烟气危害综合评价的建筑火灾虚拟疏散训练";袁静雨等;《建筑防火设计》;20170831;第36卷(第8期);第1049-1052页 *

Also Published As

Publication number Publication date
CN112926116A (en) 2021-06-08

Similar Documents

Publication Publication Date Title
CN101579238B (en) Human motion capture three dimensional playback system and method thereof
CN106648116B (en) Virtual reality integrated system based on motion capture
CN102323854B (en) Human motion capture device
CN112926116B (en) System and method for collecting fire evacuation behavior data of stadium based on virtual reality
CN102921162B (en) Self-help balance and gait training system and method
Wang et al. Inertial sensor-based analysis of equestrian sports between beginner and professional riders under different horse gaits
KR101751760B1 (en) Method for estimating gait parameter form low limb joint angles
Wang et al. Using wearable sensors to capture posture of the human lumbar spine in competitive swimming
CN105551059A (en) Power transformation simulation human body motion capturing method based on optical and inertial body feeling data fusion
CN203405772U (en) Immersion type virtual reality system based on movement capture
CN206497423U (en) A kind of virtual reality integrated system with inertia action trap setting
CN108338791A (en) The detection device and detection method of unstable motion data
Chen et al. Real-time human motion capture driven by a wireless sensor network
CN104656112B (en) Based on surface electromyogram signal and the used personal localization method and devices combined of MEMS
CN106843484B (en) Method for fusing indoor positioning data and motion capture data
Dinu et al. Accuracy of postural human-motion tracking using miniature inertial sensors
JP7107264B2 (en) Human Body Motion Estimation System
CN109284006B (en) Human motion capturing device and method
CN107260179A (en) Human body motion tracking method based on inertia and body-sensing sensing data quality evaluation
CN105892626A (en) Lower limb movement simulation control device used in virtual reality environment
Qiu et al. Ambulatory estimation of 3D walking trajectory and knee joint angle using MARG Sensors
Seifert et al. Pattern recognition in cyclic and discrete skills performance from inertial measurement units
CN115964933A (en) Construction method of virtual and real training device based on digital twins
CN112256125B (en) Laser-based large-space positioning and optical-inertial-motion complementary motion capture system and method
Chakravarthi et al. Real-time human motion tracking and reconstruction using imu sensors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant