CN115523917A - Indoor emergency environment personnel positioning system based on emergency lamp - Google Patents

Indoor emergency environment personnel positioning system based on emergency lamp Download PDF

Info

Publication number
CN115523917A
CN115523917A CN202211388906.6A CN202211388906A CN115523917A CN 115523917 A CN115523917 A CN 115523917A CN 202211388906 A CN202211388906 A CN 202211388906A CN 115523917 A CN115523917 A CN 115523917A
Authority
CN
China
Prior art keywords
target person
positioning
module
visible light
positioning result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211388906.6A
Other languages
Chinese (zh)
Inventor
李增科
邵克凡
陶振强
张冲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology CUMT
Original Assignee
China University of Mining and Technology CUMT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology CUMT filed Critical China University of Mining and Technology CUMT
Priority to CN202211388906.6A priority Critical patent/CN115523917A/en
Publication of CN115523917A publication Critical patent/CN115523917A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)

Abstract

The invention relates to an indoor emergency environment personnel positioning system based on an emergency lamp, which relates to the technical field of positioning and comprises a data acquisition module, an inertia positioning module, an emergency lamp module, a visible light positioning module and a filtering fusion module; the data acquisition module is used for acquiring sensor data of a current position point of a target person; the sensor data includes illumination intensity data; the inertial positioning module is used for determining a target person positioning result according to the sensor data; the emergency lamp module is used for emitting visible light and storing the geographical positions and the emission frequencies of all emergency lamps in a preset environment; the visible light positioning module is used for collecting visible light, positioning the target personnel according to the illumination intensity data and the collected emission frequency of the visible light, and outputting a visible light positioning result; and the filtering fusion module is used for fusing the target personnel positioning result and the visible light positioning result by adopting Kalman filtering to determine the fusion positioning result of the target personnel. The invention improves the indoor positioning precision.

Description

Indoor emergency environment personnel positioning system based on emergency lamp
Technical Field
The invention relates to the technical field of emergency indoor positioning, in particular to an indoor emergency environment personnel positioning system based on an emergency lamp.
Background
Along with the rapid development of the urbanization process, the number of factories in high-rise buildings and markets is increased rapidly, the behaviors such as fire and electricity consumption are not specified, the risk of fire in a room is increased, and the serious fire hazard harms the life safety of common people and rescue workers.
The indoor position is important information for search and rescue, and the accurate indoor position can greatly save rescue time and track the state of search and rescue personnel in real time so as to ensure the life safety of the search and rescue personnel. Due to the influence of fire, positioning technologies based on radio frequency identification, wiFi, bluetooth and the like are easily interfered by multipath effect, and ultra-wideband positioning is easily influenced by non-line-of-sight, so that positioning accuracy is reduced, and troubles are caused for search and rescue of search and rescue personnel and emergency escape of victims. Autonomous positioning based on inertial devices, such as gyroscopes and accelerometers, has a continuous position result, and can achieve continuous position tracking, but due to various noises from the sensors, positioning errors gradually accumulate, thereby causing positioning offset.
Disclosure of Invention
The invention aims to provide an indoor emergency environment personnel positioning system based on an emergency lamp, and the indoor positioning precision is improved.
In order to achieve the purpose, the invention provides the following scheme:
an indoor emergency environment personnel positioning system based on an emergency lamp comprises a data acquisition module, an inertia positioning module, an emergency lamp module, a visible light positioning module and a filtering fusion module; the data acquisition module is respectively connected with the inertial positioning module and the visible light positioning module, and the inertial positioning module and the visible light positioning module are both connected with the filtering fusion module;
the data acquisition module is used for acquiring sensor data of the current position point of the target person; the sensor data comprises illumination intensity data;
the inertial positioning module is used for determining a target person positioning result according to the sensor data;
the emergency lamp module is used for emitting visible light and storing the geographical positions and the emission frequencies of all emergency lamps in a preset environment;
the visible light positioning module is used for collecting visible light, positioning the target personnel according to the illumination intensity data and the collected emission frequency of the visible light, and outputting a visible light positioning result;
the filtering fusion module is used for fusing the target person positioning result and the visible light positioning result by adopting Kalman filtering to determine a fusion positioning result of the target person.
Optionally, the sensor data comprises geomagnetic field strength data, angular acceleration data, and acceleration data.
Optionally, when the target person is a victim, the inertial positioning module includes a gait detection unit, a step length determination unit, a heading determination unit, and a navigation position determination unit;
the gait detection unit is used for determining one step of walking by adopting wave crest detection and zero crossing detection according to the acceleration data of the target person;
the step length determining unit is used for determining the step length of the current moment according to the acceleration data of the target person within one step;
the course determining unit is used for determining the attitude of the target person at the current moment according to the geomagnetic field intensity data, the angular acceleration data and the acceleration data in the sensor data of the current position point of the target person;
the navigation position determining unit is used for determining the navigation position of the target person at the current moment according to the step length and the posture of the target person at the current moment and the navigation position at the previous moment.
Optionally, when the target person is a search and rescue person, the inertial positioning module includes a pose arrangement unit, a zero-speed detection unit and a zero-speed correction unit;
the pose arrangement unit is used for determining the pose of the target person at the current moment according to a recurrence formula, wherein the recurrence formula is expressed as follows:
Figure BDA0003931166170000021
wherein,
Figure BDA0003931166170000031
the position of the target person at time k,
Figure BDA0003931166170000032
is the speed of the target person at time k,
Figure BDA0003931166170000033
is the attitude matrix of the target person at time k,
Figure BDA0003931166170000034
the location of the target person at time k-1,
Figure BDA0003931166170000035
the velocity of the target person at time k-1,
Figure BDA0003931166170000036
is the attitude matrix of the target person at the time k-1,
Figure BDA0003931166170000037
represents the acceleration data of the target person at time k,
Figure BDA0003931166170000038
representing angular acceleration data of the target person at time k, b a Zero offset of the accelerometer representing the acquisition of said acceleration data, b g Zero-offset of the gyroscope representing the acquisition of said angular acceleration data, Δ t being the time interval, g n Is a gravity vector of a navigation coordinate system, omega]Is an anti-symmetric matrix;
the zero-speed detection unit is used for detecting whether the acceleration data is in a zero-speed state or not according to the acceleration data;
and the zero-speed correction unit is used for taking the speed of a target person in a zero-speed state as a speed observation value when the zero-speed detection unit detects that the acceleration data is in the zero-speed state, converting the speed observation value into a navigation coordinate system to obtain a zero-speed observation value, and inputting the zero-speed observation value into the filtering and fusing module.
Optionally, the system further comprises a data processing module, wherein the data processing module is configured to perform filtering, smoothing and denoising processing on the sensor data, and send the sensor data after data processing to the inertial positioning module.
Optionally, the filtering fusion module includes a state updating unit and a measurement updating unit;
the state updating unit is used for acquiring the error of the positioning result of the target person at the current moment;
when the target person is a victim, the measurement updating unit is used for constructing a position difference observation equation according to the error of the target person positioning result, the target person positioning result and the visible light positioning result, and determining a fusion positioning result of the target person according to the position difference observation equation;
when the target person is a search and rescue person, the measurement updating unit is used for constructing a position difference observation equation according to the error of the target person positioning result, the target person positioning result and the visible light positioning result, outputting the constructed speed difference observation equation according to the target person speed and zero speed correction unit, and determining the fusion positioning result of the target person according to the position difference observation equation and the speed difference observation equation.
Optionally, when the target person is a victim:
the position difference observation equation is expressed as x Dead reckoning -x Visible light =δx+n p
The fusion positioning result of the target person is represented as x Finally, the product is processed =x Dead reckoning -δx;
Wherein x is Dead reckoning Represent the stated purposeNavigation result of the target person, x Visible light Representing the visible light positioning result, δ x representing the target person position error, n p To observe noise, x Finally, the product is processed And representing the fusion positioning result of the target person.
Optionally, when the target person is a search and rescue person:
the position difference observation equation is expressed as x Mechanical choreography -x Visible light =δx+w p
The velocity difference observation equation is expressed as
Figure BDA0003931166170000041
The fusion positioning result of the target person is expressed as x Finally, the product is processed =x Mechanical choreography -δx;
Wherein x is Mechanical layout Representing the target person positioning result, x Visible light Representing the visible light positioning result, deltax representing the target person position error, deltav representing the target person velocity error,
Figure BDA0003931166170000042
indicating the attitude error of the target person, w p Representing position difference observation noise, w v Representing the observed noise of the velocity difference, x representing the antisymmetric sign, x Finally, the product is processed And representing the fusion positioning result of the target person.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the invention discloses an emergency lamp-based indoor emergency environment personnel positioning system, which not only inhibits the error accumulation of inertial positioning, improves the indoor positioning precision, but also improves the reliability of indoor positioning on the basis of not additionally increasing the emergency positioning cost by combining the inertial autonomous positioning and the visible light absolute positioning.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic structural diagram of an indoor emergency environment personnel positioning system based on an emergency lamp according to the present invention;
FIG. 2 is a schematic diagram of a specific structure of an indoor emergency environment personnel positioning system based on an emergency lamp according to the invention when a target personnel is a victim;
fig. 3 is a schematic structural diagram of an indoor emergency environment personnel positioning system based on an emergency lamp when a target personnel is a search and rescue personnel.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims to provide an indoor emergency environment personnel positioning system based on an emergency lamp, which improves the indoor positioning precision.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
FIG. 1 is a schematic structural diagram of an indoor emergency environment personnel positioning system based on an emergency lamp according to the present invention; as shown in fig. 1, an indoor emergency environment personnel positioning system based on emergency lamps comprises a data acquisition module 101, an inertial positioning module 102, an emergency lamp module 103, a visible light positioning module 104 and a filtering fusion module 105; the data acquisition module 101 is respectively connected to the inertial positioning module 102 and the visible light positioning module 104, and both the inertial positioning module 102 and the visible light positioning module 104 are connected to the filtering and fusing module 105.
The data acquisition module 101 is used for acquiring sensor data of a current position point of a target person; the sensor data includes geomagnetic field intensity data, angular acceleration data, and illumination intensity data.
The inertial positioning module 102 is configured to determine a target person positioning result according to the sensor data.
The emergency light module 103 is configured to emit visible light and store geographical locations and emission frequencies of all emergency lights in a preset environment.
The emergency light module 103 includes an emergency light hardware unit and an emergency light software unit.
The emergency lamp hardware unit comprises an emergency lamp, and the emergency lamp is used for providing visible light under an indoor emergency environment and used as a positioning source. The emergency lamp hardware unit is also used for adjusting the emission frequency of the visible light of different emergency lamps.
And the emergency lamp software unit is used for storing the geographic positions and the light emitting frequencies of all the emergency lamps.
The emergency lamp module 103 is used for providing visible light (emergency lamp) in an emergency environment and serving as a positioning source, and the light emitting frequencies of different emergency lamps are designed in advance, so that the emergency lamps at different fixed positions can be distinguished through different light emitting frequencies.
The visible light positioning module 104 is configured to collect visible light, position the target person according to the illumination intensity data and the collected emission frequency of the visible light, and output a visible light positioning result.
The visible light positioning module 104 includes an emergency light identification unit and a time difference of arrival positioning unit.
The emergency lamp identification unit is used for determining the emission frequency of the collected visible light and determining the geographic position of the corresponding emergency lamp according to the emission frequency.
The Time Difference of arrival positioning unit is used for determining the visible light positioning result of the user by adopting a Time Difference of arrival positioning method (TDOA) according to the illumination intensity data and the geographic positions of different emergency lamps.
The filtering fusion module 105 is configured to fuse the target person positioning result and the visible light positioning result by using kalman filtering to determine a fusion positioning result of the target person. The filter fusion module 105 specifically estimates the state error using indirect kalman filtering.
An indoor emergency environment personnel positioning system based on emergency lamps further comprises a data processing module 106, wherein the data processing module 106 is used for carrying out filtering, smoothing and denoising processing on the sensor data and sending the sensor data after the data processing to the inertial positioning module 102. The data processing module 106 is specifically configured to smooth the magnetic field strength data, the acceleration data, and the angular acceleration data using low pass filtering.
The data acquisition module 101 includes an accelerometer, a gyroscope, a magnetometer, and a light intensity meter. The accelerometer is used for collecting acceleration data, the gyroscope is used for collecting angular acceleration data, the magnetometer is used for collecting magnetic field intensity data, and the light intensity meter is used for collecting illumination intensity data.
When the target person is a search and rescue person, as shown in fig. 2, the data collection module 101 includes, but is not limited to, a smartphone, a smart watch, and a tablet computer of the user.
The current position point is the indoor position of the intelligent terminal of the target person at the current moment. The intelligent terminal comprises an intelligent mobile phone with a positioning function, an intelligent watch and a tablet computer.
When the target person is a victim, the inertial positioning module 102 is a dead reckoning module, and the dead reckoning module includes a gait detection unit, a step length determination unit, a course determination unit, and a dead reckoning unit.
And the dead reckoning module is used for determining the dead reckoning of the target personnel according to the sensor data. The dead reckoning module estimates the gait, the step length and the course of the target person according to zero-crossing detection, peak detection, a step length regression model at the current moment and sensor data of the current position point, and performs dead reckoning according to a gait estimation result, a step length estimation result and a course estimation result to obtain the current dead reckoning of the target person.
The gait detection unit is used for detecting the gait of the target person by adopting zero-crossing detection and wave crest detection according to acceleration data in sensor data of the current position point of the target person to obtain a gait estimation result of the target person. The gait detection unit is used for determining one step of walking of the target person by adopting wave crest detection and zero crossing detection according to the acceleration data of the target person.
The zero-crossing detection means that if the acceleration data modulus is detected to be equal to the local gravity acceleration twice continuously, the step is recorded as 1, the acceleration modulus is gradually increased when the acceleration data modulus is equal for the first time, and the acceleration modulus is gradually decreased when the acceleration data modulus is equal for the second time. The peak detection is to be recorded as step 1 if a maximum value of the acceleration data is detected, that is, if the acceleration data at the current time is detected to be a maximum value in a preceding step and a subsequent step. If the zero-crossing detection and the wave crest detection detect 1 step at the same time, recording as 1 step, and counting as +1.
The step length determining unit is used for determining the step length at the current moment according to the acceleration data of the target person within one step.
The step length determining unit is specifically used for determining that the user walks one step according to the gait detecting unit and then determining the step length of the user at the current moment according to the acceleration data in one step of the user.
The step length determining unit specifically determines the user step length at the current moment according to a step length regression model, wherein the input of the step length regression model is acceleration data within one step, and the output of the step length regression model is the step length.
The step size regression model adopts a Kim model.
The Kim model is expressed as:
Figure BDA0003931166170000071
wherein SL represents a step length, N represents an acceleration data amount within one step, | a i I represents the absolute value of the ith acceleration data, K 1 Are model parameters.
And the course determining unit is used for determining the attitude of the target person at the current moment according to the geomagnetic field intensity data, the angular acceleration data and the acceleration data in the sensor data of the current position point of the target person by adopting the Mahony complementary filtering. The attitude of the target person is represented by a pitch angle, a roll angle, and a heading angle.
Mahony complementary filtering: estimating the attitude of the target personnel to avoid the problem of dead lock of the Euler angle on the universal joint, and obtaining the attitude of the target personnel by quaternion (q) 0 ,q 1 ,q 2 ,q 3 ) Represents the directional cosine matrix:
Figure BDA0003931166170000081
wherein,
Figure BDA0003931166170000082
is a direction cosine matrix from a b system (carrier coordinate system) to an n system (navigation coordinate system) at the k moment. Quaternion is the hypercomplex number of a four-dimensional space, q 0 The cosine is a real part and represents the cosine of half of the integral rotation angle of the user; q. q.s 1 ,q 2 ,q 3 And imaginary parts representing the sine of half the rotation angle around the x, y, z axes, respectively.
Low-pass filtering the gyroscope data, estimating the quaternion of the current attitude and expressing the quaternion as
Figure BDA0003931166170000083
The accelerometer and magnetometer are low-pass filtered, and the gyroscope is calibrated:
Figure BDA0003931166170000084
wherein q is the quaternion representation after correction, P is the mapping of the three-dimensional vector to the quaternion,
Figure BDA0003931166170000085
the angular velocity vector obtained for the gyroscope.
Delta is the gyro compensation value generated by the PI regulator, delta = K p ·e+K I Integral edt, in a PI regulator, parameter K P For controlling the cross-over frequency between accelerometer, magnetometer and gyroscope, parameter K I Correcting the error of the gyroscope; e represents the measured inertia vectorAnd the predicted vector, t represents time,
Figure BDA0003931166170000086
representing an actual measurement vector comprising a locally actual measurement of gravitational acceleration and an actual measurement of magnetic field strength,
Figure BDA0003931166170000087
a predicted vector is represented, the predicted vector comprising a locally predicted gravitational acceleration and a predicted magnetic field strength.
The INS programming is carried out, and,
Figure BDA0003931166170000088
wherein n represents a navigation coordinate system, i.e. a local horizontal coordinate system, I is an identity matrix,
Figure BDA0003931166170000089
for time k, the angular increment measured by the lower gyroscope, and "x" is the antisymmetric sign;
Figure BDA00039311661700000810
is a direction cosine matrix from a b system (carrier coordinate system) to an n system (navigation coordinate system) at the time k-1.
After the direction cosine matrix of the current time is obtained, the Euler angle is reversely calculated,
Figure BDA00039311661700000811
wherein the ratio of theta, gamma,
Figure BDA00039311661700000812
pitch angle, roll angle and course angle.
The navigation position determining unit is used for determining the navigation position of the target person at the current moment, namely the positioning result of the target person according to the step length and the posture of the target person at the current moment and the navigation position at the previous moment.
And the navigation position determining unit determines the navigation position of the target person at the current moment through a recurrence formula. The recurrence formula is expressed as:
Figure BDA0003931166170000091
wherein,
Figure BDA0003931166170000092
and
Figure BDA0003931166170000093
the geographic positions of the target person in the east direction and the north direction at the time k (the current time), namely the positioning result of the target person at the time k,
Figure BDA0003931166170000094
and
Figure BDA0003931166170000095
the geographic position of the target person in the east and north directions at time k-1 (last time), l represents the target person step length, and α represents the target person heading angle (heading angle).
The filtering fusion module 105 includes a state update unit and a measurement update unit.
The state updating unit is used for acquiring the error of the target person positioning result at the current moment, and determining a state error and a covariance updating equation thereof by considering error disturbance:
Figure BDA0003931166170000096
Figure BDA0003931166170000097
wherein,
Figure BDA0003931166170000098
either the last epoch (time k-1) or a known state error,
Figure BDA0003931166170000099
is the last epoch (time k-1) or knownOf the state error of (1), δ x k State errors at the moment k predicted by Kalman filtering are east and north position errors; f k Is a state transition matrix at the time k, omega is process noise, and the covariance matrix of omega is Q k . The state errors include position errors, velocity errors, attitude errors, and the like.
The measurement updating unit is used for determining a position error observation equation at the current moment according to the current-moment target person navigation position, the current-moment visible light positioning result and the observation noise; and taking the difference between the current-time target person navigation position and the current-time Kalman filtering estimation state error as the fusion positioning result of the user at the current time.
The measurement update unit formula is:
Figure BDA00039311661700000910
Figure BDA00039311661700000911
Figure BDA0003931166170000101
wherein represents R k Representing an observation weight matrix, K k Denotes the gain matrix, H k Representing a matrix of coefficients, Z k Representing an observed value, I representing a unit weight array,
Figure BDA0003931166170000102
and
Figure BDA0003931166170000103
and representing the Kalman filtering to determine the state error and a covariance matrix thereof.
When the target person is a victim:
the position difference observation equation is expressed as x Dead reckoning -x Visible light =δx+n p
The fusion positioning result of the target person is represented as x Finally, the product is processed =x Dead reckoning -δx;
Wherein x is Dead reckoning Representing the result of the target person's position, x Visible light Representing the visible light positioning result, δ x representing the target person position error, n p To observe noise, x Finally, the product is processed And representing the fusion positioning result of the target person.
As shown in fig. 3, when the target person is a search and rescue person, the inertial positioning module 102 is a mechanical arrangement module, and the mechanical arrangement module includes a pose arrangement unit, a zero-speed detection unit, and a zero-speed correction unit.
When the target person is a search and rescue person, the data acquisition module 101 includes a calibrated high-precision gyroscope, an accelerometer, and a light intensity meter. The gyroscope and accelerometer are integrated into the foot (the target person's shoe) for zero velocity correction.
The mechanical arrangement module is used for deducing the pose of the target person at the current moment in a mechanical arrangement mode according to the angular acceleration data and the acceleration data of the current moment and the user pose at the previous moment, detecting whether the target person at the current moment is in a zero-speed state or not according to the acceleration data, and determining a zero-speed observation value when the target person is in the zero-speed state.
The pose arrangement unit is used for determining the pose of the target person at the current moment, namely a target person positioning result, according to a recursion formula, wherein the recursion formula is expressed as follows:
Figure BDA0003931166170000104
wherein,
Figure BDA0003931166170000105
the location of the target person at time k,
Figure BDA0003931166170000106
the velocity of the target person at time k,
Figure BDA0003931166170000107
is the attitude matrix of the target person at time k,
Figure BDA0003931166170000108
the location of the target person at time k-1,
Figure BDA0003931166170000109
the velocity of the target person at time k-1,
Figure BDA0003931166170000111
is the attitude matrix of the target person at the time k-1,
Figure BDA0003931166170000112
represents the acceleration data of the target person at time k,
Figure BDA0003931166170000113
representing angular acceleration data of the target person at time k, b a Zero offset of accelerometer representing the acquisition of said acceleration data, b g Zero-offset of the gyroscope representing the acquisition of said angular acceleration data, Δ t being the time interval, g n Is a gravity vector of a navigation coordinate system, omega]Is an anti-symmetric matrix.
The zero-speed detection unit is used for detecting whether the acceleration data is in a zero-speed state or not according to the acceleration data.
The zero-speed correction unit is configured to, when the zero-speed detection unit detects that the acceleration data is in a zero-speed state, take a speed of the target person in the zero-speed state as a speed observation value, convert the speed observation value into a navigation coordinate system, obtain a zero-speed observation value, and input the zero-speed observation value into the filtering and fusing module 105.
The filtering fusion module 105 includes a state updating unit and a measurement updating unit.
The state updating unit is used for acquiring the error of the positioning result of the target person at the current moment; .
When the target person is a victim, the measurement updating unit is used for constructing a position difference observation equation according to the error of the target person positioning result, the target person positioning result and the visible light positioning result, and determining a fusion positioning result of the target person according to the position difference observation equation.
When the target person is a search and rescue person, the measurement updating unit is used for constructing a position difference observation equation and a speed difference observation equation according to the error of the target person positioning result, the target person positioning result and the visible light positioning result, and determining a fusion positioning result of the target person according to the position difference observation equation and the speed difference observation equation.
When the target person is a search and rescue person:
the position difference observation equation is expressed as x Mechanical choreography -x Visible light =δx+w p
The velocity difference observation equation is expressed as
Figure BDA0003931166170000114
The fusion positioning result of the target person is represented as x Finally, the product is processed =x Mechanical layout -δx;
Wherein x is Mechanical layout Representing the target person positioning result, x Visible light Representing the visible light positioning result, deltax representing the target person position error, deltav representing the target person velocity error,
Figure BDA0003931166170000115
indicating the attitude error, w, of the target person p Representing position difference observation noise, w v Representing the velocity difference observation noise, x representing the anti-symmetric sign, x Finally, the product is processed And representing the fusion positioning result of the target person.
When the target person is a search and rescue person and a victim person, the visible light positioning module 104 is the same, except that the victim person uses an intelligent terminal to carry out dead reckoning, the search and rescue person uses high-precision inertial navigation to carry out mechanical arrangement, and the visible light positioning module provides uniform positioning source correction inertial positioning for different users.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.

Claims (8)

1. An indoor emergency environment personnel positioning system based on an emergency lamp is characterized by comprising a data acquisition module, an inertial positioning module, an emergency lamp module, a visible light positioning module and a filtering fusion module; the data acquisition module is respectively connected with the inertial positioning module and the visible light positioning module, and the inertial positioning module and the visible light positioning module are both connected with the filtering fusion module;
the data acquisition module is used for acquiring sensor data of the current position point of the target person; the sensor data comprises illumination intensity data;
the inertial positioning module is used for determining a target person positioning result according to the sensor data;
the emergency lamp module is used for emitting visible light and storing the geographical positions and the emission frequencies of all emergency lamps in a preset environment;
the visible light positioning module is used for collecting visible light, positioning the target personnel according to the illumination intensity data and the collected emission frequency of the visible light, and outputting a visible light positioning result;
the filtering fusion module is used for fusing the target person positioning result and the visible light positioning result by adopting Kalman filtering to determine a fusion positioning result of the target person.
2. The emergency light-based indoor emergency environment personnel positioning system of claim 1 wherein the sensor data comprises geomagnetic field strength data, angular acceleration data, and acceleration data.
3. The emergency light-based indoor emergency environment personnel positioning system of claim 1, wherein when the target personnel are victim personnel, the inertial positioning module comprises a gait detection unit, a step length determination unit, a heading determination unit and a dead-reckoning determination unit;
the gait detection unit is used for determining one step of walking by adopting wave crest detection and zero-crossing detection according to the acceleration data of the target person;
the step length determining unit is used for determining the step length of the current moment according to the acceleration data of the target person within one step;
the course determining unit is used for determining the attitude of the target person at the current moment according to the geomagnetic field intensity data, the angular acceleration data and the acceleration data in the sensor data of the current position point of the target person;
the navigation position determining unit is used for determining the navigation position of the target person at the current moment according to the step length and the posture of the target person at the current moment and the navigation position at the previous moment.
4. The emergency light-based indoor emergency environment personnel positioning system of claim 1, wherein when the target personnel are search and rescue personnel, the inertial positioning module comprises a pose orchestration unit, a zero-speed detection unit, and a zero-speed correction unit;
the pose arrangement unit is used for determining the pose of the target person at the current moment according to a recurrence formula, wherein the recurrence formula is expressed as follows:
Figure FDA0003931166160000021
wherein,
Figure FDA0003931166160000022
the position of the target person at time k,
Figure FDA0003931166160000023
is the speed of the target person at time k,
Figure FDA0003931166160000024
is the attitude matrix of the target person at time k,
Figure FDA0003931166160000025
the location of the target person at time k-1,
Figure FDA0003931166160000026
the velocity of the target person at time k-1,
Figure FDA0003931166160000027
is the attitude matrix of the target person at the time k-1,
Figure FDA0003931166160000028
represents the acceleration data of the target person at time k,
Figure FDA0003931166160000029
representing angular acceleration data of the target person at time k, b a Zero offset of accelerometer representing the acquisition of said acceleration data, b g Zero offset of the gyroscope representing the acquisition of said angular acceleration data, Δ t being the time interval, g n Is the gravity vector of the navigation coordinate system, omega]Is an anti-symmetric matrix;
the zero speed detection unit is used for detecting whether the acceleration data is in a zero speed state or not according to the acceleration data;
and the zero-speed correction unit is used for taking the speed of a target person in a zero-speed state as a speed observation value when the zero-speed detection unit detects that the acceleration data is in the zero-speed state, converting the speed observation value into a navigation coordinate system to obtain a zero-speed observation value, and inputting the zero-speed observation value into the filtering and fusing module.
5. The emergency light-based indoor emergency environment personnel positioning system of claim 1, further comprising a data processing module for filtering, smoothing and de-noising the sensor data, sending the data processed sensor data to the inertial positioning module.
6. The emergency light-based indoor emergency environment personnel positioning system of claim 4, wherein the filtering fusion module comprises a status update unit and a metrology update unit;
the state updating unit is used for acquiring the error of the positioning result of the target person at the current moment;
when the target person is a victim, the measurement updating unit is used for constructing a position difference observation equation according to the error of the target person positioning result, the target person positioning result and the visible light positioning result, and determining a fusion positioning result of the target person according to the position difference observation equation;
when the target person is a search and rescue person, the measurement updating unit is used for constructing a position difference observation equation and a speed difference observation equation according to the error of the target person positioning result, the target person positioning result and the visible light positioning result, constructing a speed difference observation equation according to the target person speed and the output of the zero speed correction unit, and determining the fusion positioning result of the target person according to the position difference observation equation and the speed difference observation equation.
7. An emergency light based indoor emergency environment personnel positioning system of claim 6, wherein when the target personnel are victim personnel:
the position difference observation equation is expressed as x Dead reckoning -x Visible light =δx+n p
The fusion positioning result of the target person is expressed as x Finally, the product is processed =x Dead reckoning -δx;
Wherein x is Dead reckoning Representing the result of the target person's position, x Visible light Representing the visible light positioning result, δ x representing the target person position error, n p To observe noise, x Finally, the product is processed And representing the fusion positioning result of the target person.
8. An emergency light based indoor emergency environment personnel positioning system of claim 6 wherein when the target personnel are search and rescue personnel:
the position difference observation equation is expressed as x Mechanical choreography -x Visible light =δx+w p
The velocity difference observation equation is expressed as
Figure FDA0003931166160000031
The fusion positioning result of the target person is expressed as x Finally, the product is processed =x Mechanical choreography -δx;
Wherein x is Mechanical layout Representing the target person positioning result, x Visible light Representing the visible light positioning result, deltax representing the target person position error, deltav representing the target person velocity error,
Figure FDA0003931166160000032
indicating the attitude error of the target person, w p Representing position difference observation noise, w v Representing the observed noise of the velocity difference, x representing the antisymmetric sign, x Finally, the product is processed And representing the fusion positioning result of the target person.
CN202211388906.6A 2022-11-08 2022-11-08 Indoor emergency environment personnel positioning system based on emergency lamp Pending CN115523917A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211388906.6A CN115523917A (en) 2022-11-08 2022-11-08 Indoor emergency environment personnel positioning system based on emergency lamp

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211388906.6A CN115523917A (en) 2022-11-08 2022-11-08 Indoor emergency environment personnel positioning system based on emergency lamp

Publications (1)

Publication Number Publication Date
CN115523917A true CN115523917A (en) 2022-12-27

Family

ID=84704967

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211388906.6A Pending CN115523917A (en) 2022-11-08 2022-11-08 Indoor emergency environment personnel positioning system based on emergency lamp

Country Status (1)

Country Link
CN (1) CN115523917A (en)

Similar Documents

Publication Publication Date Title
KR101851836B1 (en) Systems and methods for estimating the motion of an object
CN104296750B (en) Zero speed detecting method, zero speed detecting device, and pedestrian navigation method as well as pedestrian navigation system
CA2653622C (en) Method and system for locating and monitoring first responders
CN109827577A (en) High-precision inertial navigation location algorithm based on motion state detection
Romanovas et al. A study on indoor pedestrian localization algorithms with foot-mounted sensors
CN107490378B (en) Indoor positioning and navigation method based on MPU6050 and smart phone
Lee et al. An experimental heuristic approach to multi-pose pedestrian dead reckoning without using magnetometers for indoor localization
KR20130127991A (en) Method and system for estimating a path of a mobile element or body
Gädeke et al. Smartphone pedestrian navigation by foot-IMU sensor fusion
JP2013531781A (en) Method and system for detecting zero speed state of object
CN111024126B (en) Self-adaptive zero-speed correction method in pedestrian navigation positioning
US20160363448A1 (en) Determining Sensor Orientation in Indoor Navigation
Höflinger et al. Indoor-localization system using a micro-inertial measurement unit (imu)
CN110986997A (en) Method and system for improving indoor inertial navigation precision
EP2778709B1 (en) User assisted location method
CN110672095A (en) Pedestrian indoor autonomous positioning algorithm based on micro inertial navigation
EP3227634B1 (en) Method and system for estimating relative angle between headings
CN110260860B (en) Indoor movement measurement positioning and attitude determination method and system based on foot inertial sensor
Kim et al. Performance improvement and height estimation of pedestrian dead-reckoning system using a low cost MEMS sensor
AU2015201877A1 (en) Method and system for locating and monitoring first responders
CN115523917A (en) Indoor emergency environment personnel positioning system based on emergency lamp
Sang et al. A self-developed indoor three-dimensional pedestrian localization platform based on MEMS sensors
Liu et al. Track Your Foot Step: Anchor-free Indoor Localization based on Sensing Users' Foot Steps
Park et al. Robust pedestrian dead reckoning for indoor positioning using smartphone
Kröger et al. Method of pedestrian dead reckoning using speed recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination