Disclosure of Invention
The invention aims to provide a fire accident emergency treatment drilling system which can be used for simulating and drilling a fire accident and has the advantages of convenience in use and high reliability.
In order to achieve the purpose, the invention adopts the technical scheme that:
fire accident emergency treatment drilling system, including calculation processing equipment, still include modeling equipment, 3D printing apparatus and wear the display, modeling equipment models the petrochemical industry scene of fire to utilize 3D printing apparatus to carry out the scaling-down and rebuild, reuse wear display and mixed reality technique, realize the stack on virtual scene of fire and the real scene sand table, and carry out the emergent simulation drilling of petrochemical industry scene of fire through three-dimensional interaction.
Preferably, the calculation processing equipment comprises a main control computer and a plurality of branch control computers, the main control computer is arranged in a main department of a petrochemical enterprise, the branch control computers are positioned in each branch company, and the branch control computers are connected with the main control computer through signal lines and transmit the signals in the branch control computers to the main control computer.
Preferably, the modeling device comprises a plurality of scene scanning devices and a plurality of signal positioning devices, the plurality of signal positioning devices are installed in a plurality of target plants of a petrochemical enterprise, each signal positioning device is matched with one scene scanning device, the scene scanning devices transmit scenes in the target plants to the computing processing devices, and the positioning of the positions is realized through the signal positioning devices.
Preferably, the 3D printing device comprises a 3D printer, and the 3D printer is connected with the computing processing device through a line; the real scene sand table comprises a sand table model, and the 3D printer places the printed petrochemical enterprise model in a specified area of the sand table model.
Preferably, the head-mounted display is wirelessly connected with the computing processing device through a wireless signal transmission module; the head-mounted display comprises a display helmet, and the display helmet is provided with two light guide transparent holographic lenses, a micro projector, a depth camera, an infrared LED and a gesture displacement sensor.
Preferably, the two light guide transparent holographic lenses and the micro projector are arranged at the front end of the middle part of the display helmet, and the display helmet utilizes a near-eye 3D diffraction display technology;
through being equipped with two light guide transparent holographic lenses to adopt DLP projection technique, the miniature projecting apparatus in the front projects behind the light guide transparent holographic lens and gets into people's eye, also lets the light of real world come in simultaneously.
Preferably, the number of the gesture displacement sensors is multiple, and the multiple gesture displacement sensors are connected to the front part, the left part and the right part of the display helmet; the gesture displacement sensor is used for acquiring depth information of an environment, and real-time operation is carried out through a program by matching with self acceleration, angular velocity and the like; when the object moves, the depth information changes, and the changes build an incremental map, so that the space identification and positioning are completed.
Preferably, the depth camera and the infrared LED capture pictures from two different angles, the signals are transmitted to the depth camera and the infrared LED, and the calculation processing equipment analyzes and reconstructs motion information of the palm in a three-dimensional space of the real world through an algorithm.
The invention also aims to provide a fire accident drilling processing method.
In order to achieve the purpose, the invention adopts the technical scheme that:
a fire accident drilling processing method adopts the fire accident emergency treatment drilling system, and specifically comprises the following steps:
the method comprises the steps of firstly determining a petrochemical typical fire emergency disposal process and an accident evolution process control method, carrying out three-dimensional projection process carding on a petrochemical enterprise typical fire accident emergency plan through the disposal process and the process control method, then carrying out 1:1 accurate three-dimensional modeling analysis through a typical fire accident scene, and developing a petrochemical typical fire accident emergency disposal simulation drilling system based on the research of augmented reality interaction and optimization algorithm.
Preferably, the petrochemical fire emergency disposal process method includes the following steps:
step 101, analyzing a plurality of petrochemical typical fire accidents, and screening typical fire accidents needing to be analyzed;
102, determining good post responsibilities, and performing clear specification determination on the work responsibilities of each employee;
step 103, defining the operation specification, and transmitting the operation specification to the staff in the form of an electronic file or a paper file;
step 104, researching accident evolution process control;
and 105, making a system development scheme and a system process control scheme.
Preferably, the development content of the petrochemical fire accident emergency handling simulation drilling system comprises space simulation positioning and gesture interaction positioning.
Preferably, the spatial simulation positioning comprises basic positioning and spatial positioning identification, the basic positioning is realized by utilizing a near-eye 3D diffraction display technology and providing two light guide transparent holographic lenses, the virtual content adopts a DLP projection technology, and the virtual content is projected to the light guide lenses from a front micro projector and then enters human eyes, and meanwhile, light of the real world is also transmitted;
the spatial positioning identification is to acquire depth information of an environment by using a related sensor, match with self acceleration, angular velocity and other moving factors, change the depth information by real-time operation of a program when an object moves, and build an incremental map by the change so as to finish the identification and positioning of the space.
Preferably, the positioning function of the spatial positioning recognition is to acquire environmental depth information by using a depth-of-field camera or to scan point cloud data by using a laser radar.
Preferably, the gesture interactive positioning method includes the following steps:
step 141, capturing pictures from two different angles through a depth camera and an infrared LED, and analyzing and reconstructing motion information of a palm in a three-dimensional space of a real world through an algorithm;
142, the system establishes a rectangular coordinate system through the bottom of the sensor, the origin of the coordinate is the center of the sensor, and the X axis of the coordinate is parallel to the sensor and points to the right of the screen; the Y axis points upward; the Z axis points to the direction departing from the screen;
at step 143, the built-in sensor periodically transmits information about the motion of the hand, each such information being a frame, each frame including the following information: list information of all palms and list information of all fingers;
step 144, the sensor assigns a unique ID to all these information, which will not change if the palm, fingers, and tools remain in the field of view; from this information, information can be queried for each moving object, the object information being transmitted to the program, which assigns individual hand node information to the 3D model.
And 145, finishing the recognition of the gesture function by comparing the built-in gesture of the system with the current gesture.
Preferably, the interaction events that can occur in the gesture interaction positioning include:
gaze focus events, triggers when line of sight enters or leaves an object; processing a Hold gesture event; processing pressing and loosening events in the clicking process; processing a click event; processing a gesture manipulation event, wherein the event returns a gesture offset; processing a gesture navigation event, wherein the event returns a gesture offset, value range [ -1,1 ]; monitoring of the status of process event sources, such as: the gesture leaves the detection area.
The invention has the beneficial effects that:
the invention develops immersive petrochemical scene fire accident emergency disposal simulation drilling complete equipment by applying a 3D printing technology and a mixed reality technology, matching with an accident scene professional model and an emergency disposal plan flow and based on a real scene sand table and an augmented reality head-mounted display which are reduced in equal proportion. Compared with the existing training method, the petrochemical scene fire accident emergency treatment simulation drill based on the 3D printing and mixed reality technology is improved by the following steps:
(1) the mixed reality interactive drilling simulation system can be used for 3D printing aiming at different accident scenes and developing different plan flows by utilizing a mixed reality technology based on a professional petrochemical scene three-dimensional model library and a standardized emergency plan.
(2) Traditional text pre-case type and picture type teaching is converted into a three-dimensional visual immersive learning mode, and the learning effect is improved.
(3) The method can be applied in a large scale, and the training cost is reduced.
(4) Is not limited by time and space, and is convenient and flexible.
(5) High simulation and arouses the initiative of participants. (6) The on-site risk can be avoided, and the drilling safety is ensured.
Detailed Description
The invention provides a fire accident emergency treatment drilling system and a fire accident emergency treatment drilling method, and the invention is further described in detail below in order to make the purpose, technical scheme and effect of the invention clearer and clearer. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention is described in detail below with reference to the accompanying drawings:
example 1
With reference to fig. 1 and 2, the fire accident emergency disposal drilling system comprises a computing processing device, a modeling device, a 3D printing device and a head-mounted display, wherein the modeling device models a petrochemical fire scene, the 3D printing device is used for scaling down and rebuilding, the head-mounted display and a mixed reality technology are used for realizing superposition of the virtual fire scene and a real scene sand table, and petrochemical fire accident emergency disposal simulation drilling is performed through three-dimensional interaction.
The calculation processing equipment comprises a main control computer and a plurality of branch control computers, wherein the main control computer is arranged in the headquarter of the petrochemical enterprise, the branch control computers are positioned in each branch company, and the branch control computers are connected with the main control computer through signal lines and transmit the signals in the branch control computers to the main control computer.
The modeling equipment comprises a plurality of scene scanning devices and signal positioning devices, the signal positioning devices are installed in a plurality of target plants of petrochemical enterprises, each signal positioning device is matched with one scene scanning device, the scene scanning devices transmit scenes in the target plants to the calculation processing devices, and the positioning of the positions is realized through the signal positioning devices.
The 3D printing equipment comprises a 3D printer, and the 3D printer is connected with the computing processing equipment through a circuit; the real scene sand table comprises a sand table model, and the 3D printer places the printed petrochemical enterprise model in a specified area of the sand table model.
The head-mounted display is wirelessly connected with the computing processing equipment through the wireless signal transmission module; the head-mounted display comprises a display helmet, and the display helmet is provided with two light guide transparent holographic lenses, a micro projector, a depth camera, an infrared LED and a gesture displacement sensor.
The two light guide transparent holographic lenses and the micro projector are arranged at the front end of the middle part of the display helmet, and the display helmet utilizes a near-eye 3D diffraction display technology;
through being equipped with two light guide transparent holographic lenses to adopt DLP projection technique, the miniature projecting apparatus in the front projects behind the light guide transparent holographic lens and gets into people's eye, also lets the light of real world come in simultaneously.
The plurality of gesture displacement sensors are connected to the front part, the left part and the right part of the display helmet; the gesture displacement sensor is used for acquiring depth information of an environment, and real-time operation is carried out through a program by matching with self acceleration, angular velocity and the like; when the object moves, the depth information changes, and the changes build an incremental map, so that the space identification and positioning are completed.
The depth camera and the infrared LED capture pictures from two different angles, the signals are transmitted to the depth camera and the infrared LED, and the calculation processing equipment reconstructs motion information of the palm in a three-dimensional space of the real world through algorithm analysis.
Example 2
A fire accident drilling processing method adopts the fire accident emergency treatment drilling system, and specifically comprises the following steps:
the method comprises the steps of firstly determining a petrochemical typical fire emergency disposal process and an accident evolution process control method, carrying out three-dimensional projection process carding on a petrochemical enterprise typical fire accident emergency plan through the disposal process and the process control method, then carrying out 1:1 accurate three-dimensional modeling analysis through a typical fire accident scene, and developing a petrochemical typical fire accident emergency disposal simulation drilling system based on the research of augmented reality interaction and optimization algorithm.
The petrochemical typical fire emergency disposal flow method comprises the following steps:
step 101, analyzing a plurality of petrochemical typical fire accidents, and screening typical fire accidents needing to be analyzed; 102, determining good post responsibilities, and performing clear specification determination on the work responsibilities of each employee; step 103, defining the operation specification, and transmitting the operation specification to the staff in the form of an electronic file or a paper file; step 104, researching accident evolution process control; and 105, making a system development scheme and a system process control scheme.
Development contents of the petrochemical fire accident emergency handling simulation drilling system comprise space simulation positioning and gesture interaction positioning. The space simulation positioning comprises basic positioning and space positioning identification, wherein the basic positioning is realized by utilizing a near-eye 3D diffraction display technology and providing two light guide transparent holographic lenses, virtual contents adopt a DLP projection technology, a miniature projector in front projects the light guide lenses and then enters human eyes, and meanwhile light of the real world is transmitted in;
the spatial positioning identification is to acquire depth information of an environment by using a related sensor, match with self acceleration, angular velocity and other moving factors, change the depth information by real-time operation of a program when an object moves, and build an incremental map by the change so as to finish the identification and positioning of the space.
The positioning function of the spatial positioning identification is to acquire environmental depth information by using a depth-of-field camera or scan point cloud data by using a laser radar.
The gesture interactive positioning method comprises the following steps:
step 141, capturing pictures from two different angles through a depth camera and an infrared LED, and analyzing and reconstructing motion information of a palm in a three-dimensional space of a real world through an algorithm;
142, the system establishes a rectangular coordinate system through the bottom of the sensor, the origin of the coordinate is the center of the sensor, and the X axis of the coordinate is parallel to the sensor and points to the right of the screen; the Y axis points upward; the Z axis points to the direction departing from the screen;
at step 143, the built-in sensor periodically transmits information about the motion of the hand, each such information being a frame, each frame including the following information: list information of all palms and list information of all fingers;
step 144, the sensor assigns a unique ID to all these information, which will not change if the palm, fingers, and tools remain in the field of view; from this information, information can be queried for each moving object, the object information being transmitted to the program, which assigns individual hand node information to the 3D model.
And 145, finishing the recognition of the gesture function by comparing the built-in gesture of the system with the current gesture.
The interactive events which can occur in the gesture interactive positioning include:
gaze focus events, triggers when line of sight enters or leaves an object; processing a Hold gesture event; processing pressing and loosening events in the clicking process; processing a click event; processing a gesture manipulation event, wherein the event returns a gesture offset; processing a gesture navigation event, wherein the event returns a gesture offset, value range [ -1,1 ]; monitoring of the status of process event sources, such as: the gesture leaves the detection area.
Example 3
This patent is to the not enough of current petrochemical enterprise accident emergency handling rehearsal method and petrochemical enterprise and the actual demand that administrative department improved work to the emergent handling capacity of enterprise personnel accident, provide an immersive petrochemical tank district fire incident emergency handling rehearsal complete sets system based on mixed reality technique, use three-dimensional modeling, 3D prints, technologies such as mixed reality are the basis, carry out accurate modeling with petrochemical industry typical fire scene, and utilize 3D printing technique to carry out the equal proportion and reduce the reconstruction, thereby avoid space restriction.
The method has the advantages that the virtual fire scene and the plan deduction flow are superposed on a real scene sand table by utilizing an augmented reality head-mounted display and a mixed reality technology, petrochemical fire accident emergency handling simulation drilling is carried out through three-dimensional interaction, a real accident scene is simulated, an experience and operation drilling environment with high immersion and multiple perception forms is created for enterprise emergency personnel, and the overall control, detail operation and reality of the plan by the participating personnel in the drilling process are enhanced. And application requirements such as accident emergency plan learning and emergency drilling are provided for petrochemical enterprises. The main technical scheme comprises the following parts:
1. and controlling a petrochemical typical fire emergency disposal process and an accident evolution process. The emergency disposal process, the post responsibility and the operation specification are determined according to a typical fire accident scene, and the accident evolution process control is researched so as to conveniently make a system development scheme and system process control;
2. aiming at a typical fire accident emergency plan of a petrochemical enterprise, carrying out three-dimensional projection flow carding on the plan;
3. 1:1 accurate three-dimensional modeling is carried out aiming at a typical fire accident scene, and 3D printing technology is utilized to carry out isometric reduced printing on a drilling scene, so that the problem of space limitation existing in the traditional drilling is solved;
4. interaction and optimization algorithm research based on augmented reality. The method comprises the following steps of researching algorithms such as registration tracking, gesture recognition, three-dimensional interaction, model optimization and the like of an augmented reality technology, and providing corresponding theoretical support for functions and optimization of interactive equipment such as display, matching, scene superposition, operation and the like;
5. and (3) development of a petrochemical typical fire accident emergency treatment simulation drilling system. A tank area fire accident emergency disposal system is developed based on petrochemical typical fire emergency disposal flow and accident evolution process control, and the system function development and the three-dimensional simulation are achieved.
Example 4
For example, the fire accident emergency treatment drilling system and the petrochemical typical fire emergency treatment flow method are used for implementation in the existing petrochemical enterprise, and specific implementation contents are as follows.
The simulation evaluation method is developed by combining a gesture recognition technology, an Inside-Out positioning technology and an image recognition technology and used for simulating the simulation and the evaluation of the tank area explosion emergency disposal process, and specifically comprises the following steps:
(1) location of content in fire incident
The location of the content in the fire incident includes a base location and a spatial location identification SLAM. The basic principle of basic positioning is to use a near-eye 3D diffraction display technology, and the near-eye three-dimensional (3D) diffraction display technology is provided with two light guide transparent holographic lenses, virtual content adopts a Digital Light Processing (DLP) projection technology, and a micro projector in front projects the light guide lenses and then enters human eyes, and meanwhile, light of the real world is transmitted.
Spatial localization recognition SLAM: and (4) instant positioning and map building. The depth information of the environment is acquired by using related sensors, the acceleration, the angular velocity and the like of the environment are matched, and the depth information can be changed along with the change of the depth information when an object moves through real-time operation of a program, and an incremental map is built through the changes, so that the space identification and positioning are completed.
There are many ways for SLAM data acquisition, the positioning function is to use the depth-of-field camera to obtain the environmental depth information, and the other is to use the laser radar to scan the point cloud data.
(2) Gesture interactive positioning
The method comprises the steps of capturing pictures from two different angles through a depth camera and an infrared LED, and analyzing and reconstructing motion information of a palm in a three-dimensional space of the real world through an algorithm.
The system establishes a rectangular coordinate system through the bottom of the sensor, the origin of the coordinate is the center of the sensor, and the X axis of the coordinate is parallel to the sensor and points to the right of the screen; the Y axis points upward; the Z-axis points away from the screen.
The built-in sensor will periodically send information about the motion of the hand, each such information being a frame, each frame containing the following information: list information of all palms; list information of all fingers;
the sensor will assign a unique ID to all of this information, which will not change if the palm, fingers, or tools remain within the field of view. From this information, information of each moving object can be queried. The object information is transmitted to the program, which assigns each hand node information to the 3D model. And the gesture function is recognized by comparing the built-in gesture of the system with the current gesture.
The interaction events include:
1. gaze focus events, triggers when line of sight enters or leaves an object;
2. processing a Hold gesture event;
3. processing pressing and loosening events in the clicking process;
4. processing a click event;
5. processing a gesture manipulation event, wherein the event returns a gesture offset;
6. processing a gesture navigation event, wherein the event returns a gesture offset, value range [ -1,1 ];
7. monitoring of the status of process event sources, such as: the gesture leaves the detection area.
(3) Application of image recognition technology
The motion tracking technique of the AR image recognition technique uses an AR camera to identify feature points and track the movement of these points over time. Combining the movement of these points with the readings of the head-display device inertial sensors, AR image recognition techniques can determine its position and orientation as the head-display device moves.
In addition to identifying feature points, AR image recognition techniques detect flat surfaces (e.g., tables or floors) and estimate the average illumination intensity of surrounding areas. Together, these functions allow AR image recognition techniques to build their own understanding of the surrounding world.
With the understanding of the real world by AR image recognition technology, objects, annotations, or other information can be added in a manner that is seamlessly integrated with the real world
(4) Use of basic functions
1. Spatial scanning and positioning: and (5) displaying the spatial position by inspection, and simulating the current position by a positioning function.
2. Dynamic recording of the model: and identifying a loadable region according to an image identification technology, and dynamically loading the tank region model onto the 3D printing model.
3. Model identification: supporting information display of equipment such as a tank area, a fire engine, a fire facility and the like;
4. introduction of background: and 3D pop-up UI is supported to broadcast the tank field background information.
Secondly, simulating the fire accident, which comprises the following steps:
(5) sealing ring fire accident simulation
1. And (3) leakage simulation: and (3) expressing the crude oil leakage effect through a particle special effect or dynamic transformation model UV.
2. Lightning stroke simulation: the top irrigation is simulated to be struck by lightning through the particle special effect.
3. Flame simulation: the top-filling sealing ring is triggered to catch fire after the top-filling is simulated to be struck by lightning through the particle special effect.
4. Sound special effect: and playing a sound effect alarm bell through the AudioSource and the highlight effect.
5. Voice broadcasting: and the voice broadcast alarm event is supported to be played.
(6) Emergency rescue simulation
1. Spraying and opening: and supporting a user to simulate tank field workers and highlighting a spraying area.
2. And (3) clicking gesture: and the user is supported to trigger the spray opening through a click gesture event.
3. Particle special effect simulation: and supporting special effect display of spray particles.
4. Treatment of the process flow: and voice prompt is supported to prompt a user to adjust the process.
5. Simulating fire-fighting animation: and dynamically displaying fire-fighting alarm Animation by using an Animation system to simulate a driving route.
6. Vehicle deployment function: and animating the fire truck to move and sequentially arrive at the site designated position according to the deployment drawing. The system directs the user to deploy the fire truck. (highlight deployment location to which vehicle is driven after gesture click) the system directs the user to begin a fire fighting program. (highlight degradable or extinguishable areas).
7. A deployment scheme: the fire fighting truck is used for standing according to the principles of effective suppression and convenient evacuation and is deployed according to the fire fighting scheme of the tank field.
8. Warning simulation: and voice broadcasting of peripheral areas is supported.
9. The radiation range shows: and supporting the display of the radiation area after the fire occurs.
10. And (3) displaying fire scene information: and supporting the consumption display of the foam mixed liquid of the crude oil storage tank.
11. And (3) accident extinguishing animation simulation: and the floating bin is supported to reduce flame and gradually extinguish the animation display.
(7) Ending evacuation scene
1. And (3) ending simulation: and the ending link display after the simulated fire-fighting accident is supported.
2. Evacuation of the fire fighting truck: and the fire fighting truck can be automatically evacuated after being clicked.
The immersive petrochemical scene fire accident emergency disposal simulation drilling system is developed by applying a 3D printing technology and a mixed reality technology, matching with an accident scene professional model and an emergency disposal plan process, and based on a real scene sand table and an augmented reality head-mounted display which are reduced in equal proportion.
Compared with the existing training method, the petrochemical scene fire accident emergency treatment simulation drill based on the 3D printing and mixed reality technology is improved by the following steps:
1) the mixed reality interactive drilling simulation system can be used for 3D printing aiming at different accident scenes and developing different plan flows by utilizing a mixed reality technology based on a professional petrochemical scene three-dimensional model library and a standardized emergency plan.
2) Traditional text pre-case type and picture type teaching is converted into a three-dimensional visual immersive learning mode, and the learning effect is improved.
3) The method can be applied in a large scale, and the training cost is reduced.
4) Is not limited by time and space, and is convenient and flexible.
5) High simulation and arouses the initiative of participants.
6) The on-site risk can be avoided, and the drilling safety is ensured.
It is to be understood that the above description is not intended to limit the present invention, and the present invention is not limited to the above examples, and those skilled in the art may make modifications, alterations, additions or substitutions within the spirit and scope of the present invention.