EP3062297A1 - Emergency guidance system and method - Google Patents

Emergency guidance system and method Download PDF

Info

Publication number
EP3062297A1
EP3062297A1 EP15275042.8A EP15275042A EP3062297A1 EP 3062297 A1 EP3062297 A1 EP 3062297A1 EP 15275042 A EP15275042 A EP 15275042A EP 3062297 A1 EP3062297 A1 EP 3062297A1
Authority
EP
European Patent Office
Prior art keywords
user
building
mixed reality
reality environment
data representative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP15275042.8A
Other languages
German (de)
French (fr)
Inventor
designation of the inventor has not yet been filed The
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems PLC
Original Assignee
BAE Systems PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAE Systems PLC filed Critical BAE Systems PLC
Priority to EP15275042.8A priority Critical patent/EP3062297A1/en
Priority to PCT/GB2016/050366 priority patent/WO2016135448A1/en
Publication of EP3062297A1 publication Critical patent/EP3062297A1/en
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources
    • G08B7/062Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources indicating emergency exits
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources
    • G08B7/066Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources guiding along a path, e.g. evacuation path lighting strip

Definitions

  • This invention relates generally to an emergency guidance system and method and, more particularly but not necessarily exclusively, to a visual guidance system and method for use in assisting users to escape from, or evacuate, a building or other structure in an emergency situation.
  • a mixed reality guidance system for use within a building or other structure, the system comprising a headset for placing over a user's eyes, in use, said headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display said mixed reality environment on said screen, the system further comprising a storage module having stored therein a three-dimensional virtual model of the interior layout of said building or other structure, a positioning module for determining the current location of said user within said building or other structure, a processing module configured to calculate a recommended route from said current location of said user to a second location relative to said building or structure and generate navigation data representative of said recommended route, and an image processing module configured to generate image data representative of said navigation data and display said image data within said mixed reality environment on said screen.
  • the image data may comprise navigational symbols, overlayed or blended into said mixed reality environment so as to be representative of said recommended route.
  • Such navigational symbols may be updated within said mixed reality environment using updated location data from said positioning module as said user moves through the interior of the building or other structure.
  • the image processing module may be further configured to obtain, from said three-dimensional virtual model, image data representative of selected fixed features of the interior of the building or other structure within said real world environment in the vicinity of the user, and overlay or blend said image data into said mixed reality environment in respect of corresponding features therein.
  • the system may be configured to receive data from at least one external sensor indicative of a hazard or obstacle in or on said recommended route, and re-calculate said recommended route to circumnavigate said hazard or obstacle.
  • the system may comprise an image processing module for generating image data representative of said hazard or obstacle, and overlaying or blending said image data into said mixed reality environment displayed on said screen.
  • the positioning module may be mounted in or on said headset.
  • the image capture means may comprise at least one image capture device, and more probably two image capture devices, mounted on said headset so as to be substantially aligned with a user's eyes, in use.
  • the processing module may be configured to receive, from remote sensors, data representative of the health or structural or environmental status of the building or other structure and/or equipment located therein.
  • the headset may comprise a face mask, configured to be worn over a user's nose and/or mouth, in use, and include a respirator.
  • the face mask may be provided with a fume seal configured to form an air tight seal between said face mask and a user's face, in use.
  • control apparatus for a mixed reality guidance system as described above, said control apparatus comprising a storage module having stored therein a three-dimensional virtual model of a building or other structure, a processing module configured to receive, from a positioning module, location data representative of the current location of a user, determine a required location for said user relative to said building or structure and calculate a recommended route for said user from their current location to said required location and generate navigation data representative of said recommended route, the processing module being further configured to receive, from said positioning module, updated location data representative of the current location of the user as they move through said building or structure and generate updated navigation data representative of said recommended route accordingly.
  • the processing module may be configured to receive, from a plurality of positioning modules, location data representative of the respective current locations of a plurality of users, generate a required location for each said user, calculate a respective recommended route for each user from their current location to their required location, and generate respective navigation data representative of each recommended route, the processor being further configured to receive, from each said positioning module, updated location data representative of the current location of each respective user as they move through said building or structure and generate updated navigation data representative of their respective recommended route accordingly.
  • the processor may be further configured to receive sensor data from the current location of at least one of said users and use said sensor data in said calculation of one or more of said recommended routes.
  • the control apparatus may further include a storage module for storing data representative of the current occupants of said building or structure.
  • a mixed reality emergency guidance system for use within a building or other structure, the system comprising at least one headset for placing over a user's eyes, in use, the or each headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a respective user's field of view and display said mixed reality environment on said screen, the system further comprising control apparatus as described above.
  • a method of providing a guidance system for a building or other structure comprising providing a mixed reality system comprising a headset for placing over a user's eyes, in use, said headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display said mixed reality environment on said screen, the method further comprising providing a storage module having stored therein a three-dimensional virtual model of the interior layout of said building or other structure, providing a positioning module for determining the current location of said user within said building or other structure, providing a processing module and configuring said processing module to calculate a recommended route from said current location of said user to a second location relative to said building or structure and generate navigation data representative of said recommended route, and providing an image processing module configured to generate image data representative of said
  • Virtual reality systems comprising a headset which, when placed over a user's eyes, creates and displays a three dimensional virtual environment in which a user feels immersed and with which the user can interact in a manner dependent on the application.
  • the virtual environment may comprise a game zone, within which the user can play a game.
  • augmented and mixed reality systems have been developed, wherein image data captured in respect of a user's real world environment can be captured, rendered and placed within a 3D virtual reality environment.
  • the user views their real world environment as a three dimensional virtual world generated images captured from their surroundings.
  • a mixed reality display system may include a headset 100 comprising a visor 10 having a pair of arms 12 hingedly attached at opposing sides thereof in order to allow the visor to be secured onto a user's head, over their eyes, in use, by placing the curved ends of the arms 12 over and behind the user's ears, in a manner similar to conventional spectacles.
  • the headset is illustrated herein in the form of a visor, it may alternatively comprise a helmet for placing over a user's head, or even a pair of contact lenses or the like, and the present invention is not intended to be in any way limited in this regard.
  • a pair of image capture devices 14 for capturing images of the environment, such image capture devices being mounted as closely as possible aligned with the user's eyes, in use.
  • a typical mixed reality system further comprises a processor, which is communicably connected in some way to a screen which is provided inside the visor 10.
  • Such communicable connection may be a hard wired electrical connection, in which case the processor and associated circuitry will be mounted on the headset.
  • the processor may be configured to wirelessly communicate with the visor, for example, by means of Bluetooth or similar wireless communication protocol, in which case, the processor need not be mounted on the headset but can instead be located remotely from the headset, with the relative allowable distance between them being dictated, and limited only by, the wireless communication protocol being employed.
  • the processor could be mounted on, or formed integrally with, the user's clothing, or instead located remotely from the user, either as a stand-alone unit or as an integral part of a larger control unit, for example.
  • the processor receives image data from the image capture devices, and renders and blends such image data, in real time, into a displayed three dimensional virtual environment.
  • the concept of real time image blending for augmented or mixed reality is known, and several different techniques have been proposed.
  • the present invention is not necessarily intended to be limited in this regard.
  • one exemplary method for image blending will be briefly described.
  • a threshold function may be applied in order to extract that object from the background image. Its relative location and orientation may also be extracted and preserved by means of marker data.
  • the image and marker data is converted to a binary image, possibly by means of adaptive thresholding (although other methods are known).
  • the marker data and binary image are then transformed into a set of coordinates that match the location within the virtual environment in which they will be blended.
  • Such blending is usually performed using black and white image data.
  • colour data sampled from the source image can be backward warped, using homography, to each pixel in the resultant virtual scene. All of these computational steps require minimal processing and time and can, therefore, be performed quickly and in real (or near real) time.
  • image data within the mixed reality environment can be updated in real time.
  • an emergency guidance system comprises at least one mixed reality headset 100 and an emergency response processing system 104, which may be integrated in or mounted on the headset 100 and/or provided at a fixed location within the building or structure and configured for wireless communication with the headset 100. It is envisaged that some exemplary embodiments may comprise a central emergency response processing system for communication with a single headset, or in parallel with a plurality of headsets.
  • the processing functionality of the emergency response processing system may be distributed, partly or fully, amongst individual processing units provided in or on the headsets, which may or may not be the same processing units used to provide on the screen mixed reality images of the wearer's environment derived from image data captured by the image capture devices on the headset, and the present invention is not necessarily intended to be in any way limited in this regard.
  • the processing system 104 is configured to receive, from one or more external sources 108, 110, data representative of, for example, the structural status of the building or structure, the health and/or status of (at least) key equipment therein, the nature and/or status of an emergency situation, the location of hazardous elements of an emergency situation (i.e. the location of a fire, for example), the location of other occupants within the building or structure, etc.
  • the processing system 104 generally includes an interface to enable data to be transmitted therefrom and received thereby, in order that data that could potentially be changing dynamically is updated in real (or near real) time.
  • the processing system 104 may be configured to receive, or have stored therein, a three dimensional, virtual model 112 of the building or structure.
  • the processing functionality of the above-described emergency response processing system may be provided by means of more than one processor. Indeed, several processors may be required to facilitate embodiments of the present invention, some of which may be dedicated system processors, whether remote or on-board (i.e. mounted in or on the one or more headsets 100), and others of which may be processors or other data processing devices incorporated in the network infrastructure of the building, and the present invention is not necessarily intended to be limited in this regard.
  • the processing function may be provided by an entirely de-centralised network.
  • this functionality may be provided by a "mesh network", which is configured to self-initiate (in response to an emergency situation or otherwise) and build a network using distributed devices.
  • each node may pass data along to another available node, in the manner of a "daisy chain", such that not all nodes in the network need to be within communication range of each other.
  • the or each headset 100 may include an internal geo-location system for generating data representative of the relative location, within the building or structure, of the wearer of the respective headset and transmit such data, continuously or otherwise, to the processing system 104.
  • a user places a headset 100 over their eyes, and the processing system 104 is configured, based on the current location of the wearer within the building or structure, to calculate, using the above-mentioned 3D virtual model 112 of the building or structure, the safest and/or quickest route from the wearer's location. It will be appreciated that, in many cases, this may be an escape route, but it may also be a route toward the hazard or emergency depending on the role of the wearer within the situation.
  • Calculation of the above-mentioned route may be performed in a similar manner to that used in in-car satellite navigation systems.
  • the processing system identifies the wearer's current location and the required destination. It then determines the current status of the connecting paths between those two locations, based on the 3D virtual model 112 (for permanent status aspects) and from data received from external sources (to take into account the dynamically changing environment).
  • Status parameters may include whether or not two proximal paths or corridors are physically connected (or separated by a wall or locked door) and actually navigable (i.e. not blocked by an obstacle or two narrow to pass through safely).
  • the processing system may also identify path to path 'cost', in terms of, for example, the number of turns and corners to be navigated, presence or absence of doors, etc. The processing system then identifies the shortest and/or simplest route having the lowest 'cost'.
  • the processing system 104 generates appropriate navigation instructions, generates virtual representations of such navigational instructions and overlays them, or otherwise blends them, in the virtual environment displayed on the screen within the headset 100.
  • the wearer can see their immediate environment (derived from rendered and blended image data captured by the image capture devices on the headset) together with visual navigation aids directing them along the recommended route.
  • the navigational image data may include indications of areas through which the wearer cannot pass, for example, a locked door.
  • a still image of what the wearer may see on their screen, according to one exemplary embodiment of the invention, is illustrated in Figure 3 of the drawings.
  • the visual navigational aids may be supplemented, or even replaced, with voice guidance emitted through speakers provided within the headset.
  • the 3D environment displayed on the screen is continuously updated, in real time, using images captured by the image capture devices.
  • the processing system 104 is continuously updated with the wearer's current location, such that the displayed navigation data can also be updated accordingly.
  • structural sensors and equipment health reporting systems may be used to supply relevant data to the processing system 104, such that the wearer's route can be dynamically updated to take into account changing conditions.
  • data from heat sensors can be used to identify the location of the fire within the structure, such that the calculated route is configured to avoid it (or the route re-calculated, as required).
  • the processing system is configured to re-calculate the route accordingly, to ensure that the wearer avoids any hazard.
  • the processing system may be configured to generate and insert a visual representation of an obstacle or hazard in a user's vicinity into the 3D virtual environment displayed on their screen.
  • the processing system 104 is configured to collate available sensor data from appropriate sources with the aim of ensuring that the wearer of the headset is guided around blockages, breaches, fire, heat, chemical spills and/or other potential hazards, as appropriate.
  • the system includes a 3D virtual model 112 of the building or structure.
  • the processing system may be configured to overlay image data representative of permanent features (such as walls, corners, stairs, etc.) onto the 3D mixed reality images displayed to the user (generated from the images captured by the image capture devices on their headsets), using knowledge of the user's absolute location and/or one of a number of image matching techniques.
  • image data representative of permanent features such as walls, corners, stairs, etc.
  • the user 200 may be presented with a 3D virtual image 202 of their immediate environment including a visual representation 204 of the recommended route, an overlayed image 206 of permanent features of the building infrastructure, and known (or identified) hazards 208.
  • each headset may include a seal such that the visor can be sealed over the wearer's eyes, in use, thereby preventing smoke and other noxious substances from reaching their eyes and potentially compromising their vision.
  • the headset could include a respiration filter within a mask portion for covering the user's nose and mouth, to prevent inhalation of smoke or other irritant or toxic substances, and aid breathing.
  • embodiments of the present invention provide an emergency guidance system and method, wherein a mixed reality headset is provided with a live connection to the building infrastructure via, for example, a central control system which integrates several such headsets and employs distributed sensors, machine health information modules, and pre-existing emergency detection systems, functionally integrated therein, so as to provide a mixed reality system which generates and displays a live route in an emergency situation and which can also identify dangers and hazards in a dynamically changing environment.
  • the headset itself may be provided with sensors such as ambient temperatures sensors, oxygen quality sensors and even health monitors in respect of the wearer. Multi-spectral cameras may also be provided to identify additional sources of heat, and even radiation sensors could be employed, depending on the environment for which the system is intended. It is envisaged, in accordance with some exemplary embodiments, that multiple headsets would be communicably coupled to a central control system and to each other to enable gathered data to be shared, thereby to increase the overall situational awareness of the system.
  • a single processing system can be used to generate dynamically updated, optimum routes in respect of a number of different users (and headsets), as illustrated schematically in Figure 2 , wherein the route calculated and updated for each wearer will be dependent on their individual respective location within the building or structure and their role within the emergency situation, using data from external, static sensors within the infrastructure of the environment and/or data from sensors mounted on-board their respective headsets.
  • sensor data from other headsets within the system may additionally be used by the processing system to identify data relevant to a particular user.
  • the processing system may be configured to coordinate multiple users' locations, movements and routes, such that each individual user's route can be calculated taking into account the location and movement of other users so as to ensure, for example, that localised crowds or bottle necks within the only or principal thoroughfares can be avoided or at least minimised.
  • the main processing system will be remote from, and wirelessly coupled to, the headsets 100, either in a fixed location within the building or structure or in or on one of the headsets (intended for example for use by a team leader or safety officer).
  • each headset may include a local processing system with similar (or possibly reduced) functionality, such that the headsets can still function adequately, in the event of a main system failure, to guide the wearer to safety.
  • an exemplary embodiment of the present invention may provide an emergency guidance system which can be used to coordinate the movements of several people, whereby each user's headset is communicably coupled to the processing system and also, optionally, to each other.
  • each user's headset is communicably coupled to the processing system and also, optionally, to each other.
  • the size, location and nature of the group as a whole can additionally be taken into account, thus, for example, enabling the management and generation of alternative routes to allow emergency crews access to relevant areas.
  • the resultant system can potentially report safe routes that have provided others with a safe escape, provide an active list of all people still within the building or structure, and report building health and hazard locations to emergency crews for coordination purposes.
  • sensors worn on each user's person may be configured to transmit data representative of the respective user's vital signs and/or health status to the system, for provision to, for example, the emergency services, so as to potentially enable diagnosis of injuries such as burns, lung damage or other injuries.
  • Various exemplary embodiments of the present invention are envisaged for use in various different environments, including, but not limited to, buildings, surface and sub-surface marine vessels, offshore oil rigs, oil refineries, and other complex internal and external environments.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A mixed reality emergency guidance system, and method of providing same, for use within a building or other structure, the system comprising a headset (100) for placing over a user's eyes, in use, the headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of the real world environment into the three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display the mixed reality environment on said screen. A storage module (112) is provided, having stored therein a three-dimensional virtual model of the interior layout of the building or other structure. A positioning module determines the current location of the user within the building or other structure, and a processing module (104) is configured to calculate a recommended escape, or other, route from the current location of said user to a second location relative to the building or structure and generate navigation data representative of the recommended route. An image processing module generates image data representative of the navigation data and displays the image data within the mixed reality environment on the screen.

Description

  • This invention relates generally to an emergency guidance system and method and, more particularly but not necessarily exclusively, to a visual guidance system and method for use in assisting users to escape from, or evacuate, a building or other structure in an emergency situation.
  • There are many potential emergency situations in which occupants of a building or other structure would be required to escape therefrom, as quickly as possible and by means of the quickest, but also the safest, route. Statutory health and safety regulations specify that signs illustrating and explaining emergency procedures and escape routes are clearly displayed within all public and corporate buildings and structures, which are intended to inform occupants as to the emergency and evacuation procedures for a specific building or structure, and provide guidance and/or directions as to the quickest escape route from their current location (i.e. near the sign).
  • However, there are a number of issues associated with this type of passive information and guidance facility. Firstly, a user may not be familiar with their environment, and have difficulty, especially under pressure, in determining the correct escape route by reference to a two dimensional floor plan or map. Furthermore, once they have moved away from the sign, they do not have any ongoing reference. Still further, unknown hazards may exist or occur along the signposted exit route, of which a person may be unaware until they actually reach it, possibly causing injury and/or forcing them to take an alternative route with which they may be unfamiliar. Finally, during some types of emergency, smoke or other noxious substances may severely obscure a person's vision and/or affect their ability to safely navigate the exit route.
  • It would therefore be desirable to provide an emergency guidance system and method which provides more effective and intuitive emergency guidance and addresses at least some of the issues outlined above.
  • In accordance with a first aspect of the present invention, there is provided a mixed reality guidance system for use within a building or other structure, the system comprising a headset for placing over a user's eyes, in use, said headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display said mixed reality environment on said screen, the system further comprising a storage module having stored therein a three-dimensional virtual model of the interior layout of said building or other structure, a positioning module for determining the current location of said user within said building or other structure, a processing module configured to calculate a recommended route from said current location of said user to a second location relative to said building or structure and generate navigation data representative of said recommended route, and an image processing module configured to generate image data representative of said navigation data and display said image data within said mixed reality environment on said screen.
  • The image data may comprise navigational symbols, overlayed or blended into said mixed reality environment so as to be representative of said recommended route. Such navigational symbols may be updated within said mixed reality environment using updated location data from said positioning module as said user moves through the interior of the building or other structure.
  • The image processing module may be further configured to obtain, from said three-dimensional virtual model, image data representative of selected fixed features of the interior of the building or other structure within said real world environment in the vicinity of the user, and overlay or blend said image data into said mixed reality environment in respect of corresponding features therein.
  • In an exemplary embodiment of the present invention, the system may be configured to receive data from at least one external sensor indicative of a hazard or obstacle in or on said recommended route, and re-calculate said recommended route to circumnavigate said hazard or obstacle. In this case, the system may comprise an image processing module for generating image data representative of said hazard or obstacle, and overlaying or blending said image data into said mixed reality environment displayed on said screen.
  • The positioning module may be mounted in or on said headset.
  • The image capture means may comprise at least one image capture device, and more probably two image capture devices, mounted on said headset so as to be substantially aligned with a user's eyes, in use.
  • The processing module may be configured to receive, from remote sensors, data representative of the health or structural or environmental status of the building or other structure and/or equipment located therein.
  • The headset may comprise a face mask, configured to be worn over a user's nose and/or mouth, in use, and include a respirator. The face mask may be provided with a fume seal configured to form an air tight seal between said face mask and a user's face, in use.
  • In accordance with another aspect of the present invention, there is provided control apparatus for a mixed reality guidance system as described above, said control apparatus comprising a storage module having stored therein a three-dimensional virtual model of a building or other structure, a processing module configured to receive, from a positioning module, location data representative of the current location of a user, determine a required location for said user relative to said building or structure and calculate a recommended route for said user from their current location to said required location and generate navigation data representative of said recommended route, the processing module being further configured to receive, from said positioning module, updated location data representative of the current location of the user as they move through said building or structure and generate updated navigation data representative of said recommended route accordingly.
  • The processing module may be configured to receive, from a plurality of positioning modules, location data representative of the respective current locations of a plurality of users, generate a required location for each said user, calculate a respective recommended route for each user from their current location to their required location, and generate respective navigation data representative of each recommended route, the processor being further configured to receive, from each said positioning module, updated location data representative of the current location of each respective user as they move through said building or structure and generate updated navigation data representative of their respective recommended route accordingly.
  • The processor may be further configured to receive sensor data from the current location of at least one of said users and use said sensor data in said calculation of one or more of said recommended routes.
  • The control apparatus may further include a storage module for storing data representative of the current occupants of said building or structure.
  • Another aspect of the present invention extends to a mixed reality emergency guidance system, for use within a building or other structure, the system comprising at least one headset for placing over a user's eyes, in use, the or each headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a respective user's field of view and display said mixed reality environment on said screen, the system further comprising control apparatus as described above.
  • In accordance with yet another aspect of the present invention, there is provided a method of providing a guidance system for a building or other structure, the method comprising providing a mixed reality system comprising a headset for placing over a user's eyes, in use, said headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display said mixed reality environment on said screen, the method further comprising providing a storage module having stored therein a three-dimensional virtual model of the interior layout of said building or other structure, providing a positioning module for determining the current location of said user within said building or other structure, providing a processing module and configuring said processing module to calculate a recommended route from said current location of said user to a second location relative to said building or structure and generate navigation data representative of said recommended route, and providing an image processing module configured to generate image data representative of said navigation data and display said image data within said mixed reality environment on said screen.
  • These and other aspects of the present invention will be apparent from the following specific description in which embodiments of the present invention are described, by way of examples only, and with reference to the accompanying drawings, in which:
    • Figure 1 is a front perspective view of a headset for use in a mixed reality system in respect of which a method and apparatus according to an exemplary embodiment of the present invention can be provided;
    • Figure 2 is a schematic block diagram illustrating the configuration of some principal elements of a mixed reality system for use in an exemplary embodiment of the present invention;
    • Figure 3 is a schematic illustration of a single image frame displayed on the screen of a mixed reality system according to an exemplary embodiment of the present invention;
    • Figure 4 is a schematic diagram illustrative of the configuration of a mixed reality emergency guidance system according to an exemplary embodiment of the present invention; and
    • Figure 5 is a schematic diagram illustrative of the configuration of the configuration of a mixed reality emergency guidance system according to another exemplary embodiment of the present invention.
  • Virtual reality systems are known, comprising a headset which, when placed over a user's eyes, creates and displays a three dimensional virtual environment in which a user feels immersed and with which the user can interact in a manner dependent on the application. For example, the virtual environment may comprise a game zone, within which the user can play a game.
  • More recently, augmented and mixed reality systems have been developed, wherein image data captured in respect of a user's real world environment can be captured, rendered and placed within a 3D virtual reality environment. Thus, the user views their real world environment as a three dimensional virtual world generated images captured from their surroundings.
  • Referring to Figure 1 of the drawings, a mixed reality display system may include a headset 100 comprising a visor 10 having a pair of arms 12 hingedly attached at opposing sides thereof in order to allow the visor to be secured onto a user's head, over their eyes, in use, by placing the curved ends of the arms 12 over and behind the user's ears, in a manner similar to conventional spectacles. It will be appreciated that, whilst the headset is illustrated herein in the form of a visor, it may alternatively comprise a helmet for placing over a user's head, or even a pair of contact lenses or the like, and the present invention is not intended to be in any way limited in this regard. Also provided on the headset, is a pair of image capture devices 14 for capturing images of the environment, such image capture devices being mounted as closely as possible aligned with the user's eyes, in use.
  • A typical mixed reality system further comprises a processor, which is communicably connected in some way to a screen which is provided inside the visor 10. Such communicable connection may be a hard wired electrical connection, in which case the processor and associated circuitry will be mounted on the headset. Alternatively, however, the processor may be configured to wirelessly communicate with the visor, for example, by means of Bluetooth or similar wireless communication protocol, in which case, the processor need not be mounted on the headset but can instead be located remotely from the headset, with the relative allowable distance between them being dictated, and limited only by, the wireless communication protocol being employed. For example, the processor could be mounted on, or formed integrally with, the user's clothing, or instead located remotely from the user, either as a stand-alone unit or as an integral part of a larger control unit, for example.
  • In general, the processor receives image data from the image capture devices, and renders and blends such image data, in real time, into a displayed three dimensional virtual environment. The concept of real time image blending for augmented or mixed reality is known, and several different techniques have been proposed. The present invention is not necessarily intended to be limited in this regard. However, for completeness, one exemplary method for image blending will be briefly described. Thus, in respect of an object or portion of a real world image to be blended into the virtual environment, a threshold function may be applied in order to extract that object from the background image. Its relative location and orientation may also be extracted and preserved by means of marker data. Next, the image and marker data is converted to a binary image, possibly by means of adaptive thresholding (although other methods are known). The marker data and binary image are then transformed into a set of coordinates that match the location within the virtual environment in which they will be blended. Such blending is usually performed using black and white image data. Thus, if necessary, colour data sampled from the source image can be backward warped, using homography, to each pixel in the resultant virtual scene. All of these computational steps require minimal processing and time and can, therefore, be performed quickly and in real (or near real) time. Thus, as the user's field of view and/or external surroundings change, image data within the mixed reality environment can be updated in real time.
  • Referring to Figure 2 of the drawings, an emergency guidance system according to a first exemplary embodiment of the present invention comprises at least one mixed reality headset 100 and an emergency response processing system 104, which may be integrated in or mounted on the headset 100 and/or provided at a fixed location within the building or structure and configured for wireless communication with the headset 100. It is envisaged that some exemplary embodiments may comprise a central emergency response processing system for communication with a single headset, or in parallel with a plurality of headsets. However, in alternative exemplary embodiments, the processing functionality of the emergency response processing system may be distributed, partly or fully, amongst individual processing units provided in or on the headsets, which may or may not be the same processing units used to provide on the screen mixed reality images of the wearer's environment derived from image data captured by the image capture devices on the headset, and the present invention is not necessarily intended to be in any way limited in this regard.
  • The processing system 104 is configured to receive, from one or more external sources 108, 110, data representative of, for example, the structural status of the building or structure, the health and/or status of (at least) key equipment therein, the nature and/or status of an emergency situation, the location of hazardous elements of an emergency situation (i.e. the location of a fire, for example), the location of other occupants within the building or structure, etc. Thus, the processing system 104 generally includes an interface to enable data to be transmitted therefrom and received thereby, in order that data that could potentially be changing dynamically is updated in real (or near real) time. Furthermore, the processing system 104 may be configured to receive, or have stored therein, a three dimensional, virtual model 112 of the building or structure.
  • It will be appreciated by a person skilled in the art that the processing functionality of the above-described emergency response processing system may be provided by means of more than one processor. Indeed, several processors may be required to facilitate embodiments of the present invention, some of which may be dedicated system processors, whether remote or on-board (i.e. mounted in or on the one or more headsets 100), and others of which may be processors or other data processing devices incorporated in the network infrastructure of the building, and the present invention is not necessarily intended to be limited in this regard. Indeed, the processing function may be provided by an entirely de-centralised network. For example, this functionality may be provided by a "mesh network", which is configured to self-initiate (in response to an emergency situation or otherwise) and build a network using distributed devices. Such a de-centralised network would continue to function even if, for example, the infrastructure of the building is damaged or destroyed: each node may pass data along to another available node, in the manner of a "daisy chain", such that not all nodes in the network need to be within communication range of each other.
  • The or each headset 100 may include an internal geo-location system for generating data representative of the relative location, within the building or structure, of the wearer of the respective headset and transmit such data, continuously or otherwise, to the processing system 104. In the event of an emergency situation, a user places a headset 100 over their eyes, and the processing system 104 is configured, based on the current location of the wearer within the building or structure, to calculate, using the above-mentioned 3D virtual model 112 of the building or structure, the safest and/or quickest route from the wearer's location. It will be appreciated that, in many cases, this may be an escape route, but it may also be a route toward the hazard or emergency depending on the role of the wearer within the situation.
  • Calculation of the above-mentioned route may be performed in a similar manner to that used in in-car satellite navigation systems. Thus, in respect of data within the 3D virtual model 112, the processing system identifies the wearer's current location and the required destination. It then determines the current status of the connecting paths between those two locations, based on the 3D virtual model 112 (for permanent status aspects) and from data received from external sources (to take into account the dynamically changing environment). Status parameters may include whether or not two proximal paths or corridors are physically connected (or separated by a wall or locked door) and actually navigable (i.e. not blocked by an obstacle or two narrow to pass through safely). The processing system may also identify path to path 'cost', in terms of, for example, the number of turns and corners to be navigated, presence or absence of doors, etc. The processing system then identifies the shortest and/or simplest route having the lowest 'cost'.
  • Once a route has been identified, the processing system 104 generates appropriate navigation instructions, generates virtual representations of such navigational instructions and overlays them, or otherwise blends them, in the virtual environment displayed on the screen within the headset 100. Thus, the wearer can see their immediate environment (derived from rendered and blended image data captured by the image capture devices on the headset) together with visual navigation aids directing them along the recommended route. The navigational image data may include indications of areas through which the wearer cannot pass, for example, a locked door. A still image of what the wearer may see on their screen, according to one exemplary embodiment of the invention, is illustrated in Figure 3 of the drawings.
  • The visual navigational aids may be supplemented, or even replaced, with voice guidance emitted through speakers provided within the headset.
  • As the wearer moves within their environment, along the recommended route, the 3D environment displayed on the screen is continuously updated, in real time, using images captured by the image capture devices. In addition, the processing system 104 is continuously updated with the wearer's current location, such that the displayed navigation data can also be updated accordingly.
  • Furthermore, structural sensors and equipment health reporting systems, as well as other sensors, including, in some exemplary embodiments, sensors provided on the headset itself, may be used to supply relevant data to the processing system 104, such that the wearer's route can be dynamically updated to take into account changing conditions. Thus, for example, in the case of a fire, data from heat sensors (or other means) can be used to identify the location of the fire within the structure, such that the calculated route is configured to avoid it (or the route re-calculated, as required). Equally, if data from structural sensors indicates that a part of the structure has become unsafe, or an obstruction has been identified, the processing system is configured to re-calculate the route accordingly, to ensure that the wearer avoids any hazard. The processing system may be configured to generate and insert a visual representation of an obstacle or hazard in a user's vicinity into the 3D virtual environment displayed on their screen.
  • It will be apparent to a person skilled in the art that the nature, type and number of external data sensors required to detect, identify and classify key data relevant to the generation of an optimum route within a dynamically changing environment will be dependent on the building or structure itself, its infrastructure and equipment therein, the types of emergency situations envisaged, etc. However, in general, the processing system 104 is configured to collate available sensor data from appropriate sources with the aim of ensuring that the wearer of the headset is guided around blockages, breaches, fire, heat, chemical spills and/or other potential hazards, as appropriate.
  • As stated above, the system includes a 3D virtual model 112 of the building or structure. Thus, in accordance with some exemplary embodiments of the invention, the processing system may be configured to overlay image data representative of permanent features (such as walls, corners, stairs, etc.) onto the 3D mixed reality images displayed to the user (generated from the images captured by the image capture devices on their headsets), using knowledge of the user's absolute location and/or one of a number of image matching techniques. Thus, the wearer would still be able to see permanent features of the internal infrastructure within their field of view, even if thick smoke, for example, is obscuring the images captured by the image capture devices. Thus, referring to Figure 4 of the drawings, the user 200 may be presented with a 3D virtual image 202 of their immediate environment including a visual representation 204 of the recommended route, an overlayed image 206 of permanent features of the building infrastructure, and known (or identified) hazards 208.
  • It is envisaged that, in an exemplary embodiment of the invention, each headset may include a seal such that the visor can be sealed over the wearer's eyes, in use, thereby preventing smoke and other noxious substances from reaching their eyes and potentially compromising their vision. Furthermore, the headset could include a respiration filter within a mask portion for covering the user's nose and mouth, to prevent inhalation of smoke or other irritant or toxic substances, and aid breathing.
  • In summary, embodiments of the present invention provide an emergency guidance system and method, wherein a mixed reality headset is provided with a live connection to the building infrastructure via, for example, a central control system which integrates several such headsets and employs distributed sensors, machine health information modules, and pre-existing emergency detection systems, functionally integrated therein, so as to provide a mixed reality system which generates and displays a live route in an emergency situation and which can also identify dangers and hazards in a dynamically changing environment. The headset itself may be provided with sensors such as ambient temperatures sensors, oxygen quality sensors and even health monitors in respect of the wearer. Multi-spectral cameras may also be provided to identify additional sources of heat, and even radiation sensors could be employed, depending on the environment for which the system is intended. It is envisaged, in accordance with some exemplary embodiments, that multiple headsets would be communicably coupled to a central control system and to each other to enable gathered data to be shared, thereby to increase the overall situational awareness of the system.
  • It will further be appreciated, as briefly mentioned above, that a single processing system can be used to generate dynamically updated, optimum routes in respect of a number of different users (and headsets), as illustrated schematically in Figure 2, wherein the route calculated and updated for each wearer will be dependent on their individual respective location within the building or structure and their role within the emergency situation, using data from external, static sensors within the infrastructure of the environment and/or data from sensors mounted on-board their respective headsets. In addition, in some exemplary embodiments of the present invention, sensor data from other headsets within the system may additionally be used by the processing system to identify data relevant to a particular user.
  • In other exemplary embodiments of the invention, the processing system may be configured to coordinate multiple users' locations, movements and routes, such that each individual user's route can be calculated taking into account the location and movement of other users so as to ensure, for example, that localised crowds or bottle necks within the only or principal thoroughfares can be avoided or at least minimised. In this case, the main processing system will be remote from, and wirelessly coupled to, the headsets 100, either in a fixed location within the building or structure or in or on one of the headsets (intended for example for use by a team leader or safety officer). However, each headset may include a local processing system with similar (or possibly reduced) functionality, such that the headsets can still function adequately, in the event of a main system failure, to guide the wearer to safety.
  • Thus, an exemplary embodiment of the present invention, as illustrated schematically in Figure 5 of the drawings, may provide an emergency guidance system which can be used to coordinate the movements of several people, whereby each user's headset is communicably coupled to the processing system and also, optionally, to each other. Thus, in calculating the recommended route for each individual in a group, the size, location and nature of the group as a whole can additionally be taken into account, thus, for example, enabling the management and generation of alternative routes to allow emergency crews access to relevant areas.
  • In addition to the features of the individual systems described above, namely, and as an example, the provision of feedback between the stored 3D virtual model of the building or structure and the environment with the mixed reality depth information ton identify changes (such as blockages) which may affect a route, the resultant system can potentially report safe routes that have provided others with a safe escape, provide an active list of all people still within the building or structure, and report building health and hazard locations to emergency crews for coordination purposes. Still further, sensors worn on each user's person may be configured to transmit data representative of the respective user's vital signs and/or health status to the system, for provision to, for example, the emergency services, so as to potentially enable diagnosis of injuries such as burns, lung damage or other injuries.
  • Various exemplary embodiments of the present invention are envisaged for use in various different environments, including, but not limited to, buildings, surface and sub-surface marine vessels, offshore oil rigs, oil refineries, and other complex internal and external environments.
  • It will be appreciated by a person skilled in the art, from the foregoing description, that modifications and variations can be made to the described embodiments without departing from the scope of the invention as claimed.

Claims (17)

  1. A mixed reality guidance system for use within a building or other structure, the system comprising a headset for placing over a user's eyes, in use, said headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display said mixed reality environment on said screen, the system further comprising a storage module having stored therein a three-dimensional virtual model of the interior layout of said building or other structure, a positioning module for determining the current location of said user within said building or other structure, a processing module configured to calculate a recommended route from said current location of said user to a second location relative to said building or structure and generate navigation data representative of said recommended route, and an image processing module configured to generate image data representative of said navigation data and display said image data within said mixed reality environment on said screen.
  2. A system according to claim 1, wherein said image data comprises navigational symbols, overlayed or blended into said mixed reality environment so as to be representative of said recommended route.
  3. A system according to claim 2, wherein said navigational symbols are updated within said mixed reality environment using updated location data from said positioning module as said user moves through the interior of the building or other structure.
  4. A system according to any of the preceding claims, wherein said image processing module is further configured to obtain, from said three-dimensional virtual model, image data representative of selected fixed features of the interior of the building or other structure within said real world environment in the vicinity of the user, and overlay or blend said image data into said mixed reality environment in respect of corresponding features therein.
  5. A system according to any of the preceding claims, configured to receive data from at least one external sensor indicative of a hazard or obstacle in or on said recommended route, and re-calculate said recommended route to circumnavigate said hazard or obstacle.
  6. A system according to claim 5, comprising an image processing module for generating image data representative of said hazard or obstacle, and overlaying or blending said image data into said mixed reality environment displayed on said screen.
  7. A system according to any of the preceding claims, wherein said positioning module is mounted in or on said headset.
  8. A system according to any of the preceding claims, wherein said image capture means comprises at least one image capture device mounted on said headset so as to be substantially aligned with a user's eyes, in use.
  9. A system according to any of the preceding claims, wherein said processing module is configured to receive, from remote sensors, data representative of the health or structural or environmental status of the building or other structure and/or equipment located therein.
  10. A system according to any of the preceding claims, wherein the headset comprises a face mask, configured to be worn over a user's nose and/or mouth, in use, and including a respirator.
  11. A system according to claim 10, wherein said face mask is provided with a fume seal configured to form an air tight seal between said face mask and a user's face, in use.
  12. Control apparatus for a mixed reality guidance system according to any of the preceding claims, said control apparatus comprising a storage module having stored therein a three-dimensional virtual model of a building or other structure, a processing module configured to receive, from a positioning module, location data representative of the current location of a user, determine a required location for said user relative to said building or structure and calculate a recommended route for said user from their current location to said required location and generate navigation data representative of said recommended route, the processing module being further configured to receive, from said positioning module, updated location data representative of the current location of the user as they move through said building or structure and generate updated navigation data representative of said recommended route accordingly.
  13. Apparatus according to claim 12, wherein said processing module is configured to receive, from a plurality of positioning modules, location data representative of the respective current locations of a plurality of users, generate a required location for each said user, calculate a respective recommended route for each user from their current location to their required location, and generate respective navigation data representative of each recommended route, the processor being further configured to receive, from each said positioning module, updated location data representative of the current location of each respective user as they move through said building or structure and generate updated navigation data representative of their respective recommended route accordingly.
  14. Apparatus according to claim 13, wherein said processor is further configured to receive sensor data from the current location of at least one of said users and use said sensor data in said calculation of one or more of said recommended routes.
  15. Apparatus according to claim 13 or claim 14, including a storage module for storing data representative of the current occupants of said building or structure.
  16. A mixed reality emergency guidance system, for use within a building or other structure, the system comprising at least one headset for placing over a user's eyes, in use, the or each headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a respective user's field of view and display said mixed reality environment on said screen, the system further comprising control apparatus according to any of claims 12 to 15.
  17. A method of providing a guidance system for a building or other structure, the method comprising providing a mixed reality system comprising a headset for placing over a user's eyes, in use, said headset including a screen, image capture means for capturing images of the real world in the vicinity of the user, and a processor configured to generate a three-dimensional virtual reality environment and blend images of said real world environment into said three-dimensional virtual reality environment to create a mixed reality environment representative of a user's field of view and display said mixed reality environment on said screen, the method further comprising providing a storage module having stored therein a three-dimensional virtual model of the interior layout of said building or other structure, providing a positioning module for determining the current location of said user within said building or other structure, providing a processing module and configuring said processing module to calculate a recommended route from said current location of said user to a second location relative to said building or structure and generate navigation data representative of said recommended route, and providing an image processing module configured to generate image data representative of said navigation data and display said image data within said mixed reality environment on said screen.
EP15275042.8A 2015-02-25 2015-02-25 Emergency guidance system and method Ceased EP3062297A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP15275042.8A EP3062297A1 (en) 2015-02-25 2015-02-25 Emergency guidance system and method
PCT/GB2016/050366 WO2016135448A1 (en) 2015-02-25 2016-02-15 Emergency guidance system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP15275042.8A EP3062297A1 (en) 2015-02-25 2015-02-25 Emergency guidance system and method

Publications (1)

Publication Number Publication Date
EP3062297A1 true EP3062297A1 (en) 2016-08-31

Family

ID=52595234

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15275042.8A Ceased EP3062297A1 (en) 2015-02-25 2015-02-25 Emergency guidance system and method

Country Status (1)

Country Link
EP (1) EP3062297A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107131886A (en) * 2017-07-07 2017-09-05 四川云图瑞科技有限公司 The monitoring system of subway station evacuation guiding based on threedimensional model
JP2018073377A (en) * 2016-11-04 2018-05-10 台湾國際物業管理顧問有限公司 Application system of total sensing positioning technique based on 3d information model
US10410483B2 (en) 2017-12-15 2019-09-10 Honeywell International Inc. Systems and methods for interactive emergency response systems
JP6816909B1 (en) * 2020-07-15 2021-01-20 シンメトリー・ディメンションズ・インク Evacuation guidance system, evacuation guidance method, and eyeglass-type display
CN113034843A (en) * 2021-02-21 2021-06-25 深圳市九象数字科技有限公司 High formwork wireless automatic monitoring system
WO2024020460A1 (en) * 2022-07-21 2024-01-25 Johnson Controls Tyco IP Holdings LLP Systems and methods for providing security system information using augmented reality effects

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030234725A1 (en) * 2002-06-21 2003-12-25 Lemelson Jerome H. Intelligent bulding alarm
JP2005037181A (en) * 2003-07-17 2005-02-10 Pioneer Electronic Corp Navigation device, server, navigation system, and navigation method
US20080243385A1 (en) * 2005-01-26 2008-10-02 Kakuya Yamamoto Guiding Device and Guiding Method
US20140198017A1 (en) * 2013-01-12 2014-07-17 Mathew J. Lamb Wearable Behavior-Based Vision System

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030234725A1 (en) * 2002-06-21 2003-12-25 Lemelson Jerome H. Intelligent bulding alarm
JP2005037181A (en) * 2003-07-17 2005-02-10 Pioneer Electronic Corp Navigation device, server, navigation system, and navigation method
US20080243385A1 (en) * 2005-01-26 2008-10-02 Kakuya Yamamoto Guiding Device and Guiding Method
US20140198017A1 (en) * 2013-01-12 2014-07-17 Mathew J. Lamb Wearable Behavior-Based Vision System

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018073377A (en) * 2016-11-04 2018-05-10 台湾國際物業管理顧問有限公司 Application system of total sensing positioning technique based on 3d information model
CN107131886A (en) * 2017-07-07 2017-09-05 四川云图瑞科技有限公司 The monitoring system of subway station evacuation guiding based on threedimensional model
US10410483B2 (en) 2017-12-15 2019-09-10 Honeywell International Inc. Systems and methods for interactive emergency response systems
JP6816909B1 (en) * 2020-07-15 2021-01-20 シンメトリー・ディメンションズ・インク Evacuation guidance system, evacuation guidance method, and eyeglass-type display
JP2022018410A (en) * 2020-07-15 2022-01-27 シンメトリー・ディメンションズ・インク Evacuation guidance system, evacuation guidance method, and spectacle type display
CN113034843A (en) * 2021-02-21 2021-06-25 深圳市九象数字科技有限公司 High formwork wireless automatic monitoring system
WO2024020460A1 (en) * 2022-07-21 2024-01-25 Johnson Controls Tyco IP Holdings LLP Systems and methods for providing security system information using augmented reality effects

Similar Documents

Publication Publication Date Title
EP3062297A1 (en) Emergency guidance system and method
GB2535723A (en) Emergency guidance system and method
US10650600B2 (en) Virtual path display
US10818088B2 (en) Virtual barrier objects
JP5553405B2 (en) Augmented reality-based system and method for indicating the location of personnel and sensors in a closed structure and providing enhanced situational awareness
US20020196202A1 (en) Method for displaying emergency first responder command, control, and safety information using augmented reality
KR101768012B1 (en) Smoke Fire Detecting System Using Drone with Thermal Image Camera
CA2884855C (en) Face mounted extreme environment thermal sensor system
KR101671981B1 (en) Method and system for providing a position of co-operated firemen by using a wireless communication, method for displaying a position of co-operated firefighter, and fire hat for performing the method
WO2007133209A1 (en) Advanced augmented reality system and method for firefighter and first responder training
GB2456610A (en) Infrared monitoring system for hazardous environments
EP3163407A1 (en) Method and apparatus for alerting to head mounted display user
KR101831874B1 (en) Apparatus and Method for Augmented reality based Escape from the Disaster Site
WO2016135448A1 (en) Emergency guidance system and method
KR101504612B1 (en) Emergency evacuation information system using augmented reality and information system thereof
GB2526575A (en) Communications system
GB2535728A (en) Information system and method
WO2022042802A1 (en) Escape route system comprising escape hood
TWI755834B (en) Visual image location system
Steingart et al. Augmented cognition for fire emergency response: An iterative user study
EP3062220A1 (en) Information system and method
WO2016135447A1 (en) Information system and method
Bretschneider et al. Head mounted displays for fire fighters
KR102328567B1 (en) Fire response and lifesaving monitoring system using helicopter
Streefkerk et al. Evaluating a multimodal interface for firefighting rescue tasks

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20160918