CN116721744A - Life entity rescue method, life entity rescue device, electronic equipment and readable storage medium - Google Patents

Life entity rescue method, life entity rescue device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN116721744A
CN116721744A CN202310514195.0A CN202310514195A CN116721744A CN 116721744 A CN116721744 A CN 116721744A CN 202310514195 A CN202310514195 A CN 202310514195A CN 116721744 A CN116721744 A CN 116721744A
Authority
CN
China
Prior art keywords
life
rescued
rescue
life body
help
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310514195.0A
Other languages
Chinese (zh)
Inventor
董杰
王劲
董继鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huku Technology Co ltd
Zhejiang Geely Holding Group Co Ltd
Original Assignee
Shenzhen Huku Technology Co ltd
Zhejiang Geely Holding Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huku Technology Co ltd, Zhejiang Geely Holding Group Co Ltd filed Critical Shenzhen Huku Technology Co ltd
Priority to CN202310514195.0A priority Critical patent/CN116721744A/en
Publication of CN116721744A publication Critical patent/CN116721744A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • G16H10/65ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records stored on portable record carriers, e.g. on smartcards, RFID tags or CD

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Epidemiology (AREA)
  • General Business, Economics & Management (AREA)
  • Remote Sensing (AREA)
  • Primary Health Care (AREA)
  • Medical Informatics (AREA)
  • Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Alarm Systems (AREA)

Abstract

The application discloses a life body rescue method, a life body rescue device, electronic equipment and a readable storage medium, wherein the method comprises the following steps: acquiring help-seeking related information sent by wearable equipment; the help-seeking related information comprises sign information and/or position of a life body to be rescued corresponding to the wearable equipment; determining a rescue scheme for the sign information and/or the position based on a preset decision mechanism; the body sign information is used for determining whether the life body to be rescued needs emergency treatment, and the position is used for determining whether the life body to be rescued needs auxiliary rescue or whether the life body to be rescued needs to be provided with materials. The application realizes the aim of improving the pertinence of the life body to be rescued in the process of rescuing the life body.

Description

Life entity rescue method, life entity rescue device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of rescue, and in particular, to a method and apparatus for rescuing a living body, an electronic device, and a readable storage medium.
Background
When a living body encounters a disaster or other extraordinary situation (including natural disasters, accidents, sudden dangerous events, etc.), such as an earthquake, a debris flow, a car accident, etc., or when the living body is lost, the living body needs to be rescued. Wherein, the life body comprises people, pets and the like, and the rescue comprises rescue or assistance.
At present, when a life is rescued, a rescuer arrives at a scene through a position of the life after acquiring the position of the life. However, after the rescuer arrives at the scene, there is a possibility that there is no choice about how to spread the rescue bundle, resulting in a low pertinence in the rescue of the living body to be rescued.
Therefore, in practical application, a scheme for assisting a rescuer in rescuing a living body to be rescued is needed.
Disclosure of Invention
In view of the above, the present application provides a method, apparatus, electronic device and readable storage medium for rescuing a living body, which aim to improve pertinence when rescuing a living body to be rescuing.
To achieve the above object, the present application provides a life-saving method applied to a first device, the method comprising the steps of:
acquiring help-seeking related information sent by wearable equipment; the help-seeking related information comprises sign information and/or position of a life body to be rescued corresponding to the wearable equipment;
determining a rescue scheme for the sign information and/or the position based on a preset decision mechanism; the body sign information is used for determining whether the life body to be rescued needs emergency treatment, and the position is used for determining whether the life body to be rescued needs auxiliary rescue or whether the life body to be rescued needs to be provided with materials.
Illustratively, the method further comprises:
and pushing the rescue assistance request to terminal equipment within a preset distance to seek assistance of a user corresponding to the terminal equipment.
The embodiment of the application improves the success rate of rescue by seeking help to the user of the terminal equipment within the preset distance.
By way of example only, and not by way of limitation,
when the distress related information at least includes the location and the life to be rescued includes a plurality of life bodies, the method further includes:
acquiring a map;
displaying beacons corresponding to the life bodies to be rescued on the map; each beacon is used for marking the position of the corresponding life body to be rescued on the map;
responding to the selection operation of the salvation personnel on each beacon, and sending a target position corresponding to a target beacon to the unmanned aerial vehicle so as to lead the salvation personnel to go to the target position by the unmanned aerial vehicle; the target beacon is a beacon selected by the user through selection operation.
According to the embodiment of the application, the rescuer can select the life body to be rescued which is to be rescued preferentially, so that the success rate of rescue is improved.
By way of example only, and not by way of limitation,
when the help-seeking related information further includes the sign information, the displaying, on the map, a beacon corresponding to each life body to be rescued includes:
determining the rescue priority of each life body to be rescued based on the sign information and the position;
and based on the rescue priority, distinguishing and displaying beacons corresponding to the life bodies to be rescued on the map.
The embodiment of the application provides the suggestion of the rescue sequence by displaying the beacons differently, which can facilitate the selection of the life body to be rescued for the rescue personnel who takes precedence in rescue.
By way of example only, and not by way of limitation,
after the help-seeking related information at least comprises the position and the position sent by the wearable device is received for the first time, the method further comprises:
receiving an image of the environment where the life to be saved is located, which is shot by the unmanned aerial vehicle, and extracting the depth of field of the life to be saved from the image;
transmitting the depth of field to the wearable device; after the wearable device acquires the positioning information, the depth of field and the positioning information are coupled to obtain the updated position.
The embodiment of the application reduces the rescue difficulty by improving the position accuracy.
Illustratively, the life entity to be rescued is a person, the method further comprising:
receiving an image of the environment where the person is located, which is shot by the unmanned aerial vehicle;
and extracting the face characteristics of the person from the image, and identifying the person based on the face characteristics.
According to the embodiment of the application, after the identity of the person is acquired, the pertinence of the salvation personnel to salve the person can be improved.
Illustratively, the wearable device sends the distress related information through a target communication network;
when the distance between the target communication network and the wearable equipment is larger than a preset distance threshold value, the target communication network is a mobile communication network;
and when the distance between the target communication network and the wearable equipment is smaller than or equal to a preset distance threshold value, the target communication network is WiFi.
The embodiment of the application avoids the problem that data transmission cannot be performed when the distance between the execution main body and the wearable equipment is far; and when the distance between the execution main body and the wearable device is relatively short, reducing the power consumption of the wearable device so as to reduce the electric quantity consumption, thereby prolonging the service time of the wearable device and further improving the success rate of rescue.
To achieve the above object, the present application also provides a living body rescue apparatus including:
the first acquisition module is used for acquiring the help-seeking related information sent by the wearable equipment; the help-seeking related information comprises sign information and/or position of a life body to be rescued corresponding to the wearable equipment;
the determining module is used for determining a rescue scheme aiming at the sign information and/or the position based on a preset decision mechanism; the body sign information is used for determining whether the life body to be rescued needs emergency treatment, and the position is used for determining whether the life body to be rescued needs auxiliary rescue or whether the life body to be rescued needs to be provided with materials.
To achieve the above object, the present application also provides an electronic device, including: a memory, a processor, and a life support program stored on the memory and executable on the processor, the life support program configured to implement the steps of the life support method as described above.
For example, to achieve the above object, the present application also provides a computer-readable storage medium having stored thereon a living body rescue program which, when executed by a processor, implements the steps of the living body rescue method as described above.
Compared with the prior art that after a salvation person arrives at the scene, the pertinence of the salvation bundle is low, and the salvation scheme aiming at the sign information and/or the position of the life body to be salved is determined through the analysis of the sign information and/or the position of the life body to be salved, in particular through a preset decision mechanism; the body sign information is used for determining whether the life body to be rescued needs emergency treatment or not, and the position is used for determining whether the life body to be rescued needs auxiliary rescue or not. Because the rescue personnel can acquire the corresponding rescue scheme before the life body to be rescued is unfolded for rescue, the rescue personnel can unfold the life body to be rescued for targeted rescue based on the rescue scheme, and therefore the pertinence of the life body to be rescued in rescue is improved.
Drawings
FIG. 1 is a schematic view of a first flowchart of a first embodiment of a life-saving method according to the present application;
FIG. 2 is a schematic diagram of a second flow chart of a first embodiment of a life-saving method according to the present application;
FIG. 3 is a schematic view of a third flow chart of a first embodiment of a life-saving method according to the present application;
FIG. 4 is a fourth flow chart of a first embodiment of a life-saving method according to the present application;
FIG. 5 is a schematic diagram of a fifth flowchart of a first embodiment of a life-saving method according to the present application;
FIG. 6 is a schematic diagram of a sixth flow chart of a first embodiment of a life-saving method according to the present application;
fig. 7 is a schematic diagram of a WiFi medium equipment connection according to a first embodiment of the life support method of the application;
fig. 8 is a schematic structural diagram of a hardware running environment according to an embodiment of the present application.
The achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
For a better understanding of embodiments of the present application, the following is a brief description of the embodiments of the present application:
in practical application, the applicant researches and discovers that before arriving at the scene, a rescuer only knows the position of a life body to be rescued, but does not know the situation of the life body to be rescued, wherein the situation comprises whether the rescue scene is difficult to be unfolded to rescue, so that auxiliary rescue is needed or whether first aid is needed, so that the rescue staff is not helped in unfolding the rescue after arriving at the scene.
In order to improve pertinence of rescue, the embodiment of the application determines a rescue scheme aiming at the sign information and/or the position of a life body to be rescued through unfolding analysis of the sign information and/or the position, in particular through a preset decision mechanism; the body sign information is used for determining whether the life body to be rescued needs emergency treatment or not, and the position is used for determining whether the life body to be rescued needs auxiliary rescue or not.
Because the rescue personnel can acquire the corresponding rescue scheme before the life body to be rescued is unfolded for rescue, the rescue personnel can unfold the life body to be rescued for targeted rescue based on the rescue scheme, and therefore the pertinence of the life body to be rescued in rescue is improved.
The application provides a life body rescue method, referring to fig. 1, fig. 1 is a flow chart of a first embodiment of the life body rescue method of the application.
Embodiments of the present application provide embodiments of methods of life support, it being noted that although a logical sequence is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in a different order than that illustrated herein. The life body rescue method comprises the following steps:
step S110, acquiring help-seeking related information sent by wearable equipment; the help-seeking related information comprises sign information and/or position of a life body to be rescued corresponding to the wearable equipment.
The life body rescue method provided by the application can be applied to terminal equipment such as a mobile terminal or a vehicle-mounted terminal, and a rescuer can hold the mobile terminal or drive a vehicle to rescue the life body to be rescued. In the rescue process, the terminal equipment acquires the help-seeking related information sent by the wearable equipment. It can be understood that the wearable device is worn on a body of a living body to be rescued, the body sign information is obtained by collecting body sign parameters of the living body to be rescued by a body sign information collecting module in the wearable device, and the position is determined by the positioning information collected by a positioning module in the wearable device.
The physical sign parameters comprise blood pressure, blood oxygen, heart rate and the like, and the physical sign information acquisition module comprises a blood pressure sensor, a blood oxygen sensor, a heart rate sensor and the like. The positioning module is a GNSS (Global Navigation Satellite System ) module, and the GNSS module comprises a GPS module, a BDS (Beidou Navigation Satellite System, beidou satellite navigation system) module and the like.
Step S120, determining a rescue scheme for the sign information and/or the position based on a preset decision mechanism; the body sign information is used for determining whether the life body to be rescued needs emergency treatment, and the position is used for determining whether the life body to be rescued needs auxiliary rescue or whether the life body to be rescued needs to be provided with materials.
In an embodiment, the preset decision mechanism includes determining that the rescue plan requires first aid for the living body to be rescued when the sign information satisfies the first aid condition; when the sign information does not meet the emergency condition, the determined rescue scheme is that the life body to be rescued does not need emergency; when the position belongs to a position with difficult rescue, the determined rescue scheme is that the life body to be rescued needs auxiliary rescue; when the position does not belong to a difficult rescue position, the determined rescue scheme is that the life body to be rescued does not need auxiliary rescue; when the position belongs to a dangerous area, the determined rescue scheme is that materials are not required to be provided for a life body to be rescued; when the location does not belong to a dangerous area, the determined rescue plan is that material needs to be provided to the living being to be rescued. The first-aid condition is that the blood pressure is lower than the preset blood pressure, the blood oxygen is lower than the preset blood oxygen, and/or the heart rate is lower than the preset heart rate, etc., and the preset blood pressure, the preset blood oxygen and the preset heart rate can be set according to needs, which is not limited in this embodiment. The condition that needs auxiliary rescue is that the life body to be rescued is located at a preset position, wherein the preset position comprises deep mountain sinking, deep marsh sinking and the like.
It can be understood that if it is determined that the living body to be rescued needs emergency, an emergency call can be made; if the life body to be rescued needs auxiliary rescue, a call of an auxiliary rescue team can be dialed; if it is determined that the supply of the material to the living being to be rescued is required, the material can be carried by a rescuer to the scene. It should be noted that, when a call needs to be made after the rescue scheme is determined, not only the user can directly make a call, but also the user can be informed of the situation of the life to be rescued, and the user can be advised to make a corresponding call or provide other help, and under the condition that the user agrees and the environment where the user is located is not in a dangerous area, convenience services such as providing basic materials such as mineral water can be directly provided nearby.
Illustratively, referring to FIG. 2, the method further comprises:
step S130, a salvation assisting request is pushed to terminal equipment within a preset distance so as to seek assistance of a user corresponding to the terminal equipment.
The purpose of pushing a rescue assistance request to terminal devices within a preset distance is to seek assistance from nearby personnel. I.e. the preset distance should not be too large, e.g. the preset distance is 1km, 2km etc. It can be appreciated that a far person cannot provide help in time even if receiving a rescue assistance request, and therefore, the rescue assistance request does not need to be pushed to a terminal device greater than a preset distance.
The terminal device in the embodiment of the application can be a mobile terminal or a vehicle-mounted terminal.
The pushing mode may be to display the information corresponding to the rescue assistance request on the map of the terminal device, or to display the information corresponding to the rescue assistance request on the terminal device in a notification mode. The information corresponding to the help assisting request comprises introduction of the situation of the life body to be helped and content for assisting request.
It can be appreciated that by seeking help to the user of the terminal device within a predetermined distance, the success rate of the rescue is improved.
In order to facilitate the rescue of the life to be rescued by the rescuer and the users of the nearby terminal devices, the position can be displayed on the map of the terminal device in a beacon mode, and the sign information of the life to be rescued can be displayed nearby the beacon.
Illustratively, referring to fig. 3, when the distress related information includes at least the location and the life to be rescued includes a plurality of the life bodies, the method further includes:
step S310, a map is acquired.
Step S320, displaying beacons corresponding to the life bodies to be rescued on the map; each beacon is used for marking the position of a corresponding life body to be rescued on the map.
Step S330, in response to a selection operation of a salvation person on each beacon, sending a target position corresponding to a target beacon to an unmanned aerial vehicle, so that the unmanned aerial vehicle can lead the salvation person to go to the target position; the target beacon is a beacon selected by the user through selection operation.
When the life bodies to be saved comprise a plurality of life bodies, each life body to be saved corresponds to one beacon, a user can select a target beacon on the map through selection operation, so that the life body to be saved corresponding to the target beacon is saved preferentially, and the life body to be saved corresponding to the target beacon can be the life body to be saved nearest to a saving person, can be the life body to be saved most easily, or can be the life body to be saved most dangerously in the place, and the like. Whether the situation is dangerous or not is determined by whether the life body to be rescued needs first aid or whether auxiliary rescue is needed or not, namely, the life body to be rescued which is the most dangerous in the situation can be the life body to be rescued which is the most needed first aid or auxiliary rescue, and the life body to be rescued which is not needed first aid and auxiliary rescue is the least dangerous in the situation. It can be understood that the rescuer can select the life body to be rescued for the rescue preferentially, so that the success rate of rescue is improved.
According to the embodiment of the application, the target position is sent to the unmanned aerial vehicle, and the unmanned aerial vehicle brings the salvation personnel to the target position, so that the speed of the salvation personnel to the target position is improved.
For example, referring to fig. 4, when the help-seeking related information further includes the sign information, the displaying, on the map, a beacon corresponding to each life to be rescued includes:
step S321, determining a rescue priority of each life body to be rescued based on the sign information and the position.
From the above, the body sign information is used to determine whether the life to be rescued needs emergency, and the position is used to determine whether the life to be rescued needs auxiliary rescue and whether the life to be rescued needs to be provided with materials. Wherein, whether the life body to be rescued needs first aid and whether auxiliary rescue is needed is related to whether the situation of the life body to be rescued is dangerous. I.e. the rescue priority can be determined according to the degree of danger in the situation, wherein the life body to be rescued which most needs first aid or auxiliary rescue has the highest rescue priority, i.e. whether the life body to be rescued needs to be provided with the rescue priority of the material, the rescue priority of the life body to be rescued is lower than that of the life body to be rescued which needs the first aid or the auxiliary rescue, and in addition, the life body to be rescued does not need the first aid, does not need the auxiliary rescue and does not need to provide materials for the life body to be rescued.
Step S322, based on the rescue priority, displaying the beacons corresponding to the life to be rescued on the map in a distinguishing manner.
The distinguishing display can be distinguished by colors, for example, red is defined to represent the highest rescue priority, green represents the lowest rescue priority, and the like; the display sizes may also be distinguished, for example, the display size of the beacon corresponding to the life to be rescued with the lowest rescue priority is the smallest, the display size of the beacon corresponding to the life to be rescued with the highest rescue priority is the largest, and so on. It can be understood that by displaying the beacons differently, a recommendation of the rescue sequence is given, which can facilitate the selection of the life to be rescued by the rescuer who takes priority over other life.
For example, referring to fig. 5, after the distress related information includes at least the location and the location sent by the wearable device is received for the first time, the method further includes:
step S410, receiving an image of the environment where the life to be saved is located, which is shot by the unmanned aerial vehicle, and extracting a depth of field of the life to be saved from the image.
The image is obtained by shooting through a camera arranged on the unmanned aerial vehicle, and the depth of field can be extracted based on SLAM (Simultaneous Localization and Mapping, instant positioning and map construction) technology.
Step S420, transmitting the depth of field to the wearable device; after the wearable device acquires the positioning information, the depth of field and the positioning information are coupled to obtain the updated position.
Because the positioning information of the wearable equipment is acquired by the GNSS module, and the positioning error of the GNSS module is larger and reaches about 10 meters, obviously, the difficulty of rescue can be greatly increased due to the large error. And the depth of field and the positioning information are coupled so as to well improve the accuracy of the position determined by the depth of field and the positioning information. The coupling process may be to average or weight and sum the first coordinate corresponding to the positioning information and the second coordinate corresponding to the depth of field, where the weight is an empirical value. For example, the first coordinate is (110,213) and the second coordinate is (120, 220), and the coordinates (115,216.5) can be obtained by averaging the first coordinate and the second coordinate; and if the first coordinates are (110, 210) and the second coordinates are (120, 220), the weight of the first coordinates is 0.4, and the weight of the second coordinates is 0.6, the coordinates are (116, 216) obtained by weighting and summing the first coordinates and the second coordinates.
Illustratively, referring to FIG. 6, the method further comprises:
step S510, receiving an image of the environment where the life to be rescued is located, which is shot by the unmanned aerial vehicle.
Step S520, extracting face features of the life to be saved from the image, and identifying the life to be saved based on the face features.
The image is obtained by shooting through a camera arranged on the unmanned aerial vehicle. The extraction of the face features and the identification of the life to be rescued can be realized by algorithms such as CNN (Convolutional Neural Network ), eigenface (feature face method) and the like.
Many times, the identity of the living being to be rescued is unknown to the rescuer, which also makes it impossible for the rescuer to rescue the living being to be rescued in a targeted manner, for example, to determine whether the living being to be rescued has a companion, whether the living being to be rescued has a basic disease, etc. After the identity of the life body to be rescued is obtained, the pertinence of rescue personnel to rescue the life body to be rescued can be improved.
It should be noted that, when rescue is assisted by the unmanned aerial vehicle, in practical application, the unmanned aerial vehicle can only acquire the position of a positioning device (for example, a GPS module arranged in the wearable device), and when the unmanned aerial vehicle is connected with the positioning device, the remote controller is required to be used as a route for transfer. The execution main body of the embodiment of the application runs control software for controlling the unmanned aerial vehicle, and obviously, the communication mode between the unmanned aerial vehicle and the positioning device in practical application cannot realize the embodiment of the application, namely, a plurality of beacons are displayed on a map.
Therefore, to solve this problem, the wearable device and the execution subject are accessed to the target communication network in the embodiment of the present application. The execution subject includes, but is not limited to, a mobile terminal, a vehicle-mounted terminal, and other terminal devices. Specifically, the wearable device sends the help-seeking related information through a target communication network; when the distance between the target communication network and the wearable equipment is larger than a preset distance threshold value, the target communication network is a mobile communication network; and when the distance between the target communication network and the wearable equipment is smaller than or equal to a preset distance threshold value, the target communication network is WiFi. Wherein the mobile communication network comprises a fourth generation mobile communication technology network (4G), a fifth generation mobile communication technology network (5G) and the like. The preset distance threshold is an empirical value, and can be determined by determining coverage ranges of the WiFi signals in various scenes.
For mobile communication networks, wireless devices access the network by establishing a connection with a base station.
Referring to fig. 7, in one embodiment of the application, for a WiFi network, a wireless device accesses the network by establishing a connection with a remote control device 703. The wireless device in the embodiment of the application comprises a terminal device 704, a unmanned plane 701 and a wearable device 702. Namely, the remote controller 703 communicates with the terminal device 704, the unmanned aerial vehicle 701 and the wearable device 702 respectively, and performs data transmission. It will be appreciated that the remote control device 703 is a signal transmitter, wherein the remote control device 703 is for controlling the drone 701, the remote control device 703 comprising a remote control, a handle, etc., and the terminal device 704 comprising, but not limited to, an in-vehicle terminal or a mobile terminal. WiFi is implemented based on WLAN (Wireless Local Area Network ) technology. WLAN is to connect the wireless devices of the remote control device 703, the unmanned aerial vehicle 701, and the wearable device 702 to each other through a wireless transmission technology based on radio waves. Specifically, the remote control device 703 is used as a center of WiFi, so that a plurality of wireless devices can access the network wirelessly in a hotspot range of WiFi. When multiple wireless devices connect to the same WiFi hotspot, the wireless devices send wireless data requests to the remote control device 703, and after receiving the requests, the remote control device 703 assigns each wireless device a unique IP (Internet Protocol ) address. Through the IP address, each wireless device can realize interconnection and intercommunication, so that the sharing network resource completes real-time communication.
It should be noted that, when the wearable device 702 cannot be connected to WiFi due to a distance from the remote control device 703, the data may be transmitted to the execution body of the embodiment of the present application through the signal base station by using the 4G network or the 5G network, so as to avoid that the wearable device 702 cannot perform data transmission with the execution body. And when the wearable device 702 can be connected with WiFi due to the close distance from the remote control device 703, the data is transmitted by using WiFi. It will be appreciated that WiFi connections have lower power consumption relative to mobile communications network connections. Considering that the battery capacity of the wearable device 702 is smaller, the power consumption of the wearable device should be reduced as much as possible to reduce the power consumption, so as to prolong the service time of the wearable device and further improve the success rate of rescue.
In addition, the present application also provides a life-saving device including:
the first acquisition module is used for acquiring the help-seeking related information sent by the wearable equipment; the help-seeking related information comprises sign information and/or position of a life body to be rescued corresponding to the wearable equipment;
the determining module is used for determining a rescue scheme aiming at the sign information and/or the position based on a preset decision mechanism; the body sign information is used for determining whether the life body to be rescued needs emergency treatment, and the position is used for determining whether the life body to be rescued needs auxiliary rescue or whether the life body to be rescued needs to be provided with materials.
Illustratively, the life support device further comprises:
and the pushing module is used for pushing the rescue assistance request to the terminal equipment within a preset distance so as to seek assistance of a user corresponding to the terminal equipment.
Illustratively, when the distress related information includes at least the location and the life to be rescued includes a plurality of life bodies, the life body rescue device further includes:
the second acquisition module is used for acquiring the map;
the display module is used for displaying beacons corresponding to the life bodies to be rescued on the map; each beacon is used for marking the position of the corresponding life body to be rescued on the map;
the first sending module is used for responding to the selection operation of the salvation personnel on each beacon and sending a target position corresponding to a target beacon to the unmanned aerial vehicle so as to lead the salvation personnel to go to the target position by the unmanned aerial vehicle; the target beacon is a beacon selected by the user through selection operation.
Illustratively, when the distress related information further includes the sign information, the display module includes:
a determining unit, configured to determine a rescue priority of each of the life bodies to be rescued based on the sign information and the location;
and the display unit is used for displaying the beacons corresponding to the life bodies to be rescued on the map in a distinguishing mode based on the rescue priority.
Illustratively, after the help-seeking related information includes at least the location and the location sent by the wearable device is received for the first time, the life-saving device further includes:
the first receiving module is used for receiving an image of the environment where the life body to be rescued is located, which is shot by the unmanned aerial vehicle, and extracting the depth of field of the life body to be rescued from the image;
a second sending module, configured to send the depth of field to the wearable device; after the wearable device acquires the positioning information, the depth of field and the positioning information are coupled to obtain the updated position.
Illustratively, the life support device further comprises:
the second receiving module is used for receiving images of the environment where the life body to be rescued is located, which are shot by the unmanned aerial vehicle;
the extraction module is used for extracting the face characteristics of the life body to be rescued from the image and carrying out identity recognition on the life body to be rescued based on the face characteristics.
Illustratively, the wearable device sends the distress related information through a target communication network;
when the distance between the target communication network and the wearable equipment is larger than a preset distance threshold value, the target communication network is a mobile communication network;
and when the distance between the target communication network and the wearable equipment is smaller than or equal to a preset distance threshold value, the target communication network is WiFi.
The specific implementation of the life-saving device is basically the same as the above-mentioned examples of the life-saving method, and will not be repeated here.
In addition, the application also provides electronic equipment. As shown in fig. 8, fig. 8 is a schematic structural diagram of a hardware running environment according to an embodiment of the present application.
Fig. 8 is an exemplary schematic diagram of a hardware operating environment of an electronic device.
As shown in fig. 8, the electronic device may include a processor 801, a communication interface 802, a memory 803, and a communication bus 804, where the processor 801, the communication interface 802, and the memory 803 complete communication with each other through the communication bus 804, and the memory 803 is used to store a computer program; the processor 801 is configured to execute the program stored in the memory 803 to implement the steps of the life support method.
The communication bus 804 referred to above for the electronic devices may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, or the like. The communication bus 804 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface 802 is used for communication between the electronic device and other devices described above.
The Memory 803 may include a random access Memory (Random Access Memory, RMD) or a Non-Volatile Memory (NM), such as at least one disk Memory. Optionally, the memory 803 may also be at least one memory device located remotely from the processor 801.
The processor 801 may be a general-purpose processor including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
The specific implementation manner of the electronic device of the present application is basically the same as the above embodiments of the life-saving method, and will not be repeated here.
In addition, the embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores a life body rescue program, and the life body rescue program realizes the steps of the life body rescue method when being executed by a processor.
The specific implementation of the computer readable storage medium of the present application is basically the same as the above embodiments of the life-saving method, and will not be described herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the application, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (10)

1. A method of life support, the method comprising the steps of:
acquiring help-seeking related information sent by wearable equipment; the help-seeking related information comprises sign information and/or position of a life body to be rescued corresponding to the wearable equipment;
determining a rescue scheme for the sign information and/or the position based on a preset decision mechanism; the body sign information is used for determining whether the life body to be rescued needs emergency treatment, and the position is used for determining whether the life body to be rescued needs auxiliary rescue or whether the life body to be rescued needs to be provided with materials.
2. The life support method of claim 1, wherein the life support method further comprises:
and pushing the rescue assistance request to terminal equipment within a preset distance to seek assistance of a user corresponding to the terminal equipment.
3. The method of life support according to claim 1,
when the help-seeking related information at least comprises the position and the life body to be rescued comprises a plurality of life bodies, the life body rescue method further comprises:
acquiring a map;
displaying beacons corresponding to the life bodies to be rescued on the map; each beacon is used for marking the position of the corresponding life body to be rescued on the map;
responding to the selection operation of the salvation personnel on each beacon, and sending a target position corresponding to a target beacon to the unmanned aerial vehicle so as to lead the salvation personnel to go to the target position by the unmanned aerial vehicle; the target beacon is a beacon selected by the user through selection operation.
4. A method of life support according to claim 3,
when the help-seeking related information further includes the sign information, the displaying, on the map, a beacon corresponding to each life body to be rescued includes:
determining the rescue priority of each life body to be rescued based on the sign information and the position;
and based on the rescue priority, distinguishing and displaying beacons corresponding to the life bodies to be rescued on the map.
5. The method of life support according to claim 1,
after the help-seeking related information at least comprises the position and the position sent by the wearable device is received for the first time, the life-saving method further comprises the following steps:
receiving an image of the environment where the life to be saved is located, which is shot by the unmanned aerial vehicle, and extracting the depth of field of the life to be saved from the image;
transmitting the depth of field to the wearable device; after the wearable device acquires the positioning information, the depth of field and the positioning information are coupled to obtain the updated position.
6. The life-saving method according to claim 1, wherein the life to be saved is a person, the life-saving method further comprising:
receiving an image of the environment where the person is located, which is shot by the unmanned aerial vehicle;
and extracting the face characteristics of the person from the image, and identifying the person based on the face characteristics.
7. The life-saving method of claim 1, wherein the wearable device transmits the help-seeking related information over a target communication network;
when the distance between the target communication network and the wearable equipment is larger than a preset distance threshold value, the target communication network is a mobile communication network;
and when the distance between the target communication network and the wearable equipment is smaller than or equal to a preset distance threshold value, the target communication network is WiFi.
8. A life-saving device, characterized in that the life-saving device comprises:
the first acquisition module is used for acquiring the help-seeking related information sent by the wearable equipment; the help-seeking related information comprises sign information and/or position of a life body to be rescued corresponding to the wearable equipment;
the determining module is used for determining a rescue scheme aiming at the sign information and/or the position based on a preset decision mechanism; the physical sign information is used for determining whether the life body to be rescued needs emergency treatment or not, and the position is used for determining whether the life body to be rescued needs auxiliary rescue or not.
9. An electronic device, the electronic device comprising: a memory, a processor, and a living body rescue program stored on the memory and executable on the processor, the living body rescue program configured to implement the steps of the living body rescue method of any one of claims 1 to 7.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a life support program, which when executed by a processor, implements the steps of the life support method according to any one of claims 1 to 7.
CN202310514195.0A 2023-05-08 2023-05-08 Life entity rescue method, life entity rescue device, electronic equipment and readable storage medium Pending CN116721744A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310514195.0A CN116721744A (en) 2023-05-08 2023-05-08 Life entity rescue method, life entity rescue device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310514195.0A CN116721744A (en) 2023-05-08 2023-05-08 Life entity rescue method, life entity rescue device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN116721744A true CN116721744A (en) 2023-09-08

Family

ID=87866874

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310514195.0A Pending CN116721744A (en) 2023-05-08 2023-05-08 Life entity rescue method, life entity rescue device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN116721744A (en)

Similar Documents

Publication Publication Date Title
AU2019219744B2 (en) Systems and methods for monitoring on-route transportations
JP6771193B2 (en) Mobile call system and mobile dedicated device
US10856127B2 (en) Method and system for an emergency location information service (E-LIS) for water-based network devices
CN107810452B (en) The system and method that remote distributed for UAV controls
CN107064978B (en) First aid helicopter navigation positioning system and its navigation locating method based on Beidou
CN109788242A (en) Rescue system, rescue mode and its used server
CN111522030B (en) Mountain area missing person search and rescue system based on unmanned aerial vehicle group and Beidou positioning
JP2018094983A (en) Flying device, reporting method and program
KR20120037633A (en) Climbing support system and method
KR101334672B1 (en) Method and device for rescue service
JP2001109977A (en) Rescue system
CN107529136B (en) Traffic safety service method and system based on shared information
CN113836342A (en) Search and rescue method and system based on Beidou short message and image recognition technology
CN116721744A (en) Life entity rescue method, life entity rescue device, electronic equipment and readable storage medium
JP4669680B2 (en) Mobile terminal and mobile object display system
CN108124246B (en) People stream positioning method and system for shopping mall
CN207252893U (en) A kind of intelligence first aid helmet
CN110840421A (en) SOS mutual rescue system
CN212940234U (en) Wheel chair
CN112437109A (en) Health cloud guarding system
JP2007133718A (en) Emergency reporting system
Magar et al. Ambuitec: ambulance booking application for emergency health response, blood inventory
US20230058208A1 (en) Mobility support device, information processing method, and storage medium
JP7338550B2 (en) relief system
KR102574463B1 (en) Traffic accident reporting system and method using application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination