WO2021213737A1 - Système de navigation automatique pour robot de lutte contre l'incendie - Google Patents

Système de navigation automatique pour robot de lutte contre l'incendie Download PDF

Info

Publication number
WO2021213737A1
WO2021213737A1 PCT/EP2021/056767 EP2021056767W WO2021213737A1 WO 2021213737 A1 WO2021213737 A1 WO 2021213737A1 EP 2021056767 W EP2021056767 W EP 2021056767W WO 2021213737 A1 WO2021213737 A1 WO 2021213737A1
Authority
WO
WIPO (PCT)
Prior art keywords
main body
robot main
yaw angle
environment image
destination
Prior art date
Application number
PCT/EP2021/056767
Other languages
English (en)
Inventor
Zhan Bin YANG
Jiang Bo LIU
Zhao Jun SUN
Qi Yu
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2021213737A1 publication Critical patent/WO2021213737A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62CFIRE-FIGHTING
    • A62C27/00Fire-fighting land vehicles

Definitions

  • the present invention mainly relates to the field of robots, in particular to a navigation method and navigation apparatus for a fire fighting robot.
  • Fire fighting robots can assist fire fighting personnel in the process of extinguishing fires, and are generally arranged close to the site of a fire to perform a fire extinguishing operation.
  • a fire fighter When a fire occurs, a fire fighter must operate a remote controller manually, in order to move the robot to the best fire extinguishing position.
  • the best fire extinguishing position is generally quite far from the operator, and the motility of fire fighter operation will fall significantly in a manual control mode.
  • Robots having automatic navigation functionality have been developed in the industry; in these robots having automatic navigation functionality, an outdoor mobile platform thereof generally carries a series of sensors, such as a binocular vision camera, radar and a global positioning system (GPS), etc., in order to realize navigation based on SLAM (simultaneous localization and mapping).
  • sensors such as a binocular vision camera, radar and a global positioning system (GPS), etc.
  • GPS global positioning system
  • the present invention provides a navigation method and navigation apparatus for a fire fighting robot, to realize automatic navigation of a fire fighting robot, adapt to rapidly changing fire situations, and improve fire extinguishing efficiency.
  • the present invention proposes a navigation method for a fire fighting robot, the fire fighting robot comprising a robot main body capable of moving to the vicinity of a region where a fire has broken out to perform a fire extinguishing operation, and a remote controller wirelessly connected to the robot main body, the robot main body having a binocular vision camera and an attitude sensor, and the remote controller having an input part and a display;
  • the navigation method comprising: acquiring a first environment image and a second environment image of an environment in which the robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by the attitude sensor; wherein the first environment image and the second environment image are binocular vision images captured by the binocular vision camera; sending the first environment image to the display to be displayed, and determining a destination selected in the first environment image by an operator via the input part; calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination,
  • automatic navigation of the fire fighting robot is achieved by an operator selecting a destination in an environment image acquired by a binocular vision camera, in conjunction with a yaw angle sensed by an attitude sensor, without the need to construct a map in advance; it is thus possible to adapt to a rapidly changing fire situation, to improve fire extinguishing efficiency.
  • the step of calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image comprises: calculating the target yaw angle and the distance on the basis of the principle of binocular vision according to screen coordinates of the destination, the first environment image and the second environment image.
  • a target yaw angle of the destination can be calculated by the principle of binocular vision of the binocular vision camera, thereby realizing control of movement of the robot main body to the destination, without the need to construct a map in advance; it is thus possible to adapt to a rapidly changing fire situation, to improve fire extinguishing efficiency.
  • the step of controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance comprises: acquiring an initial yaw angle of the robot main body sensed at an initial moment by the attitude sensor; determining the difference value between the initial yaw angle and the target yaw angle; and controlling movement of the robot main body to the destination according to the difference value and the real-time yaw angle.
  • the navigation method further comprises: detecting whether an obstacle is present in the environment in which the robot main body is currently travelling; and causing the robot main body to stop moving when it is detected that an obstacle is present in the environment in which the robot main body is currently travelling.
  • the robot main body is caused to stop moving when it is detected that an obstacle is present in the environment in which the robot main body is currently travelling; it is thus possible to prevent the robot main body from striking obstacles and being damaged, to improve the stability of the fire fighting robot.
  • the navigation method further comprises: detecting the ambient temperature of the environment in which the robot main body is currently travelling, and causing the robot main body to stop moving when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature.
  • the robot main body when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, the robot main body is caused to stop moving; it is thus possible to prevent damage to the robot main body due to an excessively high temperature, to improve the stability of the fire fighting robot.
  • the present invention also proposes a navigation apparatus for a fire fighting robot, the fire fighting robot comprising a robot main body capable of moving to the vicinity of a region where a fire has broken out to perform a fire extinguishing operation, and a remote controller wirelessly connected to the robot main body, the robot main body having a binocular vision camera and an attitude sensor, and the remote controller having an input part and a display;
  • the navigation apparatus comprising: an acquisition unit, for acquiring a first environment image and a second environment image of an environment in which the robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by the attitude sensor; wherein the first environment image and the second environment image are binocular vision images captured by the binocular vision camera; an input determining unit, for sending the first environment image to the display to be displayed, and determining a destination selected in the first environment image by an operator via the input part; a calculating unit, for calculating a target yaw angle of the destination and a distance
  • the step of the calculating unit calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image comprises: calculating the target yaw angle and the distance on the basis of the principle of binocular vision according to screen coordinates of the destination, the first environment image and the second environment image.
  • the step of the control unit controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance comprises: acquiring an initial yaw angle of the robot main body sensed at an initial moment by the attitude sensor; determining the difference value between the initial yaw angle and the target yaw angle; and controlling movement of the robot main body to the destination according to the difference value and the real-time yaw angle.
  • the navigation apparatus further comprises an obstacle detection unit; the obstacle detection unit detects whether an obstacle is present in the environment in which the robot main body is currently travelling, and upon detecting that an obstacle is present in the environment in which the robot main body is currently travelling, causes the robot main body to stop moving.
  • the navigation apparatus further comprises a temperature detection unit; the temperature detection unit detects the ambient temperature of the environment in which the robot main body is currently travelling, and upon detecting that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, causes the robot main body to stop moving.
  • the present invention also proposes a fire fighting robot, comprising a processor, a memory, and a computer program that is stored on the memory and capable of being run on the processor; when executed by the processor, the computer program implements the fire fighting robot navigation method described above.
  • the present invention also proposes a computer readable medium, wherein a computer program is stored on the computer readable storage medium; when executed by a processor, the computer program implements the fire fighting robot navigation method described above.
  • Fig. 1 is a three-dimensional structural schematic diagram of a robot main body according to an embodiment of the present invention.
  • Fig. 2 is a functional block diagram of a fire fighting robot according to an embodiment of the present invention.
  • Fig. 3 is a flow chart of a fire fighting robot navigation method according to an embodiment of the present invention.
  • Fig. 4 is a block diagram of a fire fighting robot navigation apparatus according to an embodiment of the present invention.
  • Fig. 1 is a three-dimensional structural schematic diagram of a robot main body 100 according to an embodiment of the present invention.
  • the robot main body 100 can move to the vicinity of a region where a fire has broken out to perform a fire extinguishing operation.
  • the robot main body 100 comprises a support platform 101, a motion part 102, a first support 103, a water cannon 104, a second support 105, a binocular vision camera 106, an obstacle sensor 107 and an infrared temperature sensor 108.
  • the support platform 101 is configured to provide support for part of the structure of the robot main body 100, e.g. the water cannon 104 and the binocular vision camera 106, etc.
  • the motion part 102 is configured to cause the robot main body 100 to move; the motion part 102 may be a continuous track driven by a drive electric motor as shown in Fig. 1, and may also be a motion wheel driven by a drive electric motor.
  • the water cannon 104 is disposed on the support platform 101 by means of the first support 103; the first support 103 can move under the control of a water cannon controller (not shown in the figure), thereby adjusting the direction of the water cannon 104, so that the water cannon sprays water towards different fire sites.
  • the binocular vision camera 106 is disposed on the support platform 101 by means of the second support 105.
  • the binocular vision camera 106 can capture a first environment image and a second environment image of an environment in which the robot main body 100 is currently travelling; the first environment image and second environment image are binocular vision images.
  • the obstacle sensor 107 is disposed at a front end of the support platform 101, and is configured to detect whether an obstacle is present in the environment in which the robot main body 100 is currently travelling.
  • the infrared temperature sensor 108 is disposed at the front end of the support platform 101, above the obstacle sensor 107, and is configured to detect the ambient temperature of the environment in which the robot main body 100 is currently travelling.
  • Fig. 2 is a functional block diagram of a fire fighting robot according to an embodiment of the present invention.
  • the robot comprises the robot main body 100 and a remote controller 200.
  • the remote controller 200 and the robot main body 100 are separated by a certain distance, and can communicate via a wireless connection.
  • the distance can be such that the robot main body 100 is within the range of a visual field of the remote controller 200.
  • the wireless connection may be a Bluetooth connection, infrared connection or near field communication connection, etc.
  • the robot main body 100 may have the three-dimensional structure shown in Fig. 1.
  • the robot main body 100 also has a processor 109, a transceiver 110, an attitude sensor 111, an electric motor driver 112 and an electric motor 113.
  • the processor 109 may be a single-core processor, a multi-core processor, or a processor group formed of multiple processors, with the multiple processors being connected to each other via a bus.
  • the processor 109 may further comprise a graphics processing unit, for processing images and video.
  • the transceiver 110 is configured to send image data to the remote controller 200 and receive instructions from the remote controller 200.
  • the attitude sensor 111 is configured to sense a yaw angle of the robot main body 100.
  • the attitude sensor 111 may be an inertial measurement unit.
  • the electric motor driver 112 is configured to control the electric motor 113 according to an output signal of the processor 109, so that the robot main body 100 moves in different directions.
  • the remote controller 200 has an input part 201; an operator can input instructions via the input part 201.
  • the input part 201 may be a physical button and a virtual button for receiving an action of the operator, and may also be a microphone for receiving a speech sound of the operator.
  • the remote controller 200 also has a display 202, configured to display received image data.
  • the display 202 may be a liquid crystal display, a light emitting diode display or an organic light emitting diode display, etc.
  • Fig. 3 is a flow chart of a fire fighting robot navigation method 300 according to an embodiment of the present invention.
  • the navigation method 300 can control the robot shown in Figs.
  • the fire fighting robot navigation method 300 in this embodiment comprises:
  • Step 310 acquiring a first environment image and a second environment image of an environment in which a robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by an attitude sensor.
  • a binocular vision camera 106 is provided on the robot main body 100; the binocular vision camera 106 captures a first environment image and a second environment image of the environment in which the robot main body 100 is currently travelling. This step acquires the first environment image and second environment image.
  • the first environment image and second environment image are binocular vision images captured by the binocular vision camera 106; the first environment image and second environment image have regions which overlap each other.
  • the first environment image and second environment image of the environment in which the robot main body 100 is currently travelling are views of a region in front of the robot main body 100.
  • the binocular vision camera 106 may be a high-definition binocular vision camera, and correspondingly can acquire high- definition images captured by the high-definition binocular vision camera.
  • the step of acquiring the first environment image and second environment image, captured by the binocular vision camera 106, of the environment in which the robot main body 100 is currently travelling may further comprise subjecting the first environment image and second environment image to image processing.
  • the image processing to which the first environment image and second environment image are subjected may be noise reduction, enhancement, sharpening or stitching, etc.
  • An attitude sensor 111 is also provided on the robot main body 100; the attitude sensor 111 senses the real-time yaw angle of the robot main body 100. This step acquires the real-time yaw angle.
  • the attitude sensor 111 may be an inertial measurement unit (IMU).
  • Step 320 sending the first environment image to a display to be displayed, and determining a destination selected in the first environment image by an operator via an input part.
  • the first environment image After acquiring the first environment image and second environment image, captured by the binocular vision camera 106, of the environment in which the robot main body 100 is currently travelling in step 310, the first environment image is sent via a wireless connection to a remote controller 200, and displayed on a display 202 of the remote controller 200. After browsing to the first environment image displayed on the display 202, the operator selects in the first environment image the destination to which the robot main body is about to move.
  • the input part 201 may be a physical button, a virtual button or a microphone; correspondingly, the operator can select the destination to which the robot main body is about to move by pressing the physical button, touching the virtual button and issuing a speech sound.
  • the operator may be a fire fighter.
  • Step 330 calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image.
  • step 320 the destination of the robot main body 100 was determined; in this step, the target yaw angle of the destination and the distance from the robot main body to the destination are calculated according to the screen coordinates of the destination, the first environment image and the second environment image.
  • the step of calculating the target yaw angle of the destination and the distance from the robot main body to the destination according to the screen coordinates of the destination, the first environment image and the second environment image comprises: calculating the target yaw angle and the distance on the basis of the principle of binocular vision.
  • the operator has selected the destination in the first image via the input part 202
  • the remote controller 200 sends the screen coordinates of the destination to the robot main body 100
  • the robot main body 100 calculates a target yaw angle A of the destination and the distance from the robot main body 100 to the destination according to the screen coordinates of the destination, the first environment image and the second environment image, based on the principle of binocular vision.
  • the target yaw angle of the destination may be obtained as 30 degrees east of north, and the distance from the robot main body 100 to the destination may be obtained as 20 metres.
  • Step 340 controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance.
  • step 330 movement of the robot main body to the destination is controlled in conjunction with the real-time yaw angle acquired in step 310.
  • the step of controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance comprises: acquiring an initial yaw angle of the robot main body sensed at an initial moment by the attitude sensor; determining the difference value between the initial yaw angle and the target yaw angle; and controlling movement of the robot main body to the destination according to the difference value and the real time yaw angle.
  • the yaw angle sensed by the attitude sensor 111 at the initial moment is an initial yaw angle B.
  • the attitude sensor 111 senses the initial yaw angle B as east.
  • the difference value between the target yaw angle A and initial yaw angle B is determined; the difference value is sent to an electric motor driver 112, and the electric motor driver 112 drives an electric motor 113 to rotate left or right so as to adjust the difference value.
  • the electric motor driver 112 drives the electric motor 113 to yaw 60 degrees to the left, and travelling is begun.
  • a real-time yaw angle C of the robot main body 100 that is sensed by the attitude sensor 111 is acquired, the difference value C - B between the real-time yaw angle C and the initial yaw angle B is calculated in real time, and if the difference value C - B is equal to the target yaw angle A, the robot main body 100 is kept travelling in a straight line.
  • the electric motor driver 112 drives the electric motor 113 to perform adjustment until the difference value C - B is equal to the target yaw angle A, at which time the robot main body 100 is kept travelling in a straight line, until the robot main body 100 reaches the vicinity of the destination, i.e. the distance between the robot main body 100 and the destination is less than a preset value.
  • An optional scenario comprises detecting whether an obstacle is present in the environment in which the robot main body is currently travelling; and causing the robot main body to stop moving when it is detected that an obstacle is present in the environment in which the robot main body is currently travelling. It is possible to detect whether an obstacle is present in the environment in which the robot main body is currently travelling by means of an obstacle sensor 107 disposed on the robot main body 100, and cause the robot main body to stop moving when it is detected that an obstacle is present in the environment in which the robot main body is currently travelling, so as to prevent the robot main body 100 from striking obstacles and being damaged.
  • the robot main body when it is detected that the obstacle in the environment in which the robot main body is currently travelling has been removed, the robot main body is caused to begin moving again.
  • the obstacle in the environment in which the robot main body is currently travelling may be removed automatically or manually.
  • the ambient temperature of the environment in which the robot main body is currently travelling is detected; and when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, the robot main body is caused to stop moving.
  • the ambient temperature of the environment in which the robot main body is currently travelling may be detected by means of an infrared temperature sensor 108 disposed on the robot main body 100, and when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, the robot main body is caused to stop moving, so as to prevent damage to the robot main body 100 due to an excessively high temperature.
  • the robot main body when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling has fallen below the alarm temperature, the robot main body is caused to begin moving again.
  • the ambient temperature of the environment in which the robot main body is currently travelling may fall below the alarm temperature of its own accord, or fall below the alarm temperature due to manual interference.
  • This embodiment of the present invention provides a fire fighting robot navigation method, in which automatic navigation of the fire fighting robot is achieved by an operator selecting a destination in an environment image acquired by a binocular vision camera, in conjunction with a yaw angle sensed by an attitude sensor, without the need to construct a map in advance; it is thus possible to adapt to a rapidly changing fire situation, to improve fire extinguishing efficiency.
  • a flow chart has been used here to explain the operations performed in the method according to an embodiment of the present application. It should be understood that the abovementioned operations are not necessarily performed precisely in order. On the contrary, the various steps may be processed in reverse order or simultaneously. Moreover, other operations may be added to these processes, or one or more operation steps may be removed from these processes.
  • Fig. 4 is a block diagram of a fire fighting robot navigation apparatus 400 according to an embodiment of the present invention.
  • the navigation apparatus 400 can control the fire fighting robot shown in Figs. 1 and 2.
  • the navigation apparatus 400 in this embodiment comprises: an acquisition unit 410, for acquiring a captured first environment image and a captured second environment image of an environment in which a robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by an attitude sensor; wherein the first environment image and second environment image are binocular vision images captured by a binocular vision camera; an input determining unit 420, for sending the first environment image to a display to be displayed, and determining a destination selected in the first environment image by an operator via an input part; a calculating unit 430, for calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image; and a control unit 440, for controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance.
  • an acquisition unit 410 for acquiring a captured first environment image and a captured second
  • the step of the calculating unit 430 calculating the target yaw angle of the destination and the distance from the robot main body to the destination according to the screen coordinates of the destination, the first environment image and the second environment image comprises: calculating the target yaw angle of the destination and the distance from the robot main body to the destination on the basis of the principle of binocular vision according to the screen coordinates of the destination, the first environment image and the second environment image.
  • the step of the control unit 440 controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance comprises: acquiring an initial yaw angle of the robot main body sensed at an initial moment by the attitude sensor; determining the difference value between the initial yaw angle and the target yaw angle; and controlling movement of the robot main body to the destination according to the difference value and the real-time yaw angle.
  • the navigation apparatus 400 further comprises an obstacle detection unit 450; the obstacle detection unit 450 detects whether an obstacle is present in the environment in which the robot main body is currently travelling, and upon detecting that an obstacle is present in the environment in which the robot main body is currently travelling, causes the robot main body to stop moving.
  • the navigation apparatus 400 further comprises a temperature detection unit 460; the temperature detection unit 460 detects the ambient temperature of the environment in which the robot main body is currently travelling, and upon detecting that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, causes the robot main body to stop moving.
  • the navigation method 300 can be referred to; no further description is given here.
  • the present invention also proposes a fire fighting robot, comprising a processor, a memory, and a computer program that is stored on the memory and capable of being run on the processor; when executed by the processor, the computer program implements the fire fighting robot navigation method described above.
  • the present invention also proposes a computer readable medium, with a computer program being stored on the computer readable storage medium; when executed by a processor, the computer program implements the fire fighting robot navigation method described above.
  • Some aspects of the method and apparatus of the present invention may be implemented wholly by hardware, wholly by software (including firmware, terminate-and-stay-resident software, microcode, etc.), or by a combination of hardware and software. All of the above hardware or software may be referred to as a "data block”, “module”, “engine”, “unit”, “component” or “system”.
  • the processor may be one or more application- specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing device (DAPD), programmable logic device (PLD), field programmable gate array (FPGA), processor, controller, microcontroller, microprocessor or a combination thereof.
  • ASIC application- specific integrated circuit
  • DSP digital signal processor
  • DAPD digital signal processing device
  • PLD programmable logic device
  • FPGA field programmable gate array
  • computer readable media may comprise, but are not limited to, magnetic storage devices (e.g. hard disks, floppy disks, magnetic tapes%), optical disks (e.g. compact disks (CD), digital versatile disks (DVD)%), smart cards and flash memory devices (e.g. cards, sticks, key drives).
  • magnetic storage devices e.g. hard disks, floppy disks, magnetic tapes
  • optical disks e.g. compact disks (CD), digital versatile disks (DVD)
  • smart cards and flash memory devices e.g. cards, sticks, key drives
  • the computer readable medium might comprise a propagation data signal containing computer program code, e.g. on a baseband or as part of a carrier wave.
  • the propagation signal might be manifested in many forms, including electromagnetic and optical forms, etc., or a suitable combined form.
  • the computer readable medium may be any computer readable medium other than a computer readable storage medium, and may be connected to an instruction execution system, apparatus or device to realize communication, propagation or transmission of a program for use.
  • Program code located on the computer readable medium may be propagated via any suitable medium, including radio, cable, optic fibre cable, RF signal or a similar medium, or a combination of any of the above media.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Emergency Management (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Public Health (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention concerne un procédé de navigation pour un robot de lutte contre l'incendie, le procédé de navigation comprenant : acquérir une première image d'environnement et une seconde image d'environnement d'un environnement dans lequel le corps principal de robot se déplace actuellement, et un angle de lacet en temps réel du corps principal de robot qui est détecté par le capteur d'attitude ; la première image d'environnement et la seconde image d'environnement étant des images de vision binoculaire capturées par la caméra de vision binoculaire ; envoyer la première image d'environnement à l'affichage pour l'afficher, et déterminer une destination sélectionnée dans la première image d'environnement par un opérateur par l'intermédiaire de la partie d'entrée ; calculer un angle de lacet cible de la destination et une distance du corps principal de robot à la destination en fonction des coordonnées d'écran de la destination, de la première image d'environnement et de la seconde image d'environnement ; commander le déplacement du corps principal de robot vers la destination en fonction de l'angle de lacet cible, de l'angle de lacet en temps réel et de la distance.
PCT/EP2021/056767 2020-04-22 2021-03-17 Système de navigation automatique pour robot de lutte contre l'incendie WO2021213737A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010322223.5A CN113552868A (zh) 2020-04-22 2020-04-22 消防机器人的导航方法及其导航装置
CN202010322223.5 2020-04-22

Publications (1)

Publication Number Publication Date
WO2021213737A1 true WO2021213737A1 (fr) 2021-10-28

Family

ID=75143608

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/056767 WO2021213737A1 (fr) 2020-04-22 2021-03-17 Système de navigation automatique pour robot de lutte contre l'incendie

Country Status (2)

Country Link
CN (1) CN113552868A (fr)
WO (1) WO2021213737A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114296470A (zh) * 2022-01-21 2022-04-08 河南牧原智能科技有限公司 一种机器人导航方法、装置及介质
CN117140536A (zh) * 2023-10-30 2023-12-01 北京航空航天大学 一种机器人控制方法、装置和机器人

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001084260A2 (fr) * 2000-05-01 2001-11-08 Irobot Corporation Procede et systeme permettant de commander un robot mobile a distance
CN108815754A (zh) * 2018-06-20 2018-11-16 中国船舶重工集团应急预警与救援装备股份有限公司 一种液压驱动的智能消防灭火侦察机器人
US20190011921A1 (en) * 2015-09-15 2019-01-10 SZ DJI Technology Co., Ltd. Systems and methods for uav interactive instructions and control
CN110860057A (zh) * 2019-11-18 2020-03-06 燕山大学 一种消防侦察机器人及侦察方法
WO2020076610A1 (fr) * 2018-10-08 2020-04-16 R-Go Robotics Ltd. Système et procédé d'interactions d'utilisateurs géométriques par cartographie tridimensionnelle

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150109781A (ko) * 2014-03-21 2015-10-02 국방과학연구소 이종센서를 분산 탑재한 로봇의 목표물 위치 추정 협업 방법 및 이를 위한 로봇
CN108205315A (zh) * 2016-12-19 2018-06-26 广东技术师范学院 一种基于双目视觉的机器人自动导航方法
CN108536145A (zh) * 2018-04-10 2018-09-14 深圳市开心橙子科技有限公司 一种使用机器视觉进行智能跟随的机器人***及运行方法
CN109582038B (zh) * 2018-12-28 2021-08-27 中国兵器工业计算机应用技术研究所 一种无人机路径规划方法
CN110597272A (zh) * 2019-10-23 2019-12-20 安徽理工大学 一种基于视觉导航的智能无人叉车***及方法
CN110898353A (zh) * 2019-12-09 2020-03-24 国网智能科技股份有限公司 变电站消防机器人全景监控与联动控制方法及***

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001084260A2 (fr) * 2000-05-01 2001-11-08 Irobot Corporation Procede et systeme permettant de commander un robot mobile a distance
US20190011921A1 (en) * 2015-09-15 2019-01-10 SZ DJI Technology Co., Ltd. Systems and methods for uav interactive instructions and control
CN108815754A (zh) * 2018-06-20 2018-11-16 中国船舶重工集团应急预警与救援装备股份有限公司 一种液压驱动的智能消防灭火侦察机器人
WO2020076610A1 (fr) * 2018-10-08 2020-04-16 R-Go Robotics Ltd. Système et procédé d'interactions d'utilisateurs géométriques par cartographie tridimensionnelle
CN110860057A (zh) * 2019-11-18 2020-03-06 燕山大学 一种消防侦察机器人及侦察方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114296470A (zh) * 2022-01-21 2022-04-08 河南牧原智能科技有限公司 一种机器人导航方法、装置及介质
CN117140536A (zh) * 2023-10-30 2023-12-01 北京航空航天大学 一种机器人控制方法、装置和机器人
CN117140536B (zh) * 2023-10-30 2024-01-09 北京航空航天大学 一种机器人控制方法、装置和机器人

Also Published As

Publication number Publication date
CN113552868A (zh) 2021-10-26

Similar Documents

Publication Publication Date Title
US11573562B2 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
AU2019208265B2 (en) Moving robot, method for controlling the same, and terminal
US11347217B2 (en) User interaction paradigms for a flying digital assistant
EP2972462B1 (fr) Fonction modem numérique pour un suivi à l'aide d'un robot aérien autonome
US9481087B2 (en) Robot and control method thereof
CN110174903B (zh) 用于在环境内控制可移动物体的***和方法
WO2018218516A1 (fr) Procédé et appareil de planification d'itinéraire de retour de véhicule aérien sans pilote
US20180164801A1 (en) Method for operating unmanned aerial vehicle and electronic device for supporting the same
CN111033561A (zh) 用于利用语义信息来导航机器人设备的***和方法
WO2021213737A1 (fr) Système de navigation automatique pour robot de lutte contre l'incendie
TWI686686B (zh) 飛行器的控制方法和裝置
JP6302660B2 (ja) 情報取得システム、無人飛行体制御装置
WO2016168722A1 (fr) Interface baguette magique et autres paradigmes d'interaction d'utilisateur pour un assistant numérique volant
JP5595030B2 (ja) 移動体制御システム、制御装置、制御方法、プログラム及び記録媒体
KR20200015880A (ko) 스테이션 장치 및 이동 로봇 시스템
KR20170086482A (ko) 플라잉 봇의 제어 장치 및 제어 방법
US10232519B2 (en) Robot and method of controlling the same
EP2523062B1 (fr) Imagerie à commande de phase de temps pour point de vue artificiel
JP6025814B2 (ja) 操作装置および自律移動システム
CN108629842B (zh) 一种无人驾驶设备运动信息提供及运动控制方法与设备
US12007763B2 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
JP5969903B2 (ja) 無人移動体の制御方法
WO2012096282A1 (fr) Dispositif de commande, dispositif modèle et procédé de commande
KR101874212B1 (ko) 역 추적 경로로 이동 가능한 이동체와 이를 무선 조종하는 무선 조종 장치
CN109050404A (zh) 带有机械臂的汽车的大视角成像监控及自动步行***

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21713607

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21713607

Country of ref document: EP

Kind code of ref document: A1