WO2021196529A1 - 空地协同式智能巡检机器人及巡检方法 - Google Patents
空地协同式智能巡检机器人及巡检方法 Download PDFInfo
- Publication number
- WO2021196529A1 WO2021196529A1 PCT/CN2020/115072 CN2020115072W WO2021196529A1 WO 2021196529 A1 WO2021196529 A1 WO 2021196529A1 CN 2020115072 W CN2020115072 W CN 2020115072W WO 2021196529 A1 WO2021196529 A1 WO 2021196529A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- sensor
- uav
- air
- ground
- Prior art date
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 78
- 238000000034 method Methods 0.000 title claims abstract description 46
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims abstract description 24
- 230000008447 perception Effects 0.000 claims abstract description 18
- 238000013507 mapping Methods 0.000 claims abstract description 17
- 238000013461 design Methods 0.000 claims abstract description 15
- 230000004927 fusion Effects 0.000 claims abstract description 14
- 238000004364 calculation method Methods 0.000 claims abstract description 13
- 238000004891 communication Methods 0.000 claims abstract description 7
- 230000005540 biological transmission Effects 0.000 claims description 70
- 210000000078 claw Anatomy 0.000 claims description 45
- 238000001514 detection method Methods 0.000 claims description 18
- 238000004422 calculation algorithm Methods 0.000 claims description 17
- 230000033001 locomotion Effects 0.000 claims description 14
- 230000008569 process Effects 0.000 claims description 13
- 230000000007 visual effect Effects 0.000 claims description 10
- 238000007789 sealing Methods 0.000 claims description 9
- 238000004458 analytical method Methods 0.000 claims description 7
- 238000005457 optimization Methods 0.000 claims description 6
- 230000009471 action Effects 0.000 claims description 5
- 208000033748 Device issues Diseases 0.000 claims description 2
- 230000007613 environmental effect Effects 0.000 abstract description 12
- 230000004224 protection Effects 0.000 abstract description 7
- 238000013473 artificial intelligence Methods 0.000 abstract description 2
- 230000002452 interceptive effect Effects 0.000 abstract description 2
- 230000009184 walking Effects 0.000 abstract description 2
- 230000010354 integration Effects 0.000 abstract 1
- 239000000126 substance Substances 0.000 description 28
- 239000007789 gas Substances 0.000 description 20
- 238000010586 diagram Methods 0.000 description 19
- 238000005516 engineering process Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 12
- 230000006399 behavior Effects 0.000 description 11
- 238000012549 training Methods 0.000 description 9
- 238000012545 processing Methods 0.000 description 6
- 238000004519 manufacturing process Methods 0.000 description 5
- 238000012824 chemical production Methods 0.000 description 4
- 238000001914 filtration Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 4
- 230000002787 reinforcement Effects 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 238000004880 explosion Methods 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- UGFAIRIUMAVXCW-UHFFFAOYSA-N Carbon monoxide Chemical compound [O+]#[C-] UGFAIRIUMAVXCW-UHFFFAOYSA-N 0.000 description 2
- RWSOTUBLDIXVET-UHFFFAOYSA-N Dihydrogen sulfide Chemical compound S RWSOTUBLDIXVET-UHFFFAOYSA-N 0.000 description 2
- OTMSDBZUPAUEDD-UHFFFAOYSA-N Ethane Chemical compound CC OTMSDBZUPAUEDD-UHFFFAOYSA-N 0.000 description 2
- 241000282412 Homo Species 0.000 description 2
- BZHJMEDXRYGGRV-UHFFFAOYSA-N Vinyl chloride Chemical compound ClC=C BZHJMEDXRYGGRV-UHFFFAOYSA-N 0.000 description 2
- 229910002091 carbon monoxide Inorganic materials 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 231100001261 hazardous Toxicity 0.000 description 2
- 229910052739 hydrogen Inorganic materials 0.000 description 2
- 239000001257 hydrogen Substances 0.000 description 2
- 125000004435 hydrogen atom Chemical class [H]* 0.000 description 2
- 229910000037 hydrogen sulfide Inorganic materials 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 230000001681 protective effect Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 239000002341 toxic gas Substances 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003638 chemical reducing agent Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000010892 electric spark Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000383 hazardous chemical Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000011068 loading method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/06—Safety devices
- B25J19/061—Safety devices with audible signals
Definitions
- the invention belongs to the technical field of robots, and specifically relates to an air-ground cooperative intelligent inspection robot and an inspection method.
- inspection robots at this stage are generally inspection equipment of a single robot type, and their intelligence is not high, including unmanned vehicles and unmanned aerial vehicles.
- inspection robots include unmanned vehicles and drones, which mainly have the following shortcomings:
- Single unmanned vehicle inspection equipment is restricted by its own working principle, and requires too much road surface smoothness, and cannot perform inspection work on rugged roads, stair climbs, and high-altitude environments; single drone inspection equipment is limited by its cruise capability , The flight time and load capacity are very limited, it is impossible to carry a variety of sensors, and can not fly long distances.
- the inspection drone still needs operators to take it to the designated place by means of transportation when performing its tasks.
- This inspection method still has inefficiencies such as low efficiency and waste of human and material resources.
- the difficulty of path optimization is to ensure the search speed while ensuring that the path is as optimal as possible.
- the current path planning technology usually does not incorporate the kinematic constraints of the robot into the global path planning, and it is impossible to ensure real-time performance while the robot is actually operating;
- Air-ground cooperative path planning is one of the difficulties to ensure mutual cooperation and constraints under the premise that the respective paths are feasible;
- the technical difficulty of local path planning is to track the global path as much as possible while completing the obstacle avoidance of the robot based on real-time sensor data.
- An object of the present invention is to provide an air-ground collaborative intelligent inspection robot, which is applied to the safety inspection work in the chemical industry.
- An air-ground collaborative intelligent inspection robot includes a robot platform and an unmanned aerial vehicle.
- the robot platform includes a vehicle body, wheels and drive components arranged on the bottom of the vehicle body, and The robot arm, the environment sensing component, the communicator, the robot controller and the power supply component, the communicator realizes the communication connection between the unmanned aerial vehicle and the base station.
- the mechanical arm includes a base provided on the vehicle body, a joint assembly rotatably connected to the base, a mechanical claw rotatably connected to the joint assembly, and a driving mechanism.
- the rotation driving part for the rotation of each component, the joint assembly includes one or more connecting joints that are rotatably connected head to tail, and the robot arm can be adjusted for the landing position of the drone to facilitate the
- the drone performs wireless charging; the robotic arm can also replace various sensors for the drone; the robotic arm can also perform operations such as grasping, rotating, and pressing in dangerous environments, such as , Perform valve switching operations.
- the joint assembly includes a first connection joint rotatably connected with the base, a second connection joint rotatably connected with the first connection joint, and a second connection joint with the second connection joint.
- a third connecting joint rotatably connected with the joint, a fourth connecting joint rotatably connected with the third connecting joint, and a fifth connecting joint rotatably connected with the fourth connecting joint, the
- the mechanical pawl is rotatably connected with the fifth connecting joint, and the rotating drive part adopts a motor, and 6 rotating joints are driven by 6 motors to realize the movement of 6 degrees of freedom in the robot arm, and drive The mechanical claw performs a grasping operation.
- the mechanical claw includes a claw body rotatably connected to the joint assembly, a pair of claw hands connected to the claw body, and a mechanism that drives the claw hand to perform a grasping action.
- the claw hand drive assembly includes a worm provided on the claw body, a turbine connected to one end of the claw hand and matched with the worm, and a claw that drives the worm to rotate
- the hand driving part, the claw hand driving part can adopt the mechanical claw driven by the motor to realize the grasping function.
- the robot platform further includes a platform main body arranged on the vehicle body for taking off and landing and charging the drone, and the platform main body is connected to the power supply assembly.
- the drone provides wireless charging.
- the environmental sensing component includes a lidar, a sensor component, and a camera and camera component.
- the sensor assembly includes a gas concentration sensor and a humidity temperature sensor.
- the gas concentration sensor can detect the gas composition and concentration in the air to determine whether there is a leakage of hazardous gases, such as toxic gases (carbon monoxide, The concentration of vinyl chloride, hydrogen sulfide, etc.) and flammable gas (hydrogen, methane, ethane, etc.), once the ambient temperature and humidity or harmful, combustible gas concentration is found to exceed the safety threshold, immediately report to the staff for processing, the humidity
- hazardous gases such as toxic gases (carbon monoxide, The concentration of vinyl chloride, hydrogen sulfide, etc.) and flammable gas (hydrogen, methane, ethane, etc.)
- the camera and camera assembly includes a visible light high-definition camera, an infrared camera, and a monocular camera.
- the infrared camera is mainly used for night patrols and shooting night video images.
- the monocular camera can acquire and process image information of the working environment.
- the robot platform further includes a touch screen display arranged on the vehicle body, and the touch screen display provides a human-computer interaction interface for the user to facilitate the user to modify control parameters and collect relevant monitoring data.
- a microphone and a speaker are integrated in the touch screen display, so that the robot platform has functions of sound data collection and audio playback.
- inertial navigation equipment and GPS equipment are integrated in the communicator.
- the unmanned aerial vehicle includes a fuselage, an unmanned aerial vehicle control component, a landing gear, a propeller set, an unmanned aerial vehicle driving and power supply assembly, and a camera and sensing assembly arranged on the fuselage.
- the drone control assembly includes a flight control module, a data transmission and a video transmission module, and the flight control module includes a flight control sealing box arranged on the fuselage, and The flight controller, power manager and parameter adjustment interface in the flight control sealed box are respectively connected with the power manager and parameter adjustment interface; the data transmission and image transmission modules include settings The data transmission and image transmission sealed box on the fuselage, the data transmission and image transmission drone terminal arranged in the data transmission and image transmission sealed box, and the data transmission and image transmission are not compatible with the data transmission and image transmission sealed box.
- the data transmission and image transmission ground terminal connected with the human-machine terminal, and the data transmission and image transmission UAV terminal is connected with the flight controller.
- the unmanned aerial vehicle drive and power supply assembly includes a power source explosion-proof box, a battery set in the power source explosion-proof box, an electronic speed controller and a hub, a motor seat, and a power source provided on the motor seat Motor, the battery is connected to the power manager, the battery is connected to the electronic speed governor through the hub, and the input end of the electronic speed governor is connected to the The parameter adjustment interface is connected, and the output end of the electronic speed governor is connected with the motor.
- the camera and sensor assembly includes a camera, a sensor sealing box, a sensor control board arranged in the sensor sealing box, and a gas sensor connected to the sensor control board.
- the propeller group is provided with four groups, each group of the propeller group is provided with two propellers, and the two propellers are arranged up and down, and the double-propeller structure can provide stronger power for the drone.
- the flight dynamics is so that it can carry more sensing components.
- An object of the present invention is to provide an air-ground collaborative intelligent inspection method.
- An air-ground collaborative intelligent inspection method including:
- Air-ground coordinated multi-robot positioning and mapping including perception positioning calculation, map creation, and multi-information fusion positioning, including:
- Perceptual positioning calculation uses sensor components to collect surrounding environment information, and after processing the collected data, it analyzes and processes effective environment perception and detection data;
- Map creation includes modeling and scanning the environment, collecting data from sensor components and creating local 3D point cloud maps in their respective reference coordinate systems, initial alignment of the motion trajectories of the drone and robot platform, and extracting two local maps Optimize the map alignment of the two local maps for the ground plane part, and perform global optimization and adjustment on the motion trajectories and local maps obtained by the drone and robot platform;
- Multi-information fusion positioning includes fusion calculation of absolute position information such as relative position information and GPS global coordinates, a priori map and current perception data registration, to obtain robot position and attitude information,
- Air-ground coordinated tracking and control including UAV flight control system design, robot platform trajectory tracking control, and UAV self-service landing control, including:
- the design of the UAV flight control system includes the establishment of dynamic model equations for the six degrees of freedom of the UAV, analysis of the UAV actuator model composed of the motor model and the propeller aerodynamic model, and the use of the actual measured tension and torque Curve calculation of UAV-related aerodynamic parameters; according to the kinematics and dynamics model of UAV, the model is divided into two parts: attitude and position.
- the motion control method is divided into two parts: position control and attitude control.
- the attitude of the drone is called a target point.
- the path control of the drone is a collection of many target points in space. The drone needs to arrive at the planned target point in the order of the target points;
- the trajectory tracking control of the robot platform uses the DDPG algorithm to control the robot platform to track the planned path according to the status information of the robot platform and the information feedback from the environment;
- the autonomous landing control of the UAV includes flying over the unmanned vehicle through the planned route, searching for the visual sign through the visual guidance system, detecting the visual sign, and starting the automatic guided landing procedure to realize the autonomous landing of the UAV.
- the method further includes the detection and early warning of accidents, including the establishment of a mapping relationship from faults to sensor events and a mapping relationship from accidents to sensor event sequences based on the faults and accidents that may occur in the actual system, according to the established mapping relationship , Construct a system diagnostic device based on the state tree.
- the diagnostic device conducts fault detection and accident prediction online in real time by observing system events. When a system failure is detected, the diagnostic device issues a system warning; the diagnostic device calculates the probability of an accident in real time. If the probability exceeds the threshold set by the system, a warning is issued.
- the present invention has the following advantages compared with the prior art:
- the invention builds an all-weather autonomous navigation multi-type robot platform to ensure that the robot can perform given navigation and inspection tasks in all directions and all-weather. It comprehensively uses the Internet of Things, artificial intelligence, cloud computing, big data and other technologies to integrate environmental perception and dynamics. Decision-making, behavior control and alarm devices, with autonomous perception, autonomous walking, autonomous protection, interactive communication and other capabilities, can help humans complete basic, repetitive, and dangerous security tasks, promote security service upgrades, and reduce security operating costs. Functional integrated intelligent equipment.
- Figure 1 is a schematic structural diagram of this embodiment
- Figure 2 is a schematic diagram of the structure of the robot platform in this embodiment (the main body of the blanking platform);
- Figure 3 is a schematic diagram of the structure of the robotic arm in this embodiment
- Figure 4 is a schematic diagram of the structure of the mechanical claw in this embodiment
- Figure 5 is a schematic diagram of the structure of the drone in this embodiment.
- Fig. 6 is a schematic block diagram of the structure of this embodiment.
- Figure 7 is a schematic diagram of the relationship between the inspection system in this embodiment.
- Fig. 8 is a work flow chart for realizing image perception
- Fig. 9 is a schematic block diagram of the positioning and mapping of multi-robots cooperatively in the hollow ground according to this embodiment.
- FIG. 10 is a schematic block diagram of the sensing and positioning calculation in this embodiment.
- Fig. 11 is a schematic block diagram of map creation in this embodiment.
- Fig. 12 is a schematic block diagram of multi-information fusion positioning in this embodiment.
- Figures 13a and 13b are the dynamic model of the UAV rotor wing in this embodiment
- Figure 14 is a schematic block diagram of the UAV tracking control system in this embodiment.
- Figure 15 is a schematic diagram of the state machine of the drone controller in this embodiment.
- Figure 16 is a schematic block diagram of the design of the DDPG trajectory tracking controller in this embodiment.
- Figure 17 is a schematic diagram of the tracking error design in this embodiment: P is a point at a designated distance L in front of the trolley, q is a target point on the trajectory, and pq is perpendicular to L;
- Figure 18 is a schematic diagram of the UAV/unmanned vehicle collaborative inspection process in this embodiment.
- Fig. 19 is a schematic diagram of extracting event information from sensor data in this embodiment.
- Fig. 20 is a schematic block diagram of an accident prediction related model and architecture in this embodiment.
- FIG. 21 is a schematic block diagram of video recognition of a person's action behavior in this embodiment.
- Robot platform 10, car body; 11, wheels; 12, mechanical arm; 120, base; 121, mechanical claw; 1210, claw body; 1211, claw hand; 1212, worm; 1213, turbine 122.
- Lidar 131. Gas concentration sensor; 132. Humidity.
- An air-ground collaborative intelligent inspection robot as shown in Figures 1 and 2 includes a robot platform 1 and an unmanned aerial vehicle 2.
- the robot platform 1 and UAV 2 will be described in detail below.
- the robot platform 1 includes a car body 10, wheels and driving components arranged on the bottom of the car body 10, and a robot arm 12, an environment sensing component, a communicator 14, a touch screen display 15, a robot controller 16 and Power supply assembly 17. in:
- the wheels and driving components are driven by four motors with a reducer to drive the wheels 11 to rotate, and the robot controller 16 is used to control the four wheels 11 respectively, and the steering is realized through the differential speed.
- the robotic arm 12 includes a base 120 arranged on the vehicle body 10, a joint assembly rotatably connected to the base 120, a mechanical pawl 121 rotatably connected to the joint assembly, and a rotation that drives the rotation of various components.
- the joint component includes one or more connecting joints that are rotatably connected head to tail in turn.
- the joint component includes a first connection joint 122 rotatably connected with the base 120, a second connection joint 123 rotatably connected with the first connection joint 122, and a second connection joint 123 rotatably connected
- the third connecting joint 124 is rotatably connected to the third connecting joint 124
- the fifth connecting joint 126 is rotatably connected to the fourth connecting joint 125
- the mechanical pawl 121 is connected to the fifth connecting joint 126.
- Rotatable connection is a motor, and 6 rotating joints are driven by 6 motors to realize the movement of 6 degrees of freedom in the mechanical arm 12, and drive the mechanical pawl 121 to perform a grasping operation.
- the robotic arm 12 can adjust the landing position of the drone 2 to facilitate wireless charging of the drone 2; the robotic arm 12 can also replace various sensors for the drone 2; the robotic arm 12 can also perform in dangerous environments Operation tasks such as grasping, rotating, pressing, etc., for example, performing valve switching operations.
- the mechanical pawl 121 includes a pawl body 1210 rotatably connected to the fifth connecting joint 126 in the joint assembly, a pair of pawl hands 1211 connected to the pawl body 1210, and a driving pawl hand 1211 for grasping action
- the claw hand drive assembly includes a worm 1212 arranged on the claw body 1210, a turbine 1213 connected to one end of the claw hand 1210 and matched with the worm 1212, and a claw hand driving member that drives the worm 1212 to rotate.
- the claw hand 1211 is provided with serrations to help ensure the reliability of grasping.
- the gripper driver can use, for example, a motor to drive the mechanical gripper 121 to achieve a gripping function.
- Environmental sensing components include lidar 130, sensor components, and camera and camera components. in:
- the lidar 130 is a radar system that emits a laser beam to detect characteristic quantities such as the position and speed of a target.
- the robot platform 1 uses the lidar 130 to detect the dynamic environment, and collect parameters such as target distance, azimuth, height, speed, posture, and shape.
- the lidar 130 is used to detect obstacle information around the robot platform 1, and after the raw data of the lidar 130 is obtained, data preprocessing is required. Since the initial laser data is relatively messy, including some abnormal data obtained from measurement errors and accidental errors, it needs to be filtered. Usually, low-pass filtering or Gaussian filtering can be used. The amount of raw data is determined by the resolution of the sensor, which is usually huge. In order to facilitate the application in practice, the data needs to be down-sampled. The lidar data after data filtering and down-sampling will be used in obstacle recognition. , Object detection and other related perception functions.
- the sensor assembly includes a gas concentration sensor 131 and a humidity temperature sensor 132.
- the gas concentration sensor 131 can detect the gas composition and concentration in the air to determine whether there is a leak of hazardous gases, such as toxic gases (carbon monoxide, vinyl chloride, hydrogen sulfide, etc.) and flammable gases (hydrogen, methane, ethane, etc.) Once it is found that the ambient temperature and humidity or the concentration of harmful or combustible gas exceeds the safety threshold, it should be reported to the staff immediately for processing.
- the humidity temperature sensor 132 can obtain the temperature and humidity conditions of the actual inspection area environment.
- the camera and camera components include a visible light high-definition camera 133, an infrared camera 134, and a monocular camera.
- the infrared camera 134 is mainly used for night patrols and shooting night video images.
- a monocular camera can acquire and process image information of the working environment.
- the inspection requirements of the chemical environment usually include the following categories: whether the protective equipment of the staff is fully worn; whether the appearance of the production equipment is intact; whether the key indicator is normal, etc.
- This embodiment uses a deep learning method to recognize images. First, collect image samples of objects in the work scene and set corresponding labels. Then, build a deep neural network and train the samples. Through training, the network can perform image processing. Recognition, classification and other functions, as shown in Figure 8.
- a rotatable structure 135 is provided on the vehicle body 10.
- the laser radar 130, the visible light high-definition camera 133, and the infrared camera 134 can all be set on the rotatable structure 135.
- the communicator 14 is a wireless communicator, and the communicator 14 is mainly responsible for information transmission with the UAV 2 and the base station to ensure smooth information transmission. Inertial navigation equipment and GPS equipment are integrated in the communicator 14.
- the touch screen display 15 provides a human-computer interaction interface for the user to facilitate the user to modify control parameters and collect relevant monitoring data. Through the human-computer interaction interface and the high-intelligence robot system, the system operation process is simplified to reduce labor costs.
- the touch screen display 15 is integrated with a microphone and a speaker, so that the robot platform 1 has the functions of sound data collection and audio playback.
- the power supply assembly 17 includes an explosion-proof box, a battery and a motor arranged in the explosion-proof box.
- explosion-proof capability is essential.
- the ultra-high current of the robot is prone to electric sparks during the working process. Once it comes into contact with combustible gas, it will cause explosion hazards. Such hidden dangers will be unimaginable consequences.
- the explosion-proof performance is the most important technical feature of the system.
- the explosion-proof box structure is adopted to ensure its explosion-proof performance.
- the robot platform 1 also includes a platform main body 18 arranged on the vehicle body 10 for taking off, landing and charging the drone 1, and the platform main body 18 is connected to the power supply assembly 17.
- the platform main body 18 provides a loading and landing platform for the UAV 2.
- the platform main body 18 is circular, and the circular platform main body 18 is more suitable for the random error of the landing position of the UAV 2.
- the platform main body 18 also provides the wireless charging function for the UAV 2, which must be charged in time after the UAV 2 has performed an inspection mission. Therefore, the platform main body 18 uses wireless charging after the UAV 2 returns to the take-off and landing platform. Function to charge it.
- the high-power wireless power supply technology provided by the platform main body 18 does not require any physical connection.
- the charging operation is completed through non-radiative wireless energy transmission, which can completely eliminate the manual operation of the UAV 2 during charging, which greatly improves the unmanned operation.
- UAV 2 includes a fuselage 20, UAV control components, landing gear 21, a propeller set, UAV drive and power components, and cameras and sensors arranged on the fuselage 20. Components.
- the drone control component includes a flight control module, a data transmission and a video transmission module.
- the flight control module includes a flight control sealed box arranged on the fuselage 20, a flight controller, a power manager and a regulator arranged in the flight control sealed box.
- the flight controller is connected to the power manager and the parameter adjustment interface respectively.
- the flight controller contains a barometer, an inertial navigation system and an attitude stabilization system.
- the data transmission and image transmission module includes the data transmission and image transmission sealed box set on the fuselage 20, the data transmission and image transmission drone terminal set in the data transmission and image transmission sealed box, and the data transmission and image transmission without
- the data transmission and image transmission ground terminal connected to the human-machine terminal, and the data transmission and image transmission UAV terminal are connected to the flight controller.
- the drone drive and power supply components include the power source explosion-proof box, the battery set in the power source explosion-proof box, the electronic speed controller and the hub, the motor base, the motor set on the motor base, the battery is connected to the power manager, and the battery passes through the hub It is connected with the electronic speed governor, the input end of the electronic speed governor is connected with the parameter adjustment interface, and the output end of the electronic speed governor is connected with the motor.
- the camera and sensor assembly includes a camera 23, a sensor sealing box, a sensor control board arranged in the sensor sealing box, and a gas sensor connected to the sensor control board.
- the propeller group is provided with four groups, each group of propeller group is provided with two propellers 22, and the two propellers 22 are arranged up and down.
- the design of the double-propeller structure can provide stronger flight power for the UAV, so as to carry more sensor components.
- a robot platform can carry multiple drones, and the style of the robot platform is not limited, such as sensor components, the installation position of the manipulator, the increase/decrease of wheels, and the similar explosion-proof design. These changes also belong to the original The scope of protection applied for.
- Perceptual positioning calculation is realized by three parts, including sensor unit, clock synchronization device and computer unit.
- the sensor unit uses industrial cameras (color and grayscale), three-dimensional lidar, inertial navigation unit, GPS and other equipment to collect information about the surrounding environment of the robot; the raw data generated by each sensor unit is synchronized by the clock and sent to the computer unit; the computer unit After the sensor data is collected, the data is preprocessed.
- the processing content includes point cloud noise filtering, normal vector analysis, feature point extraction, feature description calculation, etc.
- the computer unit is installed on the drone and the robot platform, and the effective environment is The sensing and chemical detection data are transmitted back to the ground workstation for comprehensive analysis and processing.
- the open-ground collaborative multi-robot map creation in the chemical environment mainly solves the problems of map holes and lack of perspective caused by the limited perspective of a single robot, and is the fundamental guarantee for the construction of high-precision full-coverage environmental maps.
- the three-dimensional geometric model of the environment of the chemical plant where the robot is located is reconstructed based on the perception data sent by the computer unit on the drone and the robot platform.
- the map creation steps include:
- UAVs and robot platforms perform modeling and scanning of the chemical plant area environment in the remote control mode, and the motion trajectories of the two are in the following state, collect the sensor data of their respective computer units and perform local three-dimensional point clouds in their respective reference coordinate systems The creation of the map;
- Multi-information fusion positioning introduces an extended Kalman filter framework, which can perform absolute registration of relative position information such as inertial navigation integral, wheel odometer, laser odometer, GPS global coordinates, prior map and current perception data.
- the position information is effectively fused and calculated to obtain high-precision and low-latency robot position and posture information.
- Air-ground coordinated tracking and control :
- UAV flight control system design mainly includes UAV flight control system design, robot platform trajectory tracking control, and UAV self-service landing control. Specifically:
- the model can calculate the pulling force and torque received by each actuator at this time according to the input airflow velocity and the speed and angular velocity of the four sets of UAV propeller sets, so as to realize the simulation verification of the model, as shown in Figures 13a and 13b. .
- the model can be divided into two parts: attitude and position.
- the same motion control method is divided into two parts: position control and attitude control.
- a position and a drone attitude are called a target.
- Point the path control of the UAV is a collection of many target points in space. The UAV needs to arrive at the planned target points in the order of the target points, as shown in Figure 14.
- the active disturbance rejection controller is used to realize the control of the altitude, yaw, pitch, and roll motion parameters of the UAV.
- the position controller based on the anti-stepping method is designed to make The UAV can complete the tracking of the target trajectory.
- a state machine controller for the controller.
- the controller can automatically adjust the flying attitude and position of the robot according to the different environmental conditions of the UAV. As shown in Figure 15.
- Reinforcement learning is a method of learning the controller without knowing the control and mechanical knowledge. Reinforcement learning emphasizes the interaction with the environment and is a dynamic learning process.
- the Deep Deterministic Strategy Gradient Algorithm (DDPG) is a deep reinforcement learning algorithm that inherits the characteristics of the strategy gradient algorithm and the actor-critic algorithm. It uses the DDPG algorithm to control the wheeled robot based on the wheeled robot status information and environmental feedback information. The robot tracks the planned path.
- the actual position of the robot and the target point on the planned trajectory are used to calculate the driving error through the error function, and the error information is passed to the DDPG network.
- the DDPG network perceives the environment through the state description, and makes the optimal decision based on the current environmental state information, and passes Set the reward function to guide self-learning, and finally realize the high-precision tracking of the planned path, as shown in Figure 16.
- the error function can be designed using the lateral error of the robot.
- the robot obtains the current position through the sensing system, and the path planning system obtains the preset trajectory position.
- the distance between the hypothetical point p in front of the robot and the target q is calculated as the value of the error function. Lateral errors continue to guide the robot to walk along the trajectory.
- DDPG Before using the DDPG algorithm, pre-training of the network is required, and simulation training of the real environment is usually performed on the simulator. In the traditional DDPG algorithm, too few training samples will make the training efficiency very low, and the network cannot converge quickly. By improving the strategy of returning the samples to the training experience pool in the algorithm, when the samples are small, no network training is performed. Let the robot continue to explore and fill in the number of samples to achieve the purpose of accelerating training; at the same time, in order to solve the complex environment, the robot exploration cost is too large, and the large amount of trial and error process in the early stage consumes a lot of useless work.
- This embodiment uses the migration learning method and pre-training first.
- the DDPG network in a simple environment puts the trained network in a complex environment, and gradually increases the complexity of the environment, so that the network has the ability to generate sports strategies in a complex environment.
- the robot platform (unmanned vehicle UGV) is used as the carrier of the unmanned aerial vehicle (UAV), which can realize the automatic take-off and automatic vision guided landing of the unmanned aerial vehicle.
- UAV unmanned aerial vehicle
- unmanned vehicles perform inspection work according to preset inspection routes. In some scenarios, unmanned vehicles cannot directly reach inspection points. At this time, drones can be used to reach these inspection points.
- the autonomous landing of the UAV needs to control the height and attitude adjustment of the UAV, which requires the design of visual signs with directionality and certain specifications.
- the UAV first flies over the unmanned vehicle through the planned path of the planning system, and then searches for the visual sign through the visual guidance system. Once the visual sign is detected, the automatic guided landing procedure is started to realize the autonomous landing of the UAV.
- the controller that controls the drone's landing process can use a fuzzy controller to increase the smoothness of the landing process, as shown in Figure 18.
- a discrete event system model will be established for the operating dynamics of the chemical plant, and the sensor data will be analyzed based on the theory of the discrete event system.
- For data with different structures in the system, such as personnel's behavior, temperature, gas concentration, etc. build a sensor data dictionary based on the data structure and data format, establish a sensor data packet analysis method, integrate logic judgments, deep learning and other methods to extract all the data in the data.
- the required feature information is combined with the knowledge of information theory to establish a strong mapping relationship between feature information and event information, as shown in Figure 19.
- the mapping relationship between faults and sensor events and the mapping relationship between accidents and sensor event sequences are established, that is, system faults are modeled as fault events in the system, and system accidents are modeled as a series of event sequences.
- Combination according to the established mapping relationship, build a system diagnostic device based on the state tree, the diagnostic device by observing system events, online and real-time fault detection and accident prediction.
- the diagnostic device When a system failure is detected, the diagnostic device will issue a system warning; for accident prediction, the diagnostic device will calculate the probability of an accident in real time, and when the probability exceeds the threshold set by the system, a warning will be issued, as shown in Figure 20.
- the present invention designs a wheeled robot platform equipped with drones, which can meet the requirements for carrying, take-off and landing of inspection drones, get rid of the restriction that workers need to carry drones to the scene, and greatly improve the
- the autonomous control capability of humans and machines can carry drones, and can carry large high-energy batteries to charge the drone when the power is insufficient;
- the wheeled robot platform can also carry different types of sensor modules, including Visible light cameras, infrared cameras, laser radars, satellite navigation system receivers and other devices to make up for the lack of drone load capacity; drones can select the required sensor modules on the platform, and use robotic arms to replace sensors for drones The module greatly improves the inspection capability and efficiency of the UAV.
- the invention uses the explosion-proof box isolation method to isolate the instantaneous high-voltage current from the outside, so that the inspection robot has explosion-proof performance.
- the invention integrates a variety of sensors to perceive and understand the chemical production environment, and proposes to train a deep neural network by collecting picture samples on site to realize the recognition and detection of people or equipment in the chemical production environment. Compared with traditional recognition Method, the detection result is more reliable, and the generalization of the model is better.
- the present invention introduces the air-ground collaborative multi-robot SLAM method, breaks through the full coverage environment modeling technology of large-scale chemical plant areas, overcomes the multi-sensor fusion positioning technology, and realizes the inspection robot in the mixed chemical environment. High-precision modeling and safe and reliable positioning.
- the invention applies the hybrid algorithm to the solution of path optimization, makes the algorithm more efficient, integrates kinematics constraints into the global planning, makes the trajectory more reasonable, easy to track, can complete the air-ground cooperative path planning, and ensures that the respective paths are feasible At the same time cooperate with each other in motion.
- the present invention designs a set of fusion heterogeneous data accident prediction algorithm based on discrete event system theory, which can operate stably on the air-ground collaborative intelligent inspection robot platform, and realize the inspection robot in Complete safety inspections in the chemical operating environment, oversee the safety of accidents in the production process of chemical companies, greatly reduce safety accidents, and ensure the safety of chemical companies' production properties and personnel.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
Claims (18)
- 一种空地协同式智能巡检机器人,其特征在于:包括机器人平台、无人机,所述的机器人平台包括车体、设置在所述的车体底部的车轮及驱动组件、设置在所述的车体上的机械手臂、环境感知组件、通讯器、机器人控制器以及电源组件,所述的通讯器对所述的无人机与基站实现通信连接。
- 根据权利要求1所述的空地协同式智能巡检机器人,其特征在于:所述的机械手臂包括设置在所述的车体上的底座、可转动地连接在所述的底座上的关节组件、可转动地连接在所述的关节组件上的机械爪以及驱动各部件转动的转动驱动件,所述的关节组件包括一个或多个首尾依次可转动连接的连接关节。
- 根据权利要求2所述的空地协同式智能巡检机器人,其特征在于:所述的机械爪包括可转动地连接在所述的关节组件上的爪体、连接在所述的爪体上的一对爪手以及驱动所述的爪手进行抓取动作的爪手驱动组件,所述的爪手驱动组件包括设置在所述的爪体上的蜗杆、连接在所述的爪手一端并与所述的蜗杆配合的涡轮、驱动所述的蜗杆转动的爪手驱动件。
- 根据权利要求2所述的空地协同式智能巡检机器人,其特征在于:所述的关节组件包括与所述的底座可转动地连接的第一连接关节、与所述的第一连接关节可转动地连接的第二连接关节、与所述的第二连接关节可转动地连接的第三连接关节、与所述的第三连接关节可转动地连接的第四连接关节、与所述的第四连接关节可转动地连接的第五连接关节,所述的机械爪与所述的第五连接关节可转动地连接。
- 根据权利要求1所述的空地协同式智能巡检机器人,其特征在于:所述的机器人平台还包括设置在所述的车体上供所述的无人机起降及充电的平台主体,所述的平台主体与所述的电源组件相连接。
- 根据权利要求1所述的空地协同式智能巡检机器人,其特征在于:所述的环境感知组件包括激光雷达、传感器组件以及摄像及照相组件。
- 根据权利要求6所述的空地协同式智能巡检机器人,其特征在于:所述的传感器组件包括气体浓度传感器、湿温度传感器。
- 根据权利要求6所述的空地协同式智能巡检机器人,其特征在于:所述的摄像及照相组件包括可见光高清摄像机、红外线摄像机、单目相机。
- 根据权利要求1所述的空地协同式智能巡检机器人,其特征在于:所述的机器人平台还包括设置在所述的车体上的触屏显示器。
- 根据权利要求9所述的空地协同式智能巡检机器人,其特征在于:所述的触屏显示器内集成有麦克风、扬声器。
- 根据权利要求1所述的空地协同式智能巡检机器人,其特征在于:所述的无人机包括机身、无人机控制组件、设置在所述的机身上的起落架、螺旋桨组、无人机驱动及电源组件以及摄像及传感组件。
- 根据权利要求11所述的空地协同式智能巡检机器人,其特征在于:所述的无人机控制组件包括飞控模块、数传和图传模块,所述的飞控模块包括设置在所述的机身上的飞控密封盒、设置在所述的飞控密封盒内的飞行控制器、电源管理器以及调参接口,所述的飞行控制器分别与所述的电源管理器、调参接口相连接;所述的数传和图传模块包括设置在所述的机身上的数传和图传密封盒、设置在所述的数传和图传密封盒内的数传和图传无人机端、与所述的数传和图传无人机端相连接的数传和图传地面端,所述的数传和图传无人机端与所述的飞行控制器相连接。
- 根据权利要求11所述的空地协同式智能巡检机器人,其特征在于:所述的无人机驱动及电源组件包括电源防爆盒、设置在所述的电源防爆盒内的电池、电子调速器以及集线器、电机座、设置在所述的电机座上的电机,所述的电池与所述的电源管理器相连接,所述的电池通过所述的集线器与所述的电子调速器相连接,所述的电子调速器的输入端与所述的调参接 口相连接,所述的电子调速器的输出端与所述的电机相连接。
- 根据权利要求11所述的空地协同式智能巡检机器人,其特征在于:所述的摄像及传感组件包括摄像机、传感器密封盒、设置在所述的传感器密封盒内的传感器控制板、与所述的传感器控制板相连接的气体传感器。
- 根据权利要求11所述的空地协同式智能巡检机器人,其特征在于:所述的螺旋桨组设置有四组,每组所述的螺旋桨组设置有两个螺旋桨,两个所述的螺旋桨上下设置。
- 一种空地协同式智能巡检机器人,其特征在于:包括机器人平台、无人机,其中:所述的机器人平台包括车体、设置在所述的车体底部的车轮及驱动组件、设置在所述的车体上的机械手臂、环境感知组件、通讯器、机器人控制器以及电源组件、供所述的无人机起降及充电的平台主体、集成有麦克风、扬声器的触屏显示器,所述的通讯器对所述的无人机与基站实现通信连接,所述的平台主体与所述的电源组件相连接,其中:所述的机械手臂包括设置在所述的车体上的底座、可转动地连接在所述的底座上的关节组件、可转动地连接在所述的关节组件上的机械爪以及驱动各部件转动的转动驱动件,所述的关节组件包括一个或多个首尾依次可转动连接的连接关节,所述的机械爪包括可转动地连接在所述的关节组件上的爪体、连接在所述的爪体上的一对爪手以及驱动所述的爪手进行抓取动作的爪手驱动组件,所述的爪手驱动组件包括设置在所述的爪体上的蜗杆、连接在所述的爪手一端并与所述的蜗杆配合的涡轮、驱动所述的蜗杆转动的爪手驱动件,所述的关节组件包括与所述的底座可转动地连接的第一连接关节、与所述的第一连接关节可转动地连接的第二连接关节、与所述的第二连接关节可转动地连接的第三连接关节、与所述的第三连接关节可转动地连接的第四连接关节、与所述的第四连接关节可转动地连接的第五连接关节,所述的机械爪与所述的第五连接关节可转动地连接;所述的环境感知组件包括激光雷达、传感器组件以及摄像及照相组件,所述的传感器组件包括气体浓度传感器、湿温度传感器,所述的摄像及照相组件包括可见光高清摄像机、红外线摄像机、单目相机;所述的无人机包括机身、无人机控制组件、设置在所述的机身上的起落架、螺旋桨组、无人机驱动及电源组件以及摄像及传感组件,其中:所述的无人机控制组件包括飞控模块、数传和图传模块,所述的飞控模块包括设置在所述的机身上的飞控密封盒、设置在所述的飞控密封盒内的飞行控制器、电源管理器以及调参接口,所述的飞行控制器分别与所述的电源管理器、调参接口相连接;所述的数传和图传模块包括设置在所述的机身上的数传和图传密封盒、设置在所述的数传和图传密封盒内的数传和图传无人机端、与所述的数传和图传无人机端相连接的数传和图传地面端,所述的数传和图传无人机端与所述的飞行控制器相连接;所述的无人机驱动及电源组件包括电源防爆盒、设置在所述的电源防爆盒内的电池、电子调速器以及集线器、电机座、设置在所述的电机座上的电机,所述的电池与所述的电源管理器相连接,所述的电池通过所述的集线器与所述的电子调速器相连接,所述的电子调速器的输入端与所述的调参接口相连接,所述的电子调速器的输出端与所述的电机相连接;所述的摄像及传感组件包括摄像机、传感器密封盒、设置在所述的传感器密封盒内的传感器控制板、与所述的传感器控制板相连接的气体传感器。
- 一种空地协同式智能巡检方法,其特征在于:其采用上述任意一项权利要求所述的空地协同式智能巡检机器人,包括:1)空地协同多机器人定位及建图:包括感知定位计算、地图创建、多信息融合定位,其中:感知定位计算利用传感器组件对周围环境信息进行采集,对采集的数据进行处理,对有效的环境感知和检测数据进行分析处理;地图创建包括对环境进行建模扫描,收集传感器组件的数据并在各自的参考坐标系下进行局部三维点云地图的创建,对无人机和机器人平台运动轨迹初始对齐,提取两个局部地图中的地平面部分,对两个局部地图进行地图对齐的优化,对无人机和机器人平台获取到的运动轨迹和局部地图进行全局优化调整;多信息融合定位包括对相对位置信息和GPS全局坐标、先验地图与当前感知数据配准等绝对 位置信息进行融合计算,获取机器人位置和姿态信息,2)空地协同跟踪及控制:包括无人机飞行控制***设计、机器人平台轨迹跟踪控制、无人机自助降落控制,其中:无人机飞行控制***设计包括对无人机的空间六自由度建立动力学模型方程,分析由电机模型和螺旋桨气动模型组合而成的无人机执行器模型,并利用实际测量的拉力和扭矩曲线计算无人机相关的气动力参数;根据无人机的运动学和动力学模型将模型分成姿态和位置两部分模型,运动控制方法分为位置控制和姿态控制两部分,将一个位置和一个无人机姿态称为一个目标点,无人机的路径控制就是空间中很多个目标点的集合,无人机需要按照目标点的顺序,依次抵达规划目标点;机器人平台轨迹跟踪控制利用DDPG算法根据机器人平台状态信息和环境反馈的信息,控制机器人平台跟踪规划路径;无人机自主降落控制包括通过规划路径飞抵无人车上空,通过视觉导引***搜索视觉标识,检测到视觉标识,启动自动导引降落程序,实现无人机的自主降落。
- 根据权利要求17所述的空地协同式智能巡检方法,其特征在于:所述的方法还包括对事故的检测与预警,包括根据实际***可能发生的故障和事故,建立故障到传感器事件的映射关系与事故到传感器事件序列的映射关系,根据建立的映射关系,构建基于状态树的***诊断器,诊断器通过观测***事件,在线实时进行故障检测和事故预测,当检测到***发生故障时,诊断器发出***警告;诊断器实时计算事故发生的概率,当概率超过***设定的阈值,发出警告。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010255196.4 | 2020-04-02 | ||
CN202010255196.4A CN111300372A (zh) | 2020-04-02 | 2020-04-02 | 空地协同式智能巡检机器人及巡检方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021196529A1 true WO2021196529A1 (zh) | 2021-10-07 |
Family
ID=71155458
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/115072 WO2021196529A1 (zh) | 2020-04-02 | 2020-09-14 | 空地协同式智能巡检机器人及巡检方法 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111300372A (zh) |
WO (1) | WO2021196529A1 (zh) |
Cited By (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113741544A (zh) * | 2021-10-08 | 2021-12-03 | 北京硬石头科技有限公司 | 基于激光导航的农业室内无人机巡检***及巡检方法 |
CN113911103A (zh) * | 2021-12-14 | 2022-01-11 | 北京理工大学 | 一种混合动力履带车辆速度与能量协同优化方法及*** |
CN113946128A (zh) * | 2021-11-29 | 2022-01-18 | 中国人民解放军国防科技大学 | 一种无人机集群半实物仿真控制*** |
CN114043491A (zh) * | 2021-10-11 | 2022-02-15 | 北京天玛智控科技股份有限公司 | 视频巡检机器人 |
CN114115289A (zh) * | 2021-12-07 | 2022-03-01 | 湖南大学 | 一种自主无人集群侦察*** |
CN114102617A (zh) * | 2021-11-11 | 2022-03-01 | 山东新一代信息产业技术研究院有限公司 | 一种协同安防机器人 |
CN114161452A (zh) * | 2021-12-30 | 2022-03-11 | 山东省科学院自动化研究所 | 一种巡检机器人控制*** |
CN114187722A (zh) * | 2021-11-24 | 2022-03-15 | 河北金锁安防工程股份有限公司 | 智能安防巡查装置 |
CN114179110A (zh) * | 2021-12-31 | 2022-03-15 | 国家石油天然气管网集团有限公司 | 一种巡检机器人 |
CN114200858A (zh) * | 2021-11-23 | 2022-03-18 | 国网新疆电力有限公司克州供电公司 | 多功能人工智能变电站巡检装置 |
CN114247072A (zh) * | 2021-12-02 | 2022-03-29 | 湖南航天磁电科技有限公司 | 一种多功能巡检机器人 |
CN114265427A (zh) * | 2021-12-06 | 2022-04-01 | 江苏方天电力技术有限公司 | 一种基于红外图像匹配的巡检无人机辅助导航***及方法 |
CN114280238A (zh) * | 2021-12-22 | 2022-04-05 | 山西三合盛智慧科技股份有限公司 | 一种基于夜间作业的可视对讲气体检测机器人及其检测*** |
CN114330978A (zh) * | 2021-11-11 | 2022-04-12 | 深圳大学 | 一种空地机器人任务动态分配方法、存储介质及终端设备 |
CN114325219A (zh) * | 2021-12-03 | 2022-04-12 | 陕西省地方电力(集团)有限公司延安供电分公司 | 对配网设备运行状态进行检测识别的方法 |
CN114313884A (zh) * | 2022-02-24 | 2022-04-12 | 枣庄矿业集团新安煤业有限公司 | 一种输送机皮带智能巡检机器人 |
CN114313037A (zh) * | 2021-12-24 | 2022-04-12 | 中国兵器工业计算机应用技术研究所 | 地空协同无人自动设备 |
CN114355969A (zh) * | 2021-12-03 | 2022-04-15 | 武汉捷成电力科技有限公司 | 一种利用无人机巡检的智能供热管网检漏方法和*** |
CN114347053A (zh) * | 2021-12-28 | 2022-04-15 | 国网新疆电力有限公司吐鲁番供电公司 | 用于电缆管道的狭小廊道巡检机器人 |
CN114373042A (zh) * | 2021-12-14 | 2022-04-19 | 南京杰图空间信息技术有限公司 | 一种基于电力巡检三维场景快速建模方法 |
CN114407048A (zh) * | 2022-03-02 | 2022-04-29 | 深圳深海智人机器人技术有限公司 | 一种正压型履带式全自主巡检机器人 |
CN114434465A (zh) * | 2022-03-13 | 2022-05-06 | 国网新疆电力有限公司阿克苏供电公司 | 一种电力运检用仿真运行巡检机器人 |
CN114463968A (zh) * | 2021-12-14 | 2022-05-10 | 江苏齐物信息科技有限公司 | 一种基于物联网平台的巡检机器人通讯***及通讯方法 |
CN114489112A (zh) * | 2021-12-13 | 2022-05-13 | 深圳先进技术研究院 | 一种智能车-无人机的协同感知***及方法 |
CN114485660A (zh) * | 2021-12-27 | 2022-05-13 | 江苏集萃未来城市应用技术研究所有限公司 | 一种空地机器人协同控制与感知定位方法 |
CN114509115A (zh) * | 2022-02-24 | 2022-05-17 | 宋进涛 | 一种矿山安全智能巡检机器人 |
CN114536297A (zh) * | 2022-03-31 | 2022-05-27 | 中国矿业大学 | 一种煤矿巷道空地巡检机器人及巡检方法 |
CN114572845A (zh) * | 2022-01-24 | 2022-06-03 | 杭州大杰智能传动科技有限公司 | 用于智能塔吊工况检测的智能辅助机器人及其控制方法 |
CN114625121A (zh) * | 2022-01-24 | 2022-06-14 | 成都理工大学 | 基于多传感器融合的自主巡检探索小车***及导航方法 |
CN114843927A (zh) * | 2022-02-18 | 2022-08-02 | 华能新疆能源开发有限公司新能源东疆分公司 | 一种电缆沟道巡检机器人 |
CN114935939A (zh) * | 2022-05-09 | 2022-08-23 | 北京航天发射技术研究所 | 一种基于伴飞无人机的实时路径规划***及规划方法 |
CN114944014A (zh) * | 2022-05-30 | 2022-08-26 | 国网江苏省电力有限公司徐州供电分公司 | 一种基于3d姿态的端到端手势识别设备 |
CN115256415A (zh) * | 2022-08-01 | 2022-11-01 | 国核信息科技有限公司 | 基于安全运动的风电机舱多感融合小型化机器人及方法 |
CN115284305A (zh) * | 2022-06-15 | 2022-11-04 | 淮浙煤电有限责任公司凤台发电分公司 | 一种多功能智能巡检机器人 |
CN115338845A (zh) * | 2022-09-19 | 2022-11-15 | 安徽信息工程学院 | 一种智能化铁路巡检机器人装置 |
CN115421505A (zh) * | 2022-11-04 | 2022-12-02 | 北京卓翼智能科技有限公司 | 一种无人机集群***及无人机 |
CN115484692A (zh) * | 2022-09-08 | 2022-12-16 | 山东新一代信息产业技术研究院有限公司 | 一种无人机与机器人集群协同通信的方法、设备及介质 |
CN115488877A (zh) * | 2022-07-05 | 2022-12-20 | 港珠澳大桥管理局 | 自动巡检设备及其巡检方法 |
CN115569224A (zh) * | 2022-10-14 | 2023-01-06 | 国网山东省电力公司 | 用于配电物资仓库的清洁消毒机器人、***及方法 |
CN115847425A (zh) * | 2022-12-30 | 2023-03-28 | 安徽安天利信工程管理股份有限公司 | 一种基于bim数据管理的管廊巡检机器人及其操作方法 |
CN115933750A (zh) * | 2023-01-06 | 2023-04-07 | 国网浙江省电力有限公司嵊州市供电公司 | 基于数据处理的电力巡检方法及电力巡检*** |
CN115981355A (zh) * | 2023-02-06 | 2023-04-18 | 山东融瓴科技集团有限公司 | 一种可快速精准降落的无人机自动巡航方法及*** |
CN116423471A (zh) * | 2023-06-13 | 2023-07-14 | 中国农业科学院蔬菜花卉研究所 | 一种用于通量实验操作的智能协作机器人 |
CN116466733A (zh) * | 2023-04-25 | 2023-07-21 | 广州天勤数字科技有限公司 | 一种用于无人机起降的智能避障***及方法 |
CN116540784A (zh) * | 2023-06-28 | 2023-08-04 | 西北工业大学 | 一种基于视觉的无人***空地协同导航与避障方法 |
CN116603845A (zh) * | 2023-07-18 | 2023-08-18 | 北京建工环境修复股份有限公司 | 一种土壤修复用便于更换电池的巡检机器人 |
CN116766237A (zh) * | 2023-08-24 | 2023-09-19 | 中国电建集团北京勘测设计研究院有限公司 | 一种人工湿地巡检用机器人及巡检方法 |
CN116912749A (zh) * | 2023-09-13 | 2023-10-20 | 杭州义益钛迪信息技术有限公司 | 告警事件处理方法、装置、设备及存储介质 |
CN117039725A (zh) * | 2023-10-09 | 2023-11-10 | 广东立信电力服务有限公司 | 一种电力检查用子母巡检机器人 |
CN117021050A (zh) * | 2023-10-10 | 2023-11-10 | 北京炎凌嘉业机电设备有限公司 | 正压复合防爆机器人 |
CN117119500A (zh) * | 2023-10-25 | 2023-11-24 | 国网山东省电力公司东营供电公司 | 基于智能cpe模组的巡检机器人数据传输优化方法 |
CN117109598A (zh) * | 2023-10-23 | 2023-11-24 | 中冶建筑研究总院(深圳)有限公司 | 一种地空协同多旋翼无人机巡检路径规划方法和*** |
CN117519216A (zh) * | 2024-01-08 | 2024-02-06 | 中建八局检测科技有限公司 | 一种基于传感器联合导航检测避障的运料小车 |
CN117697760A (zh) * | 2024-01-03 | 2024-03-15 | 佛山科学技术学院 | 一种机器人安全运动控制方法及*** |
CN117723068A (zh) * | 2024-02-08 | 2024-03-19 | 清华大学 | 巡检无人车、视觉定位优化***、方法、电子设备及介质 |
CN117921621A (zh) * | 2024-03-21 | 2024-04-26 | 福州亿得隆电气技术有限公司 | 一种适应多样地形的机器人及机器人控制*** |
WO2024093420A1 (zh) * | 2022-11-04 | 2024-05-10 | 新特能源股份有限公司 | 一种无人机与地面巡检机器人协同作业的巡检方法及装置 |
WO2024093030A1 (zh) * | 2022-11-04 | 2024-05-10 | 广东电网有限责任公司 | 无人机输电线路巡检***及方法 |
Families Citing this family (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111300372A (zh) * | 2020-04-02 | 2020-06-19 | 同济人工智能研究院(苏州)有限公司 | 空地协同式智能巡检机器人及巡检方法 |
CN111930140B (zh) * | 2020-07-15 | 2023-02-03 | 招商局国际信息技术有限公司 | 一种基于无人机机器人的码头智能点检方法及*** |
CN111880537A (zh) * | 2020-07-28 | 2020-11-03 | 上海交通大学 | 一种隧道电缆智能巡检机器人*** |
CN112102514A (zh) * | 2020-08-05 | 2020-12-18 | 佛山职业技术学院 | 一种变电站子母巡检机器人巡检***及巡检方法 |
CN112276966A (zh) * | 2020-10-21 | 2021-01-29 | 苏州索斯曼电气有限公司 | 一种用于巡检电力仪表的智能机器人 |
CN112650272B (zh) * | 2020-11-24 | 2022-11-01 | 太原理工大学 | 基于5g的煤矿井下无人机巡视信息感知方法及其感知*** |
CN112476461A (zh) * | 2020-11-26 | 2021-03-12 | 云南电网有限责任公司昆明供电局 | 一种搭载无人机的变电站巡检机器人及巡检方法 |
CN112598813A (zh) * | 2020-12-01 | 2021-04-02 | 易瓦特科技股份公司 | 一种智能巡检***及其巡检方法 |
CN112558608B (zh) * | 2020-12-11 | 2023-03-17 | 重庆邮电大学 | 一种基于无人机辅助的车机协同控制及路径优化方法 |
CN112667717B (zh) * | 2020-12-23 | 2023-04-07 | 贵州电网有限责任公司电力科学研究院 | 变电站巡检信息处理方法、装置、计算机设备和存储介质 |
CN113012431B (zh) * | 2021-02-25 | 2022-06-10 | 青岛海信网络科技股份有限公司 | 一种高速公路交通事件检测方法及装置 |
CN114993261A (zh) * | 2021-02-26 | 2022-09-02 | 中国科学院宁波材料技术与工程研究所 | 无人自主避障空间探测***及方法 |
CN113034718A (zh) * | 2021-03-01 | 2021-06-25 | 启若人工智能研究院(南京)有限公司 | 一种基于多智能体的地铁管道巡检*** |
CN113358658B (zh) * | 2021-04-25 | 2022-08-30 | 上海工程技术大学 | 一种实现高铁箱梁缺陷自动化检测的方法 |
CN113271357B (zh) * | 2021-05-17 | 2023-04-18 | 南京邮电大学 | 一种地空协同组网***及控制方法 |
CN113428253A (zh) * | 2021-06-09 | 2021-09-24 | 大连海事大学 | 一种地空协同检测机器人及船舱检测方法 |
CN113485414A (zh) * | 2021-06-25 | 2021-10-08 | 国网山东省电力公司济宁市任城区供电公司 | 一种变电所计算机监控装置故障处理***及方法 |
CN113313078B (zh) * | 2021-07-02 | 2022-07-08 | 昆明理工大学 | 一种基于模型优化的轻量化夜间红外图像行人检测方法及*** |
CN113306625A (zh) * | 2021-07-09 | 2021-08-27 | 浙江博城机器人科技有限公司 | 一种分炼机器人底盘差速*** |
CN113500579B (zh) * | 2021-07-13 | 2023-06-06 | 广东电网有限责任公司 | 一种巡检方法及巡检装置 |
CN113306653A (zh) * | 2021-07-14 | 2021-08-27 | 辽宁工程技术大学 | 一种搭载无人机与机械臂的井下巡检与研究双用途无人车 |
CN114035562B (zh) * | 2021-07-20 | 2024-05-28 | 新兴际华集团有限公司 | 一种用于***性环境的多信息融合采集机器人 |
CN113371218B (zh) * | 2021-07-21 | 2023-01-31 | 衢州市庭源隆科技有限公司 | 一种无人机后勤保障车载*** |
CN113371200B (zh) * | 2021-08-03 | 2023-01-13 | 鲁东大学 | 针对规模化种植农作物的无人机精准喷药*** |
CN114020016B (zh) * | 2021-10-29 | 2022-06-21 | 哈尔滨工业大学 | 一种基于机器学习的空地协同通信服务方法及*** |
CN114097450A (zh) * | 2021-11-23 | 2022-03-01 | 南方电网电力科技股份有限公司 | 一种用于电力线路树障清理的机器人 |
CN114115287B (zh) * | 2021-12-06 | 2023-09-22 | 西安航空学院 | 一种无人车-无人机空地协同巡逻和引导*** |
CN114383576B (zh) * | 2022-01-19 | 2023-04-07 | 西北大学 | 一种空地一体化滑坡监测方法及其监测装置 |
CN114475848B (zh) * | 2022-01-26 | 2022-12-06 | 中国电建集团福建省电力勘测设计院有限公司 | 一种用于变电站巡检用的四足机器人及无人机组件 |
CN114485619A (zh) * | 2022-01-26 | 2022-05-13 | 清华大学 | 基于空地协同的多机器人定位和导航方法及装置 |
CN114599013B (zh) * | 2022-01-28 | 2023-06-30 | 中国人民解放军东部战区总医院 | 无人异构平台通信***和通信方法 |
CN114348142B (zh) * | 2022-02-22 | 2023-11-24 | 湖南工程学院 | 一种室内服务型智能四足仿生机械狗 |
CN114779766B (zh) * | 2022-04-07 | 2023-05-30 | 北京理工大学重庆创新中心 | 一种自主避障陆空两栖装置及其控制方法 |
CN115122339A (zh) * | 2022-08-19 | 2022-09-30 | 中电科机器人有限公司 | 面向装卸机器人的控制*** |
CN115171360A (zh) * | 2022-09-02 | 2022-10-11 | 国网山东省电力公司费县供电公司 | 组合式电力厂区报警***、报警平台及使用方法 |
CN115793093B (zh) * | 2023-02-02 | 2023-05-16 | 水利部交通运输部国家能源局南京水利科学研究院 | 堤坝隐伏病险诊断空地一体化装备 |
CN116300975B (zh) * | 2023-05-19 | 2023-07-28 | 深圳市云帆自动化技术有限公司 | 一种海上平台配电间机器人巡检*** |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160018224A1 (en) * | 2013-09-27 | 2016-01-21 | Regents Of The University Of Minnesota | Symbiotic Unmanned Aerial Vehicle and Unmanned Surface Vehicle System |
WO2016143806A1 (ja) * | 2015-03-11 | 2016-09-15 | 学校法人千葉工業大学 | ヘリポートを備えた搬送体 |
CN207256259U (zh) * | 2017-09-21 | 2018-04-20 | 上海合时安防技术有限公司 | 机场排爆机器人 |
CN207360445U (zh) * | 2017-07-31 | 2018-05-15 | 华南理工大学 | 一种履带式自动探索机器人 |
CN207415378U (zh) * | 2017-10-30 | 2018-05-29 | 中国石油大学(华东) | 一种空-地协同多功能智能机器人 |
CN109227527A (zh) * | 2018-10-16 | 2019-01-18 | 同济大学 | 一种基于首尾双头蛇形机械臂的无人机搜救装置及其应用 |
CN208656901U (zh) * | 2018-08-17 | 2019-03-26 | 郑州丰嘉科技有限公司 | 一种远程控制自动摄像装置 |
DE102018205880B3 (de) * | 2018-04-18 | 2019-07-25 | Volkswagen Aktiengesellschaft | Vorrichtung für ein Fahrzeug zur Übernahme eines Objektes bei einer Drohnenlieferung |
WO2019211558A1 (fr) * | 2018-05-02 | 2019-11-07 | Octopus Robots | Dispositif mobile de support de drones |
KR20200011719A (ko) * | 2018-07-25 | 2020-02-04 | 남 영 김 | 듀얼 배터리 자동교환시스템 |
CN111300372A (zh) * | 2020-04-02 | 2020-06-19 | 同济人工智能研究院(苏州)有限公司 | 空地协同式智能巡检机器人及巡检方法 |
CN211890820U (zh) * | 2020-04-02 | 2020-11-10 | 同济人工智能研究院(苏州)有限公司 | 空地协同式智能巡检机器人 |
-
2020
- 2020-04-02 CN CN202010255196.4A patent/CN111300372A/zh active Pending
- 2020-09-14 WO PCT/CN2020/115072 patent/WO2021196529A1/zh active Application Filing
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160018224A1 (en) * | 2013-09-27 | 2016-01-21 | Regents Of The University Of Minnesota | Symbiotic Unmanned Aerial Vehicle and Unmanned Surface Vehicle System |
WO2016143806A1 (ja) * | 2015-03-11 | 2016-09-15 | 学校法人千葉工業大学 | ヘリポートを備えた搬送体 |
CN207360445U (zh) * | 2017-07-31 | 2018-05-15 | 华南理工大学 | 一种履带式自动探索机器人 |
CN207256259U (zh) * | 2017-09-21 | 2018-04-20 | 上海合时安防技术有限公司 | 机场排爆机器人 |
CN207415378U (zh) * | 2017-10-30 | 2018-05-29 | 中国石油大学(华东) | 一种空-地协同多功能智能机器人 |
DE102018205880B3 (de) * | 2018-04-18 | 2019-07-25 | Volkswagen Aktiengesellschaft | Vorrichtung für ein Fahrzeug zur Übernahme eines Objektes bei einer Drohnenlieferung |
WO2019211558A1 (fr) * | 2018-05-02 | 2019-11-07 | Octopus Robots | Dispositif mobile de support de drones |
KR20200011719A (ko) * | 2018-07-25 | 2020-02-04 | 남 영 김 | 듀얼 배터리 자동교환시스템 |
CN208656901U (zh) * | 2018-08-17 | 2019-03-26 | 郑州丰嘉科技有限公司 | 一种远程控制自动摄像装置 |
CN109227527A (zh) * | 2018-10-16 | 2019-01-18 | 同济大学 | 一种基于首尾双头蛇形机械臂的无人机搜救装置及其应用 |
CN111300372A (zh) * | 2020-04-02 | 2020-06-19 | 同济人工智能研究院(苏州)有限公司 | 空地协同式智能巡检机器人及巡检方法 |
CN211890820U (zh) * | 2020-04-02 | 2020-11-10 | 同济人工智能研究院(苏州)有限公司 | 空地协同式智能巡检机器人 |
Cited By (84)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113741544A (zh) * | 2021-10-08 | 2021-12-03 | 北京硬石头科技有限公司 | 基于激光导航的农业室内无人机巡检***及巡检方法 |
CN114043491A (zh) * | 2021-10-11 | 2022-02-15 | 北京天玛智控科技股份有限公司 | 视频巡检机器人 |
CN114330978A (zh) * | 2021-11-11 | 2022-04-12 | 深圳大学 | 一种空地机器人任务动态分配方法、存储介质及终端设备 |
CN114102617A (zh) * | 2021-11-11 | 2022-03-01 | 山东新一代信息产业技术研究院有限公司 | 一种协同安防机器人 |
CN114200858A (zh) * | 2021-11-23 | 2022-03-18 | 国网新疆电力有限公司克州供电公司 | 多功能人工智能变电站巡检装置 |
CN114187722A (zh) * | 2021-11-24 | 2022-03-15 | 河北金锁安防工程股份有限公司 | 智能安防巡查装置 |
CN113946128A (zh) * | 2021-11-29 | 2022-01-18 | 中国人民解放军国防科技大学 | 一种无人机集群半实物仿真控制*** |
CN114247072A (zh) * | 2021-12-02 | 2022-03-29 | 湖南航天磁电科技有限公司 | 一种多功能巡检机器人 |
CN114355969A (zh) * | 2021-12-03 | 2022-04-15 | 武汉捷成电力科技有限公司 | 一种利用无人机巡检的智能供热管网检漏方法和*** |
CN114325219A (zh) * | 2021-12-03 | 2022-04-12 | 陕西省地方电力(集团)有限公司延安供电分公司 | 对配网设备运行状态进行检测识别的方法 |
CN114355969B (zh) * | 2021-12-03 | 2024-04-30 | 武汉捷成电力科技有限公司 | 一种利用无人机巡检的智能供热管网检漏方法和*** |
CN114265427A (zh) * | 2021-12-06 | 2022-04-01 | 江苏方天电力技术有限公司 | 一种基于红外图像匹配的巡检无人机辅助导航***及方法 |
CN114265427B (zh) * | 2021-12-06 | 2024-02-02 | 江苏方天电力技术有限公司 | 一种基于红外图像匹配的巡检无人机辅助导航***及方法 |
CN114115289A (zh) * | 2021-12-07 | 2022-03-01 | 湖南大学 | 一种自主无人集群侦察*** |
CN114489112A (zh) * | 2021-12-13 | 2022-05-13 | 深圳先进技术研究院 | 一种智能车-无人机的协同感知***及方法 |
CN114373042A (zh) * | 2021-12-14 | 2022-04-19 | 南京杰图空间信息技术有限公司 | 一种基于电力巡检三维场景快速建模方法 |
CN114463968A (zh) * | 2021-12-14 | 2022-05-10 | 江苏齐物信息科技有限公司 | 一种基于物联网平台的巡检机器人通讯***及通讯方法 |
CN114373042B (zh) * | 2021-12-14 | 2024-05-03 | 南京杰图空间信息技术有限公司 | 一种基于电力巡检三维场景快速建模方法 |
CN113911103A (zh) * | 2021-12-14 | 2022-01-11 | 北京理工大学 | 一种混合动力履带车辆速度与能量协同优化方法及*** |
CN114280238A (zh) * | 2021-12-22 | 2022-04-05 | 山西三合盛智慧科技股份有限公司 | 一种基于夜间作业的可视对讲气体检测机器人及其检测*** |
CN114313037A (zh) * | 2021-12-24 | 2022-04-12 | 中国兵器工业计算机应用技术研究所 | 地空协同无人自动设备 |
CN114485660A (zh) * | 2021-12-27 | 2022-05-13 | 江苏集萃未来城市应用技术研究所有限公司 | 一种空地机器人协同控制与感知定位方法 |
CN114347053A (zh) * | 2021-12-28 | 2022-04-15 | 国网新疆电力有限公司吐鲁番供电公司 | 用于电缆管道的狭小廊道巡检机器人 |
CN114161452A (zh) * | 2021-12-30 | 2022-03-11 | 山东省科学院自动化研究所 | 一种巡检机器人控制*** |
CN114179110A (zh) * | 2021-12-31 | 2022-03-15 | 国家石油天然气管网集团有限公司 | 一种巡检机器人 |
CN114625121A (zh) * | 2022-01-24 | 2022-06-14 | 成都理工大学 | 基于多传感器融合的自主巡检探索小车***及导航方法 |
CN114572845B (zh) * | 2022-01-24 | 2023-06-02 | 杭州大杰智能传动科技有限公司 | 用于智能塔吊工况检测的智能辅助机器人及其控制方法 |
CN114572845A (zh) * | 2022-01-24 | 2022-06-03 | 杭州大杰智能传动科技有限公司 | 用于智能塔吊工况检测的智能辅助机器人及其控制方法 |
CN114843927B (zh) * | 2022-02-18 | 2023-12-26 | 华能新疆能源开发有限公司新能源东疆分公司 | 一种电缆沟道巡检机器人 |
CN114843927A (zh) * | 2022-02-18 | 2022-08-02 | 华能新疆能源开发有限公司新能源东疆分公司 | 一种电缆沟道巡检机器人 |
CN114509115A (zh) * | 2022-02-24 | 2022-05-17 | 宋进涛 | 一种矿山安全智能巡检机器人 |
CN114509115B (zh) * | 2022-02-24 | 2023-10-20 | 宋进涛 | 一种矿山安全智能巡检机器人 |
CN114313884A (zh) * | 2022-02-24 | 2022-04-12 | 枣庄矿业集团新安煤业有限公司 | 一种输送机皮带智能巡检机器人 |
CN114313884B (zh) * | 2022-02-24 | 2023-12-01 | 枣庄矿业集团新安煤业有限公司 | 一种输送机皮带智能巡检机器人 |
CN114407048A (zh) * | 2022-03-02 | 2022-04-29 | 深圳深海智人机器人技术有限公司 | 一种正压型履带式全自主巡检机器人 |
CN114434465B (zh) * | 2022-03-13 | 2024-02-20 | 国网新疆电力有限公司阿克苏供电公司 | 一种电力运检用仿真运行巡检机器人 |
CN114434465A (zh) * | 2022-03-13 | 2022-05-06 | 国网新疆电力有限公司阿克苏供电公司 | 一种电力运检用仿真运行巡检机器人 |
CN114536297A (zh) * | 2022-03-31 | 2022-05-27 | 中国矿业大学 | 一种煤矿巷道空地巡检机器人及巡检方法 |
CN114935939A (zh) * | 2022-05-09 | 2022-08-23 | 北京航天发射技术研究所 | 一种基于伴飞无人机的实时路径规划***及规划方法 |
CN114944014A (zh) * | 2022-05-30 | 2022-08-26 | 国网江苏省电力有限公司徐州供电分公司 | 一种基于3d姿态的端到端手势识别设备 |
CN114944014B (zh) * | 2022-05-30 | 2024-04-30 | 国网江苏省电力有限公司徐州供电分公司 | 一种基于3d姿态的端到端手势识别设备 |
CN115284305A (zh) * | 2022-06-15 | 2022-11-04 | 淮浙煤电有限责任公司凤台发电分公司 | 一种多功能智能巡检机器人 |
CN115488877A (zh) * | 2022-07-05 | 2022-12-20 | 港珠澳大桥管理局 | 自动巡检设备及其巡检方法 |
CN115488877B (zh) * | 2022-07-05 | 2024-04-02 | 港珠澳大桥管理局 | 自动巡检设备及其巡检方法 |
CN115256415A (zh) * | 2022-08-01 | 2022-11-01 | 国核信息科技有限公司 | 基于安全运动的风电机舱多感融合小型化机器人及方法 |
CN115256415B (zh) * | 2022-08-01 | 2024-01-30 | 国核信息科技有限公司 | 基于安全运动的风电机舱多感融合小型化机器人及方法 |
CN115484692B (zh) * | 2022-09-08 | 2024-03-22 | 山东新一代信息产业技术研究院有限公司 | 一种无人机与机器人集群协同通信的方法、设备及介质 |
CN115484692A (zh) * | 2022-09-08 | 2022-12-16 | 山东新一代信息产业技术研究院有限公司 | 一种无人机与机器人集群协同通信的方法、设备及介质 |
CN115338845A (zh) * | 2022-09-19 | 2022-11-15 | 安徽信息工程学院 | 一种智能化铁路巡检机器人装置 |
CN115569224B (zh) * | 2022-10-14 | 2023-09-19 | 国网山东省电力公司 | 用于配电物资仓库的清洁消毒机器人、***及方法 |
CN115569224A (zh) * | 2022-10-14 | 2023-01-06 | 国网山东省电力公司 | 用于配电物资仓库的清洁消毒机器人、***及方法 |
CN115421505B (zh) * | 2022-11-04 | 2023-03-17 | 北京卓翼智能科技有限公司 | 一种无人机集群***及无人机 |
CN115421505A (zh) * | 2022-11-04 | 2022-12-02 | 北京卓翼智能科技有限公司 | 一种无人机集群***及无人机 |
WO2024093030A1 (zh) * | 2022-11-04 | 2024-05-10 | 广东电网有限责任公司 | 无人机输电线路巡检***及方法 |
WO2024093420A1 (zh) * | 2022-11-04 | 2024-05-10 | 新特能源股份有限公司 | 一种无人机与地面巡检机器人协同作业的巡检方法及装置 |
CN115847425A (zh) * | 2022-12-30 | 2023-03-28 | 安徽安天利信工程管理股份有限公司 | 一种基于bim数据管理的管廊巡检机器人及其操作方法 |
CN115933750A (zh) * | 2023-01-06 | 2023-04-07 | 国网浙江省电力有限公司嵊州市供电公司 | 基于数据处理的电力巡检方法及电力巡检*** |
CN115981355A (zh) * | 2023-02-06 | 2023-04-18 | 山东融瓴科技集团有限公司 | 一种可快速精准降落的无人机自动巡航方法及*** |
CN116466733A (zh) * | 2023-04-25 | 2023-07-21 | 广州天勤数字科技有限公司 | 一种用于无人机起降的智能避障***及方法 |
CN116466733B (zh) * | 2023-04-25 | 2023-10-31 | 广州天勤数字科技有限公司 | 一种用于无人机起降的智能避障***及方法 |
CN116423471B (zh) * | 2023-06-13 | 2023-08-15 | 中国农业科学院蔬菜花卉研究所 | 一种用于通量实验操作的智能协作机器人 |
CN116423471A (zh) * | 2023-06-13 | 2023-07-14 | 中国农业科学院蔬菜花卉研究所 | 一种用于通量实验操作的智能协作机器人 |
CN116540784A (zh) * | 2023-06-28 | 2023-08-04 | 西北工业大学 | 一种基于视觉的无人***空地协同导航与避障方法 |
CN116540784B (zh) * | 2023-06-28 | 2023-09-19 | 西北工业大学 | 一种基于视觉的无人***空地协同导航与避障方法 |
CN116603845B (zh) * | 2023-07-18 | 2023-09-08 | 北京建工环境修复股份有限公司 | 一种土壤修复用便于更换电池的巡检机器人 |
CN116603845A (zh) * | 2023-07-18 | 2023-08-18 | 北京建工环境修复股份有限公司 | 一种土壤修复用便于更换电池的巡检机器人 |
CN116766237B (zh) * | 2023-08-24 | 2023-10-27 | 中国电建集团北京勘测设计研究院有限公司 | 一种人工湿地巡检用机器人及巡检方法 |
CN116766237A (zh) * | 2023-08-24 | 2023-09-19 | 中国电建集团北京勘测设计研究院有限公司 | 一种人工湿地巡检用机器人及巡检方法 |
CN116912749B (zh) * | 2023-09-13 | 2024-01-05 | 杭州义益钛迪信息技术有限公司 | 告警事件处理方法、装置、设备及存储介质 |
CN116912749A (zh) * | 2023-09-13 | 2023-10-20 | 杭州义益钛迪信息技术有限公司 | 告警事件处理方法、装置、设备及存储介质 |
CN117039725B (zh) * | 2023-10-09 | 2024-01-16 | 广东立信电力服务有限公司 | 一种电力检查用子母巡检机器人 |
CN117039725A (zh) * | 2023-10-09 | 2023-11-10 | 广东立信电力服务有限公司 | 一种电力检查用子母巡检机器人 |
CN117021050B (zh) * | 2023-10-10 | 2024-04-02 | 北京炎凌嘉业机电设备有限公司 | 正压复合防爆机器人 |
CN117021050A (zh) * | 2023-10-10 | 2023-11-10 | 北京炎凌嘉业机电设备有限公司 | 正压复合防爆机器人 |
CN117109598B (zh) * | 2023-10-23 | 2024-01-23 | 中冶建筑研究总院(深圳)有限公司 | 一种地空协同多旋翼无人机巡检路径规划方法和*** |
CN117109598A (zh) * | 2023-10-23 | 2023-11-24 | 中冶建筑研究总院(深圳)有限公司 | 一种地空协同多旋翼无人机巡检路径规划方法和*** |
CN117119500B (zh) * | 2023-10-25 | 2024-01-12 | 国网山东省电力公司东营供电公司 | 基于智能cpe模组的巡检机器人数据传输优化方法 |
CN117119500A (zh) * | 2023-10-25 | 2023-11-24 | 国网山东省电力公司东营供电公司 | 基于智能cpe模组的巡检机器人数据传输优化方法 |
CN117697760A (zh) * | 2024-01-03 | 2024-03-15 | 佛山科学技术学院 | 一种机器人安全运动控制方法及*** |
CN117697760B (zh) * | 2024-01-03 | 2024-05-28 | 佛山科学技术学院 | 一种机器人安全运动控制方法及*** |
CN117519216B (zh) * | 2024-01-08 | 2024-03-08 | 中建八局检测科技有限公司 | 一种基于传感器联合导航检测避障的运料小车 |
CN117519216A (zh) * | 2024-01-08 | 2024-02-06 | 中建八局检测科技有限公司 | 一种基于传感器联合导航检测避障的运料小车 |
CN117723068A (zh) * | 2024-02-08 | 2024-03-19 | 清华大学 | 巡检无人车、视觉定位优化***、方法、电子设备及介质 |
CN117921621A (zh) * | 2024-03-21 | 2024-04-26 | 福州亿得隆电气技术有限公司 | 一种适应多样地形的机器人及机器人控制*** |
Also Published As
Publication number | Publication date |
---|---|
CN111300372A (zh) | 2020-06-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021196529A1 (zh) | 空地协同式智能巡检机器人及巡检方法 | |
CN211890820U (zh) | 空地协同式智能巡检机器人 | |
WO2018103242A1 (zh) | 一种基于运动学习的四旋翼无人机电塔巡检方法 | |
Montambault et al. | On the application of VTOL UAVs to the inspection of power utility assets | |
Ollero et al. | Control and perception techniques for aerial robotics | |
CN113325837A (zh) | 一种用于多信息融合采集机器人的控制***及方法 | |
CN107329487A (zh) | 一种无人机与机器人空中联动作业平台 | |
CN109509320A (zh) | 一种变电站火灾预警巡检机器人 | |
CN112109090A (zh) | 多传感器融合的搜索救援机器人*** | |
CN114115296B (zh) | 一种重点区域智能巡检与预警***及方法 | |
CN113134187A (zh) | 基于积分强化学习的多消防巡检协作机器人*** | |
Lee et al. | Artificial intelligence and Internet of Things for robotic disaster response | |
Chehri et al. | Accelerating power grid monitoring with flying robots and artificial intelligence | |
Choutri et al. | A fully autonomous search and rescue system using quadrotor UAV | |
Wang et al. | Image-based visual servoing of quadrotors to arbitrary flight targets | |
Budiyono et al. | A review of the latest innovations in uav technology | |
CN114397909B (zh) | 一种针对大型飞机的小型无人机自动巡检方法 | |
Phang et al. | Autonomous tracking and landing on moving ground vehicle with multi-rotor UAV | |
Gao et al. | Design and experimental verification of an intelligent fire-fighting robot | |
Luo et al. | Air-ground multi-agent robot team coordination | |
CN206515108U (zh) | 基于四旋翼飞行器的空气采集*** | |
Sutera et al. | A multi-robot system for thermal vision inspection | |
Chen | Design of trajectory planning and 3D modeling for crane inspection based on UAV | |
Longo et al. | A mixed terrestrial aerial robotic platform for volcanic and industrial surveillance | |
Ma et al. | Mission Capability Evaluation Testing System for Unmanned Aerial Vehicle Swarm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20928700 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20928700 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 31/03/2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20928700 Country of ref document: EP Kind code of ref document: A1 |