CN117032235A - Mobile robot inspection and remote monitoring method under complex indoor scene - Google Patents

Mobile robot inspection and remote monitoring method under complex indoor scene Download PDF

Info

Publication number
CN117032235A
CN117032235A CN202311019366.9A CN202311019366A CN117032235A CN 117032235 A CN117032235 A CN 117032235A CN 202311019366 A CN202311019366 A CN 202311019366A CN 117032235 A CN117032235 A CN 117032235A
Authority
CN
China
Prior art keywords
mobile robot
quadruped
lane line
distance
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311019366.9A
Other languages
Chinese (zh)
Inventor
黄家才
薛源
张杨桂
王徐寅
常国卫
梅贤慧
高祝欢
茅飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Institute of Technology
Original Assignee
Nanjing Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Institute of Technology filed Critical Nanjing Institute of Technology
Priority to CN202311019366.9A priority Critical patent/CN117032235A/en
Publication of CN117032235A publication Critical patent/CN117032235A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a mobile robot inspection and remote monitoring method under a complex indoor scene, which comprises the following steps: in the motion process of the four-foot robot, the upper computer receives lane line images of two sides of the four-foot mobile robot sent by industrial cameras positioned at two sides of the four-foot mobile robot in real time, processes the lane line images, fits to obtain a straight line which is most attached to an actual lane line, and takes the straight line as a routing inspection route of the four-foot mobile robot; meanwhile, the upper computer detects an obstacle or a wall positioned in front by adopting the laser radar positioned in the center of the head of the four-foot mobile robot, and the four-foot mobile robot is switched to an obstacle avoidance mode or a turning mode according to a detection result. The invention can effectively overcome the interference of abrasion and pollution of the indoor yellow lane line, can accurately adjust the movement gesture at the position without the lane line, realizes remote monitoring and control, and can be suitable for patrol and inspection of large-scale indoor environment.

Description

Mobile robot inspection and remote monitoring method under complex indoor scene
Technical Field
The invention belongs to the technical field of robots, and particularly relates to a mobile robot inspection and remote monitoring method under a complex indoor scene.
Background
At present, the development of robot technology has made remarkable progress, and is widely applied to various fields. Particularly in the field of inspection and monitoring, robots can efficiently perform heavy, dangerous or long-time tasks. However, existing robotic systems still present challenges in terms of mobility and stability under different terrain and environmental conditions. Aiming at the problems of path planning and navigation of robots in the process of inspection, researchers propose various algorithms and methods. For example, autonomous navigation algorithms based on sensor data and map information implement real-time map construction and position estimation using SLAM (Simultaneous Localization and Mapping) techniques, and generate optimal paths in conjunction with planning algorithms. In order to realize comprehensive inspection tasks, various sensors (such as cameras, laser radars, infrared sensors and the like) are used, and a robot can acquire information of surrounding environments so as to construct an accurate environment model to support execution of the inspection tasks and decision making.
Under a large indoor environment, a mobile robot with wider application is an AGV trolley. The AGV car generally adopts a magnetic navigation technology, and indoor navigation is carried out in a mode of attaching a magnetic strip on the ground, and the navigation mode of the method can be mechanically damaged by hard objects such as metal passing through a loop, has a certain influence on navigation and is only suitable for being used in a specific and single environment. The accuracy of SLAM mapping in other indoor scenes such as a large warehouse is difficult to guarantee, and meanwhile, the space structure in the scene is difficult to fix due to frequent movement of large cargoes, large equipment and personnel, so that the SLAM mapping method is not suitable for the environment. In addition, a large number of magnetic stripes are required to be paved under a large scene by adopting a magnetic navigation method, so that the original operation activity in the scene is influenced.
Compared with common wheeled and crawler-type mobile robots, the four-foot mobile robot has stronger motion performance, is more suitable for inspection in a large indoor environment, but because the four-foot mobile robot cannot acquire mileage data by using an encoder, the efficiency can be improved by applying the four-foot mobile robot to indoor inspection tasks, and a proper strategy method is designed to guide the motion of the four-foot mobile robot under the condition that the encoder cannot be used according to the actual environment.
For example, a robot welting motion control system and method are proposed in the invention of patent publication No. CN111538338B, an edge cleaning method and a cleaning vehicle are proposed in the invention of patent publication No. CN113863195B, CN115993820a is based on labyrinth robot path planning of ROS system, CN116175561a is a control method, chip and robot of drawing robot, and so on. However, although the laser radar is used for detecting objects such as walls in the patents, the function is single, the switching between line inspection and walking along the walls cannot be realized, the laser radar is only suitable for fixed occasions, and the laser radar is not suitable for complex indoor scenes.
Disclosure of Invention
The technical problems to be solved are as follows: the invention discloses a mobile robot patrol and remote monitoring method under a complex indoor scene, which solves the problem of patrol and inspection of a quadruped mobile robot under the condition that no encoder acquires mileage, can effectively overcome the interference of abrasion and pollution of an indoor yellow lane line, can accurately adjust the movement gesture at the position without the lane line, realizes remote monitoring and control, and can be suitable for patrol and inspection of a large-scale indoor environment.
The technical scheme is as follows:
a mobile robot inspection and remote monitoring method under a complex indoor scene comprises the following steps:
the method comprises the steps that a laser radar, an industrial camera and an ultrasonic sensor are additionally arranged on a quadruped mobile robot body, the quadruped mobile robot body is connected to an upper computer through a USB (universal serial bus) docking station, the upper computer receives a control instruction and a task instruction issued by a remote control terminal, the quadruped mobile robot is switched into a line inspection mode, an ROS (remote operation system) robot operation system is utilized to process information of each sensor to generate a motion instruction of the quadruped mobile robot, the motion instruction is sent to a lower computer through a UDP (user datagram protocol), and the lower computer controls the quadruped mobile robot to move according to the motion instruction, so that the quadruped mobile robot moves along a set line inspection route;
in the motion process of the four-foot robot, the upper computer receives lane line images of two sides of the four-foot mobile robot sent by industrial cameras positioned at two sides of the four-foot mobile robot in real time, processes the lane line images, fits to obtain a straight line which is most attached to an actual lane line, and takes the straight line as a routing inspection route of the four-foot mobile robot; meanwhile, the upper computer detects an obstacle or a wall positioned in front by adopting a laser radar positioned in the center of the head of the four-foot mobile robot, and the four-foot mobile robot is switched to an obstacle avoidance mode or a turning mode according to a detection result;
when the inspection task is completed and the quadruped mobile robot moves to the origin warehouse area, switching the quadruped mobile robot into a warehouse-in mode, detecting the distance between the tail of the quadruped mobile robot and a baffle at the bottom of the origin warehouse area by adopting an ultrasonic sensor positioned at the tail of the quadruped mobile robot, controlling the quadruped mobile robot to retreat until the distance between the tail of the quadruped mobile robot and the baffle at the bottom of the origin warehouse area is smaller than a first preset warehouse-in distance threshold value, detecting the distance between the quadruped robot and the side wall of the origin warehouse area by adopting a laser radar, and controlling the quadruped mobile robot to translate towards the side wall until the distance between the quadruped robot and the side wall of the origin warehouse area is smaller than a second preset warehouse-in distance threshold value.
Further, the laser radar is a two-dimensional laser radar; the information obtained by laser radar scanning comprises: one-dimensional distance array d 0 ,d 1 ,d 2 ,...,d 359 ]Wherein the element di in the array is the distance between the laser radar and the obstacle, and i represents the angle of radar scanning.
Further, the upper computer detects an obstacle or a wall positioned in front by adopting the laser radar positioned in the center of the head of the quadruped mobile robot, and the process of switching the quadruped mobile robot into the obstacle avoidance mode or the turning mode according to the detection result comprises the following steps:
in the moving process of the mobile robot, the [ d ] is obtained by intercepting 360-frame distance data detected by the laser radar 141 ,d 142 ,d 143 ,…,d 220 ]When 0.2 is less than or equal to min d 141 ,d 142 ,d 143 ,…,d 220 ]When the value is less than or equal to 1, generating a stop instruction and issuing a obstacle avoidance ROS topic;
calculating time to estimate whether the quadruped mobile robot reaches the vicinity of a turning node, if so, intercepting the 360-frame distance data detected by the laser radar to obtain [ d ] 178 ,d 179 ,d 180 ,d 181 ,d 182 ]Determining a distance d between the turning node and the adjacent wall in advance x When min [ d ] 178 ,d 179 ,d 180 ,d 181 ,d 182 ]≤d x And generating a turning 90-degree instruction and issuing a turning ROS topic.
Further, detecting the distance between the quadruped robot and the side wall of the origin warehouse area by using a laser radar, controlling the quadruped mobile robot to translate towards the side wall until the distance between the quadruped robot and the side wall of the origin warehouse area is smaller than a second preset warehouse-in distance threshold value,
taking [ d ] in laser radar 360-frame distance data 89 ,d 90 ]Presetting a distance d between the four-foot mobile robot and the right side wall in the warehouse z When (when)And when the four-foot mobile robot stops moving rightwards, stands for 2 seconds, and then falls down to wait for the next task instruction.
Further, the mobile robot inspection and remote monitoring method further comprises the following steps:
in the motion process of the quadruped robot, if the upper computer cannot be fitted to obtain a straight line which is most attached to an actual lane line, the quadruped mobile robot is switched to a wall-following motion mode, the laser radar is adopted to acquire distance information of the quadruped mobile robot, a front wall and a side wall, whether turning is carried out or not is judged according to the distance information of the front wall, and the quadruped mobile robot is adjusted to be parallel to the side wall according to the distance information of the side wall.
Further, the process of adjusting itself to be parallel to the side wall according to the distance information from the side wall includes the steps of:
the distance d between the left side surface and the right side surface of the quadruped mobile robot and the wall is obtained by intercepting 360 frames of distance data detected by the laser radar 60 ,d 61 ,d 62 ,d 63 ]And [ d ] 119 ,d 120 ,d 121 ,d 122 ]Calculate the average valueAnd (3) withDifference Δd=d 1 -D 2 Calculating by adopting the formula (1) to obtain the adjustment speed of the quadruped mobile robot:
in the formula, v r To actually adjust the speed v 0 To preset the speed of adjustment v 0 For positive indication of left turn adjustment, v 0 Negative indicates right turn adjustment.
Further, if the upper computer cannot fit to obtain the line which is most fit to the actual lane line, switching the quadruped mobile robot into a wall-following motion mode, judging whether the current area belongs to an autonomously set wireless section area, and if so, driving the quadruped mobile robot to move along the wall until the quadruped mobile robot leaves the current area; if the four-foot mobile robot does not move along the wall, starting timing, stopping moving the four-foot mobile robot if the lane line is not detected after the waiting time is preset, sending a task termination instruction to the remote terminal, and switching back to the line patrol mode if the lane line is detected again in the waiting time.
Further, in the line inspection mode, the four-foot mobile robot presets a left industrial camera view or a right industrial camera view for each section of route according to the relative positions of the lane lines and the four-foot mobile robot.
Further, the process of processing the lane line image and fitting to obtain the line most fitting to the actual lane line includes the following steps:
dividing regions of RGB images acquired by an industrial camera, shielding visual field interference of a quadruped robot body, and intercepting a region of interest (ROI); the cut-out ROI area is a three-channel RGB image, which is used as an input image, R, G, B respectively represents pixel values of three channels, and corresponding H, S, V-channel pixel values are calculated according to formulas (2) to (4), so as to obtain an HSV image:
V=C max (4)
wherein,C max =max(R′,G′,B′);C min =min(R′,G′,B′);Δ=C max -C min
performing binarization operation on the HSV image according to a formula (5), and outputting an original binary image:
wherein dst (I) is an output pixel value, lowerb (I) is a lower threshold value of a corresponding channel, upperb (I) is an upper threshold value of the corresponding channel, and src (I) is a pixel value of the corresponding channel of the input image; when the pixel value of each channel is in the set upper and lower threshold ranges, dst (I) is 255, and if not, dst (I) is 0;
extracting a plurality of groups of ground lane line HSV color space thresholds according to the abrasion and pollution conditions of the field lane lines, fusing according to a formula (6) to generate a mask, and performing color filtering by taking an original binary image as input to obtain a fused binary image, so as to obtain a lane line approximate region:
dst=mask 1 |mask 2 |…|mask j (6)
wherein dst is a fused binary image, mask j Masks under the treatment of different color ranges, | represents bitwise or;
performing open operation and close operation on the fused binary images to remove the interference of fine pollution and abrasion and obtain a complete lane line area; the open operation is carried out on the image firstly to corrode and separate the adhered fine parts, remove burrs and isolated points, and then expand to maintain the original shape of the area; specifically, when a is an initial image and B is a structural element of mxn, the corrosion and expansion operation processes are as shown in formulas (7) and (8):
where z denotes the translation of B over A,representing the expansion operator formula, Θ represents the corrosion operator, then the open operation is as in formula (9):
the tiny noise cavity is eliminated through closed operation, the contour of the lane line is smoothed, and the fracture and the gap are complemented; the closed-loop calculation process is shown in formula (10):
taking the optimal binary image as input, carrying out Canny edge detection, and mainly packaging the processThe method comprises the following steps: using gaussian distributionPerforming Gaussian filtering on the pixel points (x, y) to remove image noise; using Sobel operator S x And S is equal to y Calculating the magnitude of the gradient of each pixel of the Gaussian filtered image I>Direction and direction ofPerforming non-maximum suppression according to the gradient direction and the amplitude of each pixel, and removing pixels which possibly do not form edges; setting a double threshold, reserving a strong edge and a virtual edge, judging the connection condition of the strong edge and the virtual edge, and reserving the virtual edge connected with the strong edge;
performing Hough straight line transformation on the image after edge detection, fitting a plurality of groups of straight lines suspected to be attached to an actual lane line, setting an included angle limit theta, taking tan theta as a slope limit, and reserving the straight lines between-tan theta and tan theta;
by judging the midpoint ordinate [ y ] of a plurality of groups of straight lines parallel to the actual lane line 1 ,y 2 ,…,y a ]Find y max =max(y 1 ,y 2 ,…,y a ),y max The corresponding straight line is the straight line at the lowest part of the ROI area, and the straight line is used as the straight line which is most attached to the actual lane line and the line inspection standard of the four-foot mobile robot;
calculating the slope k and midpoint pixel coordinates (x, y) of a straight line which is most attached to an actual lane line at the lowest part of the visual field, adjusting left rotation when k is less than or equal to-0.016, adjusting right rotation when k is more than or equal to 0.016, translating right when y is less than or equal to 400px when a left camera visual field is used, translating left when y is more than or equal to 480px, translating left when y is less than or equal to 400px when a right camera visual field is used, translating right when y is more than or equal to 480px, ensuring that the mobile robot is parallel to the lane line and is about 0.3m away from the lane line, and issuing a patrol line to adjust ROS topics.
The invention also discloses a mobile robot inspection platform under the complex indoor scene, which comprises a laser radar, an upper computer, a lower computer, a left industrial camera, a right industrial camera, an ultrasonic sensor, a quadruped mobile robot body, a joint motor, a USB expansion dock, a power supply, a head industrial camera and a remote monitoring terminal;
a laser radar is placed at the front end of a vertical cut surface of the central axis of the surface of the rectangular body of the quadruped mobile robot body, a left industrial camera, a right industrial camera and a head industrial camera are respectively placed in the head space of the quadruped mobile robot body on the left side and the right side of a square frame above the rectangular body, the central axis of the square frame coincides with the central axis of the quadruped mobile robot body, and an ultrasonic sensor is placed at the tail of the rectangular body of the quadruped mobile robot body;
an upper computer and a lower computer are arranged in the quadruped mobile robot body, the upper computer is connected with the lower computer through a network cable, and the upper computer comprises a WiFi module;
the remote monitoring terminal is in wireless connection with the upper computer, and the four-foot mobile robot is controlled by adopting the mobile robot inspection and remote monitoring method under the complex indoor scene.
The beneficial effects are that:
the mobile robot patrol and remote monitoring method in the complex indoor scene can enable the quadruped mobile robot to complete patrol and inspection tasks in the complex indoor environment under the condition that the encoder does not output mileage data.
Drawings
FIG. 1 is a schematic frame diagram of a mobile robot inspection and remote monitoring method in a complex indoor scene according to an embodiment of the invention;
FIG. 2 is a schematic diagram of a four-foot mobile robot according to an embodiment of the present invention;
fig. 3 is a flowchart of a mobile robot inspection and remote monitoring method in a complex indoor scene according to an embodiment of the invention.
Detailed Description
The following examples will provide those skilled in the art with a more complete understanding of the invention, but are not intended to limit the invention in any way.
Referring to fig. 3, the invention discloses a mobile robot inspection and remote monitoring method in a complex indoor scene, which comprises the following steps:
the method comprises the steps that a laser radar, an industrial camera and an ultrasonic sensor are additionally arranged on a quadruped mobile robot body, the quadruped mobile robot body is connected to an upper computer through a USB (universal serial bus) docking station, the upper computer receives a control instruction and a task instruction issued by a remote control terminal, the quadruped mobile robot is switched into a line inspection mode, an ROS (remote operation system) robot operation system is utilized to process information of each sensor to generate a motion instruction of the quadruped mobile robot, the motion instruction is sent to a lower computer through a UDP (user datagram protocol), and the lower computer controls the quadruped mobile robot to move according to the motion instruction, so that the quadruped mobile robot moves along a set line inspection route;
in the motion process of the four-foot robot, the upper computer receives lane line images of two sides of the four-foot mobile robot sent by industrial cameras positioned at two sides of the four-foot mobile robot in real time, processes the lane line images, fits to obtain a straight line which is most attached to an actual lane line, and takes the straight line as a routing inspection route of the four-foot mobile robot; meanwhile, the upper computer detects an obstacle or a wall positioned in front by adopting a laser radar positioned in the center of the head of the four-foot mobile robot, and the four-foot mobile robot is switched to an obstacle avoidance mode or a turning mode according to a detection result;
when the inspection task is completed and the quadruped mobile robot moves to the origin warehouse area, switching the quadruped mobile robot into a warehouse-in mode, detecting the distance between the tail of the quadruped mobile robot and a baffle at the bottom of the origin warehouse area by adopting an ultrasonic sensor positioned at the tail of the quadruped mobile robot, controlling the quadruped mobile robot to retreat until the distance between the tail of the quadruped mobile robot and the baffle at the bottom of the origin warehouse area is smaller than a first preset warehouse-in distance threshold value, detecting the distance between the quadruped robot and the side wall of the origin warehouse area by adopting a laser radar, and controlling the quadruped mobile robot to translate towards the side wall until the distance between the quadruped robot and the side wall of the origin warehouse area is smaller than a second preset warehouse-in distance threshold value.
As shown in fig. 1 and fig. 2, the invention also discloses a mobile robot inspection platform in a complex indoor scene, which comprises a laser radar, an upper computer, a lower computer, a left industrial camera, a right industrial camera, an ultrasonic sensor, a quadruped mobile robot body, a joint motor, a USB expansion dock, a power supply, a head industrial camera and a remote monitoring terminal;
a laser radar is placed at the front end of a vertical cut surface of the central axis of the surface of the rectangular body of the quadruped mobile robot body, a left industrial camera, a right industrial camera and a head industrial camera are respectively placed in the head space of the quadruped mobile robot body on the left side and the right side of a square frame above the rectangular body, the central axis of the square frame coincides with the central axis of the quadruped mobile robot body, and an ultrasonic sensor is placed at the tail of the rectangular body of the quadruped mobile robot body;
an upper computer and a lower computer are arranged in the quadruped mobile robot body, the upper computer is connected with the lower computer through a network cable, and the upper computer comprises a WiFi module;
the remote monitoring terminal is in wireless connection with the upper computer, and the four-foot mobile robot is controlled by adopting the mobile robot inspection and remote monitoring method under the complex indoor scene.
The four-foot robot body 7 comprises a rectangular body and four limbs, wherein the rectangular body is of a transverse bracket structure, and an upper computer 2, a lower computer 3 and a power supply 10 are arranged in the rectangular body; the front section of the central axis of the upper layer of the transverse bracket is provided with a laser radar 1 which is connected with a USB docking station 9 through a serial port, the middle of the upper layer of the transverse bracket is provided with a square frame, the tail of the upper layer of the transverse bracket is provided with an ultrasonic sensor 6 which is connected with the USB docking station 9 through a serial port; the middle points of the uppermost cross beams on the left side and the right side of the square frame are respectively provided with a left industrial camera 4 and a right industrial camera 5, the left industrial camera 4 and the right industrial camera 5 are parallel to the central axis of the square frame and are connected with a USB expansion dock 9, and the USB expansion dock 9 is arranged in the square frame and is used as a space for circuit arrangement; the USB docking station 9 is connected with the upper computer 2 through a USB connection. 2 joint motors 8 are fixed on the left side of the front end of a rectangular body contained in the four-foot robot body 7, and 2 joint motors 8 which are the same as the front end are symmetrically arranged on the right side of the front end; 2 joint motors 8 are fixed on the left side of the rear end of a rectangular body contained in the four-foot robot body 7, and 2 joint motors 8 which are the same as the front end are symmetrically arranged on the right side of the front end; each outermost joint motor 8 is connected with leg structures, 4 leg structures are provided, and the 4 leg structures and the rectangular body form a four-foot robot body 7.
The laser radar 1 is a laser radar of 0.2m-16m, and the scanning range of the laser radar 1 is 360 degrees around the body; the left industrial camera 4, the right industrial camera 5 and the head industrial camera are all undistorted, the visual field range is 100 degrees, and the included angle between the left industrial camera and the head industrial camera and the ground is adjustable. The detection distance of the ultrasonic sensor 6 is 0.03m-5m, and the detection mode of the ultrasonic sensor 6 is linear measurement; the laser radar 1 and the ultrasonic sensor 6 acquire distance information, and the laser radar coordinate system takes the central axis of the mobile robot as a polar axis and the right front direction. Using a laser radar 1 to detect an obstacle in the range of 0.2m-1m in front of the mobile robot during the inspection operation; the laser radar 1 is used for detecting the distance between the mobile robot and the front wall in the process of inspection operation, detected information is transmitted to the upper computer 2 through the serial port, and the distance is used as a reference to judge whether turning is performed: according to the field environment, when the mobile robot needs to move along the wall, a laser radar is used for detecting the distance between the side surface of the mobile robot and the wall, detected information is transmitted to the upper computer 2 through a serial port, the difference value of distance data in the three read angle ranges is calculated, the parallelism condition of the mobile robot and the wall is judged, and if the parallelism condition is not, adjustment is carried out; the ultrasonic sensor 6 measures the linear distance from the tail of the mobile robot to the wall and the baffle plate, and is used for returning the mobile robot to the original point; the left industrial camera 4 and the right industrial camera 5 are fixed on the left and right sides of the mobile robot and are used for detecting lane lines on two sides of a road.
The upper computer and the lower computer are microcomputers capable of being provided with a ubuntu system. The lower computer 3 receives and transmits information to the upper computer 2, processes the received information, judges the type of the inspection task, transmits an action instruction to the joint motor 8, and transmits start-stop instructions and inspection task completion instructions of the laser radar 1, the ultrasonic sensor 6, the left industrial camera 4, the right industrial camera 5 and other sensors to the upper computer 2 in a UDP mode; the upper computer 2 is used for acquiring detection data of the laser radar 1, the ultrasonic sensor 6, the left industrial camera 4 and the right industrial camera 5, and sending a task instruction and a control instruction to the lower computer 3 after judgment processing, wherein the upper computer 2 is communicated with a remote monitoring terminal.
The mobile robot inspection and remote monitoring method in the complex indoor scene realizes intelligent mobile equipment remote communication based on the ROS operating system, and specifically comprises the following steps: the Ubuntu system is installed on the upper computer and the lower computer, and the ROS environment is configured; installing a function package under the ROS operating system by the upper computer, wherein the function package comprises: a subscription service function package, a call radar function package, a serial communication function package, a flash framework and an mjpg-stream device are published; under the ROS operating system environment of the upper computer, a starting file is operated, and the starting file comprises: the system comprises a web frame program, an upper computer and lower computer UDP communication program, a laser radar ranging program, a line patrol program, a warehouse entry program and an mjpg-stream device; the upper computer is connected to the local area network to determine the IP address and port of the upper computer, the connection between the upper computer and the lower computer and the connection between the upper computer and the remote terminal are configured, the lower computer is connected to the local area network to determine the IP address and port of the lower computer, the connection between the lower computer and the upper computer is configured, the mobile robot motion program is written in the lower computer, an executable file is generated, a starting script is written, and the starting self-starting is set; the third and fourth steps of the program are self-started after the upper computer and the lower computer are electrified, the remote terminal is connected to the local area network, a browser webpage is opened at the remote terminal, an address configured by the upper computer is input, and a webpage of the remote control terminal is opened; entering a remote control terminal webpage, selecting a task mode, patrol is carried out on the mobile robot along a lane line or a wall surface according to a preset route, inspection is carried out after the mobile robot reaches a specified place, an upper computer reports a task completion result to the remote terminal webpage after the patrol task is completed, and the mobile robot returns to the origin warehouse and stands by on the ground; and selecting a control mode in a webpage of a remote control terminal, wherein a real-time video, a virtual keyboard and an operating state are displayed, the mobile robot is controlled to move forwards, backwards, horizontally leftwards, horizontally rightwards, leftwards and rightwards through virtual keys, and a current real-time image in front of the mobile robot is checked through a window.
The web application program starts a task framework to configure a task interface, receives a task request sent by the remote terminal, and transmits the task request to the UDP program in the form of an ROS topic. When the laser radar ranging program runs, the upper computer receives laser radar distance detection information in the moving process of the mobile robot, makes judgment according to the obstacle detection and wall detection conditions through the program, generates stop, left turn and right turn control instructions, and transmits the stop, left turn and right turn control instructions to the UDP sending program in the form of ROS topics. The line inspection program is characterized in that an upper computer receives detection information of a left industrial camera 4 and a right industrial camera 5 in the moving process of a mobile robot, generates left-shift, right-shift, left-turn and right-turn control instructions according to the detection of the condition of a lane line through a program, judges whether to switch to a wall-following moving state according to the existence of the lane line, and transmits the information to a UDP (user datagram protocol) transmitting program in the form of an ROS (reactive oxygen species) topic. When the warehousing program runs, the upper computer receives detection information of the ultrasonic sensor 6 in the moving process of the mobile robot, when the mobile robot returns to the original point to perform warehousing action, a backward or forward control instruction is generated by the program according to the condition of detecting the distance from the tail part of the mobile robot to a baffle plate in the original point warehouse area, and the backward or forward control instruction is transmitted to the UDP transmitting program in the form of an ROS topic; the upper computer receives topics through the ROS, generates UDP data frames through tasks and control instruction sets described in S31-S34, and sends the UDP data frames to the lower computer through UDP, and meanwhile receives the data frames sent by the lower computer through UDP; the head industry camera 11 is invoked for remote video transmission within a local area network by mjpg-structer.
The laser radar detects the distance between the front end of the mobile robot and an obstacle, and when an object appears in a region with the detection angle range of 140-220 ℃ and the length range of 0.2-1 m in the moving process of the mobile robot, the program makes a stop judgment and issues an ROS topic; the method comprises the steps that a laser radar detects the distance between the front end of a mobile robot and a wall, the detection angle range is 178-182 degrees, the length range is adjusted according to the environment of a patrol site, the mobile robot works in a task mode, when the mobile robot moves to a turning position in a route, the distance between the mobile robot and the wall in front is detected through a laser radar detection program, and when the distance reaches a set turning point distance, a turning instruction is generated and an ROS topic is issued; the laser radar detects the distance between the right side of the mobile robot and the wall, and when the mobile robot needs to move along the wall in a task mode, the adjustment instructions are issued through ROS topics.
Specifically, the laser radar adopted by the invention is a two-dimensional laser radar; the information obtained by laser radar scanning comprises: one-dimensional distance array d 0 ,d 1 ,d 2 ,…,d 359 ]Wherein element d in the array i I is the radar scanning angle, and d is the distance between the laser radar and the obstacle i The change in value determines the distance required by the radar.
The laser radar detection program realizes obstacle avoidance, turning point judgment and wall detection.
The laser radar obstacle avoidance method comprises the following steps: in the moving process of the mobile robot, the [ d ] is obtained by intercepting 360-frame distance data detected by the laser radar 141 ,d 142 ,d 143 ,…,d 220 ]When 0.2 is less than or equal to min d 141 ,d 142 ,d 143 ,…,d 220 ]When the value is less than or equal to 1, generating a stop instruction and issuing a obstacle avoidance ROS topic;
the laser radar turning point judging method comprises the following steps: estimating whether the quadruped mobile robot reaches the vicinity of a turning node or not by calculating time, if so, starting to intercept from 360 frames of distance data detected by the laser radar at the moment to obtain [ d ] 178 ,d 179 ,d 180 ,d 181 ,d 182 ]Determining the actual distance d between the turning node and the adjacent wall x When min [ d ] 178 ,d 179 ,d 180 ,d 181 ,d 182 ]≤d x When the system is used, a turning 90-degree instruction is generated, the turning direction is set according to the field requirement, and turning ROS topics are issued;
the laser radar wall detection method comprises the following steps: in the lane line-free area, switching to a wall-following motion mode, and intercepting the distance [ d ] between the right side surface of the quadruped mobile robot and a wall from 360 frames of distance data detected by a laser radar 60 ,d 61 ,d 62 ,d 63 ]And [ d ] 119 ,d 120 ,d 121 ,d 122 ]Calculate the average valueAnd->Difference Δd=d 1 -D 2
In the formula, v r To actually adjust the speed v 0 To preset the speed of adjustment v 0 For positive indication left turn adjustment and negative indication right turn adjustment, the ROS topic moving along the wall is released, and the quadruped mobile robot is kept parallel to the wall surface.
When the mobile robot needs to move along the lane line in the task mode, selecting to use the left industrial camera 4 view or the right industrial camera 5 view according to the relative position of the lane line and the mobile robot; the mobile robot is guaranteed to be parallel to the lane lines and spaced about 0.3m apart.
Under a task mode, if lane lines exist, presetting a left industrial camera 4 view or a right industrial camera 5 view for each section of route according to the relative positions of the lane lines and the mobile robot;
dividing regions of RGB images acquired by an industrial camera, shielding visual field interference of the quadruped robot body 7, and intercepting a region of interest (ROI);
the cut-out ROI area is a three-channel RGB image, which is used as an input image, R, G, B respectively represents pixel values of three channels, and corresponding H, S, V-channel pixel values are calculated according to formulas (2) to (4), so as to obtain an HSV image:
V=C max (4)
wherein,C max =max(R′,G′,B′);C min =min(R′,G′,B′);Δ=C max -C min
performing binarization operation on the HSV image according to a formula (5), and outputting an original binary image:
wherein dst (I) is the output pixel value, lowerb (I) is the lower threshold of the corresponding channel, upperb (I) is the upper threshold of the corresponding channel, and src (I) is the pixel value of the corresponding channel of the input image. When the pixel value of each channel is in the set upper and lower threshold ranges, dst (I) is 255, and if not, dst (I) is 0;
extracting a plurality of groups of ground lane line HSV color space thresholds according to the conditions of field lane line abrasion, pollution and the like, fusing according to a formula (6) to generate a mask, and performing color filtering by taking an original binary image as input to obtain a fused binary image, so as to obtain a lane line approximate region:
dst=mask 1 | mask 2 |…|mask i (6)
wherein dst is a fused binary image, mask i Masks under the treatment of different color ranges, | represents bitwise or;
performing open operation and close operation on the fused binary images to remove the interference of fine pollution and abrasion and obtain a complete lane line area;
the open operation is carried out on the image firstly to corrode and separate the adhered fine parts, remove burrs and isolated points, and then expand to maintain the original shape of the area. When A is the initial image and B is the m×n structural element, the corrosion and expansion operation processes are as shown in formulas (7) and (8):
where z denotes the translation of B over A,representing the expansion operator formula, Θ represents the corrosion operator, then the open operation is as in formula (9):
and eliminating tiny noise holes through closed operation, smoothing the contour of the lane line, and complementing the fracture and the gap. The closed-loop calculation process is shown in formula (10):
taking the optimal binary image as input, and carrying out Canny edge detection, wherein the main process comprises the following steps: using gaussian distributionPerforming Gaussian filtering on the pixel points (x, y) to remove image noise; using Sobel operator S x And S is equal to y Calculating the magnitude of the gradient of each pixel of the Gaussian filtered image I>Direction and direction ofPerforming non-maximum suppression according to the gradient direction and the amplitude of each pixel, and removing pixels which possibly do not form edges; setting a double threshold, reserving a strong edge and a virtual edge, judging the connection condition of the strong edge and the virtual edge, and reserving the virtual edge connected with the strong edge;
performing Hough straight line transformation on the image after edge detection, fitting a plurality of groups of straight lines suspected to be attached to an actual lane line, setting an included angle limit theta, taking tan theta as a slope limit, and reserving the straight lines between-tan theta and tan theta;
by judging multiple groups and actualMidpoint ordinate [ y ] of parallel straight line of lane line 1 ,y 2 ,…,y i ]Find y max =max(y 1 ,y 2 ,…,y i ),y max The corresponding straight line is the straight line at the lowest part of the ROI, the straight line is the straight line which is the most attached to the actual lane line, and the straight line is used as the line inspection standard of the four-foot mobile robot;
calculating the slope k and midpoint pixel coordinates (x, y) of a straight line which is most attached to an actual lane line at the lowest part of the visual field, adjusting left rotation when k is less than or equal to-0.016, adjusting right rotation when k is more than or equal to 0.016, translating right when y is less than or equal to 400px when a left camera visual field is used, translating left when y is more than or equal to 480px, translating left when y is less than or equal to 400px when a right camera visual field is used, translating right when y is more than or equal to 480px, ensuring that the mobile robot is parallel to the lane line and is about 0.3m away from the lane line, and issuing a patrol line to adjust ROS topics.
When a lane line exists, the upper computer processes the video images of the lane line on the left and right sides of the industrial camera to collect the video images of the on-site lane line, the interference of abrasion and pollution of the lane line is effectively overcome through image processing, a straight line attached to the lane line is fitted, an adjustment instruction is judged and output through a control strategy, and the four-foot mobile robot is kept parallel to the lane line; when no lane line exists, the laser radar detects the side wall, and the control strategy judges and outputs an adjustment instruction to keep the four-foot mobile robot wall parallel; the laser radar acquires the distance between the laser radar and the obstacle in front of the laser radar to perform obstacle avoidance judgment, and detects the distance between the turning node and the wall to judge the turning action; when the task is about to be completed, the warehousing gesture is adjusted through laser radar and ultrasonic sensing, and the warehousing action is completed. All turning nodes, task designated points, line patrol areas and wall movement areas are predetermined, and the lane line-free areas comprise: the wireless section area set by the user and the area where the line patrol program does not detect the lane line. For example, in the case of long straight movement, when no lane line is detected in the continuous 20 frames in the field of view of the industrial camera, the state is switched, the lane line is moved for 20s by a wall adjustment method, in the process, if the lane line is detected again, the line inspection mode is switched, if the lane line is not detected yet after 20s, the movement is stopped, and a task termination instruction is sent to the remote terminal.
The warehouse entry program features include:
predicting the time when the four-foot mobile robot is about to complete the inspection task and moves to the vicinity of the origin reservoir area, starting to slow down at the moment, and performing turning judgment by the laser radar through the turning point judgment method according to claim 9;
after turning, the four-foot mobile robot is controlled to retreat and an ultrasonic sensor is started to detect the distance d between the tail and a baffle at the bottom of a warehouse y When d y Stopping backing and starting right translation when the thickness is less than or equal to 168 mm;
in the right translation process, the robot is kept parallel to the right wall surface by the wall-following adjustment method of claim 9, and [ d ] in the laser radar 360-frame distance data is taken at the moment 89 ,d 90 ]Presetting a distance d between the four-foot mobile robot and the right side wall in the warehouse z When (when)And when the four-foot mobile robot stops moving rightwards, stands for 2 seconds, and then falls down to wait for the next task instruction.
The above is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above examples, and all technical solutions belonging to the concept of the present invention belong to the protection scope of the present invention. It should be noted that modifications and adaptations to the invention without departing from the principles thereof are intended to be within the scope of the invention as set forth in the following claims.

Claims (10)

1. The mobile robot inspection and remote monitoring method under the complex indoor scene is characterized by comprising the following steps of:
the method comprises the steps that a laser radar, an industrial camera and an ultrasonic sensor are additionally arranged on a quadruped mobile robot body, the quadruped mobile robot body is connected to an upper computer through a USB (universal serial bus) docking station, the upper computer receives a control instruction and a task instruction issued by a remote control terminal, the quadruped mobile robot is switched into a line inspection mode, an ROS (remote operation system) robot operation system is utilized to process information of each sensor to generate a motion instruction of the quadruped mobile robot, the motion instruction is sent to a lower computer through a UDP (user datagram protocol), and the lower computer controls the quadruped mobile robot to move according to the motion instruction, so that the quadruped mobile robot moves along a set line inspection route;
in the motion process of the four-foot robot, the upper computer receives lane line images of two sides of the four-foot mobile robot sent by industrial cameras positioned at two sides of the four-foot mobile robot in real time, processes the lane line images, fits to obtain a straight line which is most attached to an actual lane line, and takes the straight line as a routing inspection route of the four-foot mobile robot; meanwhile, the upper computer detects an obstacle or a wall positioned in front by adopting a laser radar positioned in the center of the head of the four-foot mobile robot, and the four-foot mobile robot is switched to an obstacle avoidance mode or a turning mode according to a detection result;
when the inspection task is completed and the quadruped mobile robot moves to the origin warehouse area, switching the quadruped mobile robot into a warehouse-in mode, detecting the distance between the tail of the quadruped mobile robot and a baffle at the bottom of the origin warehouse area by adopting an ultrasonic sensor positioned at the tail of the quadruped mobile robot, controlling the quadruped mobile robot to retreat until the distance between the tail of the quadruped mobile robot and the baffle at the bottom of the origin warehouse area is smaller than a first preset warehouse-in distance threshold value, detecting the distance between the quadruped robot and the side wall of the origin warehouse area by adopting a laser radar, and controlling the quadruped mobile robot to translate towards the side wall until the distance between the quadruped robot and the side wall of the origin warehouse area is smaller than a second preset warehouse-in distance threshold value.
2. The method for mobile robot inspection and remote monitoring in a complex indoor scene according to claim 1, wherein the lidar is a two-dimensional lidar; the information obtained by laser radar scanning comprises: one-dimensional distance array d 0 ,d 1 ,d 2 ,...,d 359 ]Wherein element d in the array i For the laser radar distance to the obstacle, i represents the angle of radar scan.
3. The method for inspecting and remotely monitoring the mobile robot in the complex indoor scene according to claim 2, wherein the process of using the laser radar positioned in the center of the head of the quadruped mobile robot to detect the obstacle or the wall positioned in front and switching the quadruped mobile robot to the obstacle avoidance mode or the turning mode according to the detection result comprises the following steps:
in the moving process of the mobile robot, the [ d ] is obtained by intercepting 360-frame distance data detected by the laser radar 141 ,d 142 ,d 143 ,…,d 220 ]When 0.2 is less than or equal to min d 141 ,d 142 ,d 143 ,…,d 220 ]When the value is less than or equal to 1, generating a stop instruction and issuing a obstacle avoidance ROS topic;
calculating time to estimate whether the quadruped mobile robot reaches the vicinity of a turning node, if so, intercepting the 360-frame distance data detected by the laser radar to obtain [ d ] 178 ,d 179 ,d 180 ,d 181 ,d 182 ]Determining a distance d between the turning node and the adjacent wall in advance x When min [ d ] 178 ,d 179 ,d 180 ,d 181 ,d 182 ]≤d x And generating a turning 90-degree instruction and issuing a turning ROS topic.
4. The method for inspecting and remotely monitoring a mobile robot in a complex indoor scene according to claim 2, wherein detecting the distance between the quadruped robot and the sidewall of the origin warehouse area by using a laser radar, controlling the quadruped mobile robot to translate toward the sidewall until the distance between the quadruped robot and the sidewall of the origin warehouse area is smaller than a second preset warehouse-in distance threshold value,
taking [ d ] in laser radar 360-frame distance data 89 ,d 90 ]Presetting a distance d between the four-foot mobile robot and the right side wall in the warehouse z When (when)And when the four-foot mobile robot stops moving rightwards, stands for 2 seconds, and then falls down to wait for the next task instruction.
5. The method for mobile robot inspection and remote monitoring in a complex indoor scene according to claim 1, further comprising the steps of:
in the motion process of the quadruped robot, if the upper computer cannot be fitted to obtain a straight line which is most attached to an actual lane line, the quadruped mobile robot is switched to a wall-following motion mode, the laser radar is adopted to acquire distance information of the quadruped mobile robot, a front wall and a side wall, whether turning is carried out or not is judged according to the distance information of the front wall, and the quadruped mobile robot is adjusted to be parallel to the side wall according to the distance information of the side wall.
6. The method for mobile robot inspection and remote monitoring in a complex indoor scene according to claim 5, wherein the process of adjusting itself to be parallel to the side wall according to the distance information from the side wall comprises the steps of:
the distance d between the left side surface and the right side surface of the quadruped mobile robot and the wall is obtained by intercepting 360 frames of distance data detected by the laser radar 60 ,d 61 ,d 62 ,d 63 ]And [ d ] 119 ,d 120 ,d 121 ,d 122 ]Calculate the average valueAnd (3) withDifference Δd=d 1 -D 2 Calculating by adopting the formula (1) to obtain the adjustment speed of the quadruped mobile robot:
in the formula, v r To actually adjust the speed v 0 To preset the speed of adjustment v 0 For positive indication of left turn adjustment, v 0 Negative indicates right turn adjustment.
7. The method for inspecting and remotely monitoring the mobile robot in the complex indoor scene according to claim 5, wherein if the upper computer cannot fit to obtain the line most fit to the actual lane line, the quadruped mobile robot is switched to a wall-following motion mode, whether the current area belongs to an autonomously set wireless section area is judged, and if so, the quadruped mobile robot is driven to move along the wall until the quadruped mobile robot leaves the current area; if the four-foot mobile robot does not move along the wall, starting timing, stopping moving the four-foot mobile robot if the lane line is not detected after the waiting time is preset, sending a task termination instruction to the remote terminal, and switching back to the line patrol mode if the lane line is detected again in the waiting time.
8. The method for mobile robot inspection and remote monitoring under a complex indoor scene according to claim 1, wherein the four-legged mobile robot adopts a left industrial camera view or a right industrial camera view for each section of route according to the relative positions of the lane lines and the four-legged mobile robot in an inspection mode.
9. The method for mobile robot inspection and remote monitoring in complex indoor scene according to claim 1, wherein the process of processing the lane line image and fitting the straight line most fitting with the actual lane line comprises the following steps:
dividing regions of RGB images acquired by an industrial camera, shielding visual field interference of a quadruped robot body, and intercepting a region of interest (ROI); the cut-out ROI area is a three-channel RGB image, which is used as an input image, R, G, B respectively represents pixel values of three channels, and corresponding H, S, V-channel pixel values are calculated according to formulas (2) to (4), so as to obtain an HSV image:
V=C max (4)
wherein,C max =max(R′,G′,B′);C min =min(R′,G′,B′);Δ=C max -C min
performing binarization operation on the HSV image according to a formula (5), and outputting an original binary image:
wherein dst (I) is an output pixel value, lowerb (I) is a lower threshold value of a corresponding channel, upperb (I) is an upper threshold value of the corresponding channel, and src (I) is a pixel value of the corresponding channel of the input image; when the pixel value of each channel is in the set upper and lower threshold ranges, dst (I) is 255, and if not, dst (I) is 0;
extracting a plurality of groups of ground lane line HSV color space thresholds according to the abrasion and pollution conditions of the field lane lines, fusing according to a formula (6) to generate a mask, and performing color filtering by taking an original binary image as input to obtain a fused binary image, so as to obtain a lane line approximate region:
dst=mask 1 |mask 2 |…|mask j (6)
wherein dst is a fused binary image, mask j Masks under the treatment of different color ranges, | represents bitwise or;
performing open operation and close operation on the fused binary images to remove the interference of fine pollution and abrasion and obtain a complete lane line area; the open operation is carried out on the image firstly to corrode and separate the adhered fine parts, remove burrs and isolated points, and then expand to maintain the original shape of the area; specifically, when a is an initial image and B is a structural element of mxn, the corrosion and expansion operation processes are as shown in formulas (7) and (8):
where z denotes the translation of B over A,representing the expansion operator formula, Θ represents the corrosion operator, then the open operation is as in formula (9):
the tiny noise cavity is eliminated through closed operation, the contour of the lane line is smoothed, and the fracture and the gap are complemented; the closed-loop calculation process is shown in formula (10):
taking the optimal binary image as input, and carrying out Canny edge detection, wherein the main process comprises the following steps: using gaussian distributionPerforming Gaussian filtering on the pixel points (x, y) to remove image noise; using Sobel operator S x And S is equal to y Calculating the magnitude of the gradient of each pixel of the Gaussian filtered image I>Direction and direction ofPerforming non-maximum suppression according to the gradient direction and the amplitude of each pixel, and removing pixels which possibly do not form edges; setting a double threshold, reserving a strong edge and a virtual edge, judging the connection condition of the strong edge and the virtual edge, and reserving the virtual edge connected with the strong edge;
performing Hough straight line transformation on the image after edge detection, fitting a plurality of groups of straight lines suspected to be attached to an actual lane line, setting an included angle limit theta, taking tan theta as a slope limit, and reserving the straight lines between-tan theta and tan theta;
by judging the midpoint ordinate [ y ] of a plurality of groups of straight lines parallel to the actual lane line 1 ,y 2 ,…,y a ]Find y max =max(y 1 ,y 2 ,…,y a ),y max The corresponding straight line is the straight line at the lowest part of the ROI area, and the straight line is used as the straight line which is most attached to the actual lane line and the line inspection standard of the four-foot mobile robot;
calculating the slope k and midpoint pixel coordinates (x, y) of a straight line which is most attached to an actual lane line at the lowest part of the visual field, adjusting left rotation when k is less than or equal to-0.016, adjusting right rotation when k is more than or equal to 0.016, translating right when y is less than or equal to 400px when a left camera visual field is used, translating left when y is more than or equal to 480px, translating left when y is less than or equal to 400px when a right camera visual field is used, translating right when y is more than or equal to 480px, ensuring that the mobile robot is parallel to the lane line and is about 0.3m away from the lane line, and issuing a patrol line to adjust ROS topics.
10. The mobile robot inspection platform under the complex indoor scene is characterized by comprising a laser radar, an upper computer, a lower computer, a left industrial camera, a right industrial camera, an ultrasonic sensor, a four-foot mobile robot body, a joint motor, a USB (universal serial bus) docking station, a power supply, a head industrial camera and a remote monitoring terminal;
a laser radar is placed at the front end of a vertical cut surface of the central axis of the surface of the rectangular body of the quadruped mobile robot body, a left industrial camera, a right industrial camera and a head industrial camera are respectively placed in the head space of the quadruped mobile robot body on the left side and the right side of a square frame above the rectangular body, the central axis of the square frame coincides with the central axis of the quadruped mobile robot body, and an ultrasonic sensor is placed at the tail of the rectangular body of the quadruped mobile robot body;
an upper computer and a lower computer are arranged in the quadruped mobile robot body, the upper computer is connected with the lower computer through a network cable, and the upper computer comprises a WiFi module;
the remote monitoring terminal is in wireless connection with the upper computer, and the four-foot mobile robot is controlled by adopting the mobile robot inspection and remote monitoring method in the complex indoor scene as set forth in any one of claims 1 to 9.
CN202311019366.9A 2023-08-11 2023-08-11 Mobile robot inspection and remote monitoring method under complex indoor scene Pending CN117032235A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311019366.9A CN117032235A (en) 2023-08-11 2023-08-11 Mobile robot inspection and remote monitoring method under complex indoor scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311019366.9A CN117032235A (en) 2023-08-11 2023-08-11 Mobile robot inspection and remote monitoring method under complex indoor scene

Publications (1)

Publication Number Publication Date
CN117032235A true CN117032235A (en) 2023-11-10

Family

ID=88642670

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311019366.9A Pending CN117032235A (en) 2023-08-11 2023-08-11 Mobile robot inspection and remote monitoring method under complex indoor scene

Country Status (1)

Country Link
CN (1) CN117032235A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118163880A (en) * 2024-05-14 2024-06-11 中国海洋大学 Building disease detection quadruped robot and detection method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118163880A (en) * 2024-05-14 2024-06-11 中国海洋大学 Building disease detection quadruped robot and detection method
CN118163880B (en) * 2024-05-14 2024-07-30 中国海洋大学 Building disease detection quadruped robot and detection method

Similar Documents

Publication Publication Date Title
CN110775052B (en) Automatic parking method based on fusion of vision and ultrasonic perception
CN111958591B (en) Autonomous inspection method and system for semantic intelligent substation inspection robot
WO2020253316A1 (en) Navigation and following system for mobile robot, and navigation and following control method
WO2021008611A1 (en) Robot trapping detection and de-trapping method
EP3283358B1 (en) Vehicle guidance system
CN106595630B (en) It is a kind of that drawing system and method are built based on laser navigation Intelligent Mobile Robot
CN110737271B (en) Autonomous cruising system and method for water surface robot
CN109002046B (en) Mobile robot navigation system and navigation method
CN106843223A (en) A kind of intelligent avoidance AGV cart systems and barrier-avoiding method
CN103592944A (en) Supermarket shopping robot and advancing path planning method thereof
CN117032235A (en) Mobile robot inspection and remote monitoring method under complex indoor scene
JP2008275606A (en) Object recognition apparatus for recognizing autonomous mobile object
CN104049634A (en) Intelligent body fuzzy dynamic obstacle avoidance method based on Camshift algorithm
JP7249422B2 (en) Route management system and its management method
CN113805571B (en) Robot walking control method, system, robot and readable storage medium
CN112269380A (en) Obstacle meeting control method and system for substation inspection robot
CN110262487B (en) Obstacle detection method, terminal and computer readable storage medium
CN111830968B (en) Multifunctional water shield unmanned operation ship and navigation control method thereof
KR101359649B1 (en) obstacle detection sensor
CN113064430A (en) Quality inspection trolley obstacle avoidance and path planning algorithm based on Android mobile phone
CN113610910B (en) Obstacle avoidance method for mobile robot
WO2021246170A1 (en) Information processing device, information processing system and method, and program
KR102062874B1 (en) Automated Guided Vehicle
CN117539244A (en) Unmanned mobile target ship autonomous obstacle avoidance method and system
KR20180066668A (en) Apparatus and method constructing driving environment of unmanned vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination