CN113915449B - Robot system and method suitable for detecting defects of inner wall of pipeline - Google Patents

Robot system and method suitable for detecting defects of inner wall of pipeline Download PDF

Info

Publication number
CN113915449B
CN113915449B CN202111235057.6A CN202111235057A CN113915449B CN 113915449 B CN113915449 B CN 113915449B CN 202111235057 A CN202111235057 A CN 202111235057A CN 113915449 B CN113915449 B CN 113915449B
Authority
CN
China
Prior art keywords
robot
pipeline
data
chip
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111235057.6A
Other languages
Chinese (zh)
Other versions
CN113915449A (en
Inventor
艾列富
陈春生
赵明康
陈少川
张�浩
王一宾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anqing Normal University
Original Assignee
Anqing Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anqing Normal University filed Critical Anqing Normal University
Priority to CN202111235057.6A priority Critical patent/CN113915449B/en
Publication of CN113915449A publication Critical patent/CN113915449A/en
Application granted granted Critical
Publication of CN113915449B publication Critical patent/CN113915449B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16LPIPES; JOINTS OR FITTINGS FOR PIPES; SUPPORTS FOR PIPES, CABLES OR PROTECTIVE TUBING; MEANS FOR THERMAL INSULATION IN GENERAL
    • F16L55/00Devices or appurtenances for use in, or in connection with, pipes or pipe systems
    • F16L55/26Pigs or moles, i.e. devices movable in a pipe or conduit with or without self-contained propulsion means
    • F16L55/28Constructional aspects
    • F16L55/30Constructional aspects of the propulsion means, e.g. towed by cables
    • F16L55/32Constructional aspects of the propulsion means, e.g. towed by cables being self-contained
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16LPIPES; JOINTS OR FITTINGS FOR PIPES; SUPPORTS FOR PIPES, CABLES OR PROTECTIVE TUBING; MEANS FOR THERMAL INSULATION IN GENERAL
    • F16L55/00Devices or appurtenances for use in, or in connection with, pipes or pipe systems
    • F16L55/26Pigs or moles, i.e. devices movable in a pipe or conduit with or without self-contained propulsion means
    • F16L55/28Constructional aspects
    • F16L55/40Constructional aspects of the body
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/954Inspecting the inner surface of hollow bodies, e.g. bores
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16LPIPES; JOINTS OR FITTINGS FOR PIPES; SUPPORTS FOR PIPES, CABLES OR PROTECTIVE TUBING; MEANS FOR THERMAL INSULATION IN GENERAL
    • F16L2101/00Uses or applications of pigs or moles
    • F16L2101/30Inspecting, measuring or testing

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot system suitable for detecting defects of the inner wall of a pipeline, which is composed of three layers of hardware equipment and a cloud network system and provides two inspection modes of manual control inspection and automatic inspection; the robot can be effectively kept to run stably by detecting the inclination angle while detecting; meanwhile, after the robot is connected to a cloud system, video data of the inner wall of the pipeline are transmitted to data receiving end software through data transmission; after setting a motion speed parameter according to the characteristics of the pipeline, starting to enter the pipeline for detection, automatically generating a detection report of cracks and fissures on the inner wall of the pipeline, and storing original routing inspection video data; and the detection report comprises a picture marked by the crack picture, position information of the crack in the pipeline and specific time information of detection. The method can replace the traditional pipeline inner wall crack and fissure detection mode which consumes time and labor, and is beneficial to improving the detection efficiency and accuracy.

Description

Robot system and method suitable for detecting defects of inner wall of pipeline
Technical Field
The invention relates to the field of pipeline inner wall defect detection, in particular to a robot system suitable for detecting cracks and fissures of a pipeline inner wall.
Background
Pipelines are the most efficient, economical and safer way of oil and gas transportation. Worldwide, the construction and operation of oil and gas pipelines has been a century ago. The pipeline industry plays a crucial role in national economy, social employment, energy supply and the like, and the situation is particularly obvious in Asia-Pacific, europe, north America and other areas. On the other hand, oil and gas leakage and even personal casualty events caused by pipeline failure each time can cause doubts of public to pipeline influence on environment, ecology, climate and community safety, and the discussion about the sustainability of oil and gas pipeline development is not interrupted.
At present, the existing pipeline surveying device on the market is divided into two modes of handheld telescopic probe and robot-assisted detection. The handheld telescopic probe mainly comprises a miniature camera and a telescopic rod. Need artifical handheld entering pipeline mouth when using, inside flexible probe rod arrived the pipeline, carried out artifical naked eye at the other end and detected, not only had certain danger when using, still can receive very big restriction because the length of telescopic link for the detectable range, higher to manpower resource consumption simultaneously.
The robot auxiliary detection is that images in the pipeline are shot by a high-definition camera to detection personnel, when the robot enters the pipeline from the pipeline opening, the inner wall of the pipeline is shot and recorded, a real-time picture is transmitted to a background, and the detection personnel visually watch the video condition in the pipeline to make judgment before maintenance. The disadvantages of this method are: firstly, identification of cracks and fissures excessively depends on subjective judgment of technicians, and large-area popularization and use are not facilitated; secondly, the working strength of the human eye video patrol mode is high, the human eye video patrol mode is easy to fatigue after long-time work, and the omission ratio is increased; especially, the routing inspection efficiency is further reduced under the conditions of long pipeline distance and large detection range.
Disclosure of Invention
The invention aims to provide a robot system and a method suitable for detecting defects of the inner wall of a pipeline, which can replace the traditional pipeline inner wall crack and fissure detection mode which is time-consuming and labor-consuming and is beneficial to improving the detection efficiency and accuracy.
In order to solve the technical problem, the robot system suitable for detecting the defects of the inner wall of the pipeline comprises a front-end robot platform and a rear-end PC platform which are in communication connection;
the front-end robot platform comprises a camera shooting acquisition layer, a motion control layer and a chip control layer; the camera shooting and collecting layer comprises camera heads arranged through a holder; the motion control layer comprises two side motion tracks and a power device; the chip control layer comprises a drive control chip, a signal receiving chip, a drive information processing chip, an image information processing chip and a magnetic induction chip; the signal receiving chip transmits the received information to the driving information processing chip, the driving information processing chip sends a signal to the driving control chip, and the driving control chip controls the motion control layer to realize the overall motion of the front-end robot platform; the pipeline path detection device and the power supply device are also included;
the back end PC platform includes:
the configuration module is used for setting an operation environment;
the receiving module is used for receiving data fed back from the front-end robot platform;
the analysis module is used for analyzing the received video data frame by frame and judging potential cracks and fissures;
and the report module is used for outputting the patrol report to the user.
The camera shooting acquisition layer tripod head comprises two servo motors arranged on the upper layer of the front end robot platform, one of the two servo motors is arranged along the width direction of the front end robot platform, the other servo motor is fixed on a main shaft of the front end robot platform along the length direction, and the two servo motors are combined to form a double-steering-engine tripod head;
the driving control chip of the chip control layer is an L298n relay; the signal receiving chip comprises two inclination sensors and a Bluetooth module; the driving information processing chip is an Arduino development board; the image information processing chip is a raspberry development board; the magnetic induction chip is a Hall sensor; the power supply device comprises a 12V battery pack and a raspberry storage battery;
the front end robotic platform further comprises the following components: PCF8591 analog-to-digital converter, arduino expansion board and magnetic core;
the two servo motors are connected to an Arduino development board; specifically, signal lines of the two servo motors, VCC pins and GND pins are respectively connected with signal S pins, VCC pins and GND pins corresponding to No. 10 and No. 11 pins of the Arduino expansion board;
the 12V battery pack is electrically connected with the L298n relay, the Arduino development board is connected with the Arduino development board, and the raspberry group development board is electrically connected with the raspberry group storage battery;
the two inclination sensors are respectively connected with the Arduino expansion board and used for collecting attitude information of the front-end robot;
the D0, VCC and GND interfaces of the Hall sensor are sequentially connected with the G17 pin, VCC and GND functional interfaces of the BCM coding mode of the raspberry group development board, and the SDA, SCL, VCC and GND ends of the PCF8591 analog-to-digital converter are connected with the SDA and SCL functional pins and the VCC and GND ports of the raspberry group development board; an AINO port on the PCF8591 analog-to-digital converter is connected with an A0 port of the Hall sensor;
the signal lines, the VCC pins and the GND pins of the two servo motors are respectively connected with signal S pin positions, VCC pins and GND pin positions corresponding to No. 10 and No. 11 pins of the Arduino expansion board;
the camera is connected with a USB interface of the raspberry development board through a data line, and the raspberry development board performs data transmission after establishing contact with a back-end PC through a WIFI module arranged on the raspberry development board;
the Bluetooth module is connected with the Arduino expansion board;
the power device comprises two 370 motors arranged on the bottom plate; the moving crawler is arranged on two sides of the bottom plate, and the two 370 motors are respectively in transmission connection with the moving crawler; a magnetic core is attached to a wheel disc connected with the caterpillar track by one 370 motor, and the 370 motors are respectively connected with the L298n relay through driving wires; the Hall sensor is connected with the magnetic core in a magnetic induction mode.
The pipe path detecting device includes: the ultrasonic ranging sensor and the ultrasonic servo motor are arranged on the base;
ultrasonic ranging sensor' S VCC, GND, ECHO, TRIG mouth and Arduino extend three pin and No. 8 pins connection of 7 pins of board, and signal S pin, VCC, GND pin that the signal line of ultrasonic servo motor and VCC, GND pin connect respectively that Arduino extends No. 9 pins of board and correspond.
The invention also provides a method suitable for detecting the defects of the inner wall of the pipeline, which comprises the following steps:
step A, starting a front-end robot and accessing a cloud system;
a1, a front-end robot power supply is turned on, a raspberry type storage battery is turned on, so that the front-end robot is connected with a specified WIFI, a detection video service is automatically started, and a data sending end program is called to send configuration information;
in this step, the sending the configuration information includes sending an own IP and port number to the designated email box, and also includes sending the configuration information of the bluetooth module;
step A2, an administrator opens the software of the data receiving end, completes the software configuration of the data receiving end according to the relevant configuration information received in the step A1 and generates different authority accounts; in this step, the receiving end software configuration comprises a bluetooth configuration;
step A3, an operator logs in data receiving end software by using the corresponding account generated by the administrator in the step A2, sets motion voltage related parameters according to the environment, and then selects a motion control mode;
in the step, the motion control mode comprises automatic inspection and manual control detection;
step B, starting the inspection operation
B1, an operator logs in data receiving end software to perform motion-assisted operation on the front-end robot; an observer uses related account login software to perform real-time manual auxiliary monitoring;
step B2, front-end data acquisition
The front-end robot starts to run continuously after being started, and the camera collects the inner wall images of each position of the pipeline in the running process and transmits real-time video stream data to the PC end;
in the step, a wheel disc of a 370 motor rotates for a circle, the Hall sensor and the magnetic core generate magnetic induction once, and an accumulator automatically accumulates and transmits the magnetic induction to a PC (personal computer) end;
step B3, back end data analysis
B, the back-end PC receives the information in the step B2 and starts to analyze frame by frame;
image preprocessing: taking out each frame of picture of video stream data in real time, and then sequentially carrying out contrast enhancement, graying processing and Gaussian blur processing;
edge detection: performing edge detection on the preprocessed image, wherein the detection method is Canny edge detection, and the threshold values are respectively as follows: 75 255, 255; the detected edge is a suspected crack in the image;
morphological operation: performing closed operation on the image subjected to edge detection, namely expanding and then corroding, wherein fine cracks are merged in the step;
judging and storing: traversing a connected domain of the image subjected to the morphological operation and calculating the size, when the size exceeds a set threshold value of a PC (personal computer) end, marking the crack position of the image and storing the crack position to a specified path, storing time node information, and calculating and storing position information according to feedback data of an accumulator and the diameter of a wheel disc connected to a 370 motor spindle;
step C, generating a polling report
Step C1, after the inspection task is finished, software can automatically generate a pipeline inner wall crack and crack detection report according to preset parameters, and original inspection video data are stored;
and C2, manually auditing the inspection report by an operator and an observer, adding remark information, and packaging the inspection report and the original operation video data to be sent to an administrator account after the inspection report and the original operation video data are finished.
Preferably, the step B further comprises:
step B20, adjusting the operation posture
In the normal driving process of the front-end robot, 2 inclination sensors continuously acquire data, the chip control layer judges whether the robot deviates from the original gravity center horizontal line according to data feedback, and one of the motors is controlled by the relay 370 to compensate under the condition of deviation judgment until the robot marks the position of the original gravity center horizontal line.
Preferably, the step B further comprises:
step B21, over-bending adjustment
In the normal running process of the robot, the ultrasonic ranging sensor scans the front in real time to detect the distance, if the detected distance is smaller than a preset value, the ultrasonic servo motor controls the ultrasonic ranging sensor to switch and scan again until the direction of which the distance exceeds the preset value is found, and after a new advancing direction is determined, 2 370 motors are controlled to work and turn through L298n relays.
The advantages of the invention are embodied in that: A. front end robot carries on the cloud platform and carries out 360 panorama scans through the camera, and the video transmission of gained is to rear end PC platform analysis, can replace manual work, in time discovers pipeline inner wall crackle, crack, is favorable to improving detection efficiency and degree of accuracy. B. The method provides various operation modes such as automatic inspection, manual control inspection and the like, and the modes can be switched, so that the suspected area can be conveniently rechecked. C. Through magnetic core and hall sensor cooperation, can simulate and judge potential fault point position. D. The front end robot platform walks in a crawler type mode, and the robot is prevented from inclining through the cooperation of the inclination sensor and the controller, so that the posture is self-sustained. E. The front-end robot platform is provided with a control ultrasonic ranging sensor and an ultrasonic servo motor, and can realize over-bending through the matching of a control system and a moving part.
Drawings
FIG. 1 is a system architecture diagram of the present invention;
FIG. 2 is a top level structural view of the front end robot of the present invention;
FIG. 3 is a diagram of a middle level architecture of the front end robot of the present invention;
FIG. 4 is a bottom level structural view of the front end robot of the present invention;
FIG. 5 is an overall workflow diagram of the present invention;
FIG. 6 is an analysis flow diagram of the back end PC platform of the present invention;
FIG. 7 is a flowchart of front end robot travel control according to the present invention;
FIG. 8 is a flow chart of front end robot over-bending control in the present invention;
fig. 9 is a flowchart of the pose maintenance of the front-end robot in the present invention.
Detailed Description
The following further describes the embodiments of the present invention with reference to the drawings.
As shown in FIG. 1, the robot system suitable for detecting the defects of the inner wall of the pipeline comprises a front-end robot platform and a back-end PC platform which are in communication connection.
From the aspect of function realization, the front-end robot platform comprises a camera shooting acquisition layer, a motion control layer and a chip control layer; the camera shooting and collecting layer comprises camera heads arranged through a holder; the motion control layer comprises two side tracks and a power device; the chip control layer comprises a drive control chip, a signal receiving chip, a drive information processing chip, an image information processing chip and a magnetic induction chip; the pipeline path detection device and the power supply device are further included. The signal receiving chip transmits the received information to the driving information processing chip, the driving information processing chip sends a signal to the driving control chip, and the driving control chip controls the motion control layer to realize the overall motion of the front-end robot platform.
The specific parts of the front end robot platform include: camera 1, two servo motor 2, hall sensor 3, two inclination sensor 4, 12V battery 5, bluetooth module 6, raspberry group development board 7, L298n relay 8, PCF8591 analog-to-digital converter 9, arduino development board 10, arduino expansion board 11, ultrasonic ranging sensor 12, magnetic core 13, two 370 motors 14, bottom plate 15, motion track 16, ultrasonic servo motor 17, raspberry group battery 18.
The driving control chip is an L298n relay 8; the signal receiving chip comprises a Bluetooth module 6; the driving information processing chip is an Arduino development board 10; the image information processing chip is a raspberry development board 7; the magnetic induction chip is a Hall sensor 3; the pipeline path detection device comprises an ultrasonic ranging sensor 12; the power supply device comprises a 12V battery pack 5 and a raspberry storage battery 18.
The camera shooting acquisition layer is mainly responsible for controlling the camera 1 to acquire data through the holder and transmitting the data to the middle layer. As shown in fig. 2 and 3, a pan head is disposed at the upper center of the front end robot, and a camera 1 is mounted on the pan head. In this embodiment, use two servo motor 2 to constitute the double steering wheel cloud platform that can the rotation of axes respectively in the horizontal plane and vertical plane, two servo motor 2 connect Arduino development board 10, 360 panorama scans can be done to the cloud platform.
The chip control layer is mainly responsible for: and each part is controlled, and the data collected by the camera 1 is received and transmitted to the back-end PC platform.
As shown in fig. 3, on the middle-layer central axis of the front-end robot, a 12V battery pack 5, a raspberry pi battery 18, an L298n relay 8, a PCF8591 analog-to-digital converter 9, an ultrasonic ranging sensor 12, and an ultrasonic servo motor 17 are sequentially arranged from back to front; the Bluetooth module 6 and the Arduino development board 11 are arranged on the left side of the central axis, and the Arduino development board 10 is arranged at the front end of the Arduino development board 11; the raspberry group development board 7 is arranged in the middle of the right side of the central axis, and the Hall sensor 3 is arranged at the rear end of the raspberry group development board 7; two inclination sensors 4 are respectively arranged on both sides of the central axis.
12V group battery 5 and L298n relay 8 electric connection, arduino development board 10 and L298n relay 8 electric connection to realize power supply, arduino development board 11 is got the electricity from Arduino development board 10, raspberry group development board 7 and raspberry group battery 18 electric connection. The specific connection mode is as follows:
the positive electrode and the negative electrode of the 12V battery pack 5 are connected to the 12V interface and the GND interface of the L298n relay 8, respectively, so that the L298n relay 8 is energized. Two wires, one connecting the +5V and GND of the L298n relay 8 and one connecting the VCC and GND ports of the Arduino development board 11, are used to turn on the Arduino development board 10. Two wires are led out from ENA and ENB of the L298n relay 8 respectively to connect interfaces No. 5 and No. 6 of the Arduino expansion board 11 for controlling PWM output values. Four further conductors are connected from IN1, IN2, IN3, IN4 of the L298n relay 8 to the ports A2, A3, A4, A5 of the Arduino expansion board 11 respectively. Meanwhile, the raspberry group development board 7 is connected with the raspberry group storage battery 18 through a Micro USB interface, so that the raspberry group development board 7 is powered on.
The two inclination sensors 4 are respectively connected with the Arduino expansion board 11 and used for collecting the posture information of the front-end robot. The specific connection mode is as follows: the OUT ports VCC, GND of the two tilt sensors 4 on both sides are connected in sequence to the corresponding three pins at position 3,4 of the Arduino expansion board 11.
The D0, VCC and GND interfaces of the Hall sensor 3 are sequentially connected with the G17 pin, VCC and GND functional interfaces of the BCM coding mode of the raspberry group development board 7, and the SDA, SCL, VCC and GND ends of the PCF8591 analog-to-digital converter 9 are connected with the SDA, SCL, VCC and GND ports of the raspberry group development board 7. An AINO port on the PCF8591 analog-to-digital converter 9 is connected with an A0 port of the Hall sensor 3, is placed in front of the L298n relay 8 and is fixed above the middle layer structure.
VCC, GND, ECHO, TRIG mouth and the Arduino of ultrasonic ranging sensor 12 extend three pins and No. 8 pins of No. 7 pins of board 11 and are connected, and the signal line of two servo motor 2 and VCC, GND pin and ultrasonic servo motor 17' S signal line and VCC, GND pin are connected respectively and are extended the signal S pin, VCC, three pins of GND of No. 11, the signal S pin, 10 that Arduino extended the board. The ultrasonic ranging sensor 12 is used for acquiring information of front obstacles, curves and the like and providing guidance for the curves; the ultrasonic servo motor 17 is used for operating the ultrasonic distance measuring sensor 12 to commutate.
The camera 1 of superstructure passes through data line and 7USB interface connections of raspberry group development board 7 of middle level structure, is provided with PCF data acquisition device on the raspberry group development board 7 for gather the image through the camera. And the raspberry development board 7 establishes contact with a back-end PC through a WIFI module arranged on the raspberry development board and then transmits data.
The motion control layer is mainly responsible for walking, in addition through setting up magnetic core 13 on the bottom and the cooperation of middle level hall sensor 3, realizes the analog acquisition of front end robot positional information.
As shown in fig. 4, the bottom layer of the front-end robot comprises two 370 motors 14 and a bottom plate 15, moving tracks 16 are arranged on two sides of the bottom plate 15, and the two 370 motors 14 are respectively in transmission connection with the moving tracks 16; a magnetic core 13 is attached to a wheel disc of one 370 motor 14 connected with the crawler, 4 driving wires led OUT by the 370 motor 14 are respectively connected with pins OUT1, OUT2, OUT3 and OUT of the L298n relay 8, and finally the bottom layer driving control connection is completed. The hall sensor 3 of the middle layer structure is provided with a suitable position so that it is magnetically connected to the magnetic core 13 of the bottom layer.
The basic working principle is as follows: two 370 motors drive the front end robot to walk through two side moving tracks 16, two servo motors 2 drive the double steering engine holder to drive the camera 1 to scan in a rotating mode, the obtained video signals are collected and converted through a PCF data collecting device on the raspberry development board 7 and transmitted to a rear end PC platform in a WIFI mode, frame-by-frame analysis is completed at the rear end, and potential crack cracks are judged.
The chip control layer is provided with an accumulator, and when the 370 motor 14 rotates for one circle, the magnetic core 13 and the Hall sensor 3 are induced once, so that the accumulator is automatically accumulated. Through the cooperation of a plurality of data such as induction times, the diameter of a crawler wheel disc, action directions and the like, the current position of a front-end robot in a pipeline can be simulated and calculated, and the corresponding relation exists between the position information and a video signal in a back-end PC (personal computer) finally, so that the defects of the pipeline can be judged according to the video signal, and the acquisition position of the video signal can be simulated and judged.
And (3) repeated detection: bluetooth module 6 is connected with Arduino expansion board 11 for received signal. The specific connection mode is as follows: the bluetooth module 6 is connected to VCC, GND, RXD, TXD of the blue teeth area of the Arduino expansion board 11. In practical application, an operator can give an interference instruction through the Bluetooth module 6. In the invention, the defect detection mode comprises automatic inspection and manual control detection. The automatic inspection speed is high, the automation degree is high, if there is a doubt to a certain detected section, the instruction can be given through the Bluetooth module 6, and the manual mode is switched to carry out local repeated detection.
Posture adjustment: the detection object of the invention is a pipeline, the bottom of the pipeline is in an arc shape which is bent upwards, the front-end robot is easy to incline in the walking process, the posture change is detected by the two inclination sensors 4 at the moment, and a correction instruction is sent to the 370 motor 14 through the control system, so that the correct posture is always kept.
Bending and obstacle avoidance: during walking, the ultrasonic servo motor 17 controls the ultrasonic ranging sensor 12 to change the direction, the ultrasonic ranging sensor 12 continuously obtains information of front obstacles, curves and the like, and sends a correction instruction to the 370 motor 14 through the control system, so that the operations of obstacle avoidance, curve passing and the like are realized.
Control system and chip possess multiple selection, as one of the alternative ways, arduino development board: production unit: shanghai Lang translation electronics technology Limited, specific model: arduinouno R3; raspberry type development board: production unit: sony UK, specific model: 4B; arduino expansion board: production unit: taiwan intelligent sensing technology ltd, specific model: arduino Sensor Shield V5.0; a Hall sensor: production unit: shenzhen Tegat technologies, model: A3144E.
As can be seen from fig. 5, 6 and 7, the work flow of the present invention is specifically as follows:
step A, starting a front-end robot and accessing a cloud-end system;
step A1, a power supply of the front-end robot is turned on, a raspberry type storage battery is turned on, the front-end robot is connected with a designated WIFI, the detection video service is automatically started, and a data sending end program is called to send configuration information. In this step, the sending the configuration information includes sending an own IP and a port number to the specified email box, and also includes sending the configuration information of the bluetooth module 6.
And step A2, the administrator opens the data receiving end software, completes the configuration of the data receiving end software according to the relevant configuration information received in the step A1, and generates different authority accounts. In this step, the receiving end software configuration includes bluetooth configuration.
And step A3, an operator logs in data receiving end software by using the corresponding account generated by the administrator in the step A2, sets motion voltage related parameters according to the environment, and then selects a motion control mode.
In this step, the motion control mode includes automatic inspection and manual control detection.
Step B, starting the inspection operation
B1, an operator logs in data receiving end software to perform motion-assisted operation on the front-end robot; and the observer uses the related account login software to perform real-time manual auxiliary monitoring.
The operator and the observer can intervene manually in the automatic inspection operation process to deal with emergency.
Step B2, front-end data acquisition
The front-end robot starts to run continuously after being started, and the camera collects the inner wall images of all positions of the pipeline in the running process and transmits real-time video stream data to the PC end.
In this step, the wheel disc of the 370 motor 14 rotates for one circle, the hall sensor 3 and the magnetic core 13 generate magnetic induction once, and the accumulator automatically accumulates and transmits to the PC terminal.
Step B3, back-end data analysis
As can be seen from fig. 6, the back-end PC receives the information of step B2 and starts to perform frame-by-frame analysis.
Image preprocessing: and taking out each frame of picture of the video stream data in real time, and then sequentially carrying out contrast enhancement, graying processing and Gaussian blur processing so as to facilitate subsequent crack detection.
Edge detection: performing edge detection on the preprocessed image, wherein the detection method is Canny edge detection, and the threshold values are respectively as follows: 75, 255. The detected edge is a suspected crack in the image.
Morphological operation: and (3) performing closed operation on the image subjected to edge detection, namely expanding and corroding, wherein fine cracks are combined in the step.
Judging and storing: traversing the connected domain of the image which finishes the morphological operation and calculating the size, marking the crack position of the image and storing the crack position to an appointed path when the size exceeds a set threshold value of a PC (personal computer) end, storing time node information, and calculating and storing position information according to the feedback data of an accumulator and the diameter of a wheel disc connected to a main shaft of a 370 motor. The intercepted key picture is stored by robot position information and time, and the file name format is as follows: year, month, day + hour, minute, second + position information, for example, "2021-04-07 10.
Step C, generating a patrol report
And C1, after the inspection task is finished, automatically generating a pipeline inner wall crack and crack detection report by software according to preset parameters, and storing original inspection video data.
And step C2, the operator and the observer carry out manual examination and review on the inspection report and add remark information, and the inspection report and the original operation video data are packaged and sent to an administrator account after the examination is finished.
Example one
To verify the survey effect of the present invention, we performed simulation experiments.
Test subjects: a transportation pipeline with 20 meters in length and cracks;
detecting a target: crack formation inside the transport pipe.
The test method comprises the following steps: the method comprises the steps of arranging crack cracks in a transport pipeline, placing a robot in the system into the pipeline, connecting the robot into the system, detecting and receiving a detection report.
Step A, starting a robot and accessing a cloud system;
step B, starting the inspection operation
B1, an operator logs in data receiving end software to perform motion-assisted operation on the robot, and an account is tested: test, password: t123456. In the embodiment, the role of the observer is not set, and the work of the observer is completed by the operator.
Step B2, front-end data acquisition
The robot starts to continuously run under the control of an operator after being started, the camera collects the inner wall images of all positions of the pipeline in the running process, and real-time video stream data are transmitted to the PC end.
Step C, generating a polling report
In the inspection process of the step B, an operator observes a real-time video detection picture and carries out manual intervention in time, and the results of automatic crack detection are corrected and perfected, including the position of a pipeline fault point and the detection time point are modified. After the inspection task is formally finished, software automatically generates an inspection report according to the detection result and manual intervention of an operator, the detection report records the marked fault point picture in detail, and simultaneously records the time of the fault point accurate to the second and the position information of the pipeline where the robot is located accurate to the centimeter.
Example two
As shown in fig. 8, in this embodiment, the step B further includes:
step B20, adjusting the operation posture
In the normal driving process of the front-end robot, 2 inclination sensors 4 continuously acquire data, the chip control layer judges whether the robot deviates from the original gravity center horizontal line according to data feedback, and one of the motors is controlled by the relay 370 to compensate under the condition of deviation judgment until the robot marks the position of the original gravity center horizontal line.
Test subjects: a long straight transport pipeline with the length of 20 meters and the diameter of more than 50 cm;
detecting a target: a robot system and a method for detecting defects of an inner wall of a pipeline are suitable for balancing running conditions of a robot in the pipeline.
The test method comprises the following steps: the robot in the running process is interfered by a long rod to deviate from the original gravity center horizontal line, cracks are arranged in the transport pipeline, the robot in the system is placed into the pipeline, is connected into the system, and is used for detecting and receiving a detection report.
Step A, starting the robot and accessing a cloud system;
step B, starting the inspection operation
B1, an operator logs in data receiving end software to perform motion-assisted operation on the robot, and an account is tested: test, password: t123456. In the embodiment, the role of the observer is not set, and the work of the observer is completed by the operator.
Step B2, front-end data acquisition
The robot starts to continuously run under the control of an operator after being started, the camera collects the inner wall images of all positions of the pipeline in the running process, and real-time video stream data are transmitted to the PC end. The time point of the robot's shooting, the position of the robot pipe will be transmitted simultaneously with the video stream.
In the normal driving process of the robot, a long straight hard rod is used for interfering the driving process of the robot, so that the center of gravity of the robot deviates from the original center-of-gravity horizontal line, one of the 2 inclination sensors 4 at the two ends of the robot can perform horizontal induction, and then the L298n relay 8 transmits a signal to a chip control layer of the robot to adjust the direction of the robot, and further the center of gravity of the robot is changed back to the original center-of-gravity horizontal line.
Step C, generating a polling report
In the inspection process of the step B, an operator observes a real-time video detection picture and carries out manual intervention in time, and the results of automatic crack detection are corrected and perfected, including the position of a pipeline fault point and the detection time point are modified. After the inspection task is formally finished, software automatically generates an inspection report according to the detection result and manual intervention of an operator, the detection report records the marked fault point picture in detail, and simultaneously records the time of the fault point accurate to the second and the position information of the pipeline where the robot is located accurate to the centimeter.
EXAMPLE III
As shown in fig. 9, in this embodiment, the step B further includes:
step B21, over-bending adjustment
In the normal running process of the robot, the ultrasonic ranging sensor 12 scans the front in real time to detect the distance, if the detected distance is smaller than a preset value, the ultrasonic servo motor 17 controls the ultrasonic ranging sensor 12 to reverse and scan again until the direction of which the distance exceeds the preset value is found, and after a new forward direction is determined, 2 motors 14 are controlled to work and turn through the L298n relay 8.
Test subjects: a section of transportation pipeline with a length of 20 meters and a diameter of more than 50cm and with a curve;
detecting a target: a robot system and method for detecting defects of an inner wall of a pipeline are suitable for the over-bending capability of a robot.
Test method of this example: the robot is placed five meters before the curve, cracks are arranged in the transport pipeline, the robot in the system is placed in the pipeline, the system is connected, detection is carried out, and a detection report is received.
Step A, starting the robot and accessing a cloud system;
step B, starting the inspection operation
B1, an operator logs in data receiving end software to perform motion-assisted operation on the robot, and an account is tested: test, password: t123456. In the embodiment, the role of the observer is not set, and the work of the observer is completed by the operator.
Step B2, front-end data acquisition
The robot starts to continuously run under the control of an operator after being started, the camera collects the inner wall images of all positions of the pipeline in the running process, and real-time video stream data are transmitted to the PC end. The time point of the robot's shooting, the position of the robot pipe will be transmitted simultaneously with the video stream.
Step B21, over-bending test
The robot is placed at the position 5m away from the front of the curve again, the robot is enabled to run autonomously, the ultrasonic distance measuring sensor 12 of the robot detects the distance in front of the robot, if the distance of the inner wall of the pipeline at the curve is smaller than a preset value, the direction of which the peripheral distance exceeds the preset value is scanned, and after the direction is determined, 2 motors 14 are controlled through an L298n relay 8, so that the effect of turning is achieved.
Step C, generating a patrol report
In the inspection process of the step B, an operator observes a real-time video detection picture and carries out manual intervention in time, and the results of automatic crack detection are corrected and perfected, including the position of a pipeline fault point and the detection time point are modified. After the inspection task is formally finished, software automatically generates an inspection report according to a detection result and manual intervention of an operator, the detection report records a fault point picture with a label in detail, and simultaneously records the time of the fault point accurate to the second and the position information of the pipeline where the robot is located accurate to the centimeter.
The embodiments of the present invention include, but are not limited to, the above-mentioned embodiments, and those skilled in the art can make various corresponding changes and modifications according to the present invention without departing from the spirit and the substance of the present invention, and still fall into the scope of the present invention.

Claims (3)

1. A method for detecting defects of the inner wall of a pipeline by using a robot system, wherein the robot system comprises a front-end robot platform and a rear-end PC platform which are in communication connection;
the front-end robot platform comprises a camera shooting acquisition layer, a motion control layer and a chip control layer; the camera shooting and collecting layer comprises a camera (1) arranged through a holder; the motion control layer comprises two side motion tracks (16) and a power device; the chip control layer comprises a drive control chip, a signal receiving chip, a drive information processing chip, an image information processing chip and a magnetic induction chip; the signal receiving chip transmits the received information to the driving information processing chip, the driving information processing chip sends a signal to the driving control chip, and the driving control chip controls the motion control layer to realize the integral motion of the front-end robot platform; the pipeline path detection device and the power supply device are also included;
the back end PC platform includes:
the configuration module is used for setting an operation environment;
the receiving module is used for receiving data fed back from the front-end robot platform;
the analysis module is used for analyzing the received video data frame by frame and judging potential cracks and fissures;
the report module is used for outputting the inspection report to the user;
the pan-tilt of the camera shooting collection layer comprises two servo motors (2) arranged on the upper layer of the front end robot platform, and the two servo motors (2) are combined to form a double-steering engine pan-tilt;
the driving control chip of the chip control layer is an L298n relay (8); the signal receiving chip comprises two inclination sensors (4) and a Bluetooth module (6); the driving information processing chip is an Arduino development board (10); the image information processing chip is a raspberry development board (7); the magnetic induction chip is a Hall sensor (3); the power supply device comprises a 12V battery pack (5) and a raspberry-shaped storage battery (18);
the front end robotic platform further comprises the following components: PCF8591 analog-to-digital converter (9), arduino extension board (11) and magnetic core (13);
the two servo motors (2) are connected to an Arduino development board (10);
the 12V battery pack (5) is electrically connected with the L298n relay (8), the Arduino development board (10) is electrically connected with the L298n relay (8), the Arduino development board (11) is connected with the Arduino development board (10), and the raspberry group development board (7) is electrically connected with the raspberry group storage battery (18);
the two inclination sensors (4) are respectively connected with the Arduino expansion board (11) and used for collecting posture information of the front-end robot;
d0, VCC and GND interfaces of the Hall sensor (3) are sequentially connected with a G17 pin, VCC and GND interfaces of a BCM coding mode of the raspberry group development board (7), and SDA, SCL, VCC and GND ends of the PCF8591 analog-to-digital converter (9) are connected with SDA and SCL functional pins and VCC and GND ports of the raspberry group development board (7); an AIN0 port on the PCF8591 analog-to-digital converter (9) is connected with an A0 port of the Hall sensor (3);
the camera (1) is connected with a USB interface of the raspberry development board (7) through a data line, and the raspberry development board (7) is in contact with a back-end PC through a WIFI module arranged on the raspberry development board and then transmits data;
the Bluetooth module (6) is connected with the Arduino expansion board (11);
the power device comprises two 370 motors (14) arranged on a bottom plate (15); the moving crawler (16) is arranged on two sides of the bottom plate (15), and the two 370 motors (14) are respectively in transmission connection with the moving crawler (16); a magnetic core (13) is attached to a wheel disc, connected with the crawler belt, of one 370 motor (14), and the 370 motor (14) is respectively connected with the L298n relay (8) through a driving wire; the Hall sensor (3) is in magnetic induction connection with the magnetic core (13);
the pipe path detecting device includes: an ultrasonic ranging sensor (12) and an ultrasonic servo motor (17);
VCC, GND, ECHO and TRIG ports of the ultrasonic ranging sensor (12) are connected with three pins and No. 8 pins of No. 7 pins of the Arduino expansion board (11), and signal lines of the ultrasonic servo motor (17) and the VCC and GND pins are respectively connected with signal S pins, VCC and GND pins corresponding to No. 9 pins of the Arduino expansion board;
the method is characterized in that: the method comprises the following steps:
step A, starting a front-end robot and accessing a cloud system;
a1, a power supply of a front-end robot is turned on, a raspberry storage battery is turned on, the front-end robot is connected with a specified WIFI, a detection video service is automatically started, and a data sending end program is called to send configuration information;
in the step, the sending of the configuration information comprises sending of the own IP and the port number to the appointed email box and sending of the configuration information of the Bluetooth module (6);
step A2, an administrator opens the software of the data receiving end, completes the software configuration of the data receiving end according to the relevant configuration information received in the step A1 and generates different authority accounts; in this step, the receiving end software configuration comprises a bluetooth configuration;
step A3, an operator logs in data receiving end software by using the corresponding account generated by the administrator in the step A2, sets motion voltage related parameters according to the environment, and then selects a motion control mode;
in the step, the motion control mode comprises automatic inspection and manual control detection;
step B, starting the inspection operation
B1, an operator logs in data receiving end software to perform motion assistance operation on the front-end robot; an observer uses related account login software to perform real-time manual auxiliary monitoring;
step B2, front-end data acquisition
The front-end robot starts to run continuously after being started, and the camera collects the inner wall images of each position of the pipeline in the running process and transmits real-time video stream data to the PC end;
in the step, a wheel disc of a 370 motor (14) rotates for a circle, the Hall sensor (3) and the magnetic core (13) generate magnetic induction once, and an accumulator automatically accumulates and transmits the magnetic induction to a PC (personal computer) end;
step B3, back-end data analysis
B, the back-end PC receives the information in the step B2 and starts to analyze frame by frame;
image preprocessing: taking out each frame of picture of video stream data in real time, and then sequentially carrying out contrast enhancement, graying processing and Gaussian blur processing;
edge detection: performing edge detection on the preprocessed image, wherein the detection method is Canny edge detection, and the threshold values are respectively as follows: 75 255, 255; the detected edge is a suspected crack in the image;
morphological operation: performing closed operation on the image subjected to edge detection, namely expanding and corroding, wherein fine cracks are merged in the step;
judging and storing: traversing a connected domain of the image subjected to the morphological operation and calculating the size, when the size exceeds a set threshold value of a PC (personal computer) end, marking the position of a crack on the image and storing the position of the crack to a specified path, storing time node information, and calculating and storing position information according to the feedback data of an accumulator and the diameter of a wheel disc connected to a motor spindle of 370;
step C, generating a patrol report
Step C1, after the inspection task is finished, software can automatically generate a pipeline inner wall crack and crack detection report according to preset parameters, and original inspection video data are stored;
and C2, manually auditing the inspection report by an operator and an observer, adding remark information, and packaging the inspection report and the original operation video data to be sent to an administrator account after the inspection report and the original operation video data are finished.
2. The method for detecting defects in the inner wall of a pipe using a robotic system as claimed in claim 1, wherein:
the step B further comprises the following steps:
step B20, adjusting the operation posture
In the normal driving process of the front-end robot, 2 inclination sensors (4) continuously acquire data, the chip control layer judges whether the robot deviates from an original gravity center horizontal line or not according to data feedback, and one of the motors is controlled by the relay 370 to compensate under the condition of deviation judgment until the robot is aligned with the original gravity center horizontal line.
3. The method for detecting defects in the inner wall of a pipe using a robotic system as claimed in claim 1, wherein:
the step B further comprises the following steps:
step B21, bending adjustment
In the normal running process of the robot, the ultrasonic ranging sensor (12) scans the front in real time to detect the distance, if the detected distance is smaller than a preset value, the ultrasonic servo motor (17) operates the ultrasonic ranging sensor (12) to reverse and scan again until the direction of which the distance exceeds the preset value is found, and after a new advancing direction is determined, 2 370 motors (14) are controlled to work and turn through an L298n relay (8).
CN202111235057.6A 2021-10-22 2021-10-22 Robot system and method suitable for detecting defects of inner wall of pipeline Active CN113915449B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111235057.6A CN113915449B (en) 2021-10-22 2021-10-22 Robot system and method suitable for detecting defects of inner wall of pipeline

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111235057.6A CN113915449B (en) 2021-10-22 2021-10-22 Robot system and method suitable for detecting defects of inner wall of pipeline

Publications (2)

Publication Number Publication Date
CN113915449A CN113915449A (en) 2022-01-11
CN113915449B true CN113915449B (en) 2023-01-17

Family

ID=79242555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111235057.6A Active CN113915449B (en) 2021-10-22 2021-10-22 Robot system and method suitable for detecting defects of inner wall of pipeline

Country Status (1)

Country Link
CN (1) CN113915449B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114791056A (en) * 2022-03-29 2022-07-26 江苏省特种设备安全监督检验研究院 Buried PE pipe mobile intelligent detection method based on 5G and GPS technologies
CN115095735B (en) * 2022-07-29 2024-05-07 福建建利达工程技术有限公司 Pipeline robot detection device and pipeline fault detection method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104568983B (en) * 2015-01-06 2017-03-15 浙江工业大学 Pipeline Inner Defect Testing device and method based on active panoramic vision
CN208224124U (en) * 2018-05-22 2018-12-11 广西科技大学 Defect detecting system in natural gas line
CN108737705A (en) * 2018-06-14 2018-11-02 西京学院 A kind of inner wall of the pipe image collecting device
CN112258680B (en) * 2020-09-29 2021-12-21 华南理工大学 Method and system for realizing office multifunctional intelligent equipment based on raspberry group

Also Published As

Publication number Publication date
CN113915449A (en) 2022-01-11

Similar Documents

Publication Publication Date Title
CN113915449B (en) Robot system and method suitable for detecting defects of inner wall of pipeline
US11555912B2 (en) Automatic wall climbing type radar photoelectric robot system for non-destructive inspection and diagnosis of damages of bridge and tunnel structure
CN207139822U (en) Data center's crusing robot
CN109941700A (en) Coal handling system Intelligent unattended cruising inspection system
US11587217B2 (en) Systems and methods for inspecting pipelines using a robotic imaging system
CN105291112B (en) A kind of patrol robot
CN102901772B (en) Robot for intelligent tracking ultrasonic detection of welding line, and software analysis system therefor
CN206023052U (en) A kind of HV Transmission Line Routing Inspection robot
CN102567983A (en) Determining method for positions of monitored targets in instant infrared chart and application
CN109797691A (en) Unmanned sweeper and its travelling-crane method
CN103455036A (en) Scene aerial patrol method and aircraft
CN110341749A (en) A kind of track disease crusing robot system and control method
CN110362090A (en) A kind of crusing robot control system
CN110632433A (en) Power plant equipment operation fault diagnosis system and method
CN106851095B (en) Positioning method, device and system
CN208488406U (en) The automatic detection vehicle of integrated binocular vision imaging and leakage magnetic detection device
CN111912857B (en) Diversion tunnel rope climbing detection robot and detection method thereof
CN112455676A (en) Intelligent monitoring and analyzing system and method for health state of photovoltaic panel
CN112947567A (en) Method for inspecting power transmission line by using multi-rotor unmanned aerial vehicle
CN109115434A (en) A kind of tunnel health monitoring systems and method
CN209632369U (en) A kind of welding system of ship group Vertical board support structure
CN216078838U (en) Running gear of robot for detecting defects of inner wall of pipeline
CN213518003U (en) A patrol and examine robot and system of patrolling and examining for airport pavement
CN113885504A (en) Autonomous inspection method and system for train inspection robot and storage medium
CN114931112B (en) Sow body ruler detection system based on intelligent inspection robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant