CN216078838U - Running gear of robot for detecting defects of inner wall of pipeline - Google Patents

Running gear of robot for detecting defects of inner wall of pipeline Download PDF

Info

Publication number
CN216078838U
CN216078838U CN202122553526.0U CN202122553526U CN216078838U CN 216078838 U CN216078838 U CN 216078838U CN 202122553526 U CN202122553526 U CN 202122553526U CN 216078838 U CN216078838 U CN 216078838U
Authority
CN
China
Prior art keywords
robot
pipeline
chip
arduino
motors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN202122553526.0U
Other languages
Chinese (zh)
Inventor
艾列富
陈春生
赵明康
陈少川
张�浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anqing Normal University
Original Assignee
Anqing Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anqing Normal University filed Critical Anqing Normal University
Priority to CN202122553526.0U priority Critical patent/CN216078838U/en
Application granted granted Critical
Publication of CN216078838U publication Critical patent/CN216078838U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The utility model discloses a walking device of a robot for detecting defects of the inner wall of a pipeline, which comprises two 370 motors arranged on a bottom plate and moving tracks arranged on two sides of the bottom plate, wherein the two 370 motors are respectively in transmission connection with the moving tracks; the robot gesture recognition system also comprises a signal receiving chip used for collecting gesture information of the robot; the driving control chip is used for sending control signals to the two 370 motors; the driving information processing chip is used for receiving data from the signal receiving chip and sending signals to the driving control chip; the walking device also comprises a power supply device which is used for supplying power to the walking device; the utility model can monitor the posture of the robot in real time in the walking process of the robot, and send out an adjusting signal in time to keep the normal walking posture.

Description

Running gear of robot for detecting defects of inner wall of pipeline
Technical Field
The utility model relates to the field of pipeline inner wall defect detection, in particular to a robot system suitable for detecting cracks and fissures of a pipeline inner wall.
Background
Pipelines are the most efficient, economical and safer way of oil and gas transportation. Worldwide, the construction and operation of oil and gas pipelines has been a century ago. The pipeline industry plays a crucial role in national economy, social employment, energy supply and the like, and the situation is particularly obvious in Asia-Pacific, Europe, North America and other areas. On the other hand, oil and gas leakage and even personal casualty events caused by pipeline failure each time can cause doubts of public to pipeline influence on environment, ecology, climate and community safety, and the discussion about the sustainability of oil and gas pipeline development is not interrupted.
At present, the existing pipeline surveying device on the market is divided into two modes of handheld telescopic probe and robot-assisted detection. The handheld telescopic probe mainly comprises a miniature camera and a telescopic rod. Need artifical handheld entering pipeline mouth when using, inside flexible probe rod arrived the pipeline, carried out artifical naked eye at the other end and detected, not only had certain danger when using, still can receive very big restriction because of the length of telescopic link for detectable range, it is higher to manpower resource consumption simultaneously.
The robot auxiliary detection is that images in the pipeline are shot by a high-definition camera to detection personnel, when the robot is detected, the robot enters the pipeline from a pipeline opening, the inner wall of the pipeline is shot and recorded, a real-time picture is transmitted to a background, and the detection personnel visually watch the video condition in the pipeline to make judgment before maintenance. The disadvantages of this method are: firstly, identification of cracks and fissures excessively depends on subjective judgment of technicians, and large-area popularization and use are not facilitated; secondly, the working strength of the eye patrol video mode is high, the eye patrol video mode is easy to fatigue after long-time work, and the omission ratio is increased; especially, the inspection efficiency is further reduced under the conditions of long pipeline distance and large detection range.
SUMMERY OF THE UTILITY MODEL
The utility model aims to solve the technical problem of providing a walking device of a robot for detecting defects of the inner wall of a pipeline, which can monitor the posture of the robot in real time in the walking process of the robot and send out an adjusting signal in time to keep the normal walking posture.
In order to solve the technical problem, the walking device of the robot for detecting the defects of the inner wall of the pipeline comprises two 370 motors arranged on a bottom plate and moving tracks arranged on two sides of the bottom plate, wherein the two 370 motors are respectively in transmission connection with the moving tracks;
the robot gesture recognition system also comprises a signal receiving chip used for collecting gesture information of the robot;
the driving control chip is used for sending control signals to the two 370 motors;
the driving information processing chip is used for receiving data from the signal receiving chip and sending signals to the driving control chip;
the walking device also comprises a power supply device which is used for supplying power to the walking device;
the signal receiving chip comprises two inclination sensors; the driving control chip is an L298n relay; the driving information processing chip is an Arduino development board; the power supply device comprises a 12V battery pack; still include with the Arduino expansion board that Arduino development board is connected.
A magnetic core is attached to a wheel disc connected with the caterpillar track by one 370 motor; the magnetic core rotation sensor is characterized by further comprising a Hall sensor connected with the magnetic core in a magnetic induction mode and used for accumulating the rotation times of the magnetic core.
The system also comprises an ultrasonic ranging sensor and an ultrasonic servo motor, and is used for judging obstacles on a walking path; ultrasonic ranging sensor' S VCC, GND, ECHO, Trig mouth and Arduino extend three pin and No. 8 pin connection of 7 pin locations of board, and signal S pin, VCC, GND pin that the signal line of ultrasonic servo motor and VCC, GND pin connect respectively that Arduino extends the board 9 pin corresponds.
The advantages of the utility model are embodied in that: A. the robot is prevented from inclining by adopting crawler-type walking and double-motor control and matching with a control system through an inclination sensor, so that the posture is self-sustained. B. Through magnetic core and hall sensor cooperation, can carry out the technique to the rotation of track rim plate, judge walking distance for the simulation and provide the reference. C. The device is provided with an operation ultrasonic ranging sensor and an ultrasonic servo motor, and can realize bending by matching with a control system and a moving part.
Drawings
FIG. 1 is a system architecture diagram of the present invention;
FIG. 2 is a top view of the front end robot of the present invention;
FIG. 3 is a diagram of a middle level architecture of the front end robot of the present invention;
FIG. 4 is a bottom level structural view of the front end robot of the present invention;
FIG. 5 is an overall workflow diagram of the present invention;
FIG. 6 is an analysis flow diagram of the back end PC platform of the present invention;
FIG. 7 is a flowchart of front end robot travel control according to the present invention;
FIG. 8 is a flow chart of front end robot over-bending control in the present invention;
fig. 9 is a flowchart of the pose maintenance of the front-end robot in the present invention.
Detailed Description
The following further describes embodiments of the present invention with reference to the drawings.
As can be seen from fig. 1, the present specification discloses a robot system suitable for pipeline inner wall defect detection, which includes a front end robot platform and a back end PC platform, which are in communication connection. The utility model particularly relates to a walking device of a robot for detecting defects of the inner wall of a pipeline in the system.
From the aspect of function realization, the front-end robot platform comprises a camera shooting acquisition layer, a motion control layer and a chip control layer; the camera shooting and collecting layer comprises camera heads arranged through a holder; the motion control layer comprises two side tracks and a power device; the chip control layer comprises a drive control chip, a signal receiving chip, a drive information processing chip, an image information processing chip and a magnetic induction chip; the pipeline path detection device and the power supply device are further included. The signal receiving chip transmits the received information to the driving information processing chip, the driving information processing chip sends a signal to the driving control chip, and the driving control chip controls the motion control layer to realize the overall motion of the front-end robot platform.
The specific parts of the front-end robot platform comprise: camera 1, two servo motor 2, hall sensor 3, two inclination sensor 4, 12V battery 5, bluetooth module 6, raspberry group development board 7, L298n relay 8, PCF8591 analog-to-digital converter 9, Arduino development board 10, Arduino expansion board 11, ultrasonic ranging sensor 12, magnetic core 13, two 370 motors 14, bottom plate 15, motion track 16, ultrasonic servo motor 17, raspberry group battery 18.
The driving control chip is an L298n relay 8; the signal receiving chip comprises a Bluetooth module 6; the driving information processing chip is an Arduino development board 10; the image information processing chip is a raspberry development board 7; the magnetic induction chip is a Hall sensor 3; the pipeline path detection device comprises an ultrasonic ranging sensor 12; the power supply device comprises a 12V battery pack 5 and a raspberry storage battery 18.
The camera shooting acquisition layer is mainly responsible for controlling the camera 1 to acquire data through the holder and transmitting the data to the middle layer. As shown in fig. 2 and 3, a pan/tilt is disposed at the upper center of the front end robot, and a camera 1 is mounted on the pan/tilt. In this embodiment, use two servo motor 2 to constitute the double steering wheel cloud platform that can the rotation of axes respectively in horizontal plane and vertical plane, two servo motor 2 connect Arduino development board 10, 360 panorama scans can be done to the cloud platform.
The chip control layer is mainly responsible for: and each part is controlled, and the data collected by the camera 1 is received and transmitted to the back-end PC platform.
As shown in fig. 3, on the middle-layer central axis of the front-end robot, a 12V battery pack 5, a raspberry pi battery 18, an L298n relay 8, a PCF8591 analog-to-digital converter 9, an ultrasonic ranging sensor 12, and an ultrasonic servo motor 17 are sequentially arranged from back to front; the Bluetooth module 6 and the Arduino development board 11 are arranged on the left side of the central axis, and the Arduino development board 10 is arranged at the front end of the Arduino development board 11; the raspberry group development board 7 is arranged in the middle of the right side of the central axis, and the Hall sensor 3 is arranged at the rear end of the raspberry group development board 7; two inclination sensors 4 are respectively arranged on both sides of the central axis.
12V group battery 5 and L298n relay 8 electric connection, Arduino development board 10 and L298n relay 8 electric connection to realize power supply, Arduino development board 11 gets the electricity from Arduino development board 10, raspberry group development board 7 and raspberry group battery 18 electric connection. The specific connection mode is as follows:
the positive and negative poles of the 12V battery pack 5 are connected to the 12V port and the GND port of the L298n relay 8, respectively, to energize the L298n relay 8. The Arduino development board 10 is powered on with two wires, one connecting the +5V and GND of the L298n relay 8 and one connecting the VCC and GND ports of the Arduino development board 11. Two wires are led out from ENA and ENB of the relay 8 of L298n respectively to connect interfaces No. 5 and No. 6 of the Arduino expansion board 11 for controlling PWM output value. Four further conductors were connected from IN1, IN2, IN3, IN4 of the L298n relay 8 to the ports of a2, A3, a4, a5, respectively, of the Arduino expansion board 11. Meanwhile, the raspberry group development board 7 is connected with the raspberry group storage battery 18 through a Micro USB interface, so that the raspberry group development board 7 is powered on.
The two inclination sensors 4 are respectively connected with the Arduino expansion board 11 and used for collecting posture information of the front-end robot. The specific connection mode is as follows: the OUT ports VCC, GND of the two tilt sensors 4 on both sides are connected in sequence with the three pins corresponding to the 3, 4 positions of the Arduino expansion board 11.
The D0, VCC, and GND interfaces of the hall sensor 3 are sequentially connected with the G17 pin, VCC, and GND functional interfaces of the BCM coding mode of the raspberry group development board 7, and the SDA, SCL, VCC, and GND terminals of the PCF8591 analog-to-digital converter 9 are connected with the SDA, SCL, VCC, and GND ports of the raspberry group development board 7. The AINO port on the PCF8591 analog-to-digital converter 9 is connected with the A0 port of the Hall sensor 3, is placed in front of the L298n relay 8, and is fixed above the middle layer structure.
VCC of ultrasonic ranging sensor 12, GND, ECHO, TRIG mouth and Arduino extend board 11 three pin and No. 8 pin connection of No. 7 pin, the signal line of two servo motor 2 and VCC, the signal line of GND pin and ultrasonic servo motor 17 and VCC, the GND pin is connected the corresponding signal S pin of No. 9, 10, 11 pin of Arduino extend board respectively, VCC, three pin of GND. The ultrasonic ranging sensor 12 is used for acquiring information of front obstacles, curves and the like and providing guidance for the curves; the ultrasonic servo motor 17 is used for operating the ultrasonic distance measuring sensor 12 to commutate.
The camera 1 of superstructure passes through data line and 7USB interface connections of raspberry group development board 7 of middle level structure, is provided with PCF data acquisition device on the raspberry group development board 7 for gather the image through the camera. And the raspberry group development board 7 establishes contact with a back-end PC through a WIFI module arranged on the raspberry group development board and then transmits data.
The motion control layer is mainly responsible for walking, in addition through setting up magnetic core 13 on the bottom and the cooperation of middle level hall sensor 3, realizes the analog acquisition of front end robot positional information.
As shown in fig. 4, the bottom layer of the front-end robot comprises two 370 motors 14 and a bottom plate 15, moving tracks 16 are arranged on two sides of the bottom plate 15, and the two 370 motors 14 are respectively in transmission connection with the moving tracks 16; a magnetic core 13 is attached to a wheel disc of one 370 motor 14 connected with the crawler, 4 drive wires led OUT by the 370 motor 14 are respectively connected with OUT1, OUT2, OUT3 and OUT pins of an L298n relay 8, and finally the drive control connection of the bottom layer is completed. The hall sensor 3 of the middle layer structure is provided with a suitable position so that it is magnetically connected to the magnetic core 13 of the bottom layer.
The basic working principle is as follows: two 370 motors drive the front end robot to walk through two side moving tracks 16, two servo motors 2 drive the double steering engine holder to drive the camera 1 to scan in a rotating mode, the obtained video signals are collected and converted through a PCF data collecting device on the raspberry development board 7 and transmitted to a rear end PC platform in a WIFI mode, frame-by-frame analysis is completed at the rear end, and potential crack cracks are judged.
The chip control layer is provided with an accumulator, and when the 370 motor 14 rotates for one circle, the magnetic core 13 and the Hall sensor 3 are induced once, so that the accumulator is automatically accumulated. Through the cooperation of several data such as induction times, track rim plate diameter, action direction, can simulate the current position of calculation front end robot in the pipeline, there is corresponding relation above positional information, video signal in the back end PC at last to can not only judge the pipeline defect according to video signal, can simulate the collection position who judges this video signal moreover.
And (3) repeated detection: bluetooth module 6 is connected with Arduino expansion board 11 for received signal. The specific connection mode is as follows: the bluetooth module 6 is connected to VCC, GND, RXD, TXD of the blue teeth area of the Arduino expansion board 11. In practical application, an operator can issue an interference instruction through the Bluetooth module 6. In the utility model, the defect detection mode comprises automatic inspection and manual control detection. The automatic inspection speed is high, the automation degree is high, if there is a doubt to a certain detected section, the instruction can be given through the Bluetooth module 6, and the manual mode is switched to carry out local repeated detection.
Posture adjustment: the detection object of the utility model is a pipeline, the bottom of the pipeline is in an arc shape which is bent upwards, the front-end robot is easy to incline in the walking process, the posture change is detected by the two inclination sensors 4 at the moment, and a correction instruction is sent to the 370 motor 14 through the control system, so that the correct posture is always kept.
Bending and obstacle avoidance: during walking, the ultrasonic servo motor 17 controls the ultrasonic ranging sensor 12 to change the direction, the ultrasonic ranging sensor 12 continuously obtains information of front obstacles, curves and the like, and sends a correction instruction to the 370 motor 14 through the control system, so that the operations of obstacle avoidance, curve passing and the like are realized.
Control system and chip possess multiple selection, as one of the alternative ways, Arduino development board: production unit: shanghai Lang electronic technology Limited, specific model: arduinouno R3; raspberry type development board: production unit: sony UK, specific model: 4B; arduino expansion board: production unit: taiwan intelligent sensing technology ltd, specific model: arduino Sensor Shield V5.0; a Hall sensor: production unit: shenzhen Tegat technologies, model: a 3144E.
As can be seen from fig. 5, 6 and 7, the working flow of the present invention is specifically as follows:
step A, starting a front-end robot and accessing a cloud-end system;
and A1, turning on a power supply of the front-end robot, turning on a raspberry type storage battery, enabling the front-end robot to be connected with the appointed WIFI, automatically starting detection video service, and calling a data sending end program to send configuration information. In this step, the sending the configuration information includes sending the own IP and port number to the designated email box, and also includes sending the configuration information of the bluetooth module 6.
And step A2, the administrator opens the data receiving end software, completes the configuration of the data receiving end software according to the relevant configuration information received in the step A1, and generates different authority accounts. In this step, the receiving end software configuration includes bluetooth configuration.
And step A3, logging in data receiving end software by an operator by using the corresponding account generated by the administrator in the step A2, setting parameters related to the movement voltage according to the environment, and then selecting a movement control mode.
In this step, the motion control mode includes automatic inspection and manual control detection.
Step B, starting the inspection operation
B1, an operator logs in data receiving end software to perform motion-assisted operation on the front-end robot; and the observer uses the related account login software to perform real-time manual auxiliary monitoring.
The operator and the observer can intervene manually to deal with emergency in the automatic inspection operation process.
Step B2, front end data Collection
The front-end robot starts to run continuously after being started, and the camera collects the inner wall images of all positions of the pipeline in the running process and transmits real-time video stream data to the PC end.
In this step, the wheel disc of the 370 motor 14 rotates for one circle, the hall sensor 3 and the magnetic core 13 generate magnetic induction once, and the accumulator automatically accumulates and transmits to the PC terminal.
Step B3, backend data analysis
As can be seen in fig. 6, the back-end PC receives the information of step B2 and starts to perform frame-by-frame analysis.
Image preprocessing: and taking out each frame of picture of the video stream data in real time, and then sequentially carrying out contrast enhancement, graying processing and Gaussian blur processing so as to facilitate subsequent crack detection.
Edge detection: performing edge detection on the preprocessed image, wherein the detection method is Canny edge detection, and the threshold values are respectively as follows: 75, 255. The detected edge is a suspected crack in the image.
Morphological operation: and performing closed operation on the image subjected to edge detection, namely expanding and then corroding, wherein the step is to combine fine cracks.
Judging and storing: traversing the connected domain of the image which is subjected to the morphological operation and calculating the size, when the size exceeds a set threshold value of a PC (personal computer) end, marking the crack position of the image and storing the crack position to an appointed path, storing time node information, and calculating and storing position information according to the feedback data of an accumulator and the diameter of a wheel disc connected to a motor spindle of 370. The intercepted key picture is stored by robot position information and time, and the file name format is as follows: year, month, day + hour, minute and second + position information, for example, "2021-04-0710: 25: 0913.24 cm" indicates that a suspected defect was detected in this time period, and the distance of this defect from the start position was judged to be 13.24 cm.
Step C, generating a patrol report
And C1, after the inspection task is finished, the software can automatically generate a pipeline inner wall crack and crack detection report according to the preset parameters, and the original inspection video data is stored.
And step C2, the operator and the observer carry out manual examination and comment addition on the inspection report, and the inspection report and the original operation video data are packaged and sent to an administrator account after the examination report and the original operation video data are completed.
Example one
To verify the survey effect of the present invention, we performed simulation experiments.
Test subjects: a section of conveying pipeline with the length of 20 meters and crack cracks;
detecting a target: crack formation inside the transport pipe.
The test method comprises the following steps: the method comprises the steps of arranging crack cracks in a transportation pipeline, placing a robot in the system into the pipeline, connecting the robot into the system, detecting and receiving a detection report.
Step A, starting the robot and accessing a cloud system;
step B, starting the inspection operation
Step B1, the operator logs in the data receiving end software to perform motion-assisted operation on the robot, and the test account number is: test, password: t 123456. In the embodiment, the role of the observer is not set, and the work of the observer is completed by the operator.
Step B2, front end data Collection
The robot starts to continuously run under the control of an operator after being started, the camera collects the inner wall images of all positions of the pipeline in the running process, and real-time video stream data are transmitted to the PC end.
Step C, generating a patrol report
In the inspection process of the step B, an operator observes a real-time video detection picture and carries out manual intervention in time, and the results of automatic crack detection are corrected and perfected, including the position of a pipeline fault point and the detection time point are modified. After the inspection task is formally finished, software automatically generates an inspection report according to the detection result and manual intervention of an operator, the detection report records the marked fault point picture in detail, and simultaneously records the time of the fault point accurate to the second and the position information of the pipeline where the robot is located accurate to the centimeter.
Example two
As shown in fig. 8, in this embodiment, the step B further includes:
step B20, adjusting operation posture
In the normal driving process of the front-end robot, 2 inclination sensors 4 continuously acquire data, the chip control layer judges whether the robot deviates from the original gravity center horizontal line according to data feedback, and one of the motors is controlled by the relay 370 to compensate under the condition of deviation judgment until the robot marks the position of the original gravity center horizontal line.
Test subjects: a long straight transport pipeline with the length of 20 meters and the diameter of more than 50 cm;
detecting a target: a robot system and a method for detecting defects of an inner wall of a pipeline are suitable for balancing running conditions of a robot in the pipeline.
The test method comprises the following steps: the robot in the running process is interfered by a long rod to deviate from the original gravity center horizontal line, cracks are arranged in the conveying pipeline, the robot in the system is placed into the pipeline, and the pipeline is connected into the system to detect and receive a detection report.
Step A, starting the robot and accessing a cloud system;
step B, starting the inspection operation
Step B1, the operator logs in the data receiving end software to perform motion-assisted operation on the robot, and the test account number is: test, password: t 123456. In the embodiment, the role of the observer is not set, and the work of the observer is completed by the operator.
Step B2, front end data Collection
The robot starts to continuously run under the control of an operator after being started, the camera collects the inner wall images of all positions of the pipeline in the running process, and real-time video stream data are transmitted to the PC end. The time point of the robot's shooting, the position of the robot pipe will be transmitted simultaneously with the video stream.
In the normal driving process of the robot, a long straight hard rod is used for interfering the driving process of the robot, so that the center of gravity of the robot deviates from the original center-of-gravity horizontal line, one of the 2 inclination sensors 4 at the two ends of the robot can perform horizontal induction, and then signals are transmitted to the L298n relay 8 in the chip control layer of the robot to adjust the direction of the robot, and further the center of gravity of the robot is changed back to the original center-of-gravity horizontal line.
Step C, generating a patrol report
In the inspection process of the step B, an operator observes a real-time video detection picture and carries out manual intervention in time, and the results of automatic crack detection are corrected and perfected, including the position of a pipeline fault point and the detection time point are modified. After the inspection task is formally finished, software automatically generates an inspection report according to the detection result and manual intervention of an operator, the detection report records the marked fault point picture in detail, and simultaneously records the time of the fault point accurate to the second and the position information of the pipeline where the robot is located accurate to the centimeter.
EXAMPLE III
As shown in fig. 9, in this embodiment, the step B further includes:
step B21, overbending adjustment
During the normal running process of the robot, the ultrasonic ranging sensor 12 scans the front in real time to detect the distance, if the detected distance is smaller than a preset value, the ultrasonic servo motor 17 controls the ultrasonic ranging sensor 12 to reverse and scan again until the direction of the distance exceeding the preset value is found, and after a new forward direction is determined, 2 motors 370 are controlled to work and turn through the L298n relay 8.
Test subjects: a section of conveying pipeline which is 20 meters long and has a diameter of more than 50cm and is provided with a bend;
detecting a target: a robot system and method for detecting defects of an inner wall of a pipeline, wherein the robot has bending capability.
Test method of this example: the robot is placed five meters in front of the bend, cracks are arranged in the conveying pipeline, the robot in the system is placed in the pipeline, the system is connected, detection is carried out, and a detection report is received.
Step A, starting the robot and accessing a cloud system;
step B, starting the inspection operation
Step B1, the operator logs in the data receiving end software to perform motion-assisted operation on the robot, and the test account number is: test, password: t 123456. In the embodiment, the role of the observer is not set, and the work of the observer is completed by the operator.
Step B2, front end data Collection
The robot starts to continuously run under the control of an operator after being started, the camera collects the inner wall images of all positions of the pipeline in the running process, and real-time video stream data are transmitted to the PC end. The time point of the robot's shooting, the position of the robot pipe will be transmitted simultaneously with the video stream.
Step B21, overbending test
The robot is placed at the position 5m away from the front of the curve again, the robot is enabled to drive autonomously, the ultrasonic distance measuring sensor 12 of the robot detects the distance in front of the robot, if the distance of the inner wall of the pipeline at the curve is smaller than a preset value, the direction of which the peripheral distance exceeds the preset value is scanned, and after the direction is determined, 2 motors 14 are controlled through an L298n relay 8, so that the effect of turning is achieved.
Step C, generating a patrol report
In the inspection process of the step B, an operator observes a real-time video detection picture and carries out manual intervention in time, and the results of automatic crack detection are corrected and perfected, including the position of a pipeline fault point and the detection time point are modified. After the inspection task is formally finished, software automatically generates an inspection report according to the detection result and manual intervention of an operator, the detection report records the marked fault point picture in detail, and simultaneously records the time of the fault point accurate to the second and the position information of the pipeline where the robot is located accurate to the centimeter.
The embodiments of the present invention include, but are not limited to, the above-mentioned embodiments, and those skilled in the art can make various corresponding changes and modifications according to the present invention without departing from the spirit and the substance of the present invention, and still fall into the scope of the present invention.

Claims (3)

1. The walking device of the robot for detecting the defects of the inner wall of the pipeline comprises two 370 motors (14) arranged on a bottom plate (15) and also comprises moving tracks (16) arranged on two sides of the bottom plate (15), wherein the two 370 motors (14) are respectively in transmission connection with the moving tracks (16); the method is characterized in that:
the robot gesture recognition system also comprises a signal receiving chip used for collecting gesture information of the robot;
the device also comprises a drive control chip which is used for sending control signals to the two 370 motors (14);
the driving information processing chip is used for receiving data from the signal receiving chip and sending signals to the driving control chip;
the walking device also comprises a power supply device which is used for supplying power to the walking device;
the signal receiving chip comprises two inclination sensors (4); the drive control chip is an L298n relay (8); the driving information processing chip is an Arduino development board (10); the power supply device includes a 12V battery pack (5); also included is an Arduino development board (11) connected to the Arduino development board (10).
2. The running gear of the robot for detecting the defect of the inner wall of the pipeline according to claim 1, wherein: a magnetic core (13) is attached to a wheel disc of one 370 motor (14) connected with the crawler; the magnetic core rotating mechanism further comprises a Hall sensor (3) connected with the magnetic core (13) in a magnetic induction mode and used for accumulating the number of rotation times of the magnetic core (13).
3. The running gear of the robot for detecting the defect of the inner wall of the pipeline according to claim 1, wherein: the system also comprises an ultrasonic ranging sensor (12) and an ultrasonic servo motor (17) which are used for judging obstacles on the walking path; VCC, GND, ECHO, Trig mouth of ultrasonic ranging sensor (12) and Arduino extend three pin and No. 8 pin connection of 7 pin locations of board (11), and the signal S pin that No. 9 pins of the signal line of ultrasonic servo motor (17) and VCC, GND pin connect Arduino respectively extend the board, VCC, GND pin location.
CN202122553526.0U 2021-10-22 2021-10-22 Running gear of robot for detecting defects of inner wall of pipeline Expired - Fee Related CN216078838U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202122553526.0U CN216078838U (en) 2021-10-22 2021-10-22 Running gear of robot for detecting defects of inner wall of pipeline

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202122553526.0U CN216078838U (en) 2021-10-22 2021-10-22 Running gear of robot for detecting defects of inner wall of pipeline

Publications (1)

Publication Number Publication Date
CN216078838U true CN216078838U (en) 2022-03-18

Family

ID=80640863

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202122553526.0U Expired - Fee Related CN216078838U (en) 2021-10-22 2021-10-22 Running gear of robot for detecting defects of inner wall of pipeline

Country Status (1)

Country Link
CN (1) CN216078838U (en)

Similar Documents

Publication Publication Date Title
CN113915449B (en) Robot system and method suitable for detecting defects of inner wall of pipeline
CN207139822U (en) Data center's crusing robot
US20200142052A1 (en) Automatic wall climbing type radar photoelectric robot system for non-destructive inspection and diagnosis of damages of bridge and tunnel structure
CN105291112B (en) A kind of patrol robot
CN109941700A (en) Coal handling system Intelligent unattended cruising inspection system
CN205539242U (en) Intelligent inspection device of power plant and system
CN109571403B (en) Intelligent inspection robot for magnetic track trace navigation and navigation method thereof
CN206023052U (en) A kind of HV Transmission Line Routing Inspection robot
CN109688388A (en) A method of using the comprehensive real time monitoring of tunnel crusing robot
CN109571402B (en) Climbing mechanism, intelligent climbing inspection robot and transformer substation climbing method thereof
CN109571404B (en) Obstacle crossing mechanism, obstacle crossing intelligent inspection robot and obstacle crossing method of transformer substation
CN109797691A (en) Unmanned sweeper and its travelling-crane method
CN112414458A (en) Automatic intelligent inspection method for transformer substation
CN108189040A (en) A kind of sewage pipeline detects robot system
CN110632433A (en) Power plant equipment operation fault diagnosis system and method
CN211734978U (en) Unmanned rapid comprehensive road detection vehicle system
CN110341749A (en) A kind of track disease crusing robot system and control method
CN109572842B (en) Pole-climbing mechanism, pole-climbing intelligent inspection robot and pole-climbing method of transformer substation
CN111061264A (en) Intelligent inspection robot
CN111912857B (en) Diversion tunnel rope climbing detection robot and detection method thereof
CN110378956A (en) For the clean tunnel lamp localization method of Tunnel Lamp and system
CN216078838U (en) Running gear of robot for detecting defects of inner wall of pipeline
CN115582817A (en) Rail fastener overhauling robot system and rail fastener overhauling method
CN116289543A (en) Bridge support real-time monitoring system and method combining inspection and typical inspection
CN215932701U (en) Rail-mounted inspection system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220318