CN113566830B - Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot - Google Patents

Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot Download PDF

Info

Publication number
CN113566830B
CN113566830B CN202110815973.0A CN202110815973A CN113566830B CN 113566830 B CN113566830 B CN 113566830B CN 202110815973 A CN202110815973 A CN 202110815973A CN 113566830 B CN113566830 B CN 113566830B
Authority
CN
China
Prior art keywords
wheel
robot
server
connecting rod
foot composite
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110815973.0A
Other languages
Chinese (zh)
Other versions
CN113566830A (en
Inventor
魏星
杨长春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou University
Original Assignee
Changzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou University filed Critical Changzhou University
Priority to CN202110815973.0A priority Critical patent/CN113566830B/en
Publication of CN113566830A publication Critical patent/CN113566830A/en
Application granted granted Critical
Publication of CN113566830B publication Critical patent/CN113566830B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D57/00Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
    • B62D57/02Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
    • B62D57/028Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members having wheels and mechanical legs
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

An outdoor high-precision autonomous navigation device and method for a wheel-foot composite robot belong to the field of autonomous navigation of mobile robots. The 3D color camera is used as scanning equipment and used for capturing object information; the server comprises a control module and an operation module: the control module is used for receiving and transmitting instructions and data between the scanning equipment and the GPS equipment, and the operation module is used for integrating and calculating the data; the wheel-foot composite robot judges and executes tasks through object information captured by the 3D color camera in the autonomous navigation process; the wheel-foot composite robot utilizes a mechanical arm to grasp an object; the motor adjusts the steering wheel and the speed of the chassis of the wheel-foot composite robot to move towards the object according to the instruction sent by the server; the wheel-foot composite robot uses a link mechanism to switch different walking modes aiming at a specific environment in the moving process.

Description

Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot
Technical Field
The invention belongs to the field of autonomous navigation of mobile robots, and particularly relates to a method and a device for performing high-precision and high-reliability autonomous navigation and realizing specific tasks by using a wheel-foot composite robot in an outdoor specific scene.
Background
With the wider and wider application of autonomous navigation in various fields such as military, aerospace, industry and the like and daily life, the requirements of people on robots are also higher and higher, and the working environment of the robots is also transferred from indoor to more complex unstructured outdoor environments. The mobile robot will have a great application prospect in the fields of disaster relief, safety monitoring, industrial manufacturing, etc., however, the difference of working environments makes it not easy for the robot to exert its 'function' in various environments. Since the 21 st century, various advanced mobile robots applied to different fields have appeared, and the range of motion of the mobile robots has been expanded comprehensively. In order to adapt to different outdoor environments, the mobile robots are divided into: wheeled robots, tracked robots, combined robots, and the like.
With the continuous development of mobile robots and the expansion of application areas, it is possible to work in more complex and unknown environments in the future. Mobile robots that rely solely on wheels or feet have not been fully adapted to the complexity and diversity of the work environment. In order to match with the gradual improvement of the requirements on the humanization of mobile machines, a plurality of hybrid mobile mechanisms are sequentially developed, wherein the wheel-foot composite robot fuses the characteristics of a wheel type mobile robot and a leg type mobile robot, not only can ensure the mobile efficiency on a flat ground, but also has good obstacle crossing capability, and is mainly used for carrying out the task of conveying materials for people in a hard environment when the field combat is performed, but has a few defects, such as: the control system of the complex four-foot walking robot is a nonlinear multi-input and multi-output unstable system, and has timeliness and intermittence; limited by the non-real-time performance of windows systems and the software development level and experience of laboratory members, the real-time performance of the whole system is very low, one control period is as long as 50-100 ms, the bandwidth is obviously insufficient, the effect is poor, and the low-speed movement of the robot is only marginally realized; in the urban searching and rescuing (USAR) scene, the robot can be reliably positioned and a precise map can be established by combining a robust scanning cooperation method of a laser radar system with a 3D gesture estimation system based on an inertial sensor, but the method is high in manufacturing cost and not widely applicable.
The invention has the main significance that the 3D color camera is utilized to acquire the identification object information, so that the wheel-foot composite robot can carry out object grabbing and high-precision autonomous navigation in an outdoor environment. The wheel-foot composite robot can ensure that the wheel-foot composite robot can recognize object characteristic information more accurately, keep stable state and execute tasks efficiently in the autonomous movement process, and has lower cost and obvious optimization effect.
Disclosure of Invention
The invention provides a related device for realizing autonomous navigation with high precision and high reliability and realizing specific tasks of the wheel-foot composite robot in a specific scene, designs and optimizes a path search algorithm of a related main flow, and can ensure that the robot obtains more accurate environmental information, the state of the robot is more stable in the autonomous movement process and the optimization effect is more obvious.
The invention adopts the following technical scheme to solve the technical problems:
the outdoor high-precision autonomous navigation device of the wheel-foot composite robot comprises a 3D color camera, a motor, a server, a pressure sensor, GPS equipment, a wheel-foot composite robot body and a running mechanism.
The 3D color camera is used as scanning equipment and used for capturing object information; the server comprises a control module and an operation module: the control module is used for receiving and transmitting instructions and data between the scanning equipment and the GPS equipment, and the operation module is used for integrating and calculating the data; the wheel-foot composite robot judges and executes tasks through object information captured by the 3D color camera in the autonomous navigation process; the wheel-foot composite robot utilizes a mechanical arm to grasp an object; the motor adjusts the steering wheel and the speed of the chassis of the wheel-foot composite robot to move towards the object according to the instruction sent by the server; the wheel-foot composite robot uses a link mechanism to switch different walking modes aiming at a specific environment in the moving process.
The travelling mechanism of the wheel-foot composite robot comprises a crank 1, a connecting rod driving motor 2, a frame 3, a rotary joint 4, a connecting rod 5, a crawling grounding part 6, a rocker 7, a charged driving wheel 8, a folding rod 9 and a telescopic cylinder 10; the frame 3 is provided with a connecting rod driving motor 2, the connecting rod driving motor 2 is rotationally connected with one end 1 of a crank 1, the other end of the crank 1 is rotationally connected with one end of a connecting rod 5, the other end of the connecting rod 5 is rotationally connected with a rotary joint 4, the rotary joint 4 is rotationally connected with a crawling grounding part 6 and one end of a rocker 7, and the other end of the rocker 7 is rotationally connected with the frame 3; the connecting rod driving motor 2 rotates to drive the rotary joint 4 to move so as to drive the connecting rod 5 and the rocker 7 to move, and then the crawling grounding part 6 moves up and down; at the frame 3 department of being connected with rocker 7, rotate the connection bracing piece upper end, the bracing piece lower extreme sets up takes motor drive wheel 8, and electrified motor drive wheel 8 links to each other with folding pole 9, sets up telescopic cylinder 10 on the folding pole 9, drives bracing piece and folding pole 9 whole up-and-down motion through telescopic cylinder 10's flexible.
Further, the crawling grounding part 6 is a flat plate and is in the same horizontal plane with the belt-driven wheel 8. The length of the longest rod piece in the connecting rod mechanism formed by the crank 1, the connecting rod 5 and the rocker 7 is as follows: and must be less than the sum of the lengths of the remaining link members.
Further, when the robot encounters a gully or a crack topography, the 3D color camera scans width parameter information of the crack in the road and transmits the width parameter information to the server for calculation and comparison with the wheel diameter, the server transmits a control instruction to the link driving motor 2 according to the calculation result, and the link driving motor 2 drives the crank 1, the link 5 and the rocker 7 to ascend, so that the crawling grounding part 6 moves upwards. Meanwhile, the motor driving wheel 8 can relax the piston in the telescopic cylinder 10 to give a downward force to the folding rod 9, so that the motor driving wheel 8 is put down; the motorized drive wheel 8 moves the folding rod 9 upward by expanding and contracting the piston inside the expansion cylinder 10, and lifts the motorized drive wheel 8 up, so that the crawling ground-contact portion 6 can cross the obstacle.
The outdoor high-precision autonomous navigation method of the wheel-foot composite robot mainly comprises the following steps of:
step 1, a wheel-foot composite robot receives a target place data set X [ n ] through GPS equipment]And stored in a server according to the target site data set X [ n ]]And (5) establishing a map. Where n is a one-dimensional array. With the initial position of the robot as the origin of coordinates O, i.e., [0,0 ]]Establishing a three-dimensional rectangular coordinate system, and forming a set of coordinates of each target point to be reached by the robot, wherein the coordinates of each target point are [ x ] m ,y m ,z m ],m∈N * . Where m represents the number of the different target sites.
Step 2, calculating a distance matrix E: the operation module calculates the cost F to be paid by two adjacent places of the robot respectively i F is to F i As a column vector of distance matrix E, i E N * And (3) taking the set formed by the location coordinates acquired in the step (1) as a row vector.
Step 3, the operation module obtains according to the GPS equipmentRandomly generating a route from the current position of the robot to the target location, wherein the route is composed of a plurality of locations [ x ] in the location coordinate set described in the step 1 m ,y m ,z m ],m∈N * Composition, and denoted initial_path.
Step 4, the operation module calculates the total cost value F of the initial_path, andwherein i is e initial_path.
Step 5, setting a fitness function f=epsilon -1 And F > 0 is constant, and it is processed with the total cost value F:
F excellent (excellent) =F*exp(f) (1-1)
And (3) obtaining smaller optimization cost after fusing according to the mode shown in the formula (1-1), and then storing the optimization cost and the corresponding route thereof to a server.
Setting a guiding primer for path planning of a robot:depending on the degree of damage ρ of the road l The quality M of the carried materials and the demand degree Q of the materials for the specific disaster area. Wherein the material weight M is calculated by using the data U of the pressure sensor:
the demand level Q of the specific disaster area for the materials carried by the robot is T, and the materials can generate different rescue effects for different disaster area groups i Rescue ability P can be represented, namely: the demand level Q of the supplies can be expressed as:
P=Q(T i ) (1-3)
degree of road damage ρ l Can be obtained according to the damaged grade of the highway asphalt pavement, and the damaged grade of the highway asphalt pavement is divided intoFour classes are:
i is a crack, and mainly comprises a crack, a block-shaped crack, a longitudinal crack and a transverse crack;
II is loose, mainly comprising pit slots and looseness;
III is deformation class, mainly comprising subsidence, rut and wave advocacy;
IV is other types, mainly comprising oil flooding, repairing, frost heaving and slurry turning.
The damage degree rho corresponding to the damage grade of the asphalt pavement of the four types of highways l ,l∈[Ⅰ,Ⅳ]There is an increasing trend, namely: ρ >ρ >ρ >ρ
Step 6, calculating a total cost value F paid by the robot after optimizing the travel of the route by an operation module in the server Excellent (excellent) Whether to fall within the ideal range, namely: whether or not F is present Excellent (excellent) < F. If yes, stopping iteration and generating a route; otherwise, the process of the step 3 is repeated until the cost is converged within an ideal range.
Further, in the outdoor autonomous navigation process of the wheel foot composite robot, the mechanical arm is utilized to grasp the object characteristics required to be identified, the contour and size data of the object are acquired through the 3D color camera and then transmitted to the server, and the server stores and visualizes the information of the 3D color camera, and the specific steps are as follows:
and step 1.1, acquiring the outline, the size and the depth information of an object to be grabbed by the mechanical arm by the wheel-foot compound robot by using the 3D color camera, and uploading the information to a server.
And 1.2, the processing module acquires characteristic parameters of the obstacle by using the RVIZ visual platform and communicates with the operation module of the obstacle, the operation module calculates the distance l between the position of the robot and the obstacle and then sends the distance l to the server, and the control module of the server sends a control instruction to the motor.
And 1.3, adjusting the link mechanism after the motor obtains the control instruction transmitted by the control module.
Further, the 3D color camera shoots a 3D image of the object, the image information is transmitted to the processing module, the processing module converts the 3D image into a 2D image, and the OpenCV is adopted to extract texture features of the object image, and the method specifically comprises the following steps:
step 2.1, reading an original picture: firstly, the picture file on the disk needs to be read into a memory and converted into data which can be processed by OpenCV.
Step 2.2, converting the color space of the picture: converting BGR color to HSV color:
C max =max(R',G',B'),C min =min(R',G',B') (2-2)
ΔC=C max -C min (2-3)
wherein, R: red, G: green, B: is blue; r ', G ', B ' represent absolute color information, respectively, namely: the R, G, B color range is adjusted to be within the interval 0-1.
The calculation of H is divided into the following four cases according to the difference in the value of DeltaC, and the four cases can be represented by the following formulas (2-4), (2-5), (2-6) and (2-7), respectively:
H=0 ΔC=0 (2-4)
the calculation of S is classified into the following two cases according to the difference in the value of ΔC, and the cases are represented by the formulas (2-8) and (2-9):
S=0 C max =0 (2-8)
the V (value) parameter represents the brightness of the color, ranging from 0 to 1, and v=max (B, G, R)
Step 2.3, acquiring a binarized picture, and specifically executing the following steps:
step 2.3.1: first, the processing module in the server randomly generates a threshold h of HSV 1
Step 2.3.2: next, a threshold h is used 1 As a separation condition, a pixel gradation larger than the threshold value is set to 255, and a pixel gradation smaller than the threshold value is set to 0.
Step 2.3.3: finally, the threshold value is adjusted by observing the display effect of the object picture after binarization segmentation in real time. If the effect is not obvious, namely: if the object still has a part of the area overlapping with the surrounding environment color, the processing module repeatedly executes the steps 2.3.1-2.3.2; if the effect is obvious: namely: the object may have a distinct color boundary with the surrounding area, the server stops randomly generating the HSV threshold h 1
Step 2.4, the processing module carries out corrosion operation on the object picture after binarization processing, namely: and merging irregular areas with connectivity such as concave-convex shapes in the object picture to form a complete image.
The invention has the beneficial effects that:
1. according to the invention, after the 3D color camera scans objects in the surrounding environment, the captured images are effectively processed in the server by using the OpenCV, so that more accurate identification of the contours, the types and the like of the objects can be realized in actual operation. Compared with the traditional object recognition method, the method not only saves a large amount of resources of the server, but also can increase the material searching efficiency of the robot in a rescue scene.
2. According to the invention, the leg structure of the robot is modified, so that the wheel and foot integration of the leg of the robot is realized, the robot can flexibly switch the walking mode according to the complex environment by virtue of the structure, and barrier-free walking is realized as much as possible, so that the self operation efficiency is accelerated.
3. According to the invention, by improving the simulated annealing algorithm and optimizing the objective function by considering various factors, the robot can quickly find out the optimal path reaching the target site, namely: the robot can reach the destination relatively quickly and can be slightly disturbed by other paths in the moving process, so that the integrity of transported materials can be ensured; the occurrence of disaster aggravation caused by self equipment loss or overlong transportation time due to the influence of external factors in the moving process can be furthest reduced.
Drawings
Fig. 1 is a functional block diagram of the present invention.
Fig. 2 is a schematic diagram of a 3D color camera structure according to the present invention.
Fig. 3 is a schematic diagram of a running gear of a wheel-foot composite robot
Fig. 4 is a schematic diagram of object recognition in the present invention.
FIG. 5 is a flow chart of a simulated annealing algorithm modified in the present invention.
Fig. 6 is a schematic structural view of a wheel-foot composite robot.
In the figure: the device comprises a crank, a connecting rod driving motor 1, a connecting rod driving motor 2, a frame 3, a rotating joint 4, a connecting rod 5, a crawling grounding part 6, a rocker 7, a charged driving wheel 8, a folding rod 9 and a telescopic cylinder 10.
Detailed Description
The outdoor high-precision autonomous navigation device of the wheel-foot composite robot comprises a 3D color camera, a motor, a server, a wheel-foot composite robot body and a travelling mechanism.
The 3D color camera is used as scanning equipment and can be used for identifying object information; the motor is used as driving equipment, and the steering wheel and the speed of the chassis are adjusted to move according to the instruction sent by the server; the wheel-foot composite robot uses a connecting rod mechanism to switch different walking modes aiming at a specific environment in the moving process.
The running gear of wheel foot compound robot includes: crank 1, connecting rod driving motor 2, frame 3, rotary joint 4, connecting rod 5, ground part 6 of crawling, rocker 7, electrified driving wheel 8, folding rod 9 and telescopic cylinder 10; the frame 3 is provided with a connecting rod driving motor 2, the connecting rod driving motor 2 is rotationally connected with one end 1 of a crank 1, the other end of the crank 1 is rotationally connected with one end of a connecting rod 5, the other end of the connecting rod 5 is rotationally connected with a rotary joint 4, the rotary joint 4 is rotationally connected with a crawling grounding part 6 and one end of a rocker 7, and the other end of the rocker 7 is rotationally connected with the frame 3; the connecting rod driving motor 2 rotates to drive the rotary joint 4 to move so as to drive the connecting rod 5 and the rocker 7 to move, and then the crawling grounding part 6 moves up and down; at the frame 3 department of being connected with rocker 7, rotate the connection bracing piece upper end, the bracing piece lower extreme sets up takes motor drive wheel 8, and electrified motor drive wheel 8 links to each other with folding pole 9, sets up telescopic cylinder 10 on the folding pole 9, drives bracing piece and folding pole 9 whole up-and-down motion through telescopic cylinder 10's flexible.
Further, the crawling grounding part 6 is a flat plate and is in the same horizontal plane with the belt-driven wheel 8. The length of the longest rod piece in the connecting rod mechanism formed by the crank 1, the connecting rod 5 and the rocker 7 is as follows: and must be less than the sum of the lengths of the remaining link members.
Further, when the robot encounters a gully or a crack topography, the 3D color camera scans width parameter information of the crack in the road and transmits the width parameter information to the server for calculation and comparison with the wheel diameter, the server transmits a control instruction to the link driving motor 2 according to the calculation result, and the link driving motor 2 drives the crank 1, the link 5 and the rocker 7 to ascend, so that the crawling grounding part 6 moves upwards. Meanwhile, the motor driving wheel 8 can relax the piston in the telescopic cylinder 10 to give a downward force to the folding rod 9, so that the motor driving wheel 8 is put down; the motorized drive wheel 8 moves the folding rod 9 upward by expanding and contracting the piston inside the expansion cylinder 10, and lifts the motorized drive wheel 8 up, so that the crawling ground-contact portion 6 can cross the obstacle.
The outdoor high-precision autonomous navigation method of the wheel-foot composite robot comprises the following steps of:
step 1, importing a target place data set X [ n ] into a server carried by the wheel-foot composite robot and establishing a map, wherein n is a one-dimensional array and represents coordinates of each target place to be reached by the robot.
Step 2, taking the initial position of the robot as the origin of coordinates O, namely [0,0]Establishing a three-dimensional rectangular coordinate system, wherein the coordinates of each target place are [ x ] m ,y m ,z m ],m∈N * . Where m represents the number of the different target sites.
Step 3, calculating a distance matrix E, and respectively calculating the cost F required by the robot to reach each target place by a processing module i . Wherein F is i As each element in the distance matrix E: e, e i,m And i, m E N * And taking the set formed by the coordinates of each target place obtained in the step 1 as a row mark of the distance matrix E. Namely: each element E in the distance matrix E i,m The method comprises the following steps:
[[x m ,y m ,z m ],F i ] (1)
wherein in the formula (1), the cost F i The main components are as follows:
(1) the total distance the robot experiences during movement to the target site in an unstructured environment and the time spent during movement;
(2) the robot consumes electric energy for self movement and energy loss generated by friction of mechanical parts under interaction.
The above factors are accumulated continuously during the movement of the robot to the target site, and then a cost F is generated, and there is
And 4, the operation module randomly generates an initial path which can directly reach the target place by the current position of the robot according to the coordinates of each place acquired by the 3D color camera.
And 5, calculating the cost F required to be paid by the robot passing through the initial path by the operation module.
Step 6, setting a fitness function f=epsilon -1 Order F Optimization =f×exp (F), the operation module obtains the optimized cost value F by calculation Optimization
Step 7, the operation module judges the total cost F consumed by the robot in the moving process Optimization Whether to fall within the ideal range, namely: whether or not there is F Optimization < F. If yes, stopping iteration and generating a route; otherwise, the process of steps 4, 5 and 6 is repeated. And (5) performing iterative optimization until the cost is converged in an ideal range.
Wherein f=ε in step 5 -1 Wherein epsilon is a guide primer for planning a path of the robot, and is represented by the formula (2):
in the formula (2), epsilon depends on the damage degree rho of the road j The quality M of the carried material and the demand degree Q of the specific disaster area for the material. Wherein the material weight M is calculated by using the data F of the pressure sensor:degree of road damage ρ j Obtaining according to the damage grade of the highway asphalt pavement; the demand level Q of the specific disaster area for the materials carried by the robot is that the materials can generate different rescue effects for different disaster area groups, so the material type T i Rescue ability P can be expressed, and the relationship between the two is p=r (T i )。
Said ρ k According to the damaged grade of the highway asphalt pavement, four types are mainly adopted:
i is a crack, and mainly comprises a crack, a block-shaped crack, a longitudinal crack and a transverse crack;
II is loose, mainly comprising pit slots and looseness;
III is deformation class, mainly comprising subsidence, rut and wave advocacy;
IV is other types, mainly comprising oil flooding, repairing, frost heaving and slurry turning.
The four types of highway asphalt pavement damage grades correspond to each otherDegree of breakage ρ of (a) l Is different and l is E [ I, IV]Degree of road surface damage ρ k As the breakage level increases gradually, it increases.
Further, in the outdoor autonomous navigation process of the wheel foot composite robot, the mechanical arm is utilized to grasp the object characteristics required to be identified, the contour and size data of the object are acquired through the 3D color camera and then transmitted to the server, and the server stores the information of the 3D color camera, and the specific steps are as follows:
and 1.1, acquiring road condition information around the wheel-foot composite robot by using a 3D color camera and transmitting the road condition information to a server.
Step 1.2, the processing module visualizes the characteristics of the obstacle by using the RVIZ visualization platform, and the operation module calculates the distance l between the position of the robot and the obstacle according to the depth information in the characteristics of the obstacle, and transmits the distance l to the control module and then transmits a control instruction to the motor.
And 1.3, adjusting the link mechanism after the motor obtains the control instruction transmitted by the control module.
Further, the 3D color camera shoots a 3D image of the object, image information is transmitted to the server, and a processing module in the server extracts texture features of the object image acquired by the 3D color camera by using OpenCV, specifically comprising the following steps:
step 2.1, reading an original picture: the object picture file stored in the server is input into the processing module, and the processing module converts the object picture file into processable data by using OpenCV.
Step 2.2, converting the color space of the picture: converting BGR color to HSV color:
C max =max(R',G',B'),C min =min(R',G',B') (2-2)
ΔC=C max -C min (2-3)
in the formulas (2-1), (2-2), (2-3), R: red, G: green, B: is blue; r ', G ', B ' represent absolute color information, respectively, namely: the R, G, B color range is adjusted to be within the interval 0-1.
The calculation of H (hue) is classified into the following four cases according to the difference in the value of ΔC, and can be represented by the following formulas (2-4), (2-5), (2-6) and (2-7), respectively:
H=0 Δ=0 (2-4)
the calculation of S (saturation) is classified into the following two cases according to the difference in Δc value, and is expressed by formulas (2-8), (2-9):
S=0 C max =0 (2-8)
the V (brightness) parameter represents the brightness of the color, ranging from 0 to 1, and v=max (B, G, R)
Step 2.3, acquiring a binarized picture, and specifically executing the following steps:
step 2.3.1: first, the processing module in the server randomly generates a threshold h of HSV 1
Step 2.3.2: next, using this value as a separation condition, the pixel gradation larger than the threshold value is set to 255, and the pixel gradation smaller than the threshold value is set to 0.
Step 2.3.3: finally, the threshold value is adjusted by observing the display effect of the object picture after binarization segmentation in real time. If the effect is not clearDisplaying, namely: if the object still has a part of the area overlapping with the surrounding environment color, the processing module repeatedly executes the steps 2.3.1-2.3.2; if the effect is obvious: namely: the object may have a distinct color boundary with the surrounding area, the server stops randomly generating the HSV threshold h 1
Step 2.4, the processing module carries out corrosion operation on the object picture after binarization processing, namely: and merging irregular areas with connectivity such as concave-convex shapes in the object picture to form a complete image.

Claims (6)

1. The outdoor high-precision autonomous navigation device of the wheel-foot composite robot is characterized by comprising a 3D color camera, a motor, a server, a pressure sensor, GPS equipment, a wheel-foot composite robot body and a travelling mechanism;
the 3D color camera is used as scanning equipment and used for capturing object information; the server comprises a control module and an operation module: the control module is used for receiving and transmitting instructions and data between the scanning equipment and the GPS equipment, and the operation module is used for integrating and calculating the data; the wheel-foot composite robot judges and executes tasks through object information captured by the 3D color camera in the autonomous navigation process; the wheel-foot composite robot utilizes a mechanical arm to grasp an object; the motor adjusts the steering wheel and the speed of the chassis of the wheel-foot composite robot to move towards the object according to the instruction sent by the server; the wheel-foot composite robot uses a connecting rod mechanism to switch different walking modes aiming at a specific environment in the moving process;
the travelling mechanism of the wheel-foot composite robot comprises a crank (1), a connecting rod driving motor (2), a frame (3), a rotary joint (4), a connecting rod (5), a crawling grounding part (6), a rocker (7), a charged driving wheel (8), a folding rod (9) and a telescopic cylinder (10); a connecting rod driving motor (2) is arranged on the frame (3), the connecting rod driving motor (2) is rotationally connected with one end 1 of a crank (1), the other end of the crank (1) is rotationally connected with one end of a connecting rod (5), the other end of the connecting rod (5) is rotationally connected with a rotary joint (4), the rotary joint (4) is rotationally connected with a crawling grounding part (6) and one end of a rocker (7), and the other end of the rocker (7) is rotationally connected with the frame (3); the connecting rod driving motor (2) rotates to drive the rotary joint (4) to move, so that the connecting rod (5) and the rocker (7) are driven to move, and the crawling grounding part (6) is enabled to move up and down; at frame (3) department of being connected with rocker (7), rotate the connecting support pole upper end, the bracing piece lower extreme sets up and takes motor drive wheel (8), and electrified motor drive wheel (8) link to each other with folding rod (9), set up telescopic cylinder (10) on folding rod (9), drive bracing piece and folding rod (9) whole up-and-down motion through the flexible of telescopic cylinder (10).
2. The outdoor high-precision autonomous navigation device of the wheel-foot composite robot according to claim 1, wherein the crawling grounding part (6) is a flat plate and is in the same horizontal plane with the electrified driving wheel (8); the length of the longest rod piece in the connecting rod mechanism formed by the crank (1), the connecting rod (5) and the rocker (7) is as follows: and must be less than the sum of the lengths of the remaining link members.
3. The outdoor high-precision autonomous navigation device of the wheel-foot composite robot according to claim 1, wherein when the robot encounters a gully or crack topography, the 3D color camera scans width parameter information of a crack in a road and transmits the width parameter information to a server for calculation and comparison with a wheel diameter, the server transmits a control instruction to a connecting rod driving motor (2) according to a calculation result, and the connecting rod driving motor (2) drives a crank (1), a connecting rod (5) and a rocker (7) to ascend, so that a crawling grounding part (6) moves upwards; meanwhile, the electrified driving wheel (8) can relax a piston in the telescopic cylinder (10) to give a downward force to the folding rod (9), so that the electrified driving wheel (8) is put down; the motor driving wheel (8) causes a piston in the telescopic cylinder (10) to extend and retract to move the folding rod (9) upwards, and then the motor driving wheel (8) is lifted up, and the crawling grounding part (6) is used for crossing an obstacle.
4. The outdoor high-precision autonomous navigation method of the wheel-foot composite robot is characterized by being realized by the outdoor high-precision autonomous navigation device of the wheel-foot composite robot according to any one of claims 1-3, and an optimal path is planned by applying an improved simulated annealing algorithm, and comprises the following steps of:
step 1, a wheel-foot composite robot receives a target place data set X [ n ] through GPS equipment]And stored in a server according to the target site data set X [ n ]]Establishing a map; wherein n is a one-dimensional array; with the initial position of the robot as the origin of coordinates O, i.e., [0,0 ]]Establishing a three-dimensional rectangular coordinate system, and forming a set of coordinates of each target point to be reached by the robot, wherein the coordinates of each target point are [ x ] m ,y m ,z m ],m∈N * The method comprises the steps of carrying out a first treatment on the surface of the Wherein m represents the number of different target sites;
step 2, calculating a distance matrix E: the operation module calculates the cost F to be paid by two adjacent places of the robot respectively i F is to F i As a column vector of distance matrix E, i E N * Taking the set formed by the location coordinates obtained in the step 1 as a row vector;
step 3, the operation module randomly generates a route from the current position of the robot to the target place according to the current position of the robot and the target place information acquired by the GPS equipment, and the route is formed by a plurality of places [ x ] in the place coordinate set in the step 1 m ,y m ,z m ],m∈N * Composition and noted initial_path;
step 4, the operation module calculates the total cost value F of the initial_path, andwherein i is E initial_path;
step 5, setting a fitness function f=epsilon -1 And F > 0 is constant, and it is processed with the total cost value F:
F excellent (excellent) =F*exp(f) (1-1)
Obtaining smaller optimization cost after fusion according to the mode shown in the formula (1-1), and then storing the optimization cost and the corresponding route thereof to a server;
setting a guiding primer for path planning of a robot:depending on the degree of damage ρ of the road l The quality M of carried materials and the demand degree Q of specific disaster areas for the materials; wherein the material weight M is calculated by using the data U of the pressure sensor:
the demand level Q of the specific disaster area for the materials carried by the robot is T, and the materials can generate different rescue effects for different disaster area groups i Rescue ability P can be represented, namely: the demand level Q of the supplies can be expressed as:
P=Q(T i ) (1-3)
degree of road damage ρ l The road asphalt pavement damage grade can be obtained according to the road asphalt pavement damage grade, and the road asphalt pavement damage grade is divided into four types:
i is a crack, and mainly comprises a crack, a block-shaped crack, a longitudinal crack and a transverse crack;
II is loose, mainly comprising pit slots and looseness;
III is deformation class, mainly comprising subsidence, rut and wave advocacy;
IV is other types, mainly comprising oil flooding, repairing, frost heaving and slurry turning;
the damage degree rho corresponding to the damage grade of the asphalt pavement of the four types of highways l ,l∈[Ⅰ,Ⅳ]There is an increasing trend, namely: ρ >ρ >ρ >ρ
Step 6, calculating a total cost value F paid by the robot after optimizing the travel of the route by an operation module in the server Excellent (excellent) Whether to fall within the ideal range, namely: whether or not F is present Excellent (excellent) < F; if yes, stopping iteration and generating a route; otherwise, the process of the step 3 is repeated until the cost is converged within an ideal range.
5. The outdoor high-precision autonomous navigation method of the wheel-foot composite robot according to claim 4, wherein the wheel-foot composite robot uses a mechanical arm to grasp the characteristics of an object to be identified in the outdoor autonomous navigation process, and the contour and size data of the object are acquired through a 3D color camera and then transmitted to a server, and the server stores and visualizes the information of the 3D color camera, and the method comprises the following specific steps:
step 1.1, a 3D color camera acquires the outline, the size and the depth information of an object to be grabbed by a mechanical arm and uploads the information to a server;
step 1.2, a processing module acquires characteristic parameters of an obstacle by using an RVIZ visual platform and communicates the characteristic parameters with an operation module of the obstacle, the operation module calculates the distance l between the position of the robot and the obstacle and then sends the distance l to a server, and a control module of the server sends a control instruction to a motor;
and 1.3, adjusting the link mechanism after the motor obtains the control instruction transmitted by the control module.
6. The outdoor high-precision autonomous navigation method of the wheel-foot composite robot according to claim 4, wherein the 3D color camera shoots a 3D image of the object, the image information is transmitted to the processing module, the processing module converts the 3D image into a 2D image, and the OpenCV is adopted to extract texture features of the object image, specifically comprising the following steps:
step 2.1, reading an original picture: firstly, reading a picture file on a disk to a memory and converting the picture file into OpenCV processable data;
step 2.2, converting the color space of the picture: converting BGR color to HSV color:
C max =max(R',G',B'),C min =min(R',G',B') (2-2)
ΔC=C max -C min (2-3)
wherein, R: red, G: green, B: is blue; r ', G ', B ' represent absolute color information, respectively, namely: adjusting the R, G, B color range to be within a range of 0-1;
the calculation of H is divided into the following four cases according to the difference in the value of DeltaC, and the four cases can be represented by the following formulas (2-4), (2-5), (2-6) and (2-7), respectively:
H=0 ΔC=0 (2-4)
the calculation of S is classified into the following two cases according to the difference in the value of ΔC, and the cases are represented by the formulas (2-8) and (2-9):
S=0 C max =0 (2-8)
the V parameter represents the brightness of the color, ranging from 0 to 1, and v=max (B, G, R)
Step 2.3, acquiring a binarized picture, and specifically executing the following steps:
step 2.3.1: first, the processing module in the server randomly generates a threshold h of HSV 1
Step 2.3.2: next, a threshold h is used 1 As a separation condition, a pixel gradation larger than the threshold value is set to 255, and a pixel gradation smaller than the threshold value is set to 0;
step 2.3.3: finally, the threshold value is adjusted by observing the display effect of the object picture after binarization segmentation in real time; if the effect is not obvious, namely: the object still has a portion of the region overlapping the surrounding color, and the processing module repeatedly executes step 2.3.1-2.3.2; if the effect is obvious: namely: the object may have a distinct color boundary with the surrounding area, the server stops randomly generating the HSV threshold h 1
Step 2.4, the processing module carries out corrosion operation on the object picture after binarization processing, namely: and merging the irregular areas with connectivity in the object picture to form a complete image.
CN202110815973.0A 2021-07-20 2021-07-20 Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot Active CN113566830B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110815973.0A CN113566830B (en) 2021-07-20 2021-07-20 Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110815973.0A CN113566830B (en) 2021-07-20 2021-07-20 Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot

Publications (2)

Publication Number Publication Date
CN113566830A CN113566830A (en) 2021-10-29
CN113566830B true CN113566830B (en) 2023-09-26

Family

ID=78165568

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110815973.0A Active CN113566830B (en) 2021-07-20 2021-07-20 Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot

Country Status (1)

Country Link
CN (1) CN113566830B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4408982C1 (en) * 1994-03-16 1995-05-18 Deutsche Forsch Luft Raumfahrt Autonomous navigation system for mobile robot or manipulator
CA2950791A1 (en) * 2013-08-19 2015-02-26 State Grid Corporation Of China Binocular visual navigation system and method based on power robot
CN111055281A (en) * 2019-12-19 2020-04-24 杭州电子科技大学 ROS-based autonomous mobile grabbing system and method
CN111634345A (en) * 2020-05-29 2020-09-08 广西科技大学 High-adaptability walking mechanism of wheel-foot type mobile robot
CN112389571A (en) * 2019-07-31 2021-02-23 沈阳工业大学 Wheel-foot robot based on deformable auxiliary balance system and differential driving

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4408982C1 (en) * 1994-03-16 1995-05-18 Deutsche Forsch Luft Raumfahrt Autonomous navigation system for mobile robot or manipulator
CA2950791A1 (en) * 2013-08-19 2015-02-26 State Grid Corporation Of China Binocular visual navigation system and method based on power robot
CN112389571A (en) * 2019-07-31 2021-02-23 沈阳工业大学 Wheel-foot robot based on deformable auxiliary balance system and differential driving
CN111055281A (en) * 2019-12-19 2020-04-24 杭州电子科技大学 ROS-based autonomous mobile grabbing system and method
CN111634345A (en) * 2020-05-29 2020-09-08 广西科技大学 High-adaptability walking mechanism of wheel-foot type mobile robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A robust mobile robot indoor positioning system based on Wi-Fi;W Cui et al.;International Journal of Advanced Robotic Systems;全文 *
障碍地形下的机器人轮足复合越障步态规划方法;张世俊;邢琰;胡勇;;导航与控制(06);全文 *

Also Published As

Publication number Publication date
CN113566830A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
Wang et al. Vision-based robotic system for on-site construction and demolition waste sorting and recycling
CN109099901B (en) Full-automatic road roller positioning method based on multi-source data fusion
CN108647646B (en) Low-beam radar-based short obstacle optimized detection method and device
CN111360780A (en) Garbage picking robot based on visual semantic SLAM
CN115100622A (en) Method for detecting travelable area and automatically avoiding obstacle of unmanned transportation equipment in deep limited space
Yoneda et al. Urban road localization by using multiple layer map matching and line segment matching
Borges et al. A Survey on Terrain Traversability Analysis for Autonomous Ground Vehicles: Methods, Sensors, and Challenges.
Silver et al. Experimental analysis of overhead data processing to support long range navigation
US20220101534A1 (en) Sidewalk edge finder device, system and method
CN111721279A (en) Tail end path navigation method suitable for power transmission inspection work
CN112819766A (en) Bridge defect overhauling method, device, system and storage medium
CN210757693U (en) Small-size map scanning robot in electric power place
Wei et al. Damage inspection for road markings based on images with hierarchical semantic segmentation strategy and dynamic homography estimation
Mayuku et al. A self-supervised near-to-far approach for terrain-adaptive off-road autonomous driving
CN113566830B (en) Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot
CN113671522B (en) Dynamic environment laser SLAM method based on semantic constraint
Hong et al. An intelligent world model for autonomous off-road driving
Wallace Robot road following by adaptive color classification and shape tracking
Hu et al. The use of unmanned ground vehicles and unmanned aerial vehicles in the civil infrastructure sector: Applications, robotic platforms, sensors, and algorithms
Bellone et al. Pavement distress detection and avoidance for intelligent vehicles
CN114659513A (en) Point cloud map construction and maintenance method for unstructured road
CN111595859A (en) Bridge and culvert damage detection method and damage detection and management system
Huang et al. Image‐based path planning for outdoor mobile robots
Sinha et al. An approach towards automated navigation of vehicles using overhead cameras
Schaupp et al. MOZARD: Multi-Modal Localization for Autonomous Vehicles in Urban Outdoor Environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant