CN113566830A - Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot - Google Patents

Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot Download PDF

Info

Publication number
CN113566830A
CN113566830A CN202110815973.0A CN202110815973A CN113566830A CN 113566830 A CN113566830 A CN 113566830A CN 202110815973 A CN202110815973 A CN 202110815973A CN 113566830 A CN113566830 A CN 113566830A
Authority
CN
China
Prior art keywords
wheel
robot
server
motor
foot composite
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110815973.0A
Other languages
Chinese (zh)
Other versions
CN113566830B (en
Inventor
魏星
杨长春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou University
Original Assignee
Changzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou University filed Critical Changzhou University
Priority to CN202110815973.0A priority Critical patent/CN113566830B/en
Publication of CN113566830A publication Critical patent/CN113566830A/en
Application granted granted Critical
Publication of CN113566830B publication Critical patent/CN113566830B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D57/00Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
    • B62D57/02Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
    • B62D57/028Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members having wheels and mechanical legs
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

An outdoor high-precision autonomous navigation device and method for a wheel-foot composite robot belongs to the field of autonomous navigation of mobile robots. The 3D color camera is used as scanning equipment and used for capturing object information; the server comprises a control module and an operation module: the control module is used for receiving and transmitting instructions and data with the scanning equipment and the GPS equipment, and the operation module is used for carrying out integrated calculation on the data; the wheel-foot composite robot judges and executes tasks through object information captured by a 3D color camera in the autonomous navigation process; the wheel-foot composite robot grabs an object by using a mechanical arm; the motor adjusts a steering wheel of the wheel-foot composite robot chassis and moves towards an object at a speed according to an instruction sent by the server; the wheel-foot composite robot switches different walking modes according to specific environments by utilizing a connecting rod mechanism in the moving process.

Description

Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot
Technical Field
The invention belongs to the field of autonomous navigation of mobile robots, and particularly relates to a method and a device for performing high-precision and high-reliability autonomous navigation and realizing a specific task by a wheel-foot composite robot applied to an outdoor specific scene.
Background
With the increasingly wide application of autonomous navigation in many fields such as military, aerospace, industry and the like and daily life, people have increasingly high requirements on the robot, and the working environment of the robot is also shifted from indoor to more complex unstructured outdoor environment. The mobile robot has a great application prospect in the fields of disaster relief, safety monitoring, industrial manufacturing and the like, however, the difference of working environments makes the robot play its 'role' in various environments not easy. Since the 21 st century, various advanced mobile robots applied to different fields have appeared, and the range of motion of the mobile robots has been expanded comprehensively. In order to adapt to different outdoor environments, mobile robots are classified according to different moving modes: wheeled robots, tracked robots, combined robots, and the like.
With the continuous development and expansion of the application range of mobile robots, the mobile robots will work in more complex and unknown environments in the future. Relying solely on wheeled or legged mobile robots has not been able to fully accommodate the complexity and diversity of work environments. In order to meet the gradually increasing requirements for the performance of mobile robots, a plurality of hybrid mobile mechanisms have been developed, wherein a wheel-foot composite robot combines the characteristics of a wheel-type mobile robot and a leg-type mobile robot, so that the mobile efficiency on a flat ground can be ensured, and the hybrid mobile mechanism has good obstacle-crossing capability, and is mainly used for carrying out the task of transporting materials for people in a hard environment during field battles, but has some defects, such as: the control system of the complex quadruped walking robot is a nonlinear multi-input and multi-output unstable system and has time variability and intermittence; the method is limited by the non-real-time property of a windows system and the software development level and experience of laboratory members, the real-time property of the whole system is very low, one control period is as long as 50-100 ms, the bandwidth is obviously insufficient, the effect is very poor, and the low-speed motion of the robot is only reluctantly realized; in a city search and rescue (USAR) scene, a robust scanning cooperation method of a laser radar system is combined with a 3D attitude estimation system based on an inertial sensor, so that the robot can be reliably positioned and an accurate map can be established, but the construction cost is high, and the method is not widely applicable.
The invention has the main significance that the 3D color camera is used for acquiring the information of the identified object, and the wheel-foot composite robot can grab the object and carry out high-precision autonomous navigation in the outdoor environment. The wheel-foot composite robot can be guaranteed to recognize object characteristic information more accurately, the state is kept stable in the autonomous movement process, the task is executed efficiently, the cost is low, and the optimization effect is obvious.
Disclosure of Invention
The invention provides a relevant device for realizing high-precision and high-reliability autonomous navigation and realizing a specific task of the wheel-foot composite robot in a specific scene, designs and optimizes a relevant mainstream path search algorithm, and can ensure that the robot acquires environment information more accurately, the state of the robot is more stable in the autonomous movement process and the optimization effect is more obvious.
In order to solve the technical problem, the invention adopts the following technical scheme:
outdoor high accuracy of sufficient compound robot of wheel is from navigation head, includes 3D color camera, motor, server, pressure sensor, GPS equipment, sufficient compound robot body of wheel and running gear.
The 3D color camera is used as scanning equipment and used for capturing object information; the server comprises a control module and an operation module: the control module is used for receiving and transmitting instructions and data with the scanning equipment and the GPS equipment, and the operation module is used for carrying out integrated calculation on the data; the wheel-foot composite robot judges and executes tasks through object information captured by a 3D color camera in the autonomous navigation process; the wheel-foot composite robot grabs an object by using a mechanical arm; the motor adjusts a steering wheel of the wheel-foot composite robot chassis and moves towards an object at a speed according to an instruction sent by the server; the wheel-foot composite robot switches different walking modes according to specific environments by utilizing a connecting rod mechanism in the moving process.
The walking mechanism of the wheel-foot composite robot comprises a crank 1, a connecting rod driving motor 2, a frame 3, a rotating joint 4, a connecting rod 5, a crawling grounding part 6, a rocker 7, a driving wheel 8 with a motor, a folding rod 9 and a telescopic cylinder 10; a connecting rod driving motor 2 is arranged on the rack 3, the connecting rod driving motor 2 is rotationally connected with one end 1 of a crank 1, the other end of the crank 1 is rotationally connected with one end of a connecting rod 5, the other end of the connecting rod 5 is rotationally connected with a rotating joint 4, the rotating joint 4 is rotationally connected with a crawling grounding part 6 and one end of a rocker 7, and the other end of the rocker 7 is rotationally connected with the rack 3; the connecting rod driving motor 2 rotates to drive the rotating joint 4 to move, so that the connecting rod 5 and the rocker 7 are driven to move, and the crawling grounding part 6 moves up and down; at 3 departments of frame that are connected with rocker 7, rotate the joint support pole upper end, the bracing piece lower extreme sets up electrified motor drive wheel 8, and electrified motor drive wheel 8 links to each other with folding rod 9, sets up telescopic cylinder 10 on the folding rod 9, and flexible drive bracing piece and the whole up-and-down motion of folding rod 9 through telescopic cylinder 10.
Further, the creeping ground part 6 is a flat plate and is on the same level with the driving wheel 8 with the motor. The length of the longest rod piece in the connecting rod mechanism consisting of the crank 1, the connecting rod 5 and the rocker 7 is as follows: must be less than the sum of the lengths of the remaining link members.
Further, when the robot encounters a gully or a crack terrain, the 3D color camera scans width parameter information of the crack in the road, transmits the width parameter information to the server for calculation and comparison with the wheel diameter, the server sends a control command to the link driving motor 2 according to the calculation result, and the link driving motor 2 drives the crank 1, the link 5 and the rocker 7 to ascend, so that the crawling ground part 6 moves upward. Meanwhile, the driving wheel 8 with the motor can enable the piston in the telescopic cylinder 10 to relax and provide downward force for the folding rod 9, so that the driving wheel 8 with the motor can be put down; the motor-driven wheel 8 causes the piston inside the telescopic cylinder 10 to extend and contract, so that the folding rod 9 moves upward and then lifts the motor-driven wheel 8, and the obstacle is crossed by the creeping grounding part 6.
An outdoor high-precision autonomous navigation method of a wheel-foot composite robot mainly comprises the following steps of planning an optimal path by applying an improved simulated annealing algorithm:
step 1, the wheel-foot composite robot receives a target location data set X [ n ] through GPS equipment]And stored in the server, based on the target site data set X n]And establishing a map. Wherein n is a one-dimensional array. With a robotThe initial position is the origin of coordinates O, i.e., [0,0 ]]Establishing a three-dimensional rectangular coordinate system, and forming a set of coordinates of each target location to be reached by the robot, wherein the coordinates of each target location are [ x ]m,ym,zm],m∈N*. Where m represents the number of different target locations.
Step 2, calculating a distance matrix E: the operation module respectively calculates the cost F to be paid by two adjacent places of the robotiWill FiAs column vector of distance matrix E, i ∈ N*And (4) taking a set formed by the coordinates of the various places acquired in the step (1) as a row vector.
Step 3, the operation module randomly generates a route from the current position of the robot to the target location according to the current position and the target location information of the robot acquired by the GPS equipment, and the route is formed by a plurality of locations [ x ] in the location coordinate set in the step 1m,ym,zm],m∈N*Make up, and is denoted initial _ path.
Step 4, the operation module calculates the total cost value F of initial _ path, and
Figure BDA0003170121060000041
wherein i ∈ initial _ path.
Step 5, setting the fitness function f as epsilon-1And F > 0 is always true, and is processed with a total cost value F:
Fsuperior food=F*exp(f) (1-1)
And (3) obtaining smaller optimization cost after fusion according to the mode shown in the formula (1-1), and then storing the optimization cost value and the route corresponding to the optimization cost value to a server.
Setting a guiding primer for the robot to carry out path planning:
Figure BDA0003170121060000042
depending on the degree of damage p of the roadlThe quality M of the carried goods and materials and the requirement degree Q of the specific disaster area for the goods and materials. Wherein the material weight M is calculated by using the data U of the pressure sensor to obtain:
Figure BDA0003170121060000043
the requirement degree Q of the specific disaster area for the materials carried by the robot can generate different rescue effects on different disaster area groups, so that the material types TiThe rescue ability P can be expressed, that is: the demand degree Q of the material can be expressed as:
P=Q(Ti) (1-3)
degree of road damage ρlCan be obtained according to highway bituminous pavement damage grade, and highway bituminous pavement damage grade divide into four types:
i is a crack class which mainly comprises cracks, block cracks, longitudinal cracks and transverse cracks;
II, loosening, mainly comprising pit slots and loosening;
III is a deformation class, which mainly comprises subsidence, ruts and wave congestion;
IV is other types, mainly including oil-bleeding, repairing, frost heaving and slurry turning.
Breakage degree rho corresponding to the breakage grades of the four types of road asphalt pavementsl,l∈[Ⅰ,Ⅳ]In an increasing trend, there are: rho>ρ>ρ>ρ
Step 6, a calculation module in the server calculates a total cost value F paid by the robot after the robot drives and optimizes the routeSuperior foodWhether it falls within the desired range, i.e.: whether or not F is presentSuperior food< F. If so, stopping iteration and generating a route; otherwise, repeating the process of the step 3 until the cost converges in the ideal range.
Further, the wheel-foot composite robot uses the mechanical arm to grab an object to identify the object characteristics in the outdoor autonomous navigation process, the 3D color camera acquires the outline and size data of the object and transmits the data to the server, and the server stores and visualizes the 3D color camera information, and the method comprises the following specific steps:
step 1.1, the 3D color camera acquires the contour, size and depth information of an object to be grabbed by the wheel-foot composite robot through the mechanical arm and uploads the information to the server.
Step 1.2, the processing module acquires characteristic parameters of the obstacle by using the RVIZ visual platform and sends the characteristic parameters to the operation module, the operation module calculates the distance l between the position of the robot and the obstacle and then sends the distance l to the server, and then the control module of the server sends a control instruction to the motor.
And step 1.3, adjusting the link mechanism after the motor obtains the control instruction transmitted by the control module.
Furthermore, the 3D color camera shoots a 3D image of an object, transmits image information to the processing module, the processing module converts the 3D image into a 2D image, and extracts texture features of the object image by adopting OpenCV (open computer vision correction), and the method specifically comprises the following steps:
step 2.1, reading an original picture: firstly, the picture file on the disk needs to be read into the memory and converted into data which can be processed by the OpenCV.
Step 2.2, converting the color space of the picture: converting the BGR color into the HSV color:
Figure BDA0003170121060000061
Cmax=max(R',G',B'),Cmin=min(R',G',B') (2-2)
ΔC=Cmax-Cmin (2-3)
wherein, R: red, G: green, B: is blue; r ', G ', B ' represent absolute color information, i.e.: the R, G, B color range is adjusted to be within the interval 0-1.
The calculation of H is divided into the following four cases according to the difference of the delta C values, and the four cases can be respectively represented by the formulas (2-4), (2-5), (2-6) and (2-7):
H=0 ΔC=0 (2-4)
Figure BDA0003170121060000062
Figure BDA0003170121060000063
Figure BDA0003170121060000064
the calculation of S is divided into the following two cases according to the difference of the delta C values, and can be represented by the formulas (2-8) and (2-9):
S=0 Cmax=0 (2-8)
Figure BDA0003170121060000065
the V (value) parameter indicates how bright the color is, ranging from 0 to 1, and V ═ max (B, G, R)
Step 2.3, acquiring a binary image, specifically executing according to the following steps:
step 2.3.1: firstly, a processing module in the server randomly generates a threshold h of HSV1
Step 2.3.2: secondly, use the threshold h1As the partition conditions, the pixel gradation greater than the threshold value is 255, and the pixel gradation less than the threshold value is 0.
Step 2.3.3: and finally, adjusting the threshold value by observing the display effect of the object picture after the binarization segmentation in real time. If the effect is not obvious, namely: if the object still has a part of the area overlapped with the color of the surrounding environment, the processing module will repeatedly execute the step 2.3.1-2.3.2; if the effect is obvious: namely: if the object has a distinct color boundary with the surrounding area, the server stops randomly generating the threshold h for HSV1
And 2.4, the processing module carries out corrosion operation on the object picture after binarization processing, namely: and combining irregular areas with connectivity, such as concave and convex shapes, in the object picture to form a complete image.
The invention has the beneficial effects that:
1. according to the invention, after the 3D color camera scans the objects in the surrounding environment, the captured images are effectively processed in the server by utilizing OpenCV, and more accurate identification of the object outline, type and the like can be realized during actual operation. Compared with the traditional object identification method, the method not only saves a large amount of resources of the server, but also can increase the efficiency of the robot for searching the materials particularly in the rescue scene.
2. The invention realizes the wheel-foot integration of the leg part of the robot by reforming the leg part structure of the robot, and the robot can flexibly switch the walking mode aiming at the complex environment by the structure, thereby realizing barrier-free walking as much as possible and accelerating the self operation efficiency.
3. The invention optimizes the objective function by improving the simulated annealing algorithm and considering various factors, so that the robot can quickly find out the optimal path to the target location, namely: the robot can be ensured to reach a destination relatively quickly, and meanwhile, the robot can be slightly interfered by other paths in the moving process, so that the completeness of transported materials can be ensured; the disaster aggravation caused by self equipment loss or overlong transportation time due to the influence of external factors in the moving process can be reduced to the maximum extent.
Drawings
Fig. 1 is a schematic block diagram of the present invention.
Fig. 2 is a schematic diagram of a 3D color camera according to the present invention.
FIG. 3 is a schematic view of a walking mechanism of the wheel-foot hybrid robot
Fig. 4 is a schematic view of object recognition in the present invention.
Fig. 5 is a flow chart of an improved simulated annealing algorithm of the present invention.
Fig. 6 is a structural schematic diagram of the wheel-foot composite robot.
In the figure: 1 crank, 2 connecting rod driving motor, 3 frames, 4 revolute joints, 5 connecting rods, 6 crawling ground part, 7 rocker, 8 motor-driven driving wheels, 9 folding rods and 10 telescopic cylinder.
Detailed Description
Outdoor high accuracy of sufficient compound robot of wheel is from navigation head, includes 3D color camera, motor, server, sufficient compound robot body of wheel and running gear.
The 3D color camera is used as scanning equipment and can be used for identifying object information; the motor is used as a driving device and used for adjusting a steering wheel and the speed of the chassis to move according to the instruction sent by the server; the wheel-foot composite robot switches different walking modes according to specific environments by utilizing a connecting rod mechanism in the moving process.
The running gear of sufficient compound robot of wheel includes: the device comprises a crank 1, a connecting rod driving motor 2, a frame 3, a rotary joint 4, a connecting rod 5, a crawling grounding part 6, a rocker 7, a driving wheel 8 with a motor, a folding rod 9 and a telescopic cylinder 10; a connecting rod driving motor 2 is arranged on the rack 3, the connecting rod driving motor 2 is rotationally connected with one end 1 of a crank 1, the other end of the crank 1 is rotationally connected with one end of a connecting rod 5, the other end of the connecting rod 5 is rotationally connected with a rotating joint 4, the rotating joint 4 is rotationally connected with a crawling grounding part 6 and one end of a rocker 7, and the other end of the rocker 7 is rotationally connected with the rack 3; the connecting rod driving motor 2 rotates to drive the rotating joint 4 to move, so that the connecting rod 5 and the rocker 7 are driven to move, and the crawling grounding part 6 moves up and down; at 3 departments of frame that are connected with rocker 7, rotate the joint support pole upper end, the bracing piece lower extreme sets up electrified motor drive wheel 8, and electrified motor drive wheel 8 links to each other with folding rod 9, sets up telescopic cylinder 10 on the folding rod 9, and flexible drive bracing piece and the whole up-and-down motion of folding rod 9 through telescopic cylinder 10.
Further, the creeping ground part 6 is a flat plate and is on the same level with the driving wheel 8 with the motor. The length of the longest rod piece in the connecting rod mechanism consisting of the crank 1, the connecting rod 5 and the rocker 7 is as follows: must be less than the sum of the lengths of the remaining link members.
Further, when the robot encounters a gully or a crack terrain, the 3D color camera scans width parameter information of the crack in the road, transmits the width parameter information to the server for calculation and comparison with the wheel diameter, the server sends a control command to the link driving motor 2 according to the calculation result, and the link driving motor 2 drives the crank 1, the link 5 and the rocker 7 to ascend, so that the crawling ground part 6 moves upward. Meanwhile, the driving wheel 8 with the motor can enable the piston in the telescopic cylinder 10 to relax and provide downward force for the folding rod 9, so that the driving wheel 8 with the motor can be put down; the motor-driven wheel 8 causes the piston inside the telescopic cylinder 10 to extend and contract, so that the folding rod 9 moves upward and then lifts the motor-driven wheel 8, and the obstacle is crossed by the creeping grounding part 6.
The outdoor high-precision autonomous navigation method of the wheel-foot composite robot comprises the following steps:
step 1, importing a target location data set X [ n ] into a server carried by the wheel-foot composite robot and establishing a map, wherein n is a one-dimensional array and represents the coordinates of each target location to be reached by the robot.
Step 2, taking the initial position of the robot as a coordinate origin O, namely [0,0 ]]Establishing a three-dimensional rectangular coordinate system, wherein the coordinate of each target location is [ x [ ]m,ym,zm],m∈N*. Where m represents the number of different target locations.
Step 3, calculating a distance matrix E, and respectively calculating the cost F to be paid when the robot reaches each target place by a processing modulei. Wherein, FiAs each element in the distance matrix E: e.g. of the typei,mAnd i, m ∈ N*And taking a set formed by the coordinates of the target positions acquired in the step 1 as a row mark of the distance matrix E. Namely: each element E in the distance matrix Ei,mThe method is shown as follows:
[[xm,ym,zm],Fi] (1)
wherein, in the formula (1), the cost FiThe medicine mainly comprises the following parts:
the method comprises the following steps that firstly, the total journey of a robot in the process of moving to a target place in an unstructured environment and the time spent in the moving process are calculated;
and the robot generates energy loss by the friction between the electric energy consumed by the movement of the robot and each mechanical part under the interaction.
The above factors are continuously accumulated in the process that the robot moves to the target place, and then the cost F is generated, and some of the factors
Figure BDA0003170121060000101
And 4, randomly generating an initial route initial _ path which can directly reach the target position from the current position of the robot by the operation module according to the coordinates of each position acquired by the 3D color camera.
And 5, calculating by an operation module to obtain the cost F required to be paid by the robot after the robot passes the initial _ path.
Step 6, setting the fitness function f as epsilon-1Let FOptimizationF exp (F), the operation module obtains the optimized cost value F by calculationOptimization
Step 7, the operation module judges the total cost F consumed by the robot in the moving processOptimizationWhether it falls within the desired range, i.e.: whether or not there is FOptimization< F. If so, stopping iteration and generating a route; otherwise, repeating the processes of the steps 4, 5 and 6. And (5) performing iterative optimization until the cost converges in an ideal range.
Wherein f ═ epsilon in step 5-1In (e), ∈ is a guidance index for path planning by the robot, and is represented by equation (2):
Figure BDA0003170121060000102
in the formula (2), ε depends on the degree of damage ρ of the roadjThe quality M of the materials carried and the degree Q of the demand of the materials in a specific disaster area. Wherein the weight M of the material is calculated by using the data F of the pressure sensor to obtain:
Figure BDA0003170121060000111
degree of road damage ρjObtaining the road asphalt pavement damage grade; the requirement degree Q of the specific disaster area for the materials carried by the robot can generate different rescue effects on different disaster area groups, so that the materials can be used for the rescue of the different disaster area groupsClass TiThe rescue ability P can be expressed, and the association between the two is P ═ R (T)i)。
The rhokThe road asphalt pavement damage grades are mainly four types:
i is a crack class which mainly comprises cracks, block cracks, longitudinal cracks and transverse cracks;
II, loosening, mainly comprising pit slots and loosening;
III is a deformation class, which mainly comprises subsidence, ruts and wave congestion;
IV is other types, mainly including oil-bleeding, repairing, frost heaving and slurry turning.
Breakage degree rho corresponding to the breakage grades of the four types of road asphalt pavementslAre different and are e [ I, IV ]]Degree of road surface damage ρkThe damage level increases as it increases.
Further, the wheel-foot composite robot uses the mechanical arm to grab an object to identify the object characteristics in the outdoor autonomous navigation process, the 3D color camera acquires the outline and size data of the object and transmits the data to the server, and the server stores the 3D color camera information, and the method comprises the following specific steps:
step 1.1, the wheel-foot composite robot acquires road condition information around the wheel-foot composite robot by using a 3D color camera and transmits the road condition information to a server.
Step 1.2, the processing module utilizes the RVIZ visualization platform to visualize the characteristics of the obstacle, the operation module calculates the distance l between the position of the robot and the obstacle according to the depth information in the characteristics of the obstacle, and the distance l is transmitted to the control module and then a control instruction is sent to the motor.
And step 1.3, adjusting the link mechanism after the motor obtains the control instruction transmitted by the control module.
Further, the 3D color camera shoots a 3D image of an object, transmits image information to the server, and a processing module in the server extracts texture features of the object image acquired by the 3D color camera by using OpenCV, specifically according to the following steps:
step 2.1, reading an original picture: firstly, an object picture file stored in a server is input to a processing module, and the processing module converts the object picture file into processable data by utilizing OpenCV.
Step 2.2, converting the color space of the picture: converting the BGR color into the HSV color:
Figure BDA0003170121060000121
Cmax=max(R',G',B'),Cmin=min(R',G',B') (2-2)
ΔC=Cmax-Cmin (2-3)
in the formulae (2-1), (2-2), (2-3), R: red, G: green, B: is blue; r ', G ', B ' represent absolute color information, i.e.: the R, G, B color range is adjusted to be within the interval 0-1.
The calculation of H (hue) is divided into the following four cases according to the difference in Δ C values, which can be represented by formulas (2-4), (2-5), (2-6), and (2-7), respectively:
H=0 Δ=0 (2-4)
Figure BDA0003170121060000122
Figure BDA0003170121060000123
Figure BDA0003170121060000124
the calculation of S (saturation) is classified into the following two cases according to the difference in Δ C values, and can be represented by the following formulas (2-8) and (2-9):
S=0 Cmax=0 (2-8)
Figure BDA0003170121060000125
the V (brightness) parameter indicates the brightness of the color, ranging from 0 to 1, and V ═ max (B, G, R)
Step 2.3, acquiring a binary image, specifically executing according to the following steps:
step 2.3.1: firstly, a processing module in the server randomly generates a threshold h of HSV1
Step 2.3.2: next, using this value as a partition condition, the pixel gradation greater than the threshold value is 255, and the pixel gradation smaller than the threshold value is 0.
Step 2.3.3: and finally, adjusting the threshold value by observing the display effect of the object picture after the binarization segmentation in real time. If the effect is not obvious, namely: if the object still has a part of the area overlapped with the color of the surrounding environment, the processing module will repeatedly execute the step 2.3.1-2.3.2; if the effect is obvious: namely: if the object has a distinct color boundary with the surrounding area, the server stops randomly generating the threshold h for HSV1
And 2.4, the processing module carries out corrosion operation on the object picture after binarization processing, namely: and combining irregular areas with connectivity, such as concave and convex shapes, in the object picture to form a complete image.

Claims (6)

1. The outdoor high-precision autonomous navigation device of the wheel-foot composite robot is characterized by comprising a 3D color camera, a motor, a server, a pressure sensor, GPS equipment, a wheel-foot composite robot body and a walking mechanism;
the 3D color camera is used as scanning equipment and used for capturing object information; the server comprises a control module and an operation module: the control module is used for receiving and transmitting instructions and data with the scanning equipment and the GPS equipment, and the operation module is used for carrying out integrated calculation on the data; the wheel-foot composite robot judges and executes tasks through object information captured by a 3D color camera in the autonomous navigation process; the wheel-foot composite robot grabs an object by using a mechanical arm; the motor adjusts a steering wheel of the wheel-foot composite robot chassis and moves towards an object at a speed according to an instruction sent by the server; the wheel-foot composite robot switches different walking modes according to a specific environment by using a connecting rod mechanism in the moving process;
the walking mechanism of the wheel-foot composite robot comprises a crank (1), a connecting rod driving motor (2), a frame (3), a rotating joint (4), a connecting rod (5), a crawling grounding part (6), a rocker (7), a motor-driven driving wheel (8), a folding rod (9) and a telescopic cylinder (10); a connecting rod driving motor (2) is arranged on the rack (3), the connecting rod driving motor (2) is rotationally connected with one end 1 of a crank (1), the other end of the crank (1) is rotationally connected with one end of a connecting rod (5), the other end of the connecting rod (5) is rotationally connected with a rotating joint (4), the rotating joint (4) is rotationally connected with a crawling grounding part (6) and one end of a rocker (7), and the other end of the rocker (7) is rotationally connected with the rack (3); the connecting rod driving motor (2) rotates to drive the rotating joint (4) to move, so that the connecting rod (5) and the rocker (7) are driven to move, and the crawling grounding part (6) moves up and down; the upper end of a connecting support rod is rotated at a frame (3) connected with a rocker (7), a motor driving wheel (8) is arranged at the lower end of the support rod, the motor driving wheel (8) is connected with a folding rod (9), a telescopic cylinder (10) is arranged on the folding rod (9), and the support rod and the folding rod (9) are driven to move up and down integrally through the telescopic of the telescopic cylinder (10).
2. The outdoor high-precision autonomous navigation device of the wheel-foot composite robot according to claim 1, characterized in that the crawling ground part (6) is a flat plate and is on the same horizontal plane with the driving wheel (8) with the motor; the length of the longest rod piece in the connecting rod mechanism consisting of the crank (1), the connecting rod (5) and the rocker (7) is as follows: must be less than the sum of the lengths of the remaining link members.
3. The outdoor high-precision autonomous navigation device of the wheel-foot composite robot according to claim 1, wherein when the robot encounters gully or crack terrain, the 3D color camera scans width parameter information of cracks in the road, transmits the width parameter information to the server for calculation and compares the result with the wheel diameter, the server sends a control command to the link driving motor (2) according to the calculation result, and the link driving motor (2) drives the crank (1), the link (5) and the rocker (7) to ascend, so that the crawling ground part (6) moves upwards; meanwhile, the driving wheel (8) with the motor can enable the piston in the telescopic cylinder (10) to expand to give downward force to the folding rod (9), so that the driving wheel (8) with the motor is put down; the piston inside the telescopic cylinder (10) is stretched by the driving wheel (8) with the motor to enable the folding rod (9) to move upwards so as to lift the driving wheel (8) with the motor, and the crawling grounding part (6) is used for crossing obstacles.
4. The outdoor high-precision autonomous navigation method of the wheel-foot composite robot is realized by the outdoor high-precision autonomous navigation device of the wheel-foot composite robot as claimed in any one of claims 1 to 3, and an optimal path is planned by applying an improved simulated annealing algorithm, and the method comprises the following steps:
step 1, the wheel-foot composite robot receives a target location data set X [ n ] through GPS equipment]And stored in the server, based on the target site data set X n]Establishing a map; wherein n is a one-dimensional array; using the initial position of the robot as the origin of coordinates O, i.e., [0,0 ]]Establishing a three-dimensional rectangular coordinate system, and forming a set of coordinates of each target location to be reached by the robot, wherein the coordinates of each target location are [ x ]m,ym,zm],m∈N*(ii) a Wherein m represents the number of different target locations;
step 2, calculating a distance matrix E: the operation module respectively calculates the cost F to be paid by two adjacent places of the robotiWill FiAs column vector of distance matrix E, i ∈ N*Taking a set formed by the coordinates of each place acquired in the step 1 as a row vector;
step 3, the operation module randomly generates a route from the current position of the robot to the target location according to the current position and the target location information of the robot acquired by the GPS equipment, and the route is formed by a plurality of locations [ x ] in the location coordinate set in the step 1m,ym,zm],m∈N*Composing and marking as initial _ path;
step 4, the operation module calculates the total cost value F of initial _ path, and
Figure FDA0003170121050000031
wherein i belongs to initial _ path;
step 5, setting the fitness function f as epsilon-1And F > 0 is always true, and is processed with a total cost value F:
Fsuperior food=F*exp(f) (1-1)
Obtaining smaller optimization cost after fusion according to the mode shown in the formula (1-1), and then storing the optimization cost value and a route corresponding to the optimization cost value to a server;
setting a guiding primer for the robot to carry out path planning:
Figure FDA0003170121050000032
depending on the degree of damage p of the roadlThe carried quality M of the materials and the requirement degree Q of the specific disaster area for the materials; wherein the material weight M is calculated by using the data U of the pressure sensor to obtain:
Figure FDA0003170121050000033
the requirement degree Q of the specific disaster area for the materials carried by the robot can generate different rescue effects on different disaster area groups, so that the material types TiThe rescue ability P can be expressed, that is: the demand degree Q of the material can be expressed as:
P=Q(Ti) (1-3)
degree of road damage ρlCan be obtained according to highway bituminous pavement damage grade, and highway bituminous pavement damage grade divide into four types:
i is a crack class which mainly comprises cracks, block cracks, longitudinal cracks and transverse cracks;
II, loosening, mainly comprising pit slots and loosening;
III is a deformation class, which mainly comprises subsidence, ruts and wave congestion;
IV, other categories, mainly including oil bleeding, repairing, frost heaving and slurry turning;
breakage degree rho corresponding to the breakage grades of the four types of road asphalt pavementsl,l∈[Ⅰ,Ⅳ]In an increasing trend, there are: rho>ρ>ρ>ρ
Step 6, a calculation module in the server calculates a total cost value F paid by the robot after the robot drives and optimizes the routeSuperior foodWhether it falls within the desired range, i.e.: whether or not F is presentSuperior food< F; if so, stopping iteration and generating a route; otherwise, repeating the process of the step 3 until the cost converges in the ideal range.
5. The outdoor high-precision autonomous navigation method of the wheel-foot composite robot according to claim 4, wherein in the outdoor autonomous navigation process of the wheel-foot composite robot, a mechanical arm is used for grabbing an object to be recognized, the object features are acquired through a 3D color camera and then transmitted to a server, the server stores and visualizes the information of the 3D color camera, and the method comprises the following specific steps:
step 1.1, a 3D color camera acquires the contour, size and depth information of an object to be grabbed by the wheel-foot composite robot by using a mechanical arm and uploads the information to a server;
step 1.2, a processing module acquires characteristic parameters of the obstacle by using an RVIZ visual platform and sends the characteristic parameters to an operation module of the processing module, the operation module calculates the distance l between the position of the robot and the obstacle and then sends the distance l to a server, and a control module of the server sends a control instruction to a motor;
and step 1.3, adjusting the link mechanism after the motor obtains the control instruction transmitted by the control module.
6. The outdoor high-precision autonomous navigation method for the wheel-foot composite robot according to claim 4, characterized in that a 3D color camera shoots a 3D image of an object, image information is transmitted to a processing module, the processing module converts 3D into a 2D image, and the texture features of the object image are extracted by OpenCV, which specifically comprises the following steps:
step 2.1, reading an original picture: firstly, reading a picture file on a disk into a memory and converting the picture file into data which can be processed by an OpenCV (open source computer vision library);
step 2.2, converting the color space of the picture: converting the BGR color into the HSV color:
Figure FDA0003170121050000051
Cmax=max(R',G',B'),Cmin=min(R',G',B') (2-2)
ΔC=Cmax-Cmin (2-3)
wherein, R: red, G: green, B: is blue; r ', G ', B ' represent absolute color information, i.e.: adjusting R, G, B color range to 0-1 interval;
the calculation of H is divided into the following four cases according to the difference of the delta C values, and the four cases can be respectively represented by the formulas (2-4), (2-5), (2-6) and (2-7):
H=0 ΔC=0 (2-4)
Figure FDA0003170121050000052
Figure FDA0003170121050000053
Figure FDA0003170121050000054
the calculation of S is divided into the following two cases according to the difference of the delta C values, and can be represented by the formulas (2-8) and (2-9):
S=0 Cmax=0 (2-8)
Figure FDA0003170121050000055
the V parameter indicates the brightness of the color, ranging from 0 to 1, and V ═ max (B, G, R)
Step 2.3, acquiring a binary image, specifically executing according to the following steps:
step 2.3.1: firstly, a processing module in the server randomly generates a threshold h of HSV1
Step 2.3.2: secondly, use the threshold h1As the partition conditions, the pixel gradation greater than the threshold value is 255, and the pixel gradation less than the threshold value is 0;
step 2.3.3: finally, adjusting the threshold value by observing the display effect of the image of the object after the binarization segmentation in real time; if the effect is not obvious, namely: if the object still has a part of the area overlapped with the color of the surrounding environment, the processing module will repeatedly execute the step 2.3.1-2.3.2; if the effect is obvious: namely: if the object has a distinct color boundary with the surrounding area, the server stops randomly generating the threshold h for HSV1
And 2.4, the processing module carries out corrosion operation on the object picture after binarization processing, namely: and combining irregular areas with connectivity, such as concave and convex shapes, in the object picture to form a complete image.
CN202110815973.0A 2021-07-20 2021-07-20 Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot Active CN113566830B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110815973.0A CN113566830B (en) 2021-07-20 2021-07-20 Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110815973.0A CN113566830B (en) 2021-07-20 2021-07-20 Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot

Publications (2)

Publication Number Publication Date
CN113566830A true CN113566830A (en) 2021-10-29
CN113566830B CN113566830B (en) 2023-09-26

Family

ID=78165568

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110815973.0A Active CN113566830B (en) 2021-07-20 2021-07-20 Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot

Country Status (1)

Country Link
CN (1) CN113566830B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4408982C1 (en) * 1994-03-16 1995-05-18 Deutsche Forsch Luft Raumfahrt Autonomous navigation system for mobile robot or manipulator
CA2950791A1 (en) * 2013-08-19 2015-02-26 State Grid Corporation Of China Binocular visual navigation system and method based on power robot
CN111055281A (en) * 2019-12-19 2020-04-24 杭州电子科技大学 ROS-based autonomous mobile grabbing system and method
CN111634345A (en) * 2020-05-29 2020-09-08 广西科技大学 High-adaptability walking mechanism of wheel-foot type mobile robot
CN112389571A (en) * 2019-07-31 2021-02-23 沈阳工业大学 Wheel-foot robot based on deformable auxiliary balance system and differential driving

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4408982C1 (en) * 1994-03-16 1995-05-18 Deutsche Forsch Luft Raumfahrt Autonomous navigation system for mobile robot or manipulator
CA2950791A1 (en) * 2013-08-19 2015-02-26 State Grid Corporation Of China Binocular visual navigation system and method based on power robot
CN112389571A (en) * 2019-07-31 2021-02-23 沈阳工业大学 Wheel-foot robot based on deformable auxiliary balance system and differential driving
CN111055281A (en) * 2019-12-19 2020-04-24 杭州电子科技大学 ROS-based autonomous mobile grabbing system and method
CN111634345A (en) * 2020-05-29 2020-09-08 广西科技大学 High-adaptability walking mechanism of wheel-foot type mobile robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
W CUI ET AL.: "A robust mobile robot indoor positioning system based on Wi-Fi", INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS *
张世俊;邢琰;胡勇;: "障碍地形下的机器人轮足复合越障步态规划方法", 导航与控制, no. 06 *

Also Published As

Publication number Publication date
CN113566830B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
CN108647646B (en) Low-beam radar-based short obstacle optimized detection method and device
Saska et al. Documentation of dark areas of large historical buildings by a formation of unmanned aerial vehicles using model predictive control
CN111360780A (en) Garbage picking robot based on visual semantic SLAM
Yoneda et al. Urban road localization by using multiple layer map matching and line segment matching
CN107917710A (en) A kind of positioning in real time of the interior based on single line laser and three-dimensional map construction method
CN112819943B (en) Active vision SLAM system based on panoramic camera
US10196104B1 (en) Terrain Evaluation for robot locomotion
EP3931657B1 (en) System and method for surface feature detection and traversal
CN111721279A (en) Tail end path navigation method suitable for power transmission inspection work
CN112405490A (en) Flexible assembly robot with autonomous navigation and positioning functions
CN113096190A (en) Omnidirectional mobile robot navigation method based on visual map building
CN112819766A (en) Bridge defect overhauling method, device, system and storage medium
Bartoszyk et al. Terrain-aware motion planning for a walking robot
CN113671522B (en) Dynamic environment laser SLAM method based on semantic constraint
CN114353799A (en) Indoor rapid global positioning method for unmanned platform carrying multi-line laser radar
CN113566830A (en) Outdoor high-precision autonomous navigation device and method for wheel-foot composite robot
Ali et al. Development of an autonomous robotics platform for road marks painting using laser simulator and sensor fusion technique
CN111401337A (en) Lane following exploration mapping method, storage medium and robot
CN112731918B (en) Ground unmanned platform autonomous following system based on deep learning detection tracking
Wallace Robot road following by adaptive color classification and shape tracking
CN114967722A (en) Method for automatically crossing step obstacle of rocker arm type motorized platform
Bellone et al. Pavement distress detection and avoidance for intelligent vehicles
Wang et al. Obstacle detection and obstacle-surmounting planning for a wheel-legged robot based on Lidar
CN118023799B (en) Autonomous welding operation and planning method of humanoid welding robot based on camera
CN213999460U (en) Flexible assembly robot with autonomous navigation and positioning functions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant