CN117553788A - Robot obstacle avoidance navigation method and robot - Google Patents

Robot obstacle avoidance navigation method and robot Download PDF

Info

Publication number
CN117553788A
CN117553788A CN202311261740.6A CN202311261740A CN117553788A CN 117553788 A CN117553788 A CN 117553788A CN 202311261740 A CN202311261740 A CN 202311261740A CN 117553788 A CN117553788 A CN 117553788A
Authority
CN
China
Prior art keywords
robot
distance
coordinates
delta
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311261740.6A
Other languages
Chinese (zh)
Inventor
陆文涛
么晶明
闵桂元
饶毅
孙宁
张宗江
刘国亮
王炳南
张帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CITIC HIC Kaicheng Intelligence Equipment Co Ltd
Original Assignee
CITIC HIC Kaicheng Intelligence Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CITIC HIC Kaicheng Intelligence Equipment Co Ltd filed Critical CITIC HIC Kaicheng Intelligence Equipment Co Ltd
Priority to CN202311261740.6A priority Critical patent/CN117553788A/en
Publication of CN117553788A publication Critical patent/CN117553788A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to a robot obstacle avoidance navigation method and a robot, wherein the method comprises the following steps: s1, acquiring initial position coordinates of a robot and coordinates of navigation points; s2, generating a straight line route according to the coordinates and arranging a plurality of filling points on the route at intervals; s3, controlling the robot to sequentially move along the filling points, and continuously judging the distance between the robot and the obstacle in the moving process; s4, setting a safety distance between the robot and the obstacle; s5, when the robot moves to a safe distance, adding a new filling point, and moving from the current position of the robot to the newly formed filling point; the robot is provided with a vehicle-mounted base station (RTK receiver), a depth camera, a controller, an embedded controller, a singlechip and a power supply system on a mobile chassis. According to the RTK positioning technology and the method, high-precision robot navigation and obstacle avoidance can be achieved.

Description

Robot obstacle avoidance navigation method and robot
Technical Field
The invention relates to the technical field of equipment navigation and positioning, in particular to a robot obstacle avoidance navigation method and a robot.
Background
Along with the increasing complexity of modern rescue work, the situation that the personal is threatened often appears in the rescue process, and the rescue personnel is prevented from adventure to enter, so that the problem of difficult decision is always solved. Throughout the country, various rescue and relief problems such as forest fires, debris flows, floods, earthquakes and the like are faced in the form of various forms every year. Traffic and transportation in the rescue process become important in guaranteeing the life and property safety of people. In order to improve the efficiency and safety of rescue work, the invention provides a robot obstacle avoidance navigation method and a robot. The RTK positioning technology is a real-time dynamic positioning technology based on carrier phase observation values, can provide three-dimensional positioning results of a measuring station in a specified coordinate system in real time and reach centimeter-level precision, but cannot carry out obstacle-detouring navigation according to environmental obstacles in the field. The background of the invention is to solve the above problems.
Disclosure of Invention
In order to solve the problems in the above technology, the present application provides a robot obstacle avoidance navigation method, which includes the following steps:
s1, acquiring initial position coordinates of a robot and coordinates of navigation points;
s2, generating a straight line route according to the coordinates and arranging a plurality of filling points on the route at intervals;
s3, controlling the robot to sequentially move along the filling points, and continuously judging the distance between the robot and the obstacle in the moving process;
s4, setting a safety distance between the robot and the obstacle;
s5, when the robot moves to a safe distance, adding a new filling point, and moving from the current position of the robot to the newly formed filling point;
wherein the newly formed filling points are obtained as follows;
next_waypoint_x = x + forward_distance cos(steering_angle);
next_waypoint_y = y + forward_distance sin(steering_angle);
wherein x and y are coordinates of the current position of the robot, steering_angle is a steering angle, forward_distance is a forward distance of the robot per second, and next_waypoint_x/y is coordinates of a newly added filling point;
steering_angle = atan2(distance, safety_distance);
wherein, the safety_distance is a safe distance, and the distance is a distance between the robot and the obstacle;
s6, repeating the steps S3 to S5 until reaching the navigation point.
Wherein preferably the robot carries a binocular camera;
the method for judging the distance between the robot and the obstacle comprises the steps of respectively obtaining two images in front through a binocular camera, calculating angular points in the two images according to a Fast algorithm, calculating descriptors of the angular points through an ORB algorithm, matching the angular points in the two images, calculating parallax between pixels of corresponding angular points in the two images according to the descriptors, and calculating the distance between the robot and the obstacle according to the parallax, camera parameters and polar line length.
Preferably, the method further comprises step S5-1, after reaching the newly formed filling point, connecting the newly formed filling point coordinate with the straight line path vertically to obtain an intersection point, and controlling the robot to move to the next filling point of the intersection point on the straight line path.
Wherein preferably the coordinates of the filling point are obtained according to the following method:
p1 sets the coordinates of the navigation point as (x_n, y_n), the coordinates of the initial position as (x_0, y_0) and calculates the relative coordinates delta_x and delta_y;
relative x coordinates: delta_x=x_n-x_0;
relative y coordinates: delta_y=y_n-y_0;
p2 calculates the length L=sqrt (delta_x2+delta_y2) between the navigation point and the initial position
P3, setting the interval between filling points to be Lp;
the number of filling points n=l/Lp
The coordinates of the M-th filling point of P4 are delta_x_fill= (delta_x/N) M, delta_y_fill= (delta_y/N) M.
A robot controlled by the obstacle avoidance navigation method is provided with a vehicle-mounted base station (RTK receiver), a controller, an embedded controller, a singlechip and a power supply system.
The beneficial effects of the invention are as follows: compared with the traditional single-sensor navigation method, the method can realize more accurate robot position positioning, thereby ensuring the accuracy of a navigation path. The depth camera is adopted to acquire more comprehensive and accurate environmental information. By fusing and processing the data, the environmental characteristics of obstacles, roads and the like can be better identified, so that the efficient obstacle avoidance navigation is realized. And calculating an optimal path according to the real-time environment data and the target position, so as to realize more intelligent path planning and navigation. Meanwhile, when the robot cannot reach the designated position, the method and the device can recalculate the optimal path, thereby realizing more flexible navigation control.
Detailed Description
The following description of the technical solution in the embodiments of the present invention is clear and complete. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways other than those described herein, and persons skilled in the art will readily appreciate that the present invention is not limited to the specific embodiments disclosed below.
The robot comprises a vehicle-mounted base station (RTK receiver), a depth camera, a controller, an embedded controller, a singlechip, a mobile chassis and a power supply system. The depth camera is used for acquiring depth image data of the environment in real time and extracting position and shape information of the obstacle through an image processing algorithm. The vehicle-mounted base station (RTK receiver) is used for acquiring accurate position and attitude information of the navigation equipment in real time, and high-precision positioning data is provided by using an RTK technology. The embedded controller is connected with the singlechip, the singlechip is connected with the controller and the four-wheel drive crawler-type mobile chassis, and route planning and obstacle avoidance decision is carried out through a navigation algorithm according to data provided by the depth camera and the vehicle-mounted base station (RTK receiver), so that autonomous navigation and obstacle avoidance of the navigation equipment are realized. The controller is used for controlling the mobile chassis to walk according to the instruction of the embedded controller, after receiving the instruction of the embedded controller, the chassis can adjust the speed, the direction and the gesture of the mobile chassis, and the controller can send corresponding driving signals to the mobile chassis to start walking; when the embedded controller sends out a stop instruction, the controller immediately cuts off the driving signal to stop the motion of the mobile chassis.
The embedded controller is a brain of the robot and is responsible for movement control of the robot;
in the navigation process, the depth camera module acquires depth image data in the environment in real time and transmits the depth image data to the embedded controller. The embedded control computer analyzes the depth image by using a computer vision algorithm, and identifies and locates obstacles on the navigation path. And the embedded controller adjusts the navigation path in real time according to the output result of the image processing module, avoids the detected obstacle, and ensures the safety and stability of the navigation process.
The power supply module provides working power for a vehicle-mounted base station (RTK receiver), a depth camera, a controller, an embedded controller, a singlechip and a mobile chassis.
Preferably, the depth camera is a binocular camera, and the depth camera can capture left and right two visual angles of a scene simultaneously by using a binocular vision system, and realize three-dimensional perception of an object by calculating parallax and depth information.
Preferably, the vehicle-mounted base station of the invention combines a global positioning system (GNSS) satellite signal with a real-time kinematic (RTK) technology, and the robot can acquire the accurate position and posture information of the robot in real time, so that the robot can realize accurate navigation in wide areas such as construction sites, farmlands, fields and the like, and the positioning longitude reaches the centimeter level.
The robot obstacle avoidance navigation method of the robot comprises the following steps:
the robot obstacle avoidance navigation method is characterized by comprising the following steps of:
s1, acquiring initial position coordinates of a robot and coordinates of navigation points;
s2, generating a straight line route according to the coordinates and arranging a plurality of filling points on the route at intervals;
the coordinates of the filling points are obtained according to the following method:
p1 sets the coordinates of the navigation point as (x_n, y_n), the coordinates of the initial position as (x_0, y_0) and calculates the relative coordinates delta_x and delta_y;
relative x coordinates: delta_x=x_n-x_0;
relative y coordinates: delta_y=y_n-y_0;
p2 calculates the length L=sqrt (delta_x2+delta_y2) between the navigation point and the initial position
P3, setting the interval between filling points to be Lp;
the number of filling points n=l/Lp
The coordinates of the M-th filling point of P4 are delta_x_fill= (delta_x/N) M, delta_y_fill= (delta_y/N) M;
s3, controlling the robot to sequentially move along the filling points, and continuously judging the distance between the robot and the obstacle in the moving process;
specifically, the method for judging the distance between the robot and the obstacle is as follows:
s3-1, respectively obtaining two images in front through a binocular camera;
s3-2, calculating corner points in two images according to Fast algorithm, firstly selecting a pixel as a central pixel, then selecting 16 pixel points on the circumference with the radius of 3 around the central pixel, and then calculating gray value difference between the 16 pixel points and the central pixel.
S3-3 if there are 12 consecutive pixels whose gray values differ from the gray value of the center pixel by more than a threshold value of 25, the center pixel is considered to be a corner point.
S3-4, matching angular points in the two images through an ORB algorithm and calculating parallax according to the matched angular points and positions in the images, specifically, calculating descriptors of the angular points through a BRIEF algorithm, judging whether the angular points in the two images are matched or not through a hamming distance according to the descriptors of the angular points, calculating parallax between pixels of corresponding angular points in the two images, and calculating the distance between the pixels and an obstacle according to the parallax, camera parameters and polar line length;
s4, setting a safe distance between the robot and the obstacle, for example, one meter;
s5, when the robot moves to a safe distance, adding a new filling point, and moving from the current position of the robot to the newly formed filling point; after the newly formed filling point is reached, the coordinates of the newly formed filling point are vertically connected with the linear path to obtain an intersection point, and the robot is controlled to move to the next filling point of the intersection point on the linear path;
wherein the newly formed filling points are obtained as follows;
next_waypoint_x = x + forward_distance cos(steering_angle);
next_waypoint_y = y + forward_distancesin(steering_angle);
wherein x and y are coordinates of the current position of the robot, steering angle is a steering angle, forward distance is a forward distance of the robot per second, and next_waypoint_x and next_waypoint_y are coordinates of newly added filling points;
steering_angle = atan2(distance, safety_distance);
wherein, the safety_distance is a safe distance, and the distance is a distance between the robot and the obstacle;
s6, repeating the steps S3 to S5 until reaching the navigation point.
It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.

Claims (7)

1. The robot obstacle avoidance navigation method is characterized by comprising the following steps of:
s1, acquiring initial position coordinates of a robot and coordinates of navigation points;
s2, generating a straight line route according to the coordinates and arranging a plurality of filling points on the route at intervals;
s3, controlling the robot to sequentially move along the filling points, and continuously judging the distance between the robot and the obstacle in the moving process;
s4, setting a safety distance between the robot and the obstacle;
s5, when the robot moves to a safe distance, adding a new filling point, and moving from the current position of the robot to the newly formed filling point;
wherein the newly formed filling points are obtained as follows;
next_waypoint_x=x+forward_distance*cos(steering_angle);
next_waypoint_y=y+forward_distance*sin(steering_angle);
wherein x and y are coordinates of the current position of the robot, steering_angle is a steering angle, forward_distance is a forward distance of the robot per second, and next_waypoint_x/y is coordinates of a newly added filling point;
steering_angle=atan2(distance,safety_distance);
wherein, the safety_distance is a safe distance, and the distance is a distance between the robot and the obstacle;
s6, repeating the steps S3 to S5 until reaching the navigation point.
2. The method for robot obstacle avoidance navigation of claim 1 wherein,
the robot is provided with a binocular camera;
the method for judging the distance between the robot and the obstacle comprises the steps of respectively obtaining two images in front through a binocular camera, calculating angular points in the two images according to a Fast algorithm, calculating descriptors of the angular points through an ORB algorithm, matching the angular points in the two images, calculating parallax between pixels of corresponding angular points in the two images according to the descriptors, and calculating the distance between the robot and the obstacle according to the parallax, camera parameters and polar line length.
3. A robot obstacle avoidance navigation method as set forth in claim 1 or 2, wherein,
and S5-1, after the newly formed filling point is reached, connecting the coordinates of the newly formed filling point with the straight line path vertically to obtain an intersection point, and controlling the robot to move to the next filling point of the intersection point on the straight line path.
4. A robot obstacle avoidance navigation method as set forth in claim 3, wherein,
the coordinates of the filling points are obtained according to the following method:
p1 sets the coordinates of the navigation point as (x_n, y_n), the coordinates of the initial position as (x_0, y_0) and calculates the relative coordinates delta_x and delta_y;
relative x coordinates: delta_x=x_n-x_0;
relative y coordinates: delta_y=y_n-y_0;
p2 calculates the length L=sqrt (delta_x2+delta_y2) between the navigation point and the initial position
P3, setting the interval between filling points to be Lp;
the number of filling points n=l/Lp
The coordinates of the M-th filling point of P4 are delta_x_fill= (delta_x/N) M, delta_y_fill= (delta_y/N) M.
5. A robot controlled by a robot obstacle avoidance navigation method as claimed in any one of claims 1 to 4.
6. The robot of claim 5, wherein the robot is equipped with a vehicle-mounted base station, a controller, an embedded controller, a single-chip microcomputer, and a power supply system.
7. The robot of claim 6, wherein said in-vehicle base station is an RTK receiver.
CN202311261740.6A 2023-09-27 2023-09-27 Robot obstacle avoidance navigation method and robot Pending CN117553788A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311261740.6A CN117553788A (en) 2023-09-27 2023-09-27 Robot obstacle avoidance navigation method and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311261740.6A CN117553788A (en) 2023-09-27 2023-09-27 Robot obstacle avoidance navigation method and robot

Publications (1)

Publication Number Publication Date
CN117553788A true CN117553788A (en) 2024-02-13

Family

ID=89809911

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311261740.6A Pending CN117553788A (en) 2023-09-27 2023-09-27 Robot obstacle avoidance navigation method and robot

Country Status (1)

Country Link
CN (1) CN117553788A (en)

Similar Documents

Publication Publication Date Title
CN106909145B (en) Real-time obstacle sensing and avoiding system and method for unmanned channel survey vessel
CN112518739B (en) Track-mounted chassis robot reconnaissance intelligent autonomous navigation method
CN107608346A (en) Ship intelligent barrier avoiding method and system based on Artificial Potential Field
CN108345005A (en) The real-time continuous autonomous positioning orientation system and navigation locating method of tunnelling machine
CN109901580A (en) A kind of unmanned plane cooperates with unmanned ground robot follows diameter obstacle avoidance system and its method
US11726501B2 (en) System and method for perceptive navigation of automated vehicles
CN107167139A (en) A kind of Intelligent Mobile Robot vision positioning air navigation aid and system
CN106444780A (en) Robot autonomous navigation method and system based on vision positioning algorithm
CN113189977B (en) Intelligent navigation path planning system and method for robot
RU2662913C2 (en) Method of robot localization in localization plane
CN111982114B (en) Rescue robot for estimating three-dimensional pose by adopting IMU data fusion
CN103926933A (en) Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle
CN103869824A (en) Biological antenna model-based multi-robot underwater target searching method and device
Pradeep et al. A wearable system for the visually impaired
CN106569225A (en) Range-finding sensor based real-time obstacle avoidance method of driveless car
EP3799618B1 (en) Method of navigating a vehicle and system thereof
Yamashita et al. Pedestrian navigation system for visually impaired people using HoloLens and RFID
CN115416047B (en) Blind assisting system and method based on multi-sensor four-foot robot
CN114923477A (en) Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology
CN108544491A (en) A kind of moving robot obstacle avoiding method considering distance and two factor of direction
CN117553788A (en) Robot obstacle avoidance navigation method and robot
KR20210051030A (en) Apparatus and method for correcting bias in sensor
Sajeed et al. Vehicle lane departure estimation on urban roads using GIS information
CN105241466A (en) Three-dimensional navigation method and device for aircraft
Landa-Hernández et al. Cognitive guidance system for the blind

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination