CN112363491B - Robot turning control method and device - Google Patents

Robot turning control method and device Download PDF

Info

Publication number
CN112363491B
CN112363491B CN201910670300.3A CN201910670300A CN112363491B CN 112363491 B CN112363491 B CN 112363491B CN 201910670300 A CN201910670300 A CN 201910670300A CN 112363491 B CN112363491 B CN 112363491B
Authority
CN
China
Prior art keywords
point
turning
robot
straight
route
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910670300.3A
Other languages
Chinese (zh)
Other versions
CN112363491A (en
Inventor
吴珺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Ezviz Software Co Ltd
Original Assignee
Hangzhou Ezviz Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Ezviz Software Co Ltd filed Critical Hangzhou Ezviz Software Co Ltd
Priority to CN201910670300.3A priority Critical patent/CN112363491B/en
Publication of CN112363491A publication Critical patent/CN112363491A/en
Application granted granted Critical
Publication of CN112363491B publication Critical patent/CN112363491B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0217Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with energy consumption, time reduction or distance reduction criteria

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides a robot turning control method, which comprises the following steps: when an obstacle is detected to exist on the first straight navigation line, determining the distance between the current position and the obstacle; determining a turning route formed by an arc formed by a first turning point on a first straight navigation line and a second turning point on a second straight navigation line according to the distance and the navigation line distance; the first turning point has the same derivative on the first straight course as the turning route and the second turning point has the same derivative on the second straight course as the turning route. When turning around, the traveling direction of the robot in the straight line is consistent with the main direction of the camera, so that the traveling direction of the robot on the straight line between the two arc-shaped tracks is consistent with the main direction of the camera, and part of tracks can still fall in the visible area. Because the derivatives of the four points on the turning route on the two tracks to which the four points belong are the same, the robot does not need to stop turning in place to realize transition when reaching any point, and the running is smooth.

Description

Robot turning control method and device
Technical Field
The application relates to the field of robots, in particular to a robot turning control method and a robot turning control device.
Background
In order to efficiently clean the ground and reduce the condition of missing sweeping, the current sweeping robot generally moves along an arc-shaped track. When turning around, the sweeping robot stops turning around for 90 degrees at first and turns around according to an arc track until the sweeping robot moves to the next adjacent straight navigation line, so that the sweeping robot can turn around to the next straight navigation line by stopping turning around at one time.
However, in the current arc track turning-around mode, the in-situ turning-around is still required to be stopped, so that the robot is blocked in running and has lower running efficiency.
Disclosure of Invention
In view of the above, the application provides a method and a device for controlling the turning of a robot, which are used for solving the problems that the existing turning mode can cause the robot to travel and stop and the operation efficiency is lower.
According to a first aspect of an embodiment of the present application, there is provided a robot u-turn control method, including:
When the robot detects that an obstacle exists on the first straight navigation line, determining a distance S1 between the current position and the obstacle;
Determining a turning route from the first straight-going route to the second straight-going route according to the distance S1 and the route distance S2 between the first straight-going route and the second straight-going route;
the turning route is composed of an arc formed by a first turning point on a first straight navigation line and a second turning point on a second straight navigation line, the derivative of the first turning point on the first straight navigation line is identical to that of the turning route, and the derivative of the second turning point on the second straight navigation line is identical to that of the turning route.
According to a second aspect of the embodiment of the present application, there is provided a robot u-turn control method, the method including:
The robot travels along the first straight course and, upon reaching a first turning point on the turning route determined by the method as described in the first aspect above, travels along the turning route to turn from the first straight course to the second straight course.
According to a third aspect of an embodiment of the present application, there is provided a robot turn-around control device, the device including:
the detection module is used for detecting an obstacle on the first straight navigation line;
The processor is used for determining a distance S1 between the current position and the obstacle when the obstacle is detected to exist on the first straight navigation line; determining a turning route from the first straight-going route to the second straight-going route according to the distance S1 and the route distance S2 between the first straight-going route and the second straight-going route; the turning route is composed of an arc formed by a first turning point on a first straight navigation line and a second turning point on a second straight navigation line, the derivative of the first turning point on the first straight navigation line is identical to that of the turning route, and the derivative of the second turning point on the second straight navigation line is identical to that of the turning route.
According to a fourth aspect of an embodiment of the present application, there is provided a robot turn-around control device, the device including:
And the power driving module is used for controlling the robot to travel along the first straight navigation line and controlling the robot to travel along the turning route to turn from the first straight navigation line to the second straight navigation line when reaching the first turning point on the turning route determined by the method in the first aspect.
By applying the embodiment of the application, the turning route is composed of the first arc track, the first straight track and the second arc track, and in the process of turning around, as the main direction of the visual range of the camera is consistent with the advancing direction of the robot in the process of straight line advancing of the robot, the advancing direction of the robot on the straight line track between two arc tracks is consistent with the main direction of the visual range of the camera, and only the advancing direction on the two arc tracks deviates from the main direction, part of the turning around track can still fall in the visual area of the robot, and the probability of collision with objects in dead zones can be reduced. And because the derivatives of the four points on the turning route on the two tracks to which the four points belong are the same, the robot does not need to stop turning in place to realize transition when reaching any point, the running is smooth, and the running efficiency is high.
Drawings
FIG. 1A is a schematic view of the visual range of a camera according to an exemplary embodiment of the present application;
FIG. 1B is a schematic diagram of a turning route of a robot according to an exemplary embodiment of the present application;
FIG. 2A is a flow chart of an embodiment of a method for controlling robot turn around according to an exemplary embodiment of the present application;
FIG. 2B is a schematic illustration of a robot turn path according to the embodiment of FIG. 2A;
FIG. 2C is a schematic diagram of another robot turn route according to an exemplary embodiment of the present application;
FIG. 3A is a flowchart illustrating another robot turn around control method according to an exemplary embodiment of the present application;
FIG. 3B is a schematic diagram of a robot travel path according to an exemplary embodiment of the present application;
FIG. 3C is a schematic diagram of a robot travel route update according to the embodiment of FIG. 3A;
FIG. 3D is a schematic view of a track smoothing of an updated driving route according to the embodiment of FIG. 3C;
FIG. 4A is a flowchart illustrating yet another robot turn around control method according to an exemplary embodiment of the present application;
FIG. 4B is a schematic view of a currently visible area of a robot according to the embodiment of FIG. 4A;
FIG. 4C is a schematic view of the route optimization of the present application when a first turning point is blocked according to the embodiment shown in FIG. 4A;
FIG. 4D is a schematic view of the route optimization according to the present application when a first intermediate point is blocked, according to the embodiment shown in FIG. 4A;
FIG. 4E is a schematic view of route optimization when a second intermediate point is occluded according to the embodiment of FIG. 4A;
FIG. 4F is a schematic view of the route optimization of the present application when a second turning point is blocked according to the embodiment shown in FIG. 4A;
fig. 5 is a block diagram illustrating an embodiment of a robot u-turn control device according to an exemplary embodiment of the present application;
fig. 6 is a block diagram illustrating another robot u-turn control device according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the application. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination" depending on the context.
Taking a sweeping robot with a camera as an example, the camera is different from other environment sensing sensors (such as laser radar, infrared sensor, ultrasonic sensor and the like), the camera is equivalent to an 'eye' of the robot, and a certain blind area usually exists due to the limitation of the visible range of the camera (the visible range is determined by the angle of view and the visible distance of the camera), and objects in the blind area are invisible to the sweeping robot at the current moment. As shown in fig. 1A, the shadow range is the visible range of the camera and the visible range of the robot, and other areas are all blind areas, wherein the angle and depth of the blind areas are determined by the field angle of the camera and the installation position on the sweeping robot. As can be seen from fig. 1A, the camera has a main direction, and the further the camera is from the main direction, the smaller the robot has a visible range, and the main direction of the visible range is the traveling direction of the robot during the straight traveling process of the robot.
In order to avoid the problem of low robot travelling efficiency caused by continuous stop in-situ steering in the one-time turning process, as shown in fig. 1B, when the sweeping robot travels to a turning point, the robot turns around in situ for 90 degrees, and then travels to the next adjacent straight navigation line according to an arc track, and the turning-around travelling is realized through one-time in-situ steering. However, in the current arc track turning-around mode, the robot is required to stop on-site turning-around steering, so that the robot is blocked in running and has lower running efficiency. And the whole turning track direction always deviates from the main direction and falls into the blind area of the robot, so that the probability of collision between the sweeping robot and an object in the blind area is high.
When the sweeping robot is used for sweeping, the adopted bow-shaped sweeping refers to sweeping the road by turning the road to a certain width after turning the road to 90 degrees in situ when the sweeping robot is used for sweeping along a straight-going route, turning the road again to 90 degrees, enabling the current traveling direction to be opposite to the traveling direction of the original straight-going route, and continuing to travel to the next turning point.
In order to solve the problems, the application improves the robot turning control method in order to avoid the problem of robot in-situ pause turning during the robot turning process.
Fig. 2A is a flowchart of an embodiment of a robot u-turn control method according to an exemplary embodiment of the present application, where an example of how a turning route is planned before a robot travels is illustrated, and as shown in fig. 2A, the robot u-turn control method includes the following steps:
step 201: when the robot detects that an obstacle exists on the first straight navigation line, the distance S1 between the current position and the obstacle is determined.
In the present application, if there is no obstacle in front of the robot, it is generally required to plan a turning route to travel from the current straight course to the next straight course until the obstacle is detected on the straight course.
Based on the above, the robot builds an environment map by using the data collected by the various sensors, the environment map includes both the area where the robot has already walked and the area where the robot has not walked, and the environment map is marked with the current position of the robot, the first straight-going course where the robot is located, and the obstacle detected by the data of the obstacle sensor, if the robot detects that the robot has the obstacle on the first straight-going course, it means that the robot cannot travel all the way along the first straight-going course, and the distance S1 between the current position and the obstacle needs to be determined for planning the turning course.
The obstacle may be any object that impedes the progress of the robot, such as a wall, a table, a person, etc. The sensors provided on the robot may include a mileage sensor, a code wheel sensor, an obstacle sensor, an image sensor, and the like. The obstacle sensor may be a laser sensor, a radar sensor, or the like.
Step 202: and determining a turning route from the first straight-going route to the second straight-going route according to the distance S1 and the route distance S2 between the first straight-going route and the second straight-going route.
The route distance S2 between the first straight-going route and the second straight-going route may be preset, that is, the distance between every two adjacent straight-going routes is S2.
From the turning route shown in fig. 1B, when the robot travels to the turning point along the straight navigation route, the robot needs to stop in place and turn around for 90 degrees before continuing to travel according to the arc track. The reason that the turning point needs to be stopped and turned is that the derivatives of the turning point on the straight navigation line and the arc track are different, and the motion control module of the robot cannot realize smooth transition to the arc track in the driving process, so that the robot needs to be controlled to stop, turn around and turn for 90 degrees before the robot can be transitioned to the arc track.
Based on the analysis reasons, the turning route determined by the application is formed by an arc formed by the first turning point on the first straight-going route and the second turning point on the second straight-going route, the derivative of the first turning point on the first straight-going route is the same as that on the turning route, and the derivative of the second turning point on the second straight-going route is the same as that on the turning route, so that the robot can smoothly transition from the first straight-going route to the turning route and then smoothly transition from the turning route to the second straight-going route.
In this embodiment, when the robot detects that there is an obstacle on the first straight-going route, the turning route for turning from the first straight-going route to the second straight-going route is determined according to the distance S1 between the robot and the obstacle and the route distance S2 between the first straight-going route and the second straight-going route.
In an embodiment, the turning route may include a first arcuate path between the first turning point and the first intermediate point, a first linear path between the first intermediate point and the second intermediate point, and a second arcuate path between the second intermediate point and the second turning point.
The first turning point has the same derivative on the first straight-going route and the first arc track, the first intermediate point has the same derivative on the first arc track and the first straight-going route, the second intermediate point has the same derivative on the first straight-going route and the second arc track, and the second turning point has the same derivative on the second straight-going route and the second arc track. Therefore, the derivatives of each point on the two tracks to which each point belongs are the same, so that the robot can smoothly transition to the next track all the time when reaching a certain point in the turning process, no clamping is caused, the running is smooth, and the running efficiency is high.
In the application, the distance between the first intermediate point and the second intermediate point is a preset distance S3, and the center point between the first intermediate point and the second intermediate point is positioned on the center line between the first straight-going route and the second straight-going route, and the first straight-going route is perpendicular to the first straight-going route and the second straight-going route. Based on this, the positions of the first intermediate point and the second intermediate point can be determined according to the distance S1, the route distance S2, and the preset distance S3.
The length of the first straight track between the first intermediate point and the second intermediate point is S3, and the value range is: s3 is more than or equal to 0 and S2 is more than or equal to 2.
In addition, the longitudinal distance between the first turning point and the first middle point and the longitudinal distance between the second turning point and the second middle point are both preset distances S4, so that the positions of the first turning point and the second turning point can be determined according to the distance S1, the route distance S2 and the preset distances S4.
Wherein, the value range of the longitudinal distance S4 is as follows: s4>0.
The following describes in detail the course of a turn in one exemplary scenario:
As shown in fig. 2B, a coordinate system is established with the center of the robot as the origin, the horizontal direction as the horizontal axis, and the vertical direction as the vertical axis, assuming that the shape of the robot is a circle with a radius r, the point a is a first turning point, the point B is a first intermediate point, the point C is a second intermediate point, the point D is a second turning point, and the distance between the obstacle on the first straight line and the current position of the robot is S1 when the robot detects the obstacle on the first straight line, and in order to avoid the robot hitting the obstacle on the turning line, the distance between the first straight line trajectory BC and the obstacle is at least r, so that the point B position (0.5 (S2-S3), (S1-r)) and the preset distance S3 can be obtained, and the point C position (0.5 (s2+s3), (S1-r)); the A point position (0, (S1-r-S4)) and the D point position (S2, (S1-r-S4)) can be obtained by using the distance S1, the route distance S2 and the preset distance S4.
From this, the relationship of the first straight-forward course is:
x=0
wherein y is more than or equal to 0 and less than or equal to (S1-r-S4)
The relationship of the second straight-through course is:
x=S2
wherein y is more than or equal to 0 and less than or equal to (S1-r-S4)
The relationship of the first linear trace BC is:
y=S1-r
Wherein, x is more than or equal to 0.5 (S2-S3) and less than or equal to 0.5 (S2+S3)
As can be seen from the above-mentioned relational expression, the derivative of point a on the first line of travel is + -infinity, the derivatives of points B and C on the first straight track BC are 0, and the derivative of point D on the first straight track is ++. Because the first arc track is between the point A and the point B, the derivative of the point A on the first arc track is also + -infinity, and the derivative of the point B on the first arc track is also 0, the first arc track is a quarter of an ellipse. It follows that the center of the first arc-shaped track is O1 (0.5 (S2-S3), (S1-r-S4)), and the center of the second arc-shaped track is O2 (0.5 (S2+S3), (S1-r-S4)) based on the same principle of derivation.
Further, the relationship of the first arc track is as follows:
wherein x is more than or equal to 0 and less than or equal to 0.5 (S2-S3), y is more than or equal to (S1-r) and less than or equal to (S1-r-S4)
The relationship of the second arc track is:
wherein, x is more than or equal to 0.5 (S2+S3) is more than or equal to S2, (S1-r-S4) y is more than or equal to (S1-r)
It should be noted that, when the length S3 of the first straight track between the first intermediate point and the second intermediate point is taken to be 0, it means that the first intermediate point and the second intermediate point are combined into one point, as shown in fig. 2C, the point B and the point C are combined into one point, and the derivatives of the point on the first arc track and the second arc track are the same, so that the first arc track and the second arc track may also be combined into a segment of arc track.
Compared with the turning route shown in fig. 1B, although the present application also has an arc track between the first turning point and the second turning point when S3 is taken to be 0, the derivatives of the first turning point and the second turning point on the two tracks to which the first turning point belongs are the same, and the derivatives of the first turning point on fig. 1B on the two tracks to which the first turning point belongs are different.
Therefore, compared with the prior art, even if two arc-shaped tracks are combined into one section, the application does not need to turn around in a pause in-situ steering mode.
In this embodiment, since the turning route is composed of the first arc track, the first straight track and the second arc track, the main direction of the camera visual range is consistent with the advancing direction of the robot in the straight advancing process of the robot, so that the advancing direction of the robot on the straight track between the two arc tracks is consistent with the main direction of the camera visual range, and only the advancing direction on the two arc tracks deviates from the main direction, so that part of the turning track can still fall in the visible area of the robot, and the probability of collision with objects in the blind area can be reduced. And because the derivatives of the four points on the turning route on the two tracks to which the four points belong are the same, the robot does not need to stop turning in place to realize transition when reaching any point, the running is smooth, and the running efficiency is high.
Fig. 3A is a flowchart illustrating another embodiment of a robot u-turn control method according to an exemplary embodiment of the present application, which takes as an example how a robot travels along a turning route determined in any of the embodiments of fig. 2A to 2C, as shown in fig. 3A, the robot u-turn control method further includes the following steps:
Step 301: the robot travels along a first straight course to a first turning point.
Step 302: traveling along a turn path from a first turn point to a second straight course.
The following describes a scenario in which a turning route includes a first arc track, a first straight track, and a second arc track in detail:
as shown in the path schematic diagram in fig. 3B, the line with an arrow indicates the travelling direction of the robot, a first arc track is formed between the point a and the point B, a first straight track is formed between the point B and the point C, and a second arc track is formed between the point C and the point D.
When the robot moves to the point A on the first straight-going route, the robot moves from the point A to the point B along the first arc track, moves from the point B to the point C along the first straight-line track, moves from the point C to the point D along the second arc track and moves to the second straight-going route, and so on. Because the derivatives of the points A, B, C and D on the two tracks to which the points A, B, C and D respectively belong are the same, the robot can smoothly transition to the next track from each point without stopping in-situ steering.
In the turning process of the robot, the main direction of the visual range of the camera is consistent with the advancing direction of the robot in the linear advancing process of the robot, so that the advancing direction of the robot on the linear track between the two arc tracks is consistent with the main direction of the visual range of the camera, and only the advancing direction on the two arc tracks deviates from the main direction, so that part of the turning track can still fall in the visual area of the robot, and the probability of collision with objects in a blind area can be reduced. And because the derivatives of the four points on the turning route on the two tracks to which the four points belong are the same, the robot does not need to stop turning in place to realize transition when reaching any point, the running is smooth, and the running efficiency is high.
It is noted that the robot traveling along the first straight course and traveling along the turning course when reaching the first turning point is performed when the robot does not detect an obstacle affecting the turning course during traveling.
In an embodiment, in a running process of the robot, an obstacle around the robot is obtained, the outline of the obstacle is expanded according to a preset size to obtain a target outline, and if the target outline and the running route of the robot have two intersection points, the running route of the robot is updated according to the two intersection points, so that the robot is prevented from colliding with the obstacle in the running process.
The preset size can be set according to actual requirements, for example, when the shape of the robot is circular, the preset size can be set as the radius of the robot. If the target profile has two intersection points with the travel route, it means that the robot is likely to collide with an obstacle during traveling according to the current travel route, and therefore the travel route of the robot needs to be updated according to the two intersection points.
Illustratively, the obstacle sensor continuously detects surrounding obstacles during the driving of the robot and marks the surrounding obstacles in the environment map, so that the robot can acquire the surrounding obstacles of the robot from the constructed environment map. If the obstacle is large, the obstacle sensor will detect different positions in different directions, so the edge of the area formed by these positions is the outline of the obstacle.
Wherein, the two intersection points of the target profile and the driving route comprise three situations: the first is that the two intersection points are both positioned on the first straight navigation line or the second straight navigation line; the second is that both intersection points are positioned on the turning route; the third is that one intersection point is located on the turning route, and the other intersection point is located on the first straight route or the second straight route.
In an embodiment, for the process of updating the travel route of the robot according to the two intersection points, the travel route of the robot between the two intersection points may be replaced with a segment of arc on the target profile composed of the two intersection points.
As shown in fig. 3C, after the obtained outline of the obstacle around the robot is expanded, the intersection points of the obtained outline and the turning route a-B-C-D of the robot are M and N, so that the running route of the robot between the M point and the N point can be replaced by a section of arc formed by the M point and the N point on the target outline.
It should be noted that, in order to better smooth the running route of the robot, stability and efficiency of movement of the robot are increased, after the running route is updated, a track smoothing method may be adopted to smooth the updated running route, that is, smooth the unsmooth inflection point, so as to increase stability and efficiency of running of the robot, so that running of the robot is smoother.
Wherein the minimum turning radius of the robot can reach zero, so that the kinematic constraint of the robot is not needed to be considered when the track is smooth. An inflection point of a non-smooth refers to a point in the travel route where the track discontinuity and derivative does not exist. In the present application, the intersection with the travel route is an inflection point that is not smooth.
It will be appreciated by those skilled in the art that the track smoothing method may be implemented by a related art, and the track smoothing method is not particularly limited in the present application.
As shown in fig. 3C, the turning track is changed into a first arc track between the point a and the point M, the obstacle between the point M and the point N expands to form a profile track, the point N and the point D form a second arc track, the tracks at the point M and the point N are discontinuous, the included angle between the front and back tracks is relatively large, the robot needs to reduce the speed to be very low at the point M and the point N to perform the turning operation, so that the running efficiency of the robot is greatly reduced, and therefore, the turning track composed of a-M-N-D needs to be smoothed by adopting a track smoothing method, as shown in fig. 3D, and then is changed into a-M '-N' -D after being smoothed.
Thus, the flow shown in the figure 3A is completed, and through the flow shown in the figure 3A, the robot does not need to stop and turn in situ in the whole course in the turning process, so that the running efficiency of the robot is high.
Fig. 4A is a flowchart of an embodiment of another robot u-turn control method according to an exemplary embodiment of the present application, and based on the embodiment shown in fig. 2A to 3A, when an obstacle affecting a turning route is detected during the driving of the robot, the embodiment is described by taking how to optimize the turning route as an example. In this embodiment, the turning route includes a first arc-shaped track between a first turning point and a first intermediate point, a first straight-line track between the first intermediate point and a second intermediate point, and a second arc-shaped track between the second intermediate point and a second turning point.
As shown in fig. 4A, the robot u-turn control method further includes the following steps:
Step 401: and if the first turning point, the first middle point, the second middle point and the second turning point are in the visual range of the camera arranged on the robot, determining the visual area of the robot according to the visual range of the camera and the position of the obstacle contained in the visual range.
And in the running process of the robot along the first straight-going route, judging whether the first turning point, the first middle point, the second middle point and the second turning point are contained in the visual range of the camera in real time, and determining the visual area of the robot when judging that the first turning point, the first middle point, the second middle point and the second turning point are contained in the visual range of the camera. The robot may acquire obstacles contained in the visual range from the constructed environment map.
As shown in fig. 4B, when the camera includes an obstacle in the visible range, the area behind the obstacle in the visible range is blocked, and becomes an invisible area of the robot, and the area in the visible range that is not blocked is the current visible area of the robot.
Step 402: judging whether points outside the visible area exist in the first turning point, the first middle point, the second middle point and the second turning point, if so, executing step 403, and if not, ending the current flow.
Step 403: and performing position optimization on at least one point among the first turning point, the first middle point, the second middle point and the second turning point so that points outside the visible area fall into the visible area of the robot before the robot arrives.
In an embodiment, the position optimization may be performed by selecting one target point from among the first turning point, the first intermediate point, the second intermediate point, and the second turning point according to points outside the visible region.
The following describes the optimization process of the turning route in four cases:
First case: the first turning point is outside the visible region
The first turning point is selected as a target point for position optimization, and the optimization mode can be as follows: the position of the first turning point is updated by increasing the distance between the first turning point and the current position of the robot. Wherein the increased distance is smaller than the preset distance S4.
When the point a (i.e., the first turning point) is located behind the obstacle, the point a is never seen in the running process of the robot on the first straight line, and at this time, the point a may be moved up by increasing the distance between the point a and the current position of the robot, so that the point a may be ensured to fall within the visible area of the robot as much as possible before the robot bypasses the obstacle to reach the point a, and as far as possible, the point a is moved up to the intersection point position of the first straight line and the first straight line track, i.e., the point a ' in fig. 4C. Of course, the upward moving distance can be determined according to the newly added visible area in the process of the robot moving around the obstacle.
Second case: the first intermediate point is outside the visible region
The first way is: and selecting the first turning point as a target point for position optimization, namely updating the position of the first turning point by reducing the distance between the first turning point and the current position of the robot.
The second mode is as follows: the first intermediate point is selected as the target point for position optimization, i.e. the position of the first intermediate point is updated by increasing or decreasing the length S3 of the first straight-line trajectory.
As shown in fig. 4D, for the first mode, the distance between the point a and the current position of the robot is reduced by moving the point a downward, so that the point B can fall into the visible area as early as possible before the robot reaches the point B, and the distance for moving the point a downward can be preset according to practical experience; for the second way, the point B is moved out of the blind area by moving the point B left or right so as to fall within the visible area as much as possible, and the point B is moved to the point B ' to fall within the visible area in fig. 4D.
Third case: the second intermediate point is outside the visible region
The first way is: selecting a first intermediate point as a target point for position optimization, namely updating the position of the first intermediate point by increasing the length S3 of the first linear track;
The second mode is as follows: the second intermediate point is selected as the target point for position optimization, i.e. the position of the second intermediate point is updated by increasing or decreasing the length S3 of the first straight-line trajectory.
In the first way, as shown in fig. 4E, the point B is moved to the left, so that the point C can fall into the visible area as early as possible before the robot reaches the point C, as shown in fig. 4E, the point B is moved to the point B ', and the point C can fall into the visible area as early as possible when the robot turns to the BC straight track; for the second way, the point C is moved out of the blind area by moving it left so as to fall as much as possible in the visible area.
Fourth case: the second turning point is outside the visible region
The first way is: selecting a first intermediate point as a target point for position optimization, namely updating the position of the first intermediate point by increasing the length S3 of the first linear track;
the second mode is as follows: selecting a second intermediate point as a target point to perform position optimization, namely updating the position of the second intermediate point by reducing the length of the first linear track;
The third way is: and selecting the second turning point as a target point to perform position optimization, namely updating the position of the second turning point by increasing the longitudinal distance between the second turning point and the current position of the robot, wherein the increased distance is smaller than the preset distance S4.
As shown in fig. 4F, for the first mode, the point B is moved to the left, so that the point D falls into the visible area as far as possible before the robot reaches the point D; for the second mode, the point C is moved leftwards so that the point D can fall into the visible area as soon as possible, as shown in the left-moving of the point C to the point C ' in the diagram in FIG. 4F, so that the point D can fall into the visible area as soon as possible; for the third approach, the D point is moved out of the dead zone by moving the D point upward.
It should be noted that the above four cases are only examples, and the present application is not limited to adopting other optimization methods. In addition, for the combination of the four cases, the application can select and optimize four points according to the optimization principle.
Based on the above-mentioned optimization process, as long as the position of any one of four points on the turning route changes, the derivatives on the two tracks to which the robot belongs may appear different phenomena, and become inflection points, so that the robot can get stuck when running to the inflection points, and the running efficiency is reduced. Therefore, after the optimization updating, the smoothing processing can be performed on the unsmooth inflection point by adopting a track smoothing method.
Thus, the flow shown in fig. 4A is completed, the obstacle information can be fused into the turning route optimization through the flow shown in fig. 4A, the visible area of the robot is obtained according to the visible range of the obstacle and the camera, and if the turning points are blocked by the obstacle and do not fall in the visible area, any point of the four turning points can be subjected to position optimization so as to reduce the possibility of collision with the obstacle as much as possible.
Fig. 5 is a block diagram of an embodiment of a robot u-turn control device according to an exemplary embodiment of the present application, the robot u-turn control device including:
a detection module 510 for detecting an obstacle on a first straight line;
A processor 520 for determining a distance S1 between the current position and the obstacle when the obstacle is detected on the first straight line; determining a turning route from the first straight-going route to the second straight-going route according to the distance S1 and the route distance S2 between the first straight-going route and the second straight-going route; the turning route is composed of an arc formed by a first turning point on a first straight navigation line and a second turning point on a second straight navigation line, the derivative of the first turning point on the first straight navigation line is identical to that of the turning route, and the derivative of the second turning point on the second straight navigation line is identical to that of the turning route.
In an alternative implementation, the turning route includes: a first arcuate path between the first turning point and a first intermediate point, a first straight path between the first intermediate point and a second intermediate point, and a second arcuate path between the second intermediate point and the second turning point; the first turning point has the same derivative on the first straight-going route and the first arc track, the first intermediate point has the same derivative on the first arc track and the first straight-going route, the second intermediate point has the same derivative on the first straight-going route and the second arc track, and the second turning point has the same derivative on the second straight-going route and the second arc track.
In an optional implementation manner, a distance between the first intermediate point and the second intermediate point is a preset distance S3; the positions of the first intermediate point and the second intermediate point are determined according to the distance S1, the route distance S2 and the preset distance S3; the central point between the first intermediate point and the second intermediate point is located on a central line between the first straight-going air line and the second straight-going air line, and the first straight-going track is perpendicular to the first straight-going air line and the second straight-going air line.
In an alternative implementation manner, the longitudinal distance between the first turning point and the first middle point and the longitudinal distance between the second turning point and the second middle point are both preset distances S4; the positions of the first turning point and the second turning point are determined according to the distance S1, the route distance S2 and the preset distance S4.
Fig. 6 is a block diagram of another embodiment of a robot u-turn control device according to an exemplary embodiment of the present application, and based on the device structure shown in fig. 5, the robot u-turn control device further includes:
The power driving module 530 is configured to control the robot to travel along the first straight course, and when reaching the first turning point on the turning course determined by the apparatus described in fig. 5, control the robot to travel along the turning course to turn from the first straight course to the second straight course.
In an alternative implementation, the robot travels along the first straight course and travels along the turning course when reaching the first turning point, which is performed when the robot does not detect an obstacle affecting the turning course during the traveling.
In an alternative implementation, the turning route includes: a first arcuate path between the first turning point and a first intermediate point, a first linear path between the first intermediate point and a second intermediate point, and a second arcuate path between the second intermediate point and a second turning point; the device further comprises (not shown in fig. 6):
The path optimization module is used for determining the visible area of the robot according to the visible range of the camera and the position of the obstacle contained in the visible range if the first turning point, the first middle point, the second middle point and the second turning point are in the visible range of the camera arranged on the robot when the robot detects the obstacle influencing the turning route in the driving process; judging whether points which are outside the visible area exist in the first turning point, the first middle point, the second middle point and the second turning point or not; if so, performing position optimization on at least one point among the first turning point, the first middle point, the second middle point and the second turning point so that points outside the visible area fall into the visible area of the robot before the robot arrives.
In an alternative implementation, the apparatus further comprises (not shown in fig. 6):
The obstacle avoidance module is used for acquiring obstacles around the robot in the running process of the aerial robot and expanding the outline of the obstacle according to a preset size to obtain a target outline; and if the target profile and the running route of the robot have two intersection points, updating the running route of the robot according to the two intersection points.
In an optional implementation manner, the obstacle avoidance module is specifically configured to replace a travel route of the robot between the two intersection points with an arc formed by the two intersection points on the target contour in a process of updating the travel route of the robot according to the two intersection points.
The implementation process of the functions and roles of each unit in the above device is specifically shown in the implementation process of the corresponding steps in the above method, and will not be described herein again.
For the device embodiments, reference is made to the description of the method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purposes of the present application. Those of ordinary skill in the art will understand and implement the present application without undue burden.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the application.

Claims (10)

1. The robot turning control method is characterized by comprising the following steps:
When the robot detects that an obstacle exists on the first straight navigation line, determining a distance S1 between the current position and the obstacle;
Constructing an environment map by using data acquired by a sensor arranged on the robot, wherein the environment map comprises an area where the robot has walked and an area where the robot has not walked, and the environment map is marked with the current position of the robot, a first straight-going route where the robot is located and an obstacle detected by the data of an obstacle sensor;
Determining a turning route from the first straight-going route to the second straight-going route according to the distance S1 and the route distance S2 between the first straight-going route and the second straight-going route;
The turning route consists of an arc formed by a first turning point on a first straight navigation line and a second turning point on a second straight navigation line, the derivative of the first turning point on the first straight navigation line is the same as that of the turning route, and the derivative of the second turning point on the second straight navigation line is the same as that of the turning route; the turning route includes: a first arcuate path between the first turning point and a first intermediate point, a first linear path between the first intermediate point and a second intermediate point, and a second arcuate path between the second intermediate point and a second turning point;
When the robot detects an obstacle affecting a turning route in the driving process, if the first turning point, the first middle point, the second middle point and the second turning point are in the visible range of a camera arranged on the robot, determining the visible area of the robot according to the visible range of the camera and the position of the obstacle contained in the visible range; the obstacle included in the visual range is acquired from the environment map;
Judging whether points which are outside the visible area exist in the first turning point, the first middle point, the second middle point and the second turning point or not;
If so, performing position optimization on at least one point among the first turning point, the first middle point, the second middle point and the second turning point so that points outside the visible area fall into the visible area of the robot before the robot arrives.
2. The method of claim 1, wherein the first turning point has the same derivative on a first straight course as the first arcuate path, the first intermediate point has the same derivative on the first arcuate path as the first straight course, the second intermediate point has the same derivative on the first straight path as the second arcuate path, and the second turning point has the same derivative on the second straight course as the second arcuate path.
3. The method according to claim 1, wherein the distance between the first intermediate point and the second intermediate point is a preset distance S3;
the positions of the first intermediate point and the second intermediate point are determined according to the distance S1, the route distance S2 and the preset distance S3;
The central point between the first intermediate point and the second intermediate point is located on a central line between the first straight-going air line and the second straight-going air line, and the first straight-going track is perpendicular to the first straight-going air line and the second straight-going air line.
4. A method according to any one of claims 1 to 3, wherein the longitudinal distance between the first turning point and the first intermediate point and the longitudinal distance between the second turning point and the second intermediate point are both preset distances S4;
the positions of the first turning point and the second turning point are determined according to the distance S1, the route distance S2 and the preset distance S4.
5. The robot turning control method is characterized by comprising the following steps:
The robot traveling along a first straight course and traveling along the turning course to turn from the first straight course to a second straight course when reaching a first turning point on the turning course determined by the method as claimed in any one of claims 1 to 4; the turning route includes: a first arcuate path between the first turning point and a first intermediate point, a first linear path between the first intermediate point and a second intermediate point, and a second arcuate path between the second intermediate point and a second turning point;
When the robot detects an obstacle affecting a turning route in the driving process, if the first turning point, the first middle point, the second middle point and the second turning point are in the visible range of a camera arranged on the robot, determining the visible area of the robot according to the visible range of the camera and the position of the obstacle contained in the visible range;
Judging whether points which are outside the visible area exist in the first turning point, the first middle point, the second middle point and the second turning point or not; the obstacle included in the visual range is obtained from the environment map constructed by the method;
If so, performing position optimization on at least one point among the first turning point, the first middle point, the second middle point and the second turning point so that points outside the visible area fall into the visible area of the robot before the robot arrives.
6. The method of claim 5, wherein the robot traveling along the first straight course and traveling along the turn course when the first turn point is reached is performed when the robot does not detect an obstacle affecting the turn course during traveling.
7. The method of claim 5, wherein the method further comprises:
In the running process of the robot, obtaining obstacles around the robot, and expanding the outline of the obstacle according to a preset size to obtain a target outline;
And if the target profile and the running route of the robot have two intersection points, updating the running route of the robot according to the two intersection points.
8. The method of claim 7, wherein updating the travel route of the robot based on the two intersections comprises:
And replacing the running route of the robot between the two intersection points with an arc-shaped section formed by the two intersection points on the target profile.
9. A robot turn around control device, the device comprising:
the detection module is used for detecting an obstacle on the first straight navigation line and an obstacle affecting the turning route;
The processor is used for determining a distance S1 between the current position and the obstacle when the obstacle is detected to exist on the first straight navigation line; constructing an environment map by using data acquired by a sensor arranged on the robot, wherein the environment map comprises an area where the robot has walked and an area where the robot has not walked, and the environment map is marked with the current position of the robot, a first straight-going route where the robot is located and an obstacle detected by the data of an obstacle sensor; determining a turning route from the first straight-going route to the second straight-going route according to the distance S1 and the route distance S2 between the first straight-going route and the second straight-going route; the turning route consists of an arc formed by a first turning point on a first straight navigation line and a second turning point on a second straight navigation line, the derivative of the first turning point on the first straight navigation line is the same as that of the turning route, and the derivative of the second turning point on the second straight navigation line is the same as that of the turning route; the turning route includes: a first arcuate path between the first turning point and a first intermediate point, a first linear path between the first intermediate point and a second intermediate point, and a second arcuate path between the second intermediate point and a second turning point;
When the robot detects an obstacle affecting a turning route in the driving process, if the first turning point, the first middle point, the second middle point and the second turning point are in the visible range of a camera arranged on the robot, determining the visible area of the robot according to the visible range of the camera and the position of the obstacle contained in the visible range; the obstacle included in the visual range is acquired from the environment map;
Judging whether points which are outside the visible area exist in the first turning point, the first middle point, the second middle point and the second turning point or not;
If so, performing position optimization on at least one point among the first turning point, the first middle point, the second middle point and the second turning point so that points outside the visible area fall into the visible area of the robot before the robot arrives.
10. A robot turn around control device, the device comprising:
A power drive module for controlling the robot to travel along a first straight course and, when reaching a first turning point on a turning route determined by the method of any one of claims 1 to 4, controlling the robot to travel along the turning route to turn from the first straight course to a second straight course; the turning route includes: a first arcuate path between the first turning point and a first intermediate point, a first linear path between the first intermediate point and a second intermediate point, and a second arcuate path between the second intermediate point and a second turning point;
When the robot detects an obstacle affecting a turning route in the driving process, if the first turning point, the first middle point, the second middle point and the second turning point are in the visible range of a camera arranged on the robot, determining the visible area of the robot according to the visible range of the camera and the position of the obstacle contained in the visible range;
Judging whether points which are outside the visible area exist in the first turning point, the first middle point, the second middle point and the second turning point or not; the obstacle included in the visual range is obtained from the environment map constructed by the method;
If present, the first turning point, the first middle point, the second middle point and the second turning point are all
At least one point is position optimized such that points outside the visible area fall within the visible area of the robot before the robot arrives.
CN201910670300.3A 2019-07-24 2019-07-24 Robot turning control method and device Active CN112363491B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910670300.3A CN112363491B (en) 2019-07-24 2019-07-24 Robot turning control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910670300.3A CN112363491B (en) 2019-07-24 2019-07-24 Robot turning control method and device

Publications (2)

Publication Number Publication Date
CN112363491A CN112363491A (en) 2021-02-12
CN112363491B true CN112363491B (en) 2024-04-26

Family

ID=74516304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910670300.3A Active CN112363491B (en) 2019-07-24 2019-07-24 Robot turning control method and device

Country Status (1)

Country Link
CN (1) CN112363491B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115480559A (en) * 2021-05-31 2022-12-16 苏州宝时得电动工具有限公司 Self-moving device, obstacle avoidance control method and storage medium
CN115494832A (en) * 2021-06-17 2022-12-20 苏州宝时得电动工具有限公司 Self-moving device, obstacle avoidance control method and storage medium
CN113268065B (en) * 2021-07-19 2021-09-24 山东华力机电有限公司 AGV self-adaptive turning obstacle avoidance method, device and equipment based on artificial intelligence

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009031884A (en) * 2007-07-25 2009-02-12 Toyota Motor Corp Autonomous mobile body, map information creation method in autonomous mobile body and moving route specification method in autonomous mobile body
JP2013239035A (en) * 2012-05-15 2013-11-28 Toyota Motor Corp Route planning method for movable body
KR20140042346A (en) * 2012-09-28 2014-04-07 주식회사 두시텍 Self-alignment driving system
CN106909150A (en) * 2017-01-22 2017-06-30 无锡卡尔曼导航技术有限公司 For the unpiloted avoidance of agricultural machinery, turn around path planning and its control method
CN107368079A (en) * 2017-08-31 2017-11-21 珠海市微半导体有限公司 Robot cleans the planing method and chip in path
CN107511824A (en) * 2017-08-31 2017-12-26 珠海市微半导体有限公司 The control method and chip that a kind of robot turns around
US9886035B1 (en) * 2015-08-17 2018-02-06 X Development Llc Ground plane detection to verify depth sensor status for robot navigation
KR20180044486A (en) * 2016-10-21 2018-05-03 네이버 주식회사 Robot for generating 3d indoor map using autonomous driving and method for controlling the robot
WO2019025919A1 (en) * 2017-08-04 2019-02-07 Ideaforge Technology Pvt. Ltd. Uav system emergency path planning on communication failure

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9230442B2 (en) * 2013-07-31 2016-01-05 Elwha Llc Systems and methods for adaptive vehicle sensing systems

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009031884A (en) * 2007-07-25 2009-02-12 Toyota Motor Corp Autonomous mobile body, map information creation method in autonomous mobile body and moving route specification method in autonomous mobile body
JP2013239035A (en) * 2012-05-15 2013-11-28 Toyota Motor Corp Route planning method for movable body
KR20140042346A (en) * 2012-09-28 2014-04-07 주식회사 두시텍 Self-alignment driving system
US9886035B1 (en) * 2015-08-17 2018-02-06 X Development Llc Ground plane detection to verify depth sensor status for robot navigation
KR20180044486A (en) * 2016-10-21 2018-05-03 네이버 주식회사 Robot for generating 3d indoor map using autonomous driving and method for controlling the robot
CN106909150A (en) * 2017-01-22 2017-06-30 无锡卡尔曼导航技术有限公司 For the unpiloted avoidance of agricultural machinery, turn around path planning and its control method
WO2019025919A1 (en) * 2017-08-04 2019-02-07 Ideaforge Technology Pvt. Ltd. Uav system emergency path planning on communication failure
CN107368079A (en) * 2017-08-31 2017-11-21 珠海市微半导体有限公司 Robot cleans the planing method and chip in path
CN107511824A (en) * 2017-08-31 2017-12-26 珠海市微半导体有限公司 The control method and chip that a kind of robot turns around

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于多源信息融合的智能农机路径规划和路径跟踪研究;林乙蘅;中国优秀硕士学位论文全文数据库农业科技辑(第5期);正文的第1-71页 *
王建波.拖拉机作业路径规划与辅助导航***的研究.中国优秀硕士学位论文全文数据库农业科技辑.2017,(第5期),正文的第1-86页. *

Also Published As

Publication number Publication date
CN112363491A (en) 2021-02-12

Similar Documents

Publication Publication Date Title
CN112363491B (en) Robot turning control method and device
US11072073B2 (en) System for communicating and using traffic analysis in a space with moving obstacles
JP4682973B2 (en) Travel route creation method, autonomous mobile body, and autonomous mobile body control system
KR102192530B1 (en) Automated guided vehicle, system with a computer and an automated guided vehicle, method for planning a virtual track and method for operating an automated guided vehicle
Kümmerle et al. A navigation system for robots operating in crowded urban environments
KR20190019897A (en) Robot path planning system, method, robot and medium
KR101196374B1 (en) Path planning system for mobile robot
Trulls et al. Autonomous navigation for mobile service robots in urban pedestrian environments
CN108544490B (en) Obstacle avoidance method for unmanned intelligent robot road
CN107121142A (en) The topological map creation method and air navigation aid of mobile robot
JP2018514879A (en) Floor processing device and navigation method thereof, and group of floor processing devices and overall navigation method thereof
JP5666327B2 (en) Unmanned moving body and control method of unmanned moving body
EP3864484A2 (en) Autonomous map traversal with waypoint matching
CN112363494A (en) Method and device for planning advancing path of robot and storage medium
JP4670807B2 (en) Travel route creation method, autonomous mobile body, and autonomous mobile body control system
CN113110497B (en) Edge obstacle detouring path selection method based on navigation path, chip and robot
JPWO2019031168A1 (en) MOBILE BODY AND METHOD FOR CONTROLLING MOBILE BODY
US20220101534A1 (en) Sidewalk edge finder device, system and method
CN110389587A (en) A kind of robot path planning's new method of target point dynamic change
CN112394725A (en) Predictive and reactive view-based planning for autonomous driving
CN108363395A (en) A kind of method of AGV automatic obstacle avoidings
CN109085840A (en) A kind of automobile navigation control system and control method based on binocular vision
WO2023126680A1 (en) Systems and methods for analyzing and resolving image blockages
TWI532619B (en) Dual Image Obstacle Avoidance Path Planning Navigation Control Method
CN108363391B (en) Robot and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant