CN114326712A - Method for generating navigation path of robot, device, and storage medium - Google Patents

Method for generating navigation path of robot, device, and storage medium Download PDF

Info

Publication number
CN114326712A
CN114326712A CN202111478561.9A CN202111478561A CN114326712A CN 114326712 A CN114326712 A CN 114326712A CN 202111478561 A CN202111478561 A CN 202111478561A CN 114326712 A CN114326712 A CN 114326712A
Authority
CN
China
Prior art keywords
robot
area
intersection
navigation path
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111478561.9A
Other languages
Chinese (zh)
Inventor
夏俊超
梁康华
杨永森
刘喜龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunjing Intelligence Technology Dongguan Co Ltd
Yunjing Intelligent Shenzhen Co Ltd
Original Assignee
Yunjing Intelligence Technology Dongguan Co Ltd
Yunjing Intelligent Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunjing Intelligence Technology Dongguan Co Ltd, Yunjing Intelligent Shenzhen Co Ltd filed Critical Yunjing Intelligence Technology Dongguan Co Ltd
Priority to CN202111478561.9A priority Critical patent/CN114326712A/en
Publication of CN114326712A publication Critical patent/CN114326712A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention relates to the technical field of robots, and discloses a navigation path generation method of a robot, the robot, a navigation path generation device of the robot and a computer storage medium. The navigation path generation method of the robot determines a first area with a target obstacle in a trapped area where the robot is located, wherein the target obstacle is detected by a target detection device configured on the basis of the robot; detecting an intersection region between the first region and a second region through which the robot travels; and when the intersection area is detected, generating a navigation path for the robot to escape from the trapped area according to the intersection area. The invention solves the technical problem that the robot cannot smoothly escape from the narrow-passage space when the edge position of the narrow-passage space is identified as an obstacle by a radar so as to cause the robot to walk very unsmoothly.

Description

Method for generating navigation path of robot, device, and storage medium
Technical Field
The present invention relates to the field of robotics, and in particular, to a method and apparatus for generating a navigation path of a robot, and a computer storage medium.
Background
Along with the continuous improvement of the living standard and the scientific and technical standard of the materials, more and more families of users use robots to provide corresponding services for people, particularly use cleaning robots to replace people to carry out house cleaning in person, so that the working pressure of people can be relieved, and the house cleaning efficiency can be improved.
However, after a cleaning robot cleans a house along the edge and enters a narrow-passage space (such as the bottoms of sofas, beds, large electrical appliances and other objects) with a small distance from each other, based on the influence of the arrangement positions of various sensors for sensing the environment on the cleaning robot, radars with relatively high arrangement positions among the various sensors usually identify all the peripheral edge positions of the narrow-passage space as obstacles, and thus, when the cleaning robot needs to come out of the narrow-passage space, the radar identifies all the peripheral edge positions of the narrow-passage space as obstacles, so that the cleaning robot is mistakenly trapped in the narrow-passage space and cannot escape.
In summary, when the radar recognizes the edge position of the narrow-passage space as an obstacle, the conventional robot cannot smoothly escape from the narrow-passage space, and thus the robot does not walk smoothly.
Disclosure of Invention
The invention mainly aims to provide a navigation path generation method of a robot, the robot, a navigation path generation device of the robot and a computer storage medium, aiming at solving the technical problem that when the edge position of a narrow-channel space is identified as an obstacle by a radar, the existing robot cannot smoothly escape from the narrow-channel space, so that the robot runs very unsmoothly.
In order to achieve the above object, the present invention provides a navigation path generating method for a robot, including:
determining a first area with a target obstacle in a trapped area where the robot is located, wherein the target obstacle is detected by a target detection device configured on the basis of the robot;
detecting an intersection region between the first region and a second region through which the robot travels;
and when the intersection area is detected, generating a navigation path for the robot to escape from the trapped area according to the intersection area.
Optionally, the step of generating a navigation path for the robot to escape from the trapped region according to the intersection region includes:
detecting whether other obstacles exist in the intersection area, wherein the other obstacles are detected based on an environment perception sensor which is configured by the robot and is not the target detection device;
when the other obstacles do not exist in the intersection area, ignoring the target obstacles existing in the intersection area, and generating a navigation path for the robot to escape from the trapped area through the intersection area;
when the other obstacles exist in the intersection area, marking the other obstacles as the detonable obstacles, and generating the navigation path, so that when the robot travels to the intersection area according to the navigation path, the robot detours along the edge of the other obstacles to escape from the trapped area.
Optionally, after the step of generating a navigation path for the robot to escape from the trapped region directly through the intersection region by ignoring the target obstacle existing in the intersection region, the method further includes:
according to the navigation path, ignoring the target obstacle existing in the intersection region, executing walking to try to escape from the trapped region;
stopping to ignore the target obstacle and perform walking when the robot successfully escapes from the trapped area;
and outputting a preset trapped prompt when the robot is not successfully escaped from the trapped area.
Optionally, the method further comprises:
when the intersection area is not detected, performing edge-bypassing for the target obstacle, and when a preset collision sensor does not detect the target obstacle, ignoring the target obstacle until the target obstacle successfully escapes from the trapped area.
Optionally, after the step of generating a navigation path for the robot to escape from the trapped region according to the intersection region, the method further comprises:
and adjusting the navigation path to obtain a new navigation path, so that the robot can walk according to the new navigation path to escape from the trapped area.
Optionally, the step of adjusting the navigation path to obtain a new navigation path includes:
determining a first intersection point of the navigation path and one side, close to the robot, of the intersection area;
determining an obstacle point in the intersection region according to the first intersection point;
and adjusting a preset distance path including the first intersection point in the navigation path according to the first intersection point and the obstacle point to obtain a new navigation path.
Optionally, the step of determining an obstacle point in the intersection region according to the first intersection point includes:
determining a path direction of the navigation path;
and searching to the two sides of the path direction in the intersection area to obtain an obstacle point which is farthest away from the first intersection point in the intersection area.
Optionally, the step of adjusting, according to the first intersection point and the obstacle point, a preset distance path including the first intersection point in the navigation path to obtain a new navigation path includes:
determining each reference point on a connecting line of the first intersection point and the barrier point according to a first spacing distance, and expanding each reference point in a preset target direction by a corresponding first expansion distance according to the direction from the first intersection point to the second intersection point to obtain each expansion point, wherein the first expansion distance is sequentially decreased in the direction from the first intersection point to the barrier point; alternatively, the first and second electrodes may be,
determining each reference point on the preset distance path according to a second distance, and sequentially expanding each reference point in a preset first direction by a corresponding second expansion distance in an area between the preset distance path and the obstacle point to obtain each expansion point, wherein the second expansion distance is sequentially decreased in a direction from the first intersection point to a boundary point of the preset distance path;
and sequentially connecting the extension points to form an improved path, and replacing the preset distance path with the improved path to obtain a new navigation path.
Optionally, the method further comprises:
and if other obstacles exist in the area for expanding the expansion point, or if other obstacles exist between the currently expanded expansion point and the previous expansion point, stopping expanding the subsequent expansion point.
Further, to achieve the above object, the present invention also provides a robot comprising: the navigation path generation program of the robot realizes the steps of the navigation path generation method of the robot when being executed by the processor.
In order to achieve the above object, the present invention also provides a navigation path generating apparatus for a robot, comprising:
the robot comprises a determining module, a detecting module and a judging module, wherein the determining module is used for determining a first area distributed with target obstacles in a trapped area where the robot is located, and the target obstacles are obtained based on radar detection configured by the robot;
the detection module is used for detecting an intersection area between the first area and a second area where the robot walks;
and the path generation module is used for generating a navigation path for the robot to escape from the trapped region according to the intersection region when the intersection region is detected.
The steps of the navigation path generating method for the robot described above are realized when each functional module of the navigation path generating device for the robot of the present invention is operated.
In order to achieve the above object, the present invention also provides a computer storage medium having a navigation path generation program for a robot stored thereon, the navigation path generation program for a robot implementing the steps of the navigation path generation method for a robot as described above when executed by a processor.
Further, to achieve the above object, the present invention also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of the navigation path generation method of the robot as described above.
The invention provides a navigation path generation method of a robot, the robot, a navigation path generation device of the robot, a computer storage medium and a computer program product, wherein a first area with a target obstacle is determined in a trapped area where the robot is located, wherein the target obstacle is detected by a target detection device configured on the basis of the robot; detecting an intersection region between the first region and a second region through which the robot travels; and when the intersection area is detected, generating a navigation path for the robot to escape from the trapped area according to the intersection area.
According to the method, when the robot is in the narrow-passage space, so that the edge position of the narrow-passage space is recognized as an obstacle by a radar sensor of the robot, a first area with a target obstacle detected and recognized by the radar sensor is determined through the current trapped area of the robot and the narrow-passage space, then an intersection area with overlapping intersection exists between the first area and a second area where the robot has traveled before the current moment, and finally, a navigation path escaping from the narrow-passage space is directly generated according to the intersection area under the condition that the intersection area is detected by the robot, so that the robot can walk through the intersection area according to the navigation path to escape from the narrow-passage space.
The invention realizes that under the condition that the robot determines the trapped area based on the radar sensor detecting the radar obstacle, the intersection area between the area with the target obstacle and the area which the robot has already walked is determined, and then the target obstacle in the intersection area is ignored to generate the navigation path, so that the robot can escape from the trapped area at present based on the intersection area as a breakthrough, and the aim of smoothly escaping from the narrow-road space and enabling the robot to walk smoothly is achieved when the radar identifies the edge position of the narrow-road space as the obstacle.
Drawings
FIG. 1 is a schematic diagram of an apparatus structure of a robot hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart illustrating a navigation path generating method of a robot according to a first embodiment of the present invention;
fig. 3 is a schematic view of an application scenario according to an embodiment of a navigation path generation method of a robot of the present invention;
fig. 4 is a schematic view of another application scenario according to an embodiment of the navigation path generating method of the robot of the present invention;
fig. 5 is a schematic view of another application scenario related to an embodiment of the method for generating a navigation path of a robot according to the present invention;
fig. 6 is a schematic view of another application scenario according to an embodiment of the method for generating a navigation path of a robot of the present invention;
fig. 7 is a functional block diagram of an embodiment of a navigation path generating device of a robot according to the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic device structure diagram of a hardware operating environment of a robot according to an embodiment of the present invention. The robot of the embodiment of the invention can be a cleaning robot, such as a sweeping robot, a mopping robot, a sweeping and mopping integrated robot, other cleaning robots and the like.
As shown in fig. 1, the robot may include: a processor 1001, such as a CPU, a communication bus 1002, sensors 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The sensor 1003 may include at least one of: the sensors 1003 may be used to construct a map of the working environment of the robot, locate and navigate the robot, and detect information such as obstacles in the working environment, so as to control the behavior of the robot. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., a Wi-Fi interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the robot configuration shown in fig. 1 does not constitute a limitation of the robot, and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, and a navigation path generation program of the robot.
In the robot shown in fig. 1, the network interface 1004 is mainly used for connecting a base station, a charging stand, etc. used in cooperation with the robot to perform data communication with the base station, wherein, in order to facilitate the use of a user, the base station can be used in cooperation with the use of the robot, and the base station can be used for charging the robot, such as a cleaning robot, and when the electric quantity of the cleaning robot is less than a threshold value during a cleaning process, the cleaning robot automatically moves to the base station to perform charging. For a cleaning robot, the base station may also clean a mop (e.g., a mop cloth), and the cleaning robot may move onto the base station so that a cleaning mechanism on the base station automatically cleans the mop of the cleaning robot.
The processor 1001 may be configured to invoke a navigation path generation program of the robot stored in the memory 1005 and perform the following operations:
determining a first area with a target obstacle in a trapped area where the robot is located, wherein the target obstacle is detected by a target detection device configured on the basis of the robot;
detecting an intersection region between the first region and a second region through which the robot travels;
and when the intersection area is detected, generating a navigation path for the robot to escape from the trapped area according to the intersection area.
Further, the processor 1001 may call the navigation path generation program of the robot stored in the memory 1005, and also perform the following operations:
detecting whether other obstacles exist in the intersection area, wherein the other obstacles are detected based on an environment perception sensor which is configured by the robot and is not the target detection device;
when the other obstacles do not exist in the intersection area, ignoring the target obstacles existing in the intersection area, and generating a navigation path for the robot to escape from the trapped area through the intersection area;
when the other obstacles exist in the intersection area, marking the other obstacles as the detonable obstacles, and generating the navigation path, so that when the robot travels to the intersection area according to the navigation path, the robot detours along the edge of the other obstacles to escape from the trapped area.
Further, the processor 1001 may call the navigation path generation program of the robot stored in the memory 1005, and after executing the step of generating the navigation path for the robot to escape from the trapped region directly through the intersection region by ignoring the target obstacle existing in the intersection region, further execute the following operations:
according to the navigation path, ignoring the target obstacle existing in the intersection region, executing walking to try to escape from the trapped region;
stopping to ignore the target obstacle and perform walking when the robot successfully escapes from the trapped area;
and outputting a preset trapped prompt when the robot is not successfully escaped from the trapped area.
Further, the processor 1001 may call the navigation path generation program of the robot stored in the memory 1005, and also perform the following operations:
when the intersection area is not detected, performing edge-bypassing for the target obstacle, and when a preset collision sensor does not detect the target obstacle, ignoring the target obstacle until the target obstacle successfully escapes from the trapped area.
Further, the processor 1001 may call a navigation path generation program of the robot stored in the memory 1005, and after executing the step of generating a navigation path for the robot to escape from the trapped region according to the intersection region, further execute the following operations:
and adjusting the navigation path to obtain a new navigation path, so that the robot can walk according to the new navigation path to escape from the trapped area.
Further, the processor 1001 may call the navigation path generation program of the robot stored in the memory 1005, and also perform the following operations:
determining a first intersection point of the navigation path and one side, close to the robot, of the intersection area;
determining an obstacle point in the intersection region according to the first intersection point;
and adjusting a preset distance path including the first intersection point in the navigation path according to the first intersection point and the obstacle point to obtain a new navigation path.
Further, the processor 1001 may call the navigation path generation program of the robot stored in the memory 1005, and also perform the following operations:
determining a path direction of the navigation path;
and searching to the two sides of the path direction in the intersection area to obtain an obstacle point which is farthest away from the first intersection point in the intersection area.
Further, the processor 1001 may call the navigation path generation program of the robot stored in the memory 1005, and also perform the following operations:
determining each reference point on a connecting line of the first intersection point and the barrier point according to a first spacing distance, and expanding each reference point in a preset target direction by a corresponding first expansion distance according to the direction from the first intersection point to the second intersection point to obtain each expansion point, wherein the first expansion distance is sequentially decreased in the direction from the first intersection point to the barrier point; alternatively, the first and second electrodes may be,
determining each reference point on the preset distance path according to a second distance, and sequentially expanding each reference point in a preset first direction by a corresponding second expansion distance in an area between the preset distance path and the obstacle point to obtain each expansion point, wherein the second expansion distance is sequentially decreased in a direction from the first intersection point to a boundary point of the preset distance path;
and sequentially connecting the extension points to form an improved path, and replacing the preset distance path with the improved path to obtain a new navigation path.
Further, the processor 1001 may call the navigation path generation program of the robot stored in the memory 1005, and also perform the following operations:
and if other obstacles exist in the area for expanding the expansion point, or if other obstacles exist between the currently expanded expansion point and the previous expansion point, stopping expanding the subsequent expansion point.
Based on the hardware structure, embodiments of the navigation path generation method of the robot are provided. It should be noted that, after a current cleaning robot cleans a house along the side and enters a narrow-passage space (for example, the bottoms of articles such as sofas, beds, large-sized electrical appliances, and the like) with a small distance from each other, based on the influence of the arrangement positions of various sensors for sensing the environment on the cleaning robot, a target detection device (for example, a laser radar) with a relatively high arrangement position among the various sensors usually recognizes all the peripheral edge positions of the narrow-passage space as obstacles, and thus, when the cleaning robot needs to come out of the narrow-passage space, since the target detection device has recognized all the peripheral edge positions of the narrow-passage space as obstacles, the cleaning robot is mistakenly trapped in the narrow-passage space and cannot escape.
In summary, when the edge position of the narrow-passage space is recognized as an obstacle by the target detection device, the robot cannot smoothly escape from the narrow-passage space, so that the robot is not smooth to walk.
In view of the above, the present invention provides a navigation path generation method for a robot. Referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of a method for generating a navigation path of a robot according to the present invention, in which the method for generating a navigation path of a robot is applied to the robot, and the method for generating a navigation path of a robot according to the present invention includes:
step S10, determining a first area with a target obstacle in the trapped area of the robot, wherein the target obstacle is detected by a target detection device configured on the basis of the robot;
it should be noted that, in the present embodiment, the target detection device of the robot configuration includes, but is not limited to, a laser radar sensor, and the laser radar sensor may be disposed on the upper surface of the robot main body in a protruding manner, so that, relative to other sensors disposed on the robot, the detection range of the laser radar sensor is higher than that of the other sensors, and thus a higher obstacle can be detected. When the robot is in a narrow-passage space with a small vertical distance, even if various sensors used for sensing the environment on the robot do not detect and sense the obstacles, the laser radars with relatively high positions arranged in the various sensors can identify the peripheral edge positions of the narrow-passage space as the obstacles. Therefore, the robot can firstly walk along the obstacles detected and identified by the laser radar in a way of walking along the edge under the mode of normally executing house cleaning, and the laser radar identifies the peripheral edge positions of the narrow-passage space as the obstacles, so that the robot cannot escape from the narrow-passage space (namely, walks to an area outside the narrow-passage space) according to the mode of normally executing house cleaning, and then the robot determines that the obstacles identified based on the laser radar currently cause trapping and marks the narrow-passage space as the trapped area where the robot is located currently.
When the robot determines that the robot is located in a trapped area, firstly, a first area of a target obstacle detected and identified in the trapped area by a laser radar sensor with a relatively high position is determined to be distributed in the trapped area.
Specifically, for example, referring to the application scenario shown in fig. 3, when the robot enters a narrow passage space with a small distance from top to bottom, such as a bed bottom or a sofa bottom, based on the working mode of performing normal house cleaning, the narrow passage space only has four corner points or partial positions of four sides with real obstacles, however, among various sensors for sensing the environment disposed on the robot body, the lidar sensor, which is disposed at a relatively high position, may detect and recognize the periphery of the bed or the sofa as an obstacle based on the effect of the disposed position, and thus, the robot confirms that the robot is trapped after executing the normal operation mode to walk along the edge for the obstacles (the distribution of the obstacles detected by the laser radar is shown as the periphery of the bed bottom or the sofa bottom) forming the closed area, and marks the currently located bed bottom or the sofa bottom as the trapped area.
Therefore, the robot immediately switches from the working mode of normally performing house cleaning to the escape mode after confirming the trapped area, and immediately determines the distribution area of the target obstacle detected and recognized by the laser radar sensor (the area where the solid line for the robot is enclosed in the drawing) in the trapped area, i.e., the currently located bed bottom or sofa bottom, as the first area.
Step S20 of detecting an intersection area between the first area and a second area through which the robot travels;
after determining the first area distributed with the target obstacles, the robot further compares a pre-recorded second area where the robot has walked with the first area, so as to determine an intersection area between the first area and the second area, where the first area and the second area are overlapped.
It should be noted that, in this embodiment, the robot may record a path area that has been traveled in the previous environment during the whole process of starting operation, for example, the robot may specifically adopt a queue storage structure to record a path area that has been traveled when performing normal cleaning in the house area or escaping from a trapped area. It should be understood that, based on different design requirements of practical applications, in different possible embodiments, the way in which the robot records the path area or the specific recorded path, etc. may be different, and the navigation path generating method of the robot of the present invention is not limited to the specific content of the recorded way or path.
Specifically, for example, referring to the application scenario shown in fig. 3, after the robot confirms that the robot is trapped and immediately determines the distribution area of the obstacle detected and recognized by the object detection device as the first area in the trapped area, i.e., the currently located bed bottom or sofa bottom, the robot starts to read the pre-recorded second area (wide solid line area with filled gray scale shown) traveled when the normal house cleaning operation mode is performed, and combines and splices the first area and the second area according to the actual position distribution in the located environment, so as to determine the intersection area (selected area with dotted line frame shown) with overlapping positions between the first area and the second area.
In this embodiment, the robot may specifically superimpose the grid map of the first area and the recorded grid map of the second area in a manner of superimposing the grid maps, so as to combine and splice the first area and the second area according to the actual position distribution. It should be understood that, based on different design requirements of practical applications, in different possible embodiments, the robot may also adopt other possible ways to combine and splice the first area and the second area, and the navigation path generating method of the robot of the present invention is not limited to the specific way in which the robot combines and splices the first area and the second area.
Step S30, when the intersection area is detected, generating a navigation path for the robot to escape from the trapped area according to the intersection area.
In the process that a robot compares a first area distributed with a target obstacle with a pre-recorded second area already walked by the robot so as to determine an intersection area where an overlap exists between the first area and the second area, if the intersection area does exist between the first area and the second area, the robot generates a navigation path currently used by the robot for escaping from a trapped area by using a path passing through the intersection area based on the intersection area.
Specifically, for example, referring to the application scenario shown in fig. 4, when the robot combines and splices a first area where radar obstacles are distributed and a second area which has been previously walked according to the actual distribution of positions in the environment, so as to determine that there is an intersection area between the first area and the second area, which is actually overlapped, that is, the robot is previously based on the intersection area, from the currently trapped area, i.e., the area outside the bed bottom or the sofa bottom, and walks into the trapped area, i.e., the bed bottom or the sofa bottom, based on the need of normally cleaning the house, so that the robot can naturally walk from the trapped area, i.e., the bed bottom or the sofa bottom, to the area outside the bed bottom or the sofa bottom again through the intersection area, so that the robot directly ignores the radar obstacles distributed in the intersection area, the navigation path (path generated by the drawing) which can escape from the trapped area, namely the bed bottom or the sofa bottom, to the external area is generated by utilizing a section of path passing through the intersection area, and then the robot can smoothly walk to the area (such as the area where the target point is located) outside the bed bottom or the sofa bottom, namely the trapped area, according to the navigation path.
Further, in a possible embodiment, the method for generating a navigation path of a robot according to the present invention may further include:
step S40, when the intersection area is not detected, performing a side-by-side detour for the target obstacle, and when a preset collision sensor does not detect the target obstacle, ignoring the target obstacle until the target obstacle successfully escapes from the trapped area.
It should be noted that, in this embodiment, the preset collision sensor is one of many sensors disposed in the robot, and the collision sensor is used to detect whether the robot encounters a collision with an actual obstacle during a real-time walking process.
In the process that a robot compares a first area distributed with target obstacles with a pre-recorded second area where the robot has already walked, so as to determine that an intersection area overlapping the first area and the second area exists, if the intersection area does not actually exist between the first area and the second area, the robot directly performs side-by-side detour on the distributed target obstacles in the first area, and determines that the target obstacles do not actually exist when a preset collision sensor does not detect the target obstacles, and then the robot can directly neglect the target obstacles to walk until the robot successfully escapes from the currently located trapped area.
Specifically, for example, when the robot combines and splices a first area in which radar obstacles are distributed and a second area which has been previously walked according to the actual position distribution in the environment, so as to determine that the first area and the second area do not actually have an intersection area with overlapped positions, that is, the robot is directly placed at the currently-located trapped area, namely, the bed bottom or the sofa bottom in advance, and then starts to operate, the robot directly uses the nearest target radar obstacle among the radar obstacles distributed in the first area as a breakthrough to directly perform the walking manner of the edgewise detour for the target radar obstacle, and during the execution of the edgewise detour, whether a collision is generated between the robot and the target radar obstacle or other radar obstacles is detected by a collision sensor at the moment, furthermore, when the collision sensor of the robot does not detect the collision with the target radar obstacle or other radar obstacles, thereby completely bypassing the target radar obstacle, the robot can successfully and smoothly escape from the trapped area, namely the bed bottom or the sofa bottom, to the area outside the trapped area, namely the bed bottom or the sofa bottom.
Further, in this embodiment, according to the navigation path generating method of the robot of the present invention, after the robot successfully and smoothly escapes from the trapped area, the robot recovers to the normal working mode for cleaning the house, so that the radar obstacle detected and identified by the radar sensor is not ignored.
The embodiment of the invention provides a navigation path generation method of a robot, which comprises the steps that when the robot is determined to be located in a trapped area at present, a first area of radar obstacles detected and identified by radar sensors with relatively high positions in the trapped area is determined and distributed in the trapped area; after determining a first area where radar obstacles detected and identified by a radar sensor in a trapped area are distributed, the robot further compares a pre-recorded second area where the robot has walked with the first area, so as to determine an intersection area where the first area and the second area are overlapped; in the process that a first area distributed with radar obstacles identified by a radar sensor is compared with a second area which is recorded in advance and is walked by the robot, so that an intersection area which is overlapped between the first area and the second area is determined, if the intersection area does exist between the first area and the second area, the robot generates a navigation path which is used for the robot to escape from a trapped area by using a path passing through the intersection area based on the intersection area, or if the intersection area does not exist between the first area and the second area, the robot directly performs side-by-side detour on the distributed radar obstacles in the first area, and determines that the radar obstacles do not exist actually when a preset collision sensor does not detect the radar obstacles, further, the robot can directly neglect the radar obstacle to walk until the robot successfully escapes from the trapped area where the robot is located.
The invention realizes that under the condition that the robot determines the trapped area based on the radar sensor to detect the radar obstacle, the intersection area between the area where the radar obstacle is distributed and the area which the robot has already walked is determined, and then the obstacle existing in the intersection area is ignored to generate the navigation path, so that the robot can escape from the trapped area at present based on the intersection area as a breakthrough, and the aim of smoothly escaping from the narrow-road space and enabling the robot to walk smoothly when the radar identifies the edge position of the narrow-road space as the obstacle is achieved.
Further, based on the first embodiment, a second embodiment of the navigation path generation method of the robot according to the present invention is proposed, and a main difference between this embodiment and the first embodiment is that, in this embodiment, the step of "generating the navigation path for the robot to escape from the trapped region according to the intersection region" in the step S30 may include:
step S301, detecting whether other obstacles exist in the intersection area, wherein the other obstacles are detected based on an environment perception sensor configured by the robot and other than the target detection device;
when the robot compares a first area distributed with a target obstacle with a pre-recorded second area where the robot has already traveled, so as to determine that an intersection area actually exists between the first area and the second area, the robot further detects whether an environment sensing sensor is distributed in the intersection area, which is configured based on the robot and is other than a target detection device detecting the target obstacle, and other detected obstacles, specifically, the environment sensing sensor may detect an obstacle lower than a detection range of the target detection device, where the environment sensing sensor may be at least one of: a binocular sensor, a line laser sensor, a vision sensor, a collision sensor, an ultrasonic sensor, etc., without limitation.
Specifically, for example, as shown in the application scenarios in fig. 3 or fig. 4, when the robot determines that there is an intersection region (a region selected by a dashed box in the figure) with overlapping positions between a first region (a region where radar obstacles are distributed) and a second region (a region where the robot walks in the figure), the robot may further detect and identify whether an obstacle actually exists in the intersection region by using an infrared sensor or a collision sensor among all sensors currently configured for sensing the environment.
It should be noted that, in this embodiment, based on the influence of the respective installation positions of other sensors (such as the above-mentioned infrared sensor or collision sensor) configured in the robot besides the radar, when the robot further detects whether an obstacle actually exists in the intersection area, the robot may call the other sensors to perform the detection and identification process after the robot travels near the intersection area based on the current position.
Step S302, when detecting that no other obstacles exist in the intersection region, ignoring the target obstacles existing in the intersection region, and generating a navigation path for the robot to escape from the trapped region through the intersection region;
when the robot further detects that no other obstacles are distributed in the intersection area, the robot can directly ignore the target obstacles distributed in the intersection area so as to use the path passing through the intersection area to generate a navigation path which is currently used by the robot to escape from the trapped area.
Specifically, for example, referring to the application scenario shown in fig. 4, in a case where the robot further detects and identifies that no obstacle actually exists in the intersection region through an infrared sensor or a collision sensor among all the sensors currently configured for sensing the environment, the robot directly ignores radar obstacles distributed in the intersection region to generate a navigation path (illustrating the generated path) that can escape from the trapped region, i.e., the bed bottom or the sofa bottom, to the outside region by using a path passing through the intersection region.
Further, in a possible embodiment, after the step S302, the method for generating a navigation path of a robot according to the present invention may further include:
step A, according to the navigation path, ignoring the target obstacle existing in the intersection area, and executing walking to try to escape from the trapped area;
after generating a navigation path for escaping from the currently trapped area, the robot starts to attempt to escape from the trapped area by walking according to the navigation path in a manner of ignoring the target obstacles distributed in the intersection area.
Specifically, for example, referring to the application scenario shown in fig. 4, after the robot generates the navigation path, the robot may ignore radar obstacles detected by the radar sensor distributed in the intersection area according to the navigation path, so as to attempt to escape from the currently trapped area, i.e. the bed bottom or the sofa bottom, to smoothly walk an area (e.g. the illustrated goal: the area where the target point is located). The robot can mark the target obstacle on the obstacle map, and when the robot passes through the same area next time, the robot can confirm that the target obstacle exists at the position through the obstacle map.
Step B, when the robot escapes from the trapped area successfully, the robot stops ignoring the target barrier to execute walking;
the robot attempts to escape from the trapped area by ignoring the target obstacle in the intersection area, and after the robot smoothly walks to a target point outside the trapped area, the robot determines that the robot has successfully escaped from the trapped area currently, so that the robot does not continue to ignore the new target obstacle detected and identified by the target detection device, and resumes the working mode of normally cleaning the house to continue to perform the walking process.
And C, outputting a preset trapped prompt when the robot does not successfully escape from the trapped area.
It should be noted that, in this embodiment, the preset trapped prompt is a sound prompt or a message prompt that the robot issues a message to the user robot directly through a sound or a notification server by using a signal output module or a communication module such as a configured speaker, and notifies the user that the robot is currently trapped and needs artificial assistance.
The robot attempts to escape from the trapped area by ignoring target obstacles distributed in the intersection area, but still does not detect the actual existing obstacle interception in the intersection area based on other sensors, so that when the robot does not smoothly walk outside the trapped area, the robot determines that the current escape from the trapped area fails, and therefore, the robot notifies a user that the robot is currently trapped by a voice prompt or a message prompt which needs human assistance in a mode of directly sending a message to the user robot through a signal output module or a communication module such as a configured speaker.
Step S303, when it is detected that the other obstacle exists in the intersection area, the other obstacle is marked as a detonable obstacle, and the navigation path is generated, so that when the robot travels to the intersection area according to the navigation path, the robot detours along the edge of the other obstacle to escape from the trapped area.
When the robot further detects that other obstacles detected by the environment sensing sensor which is configured based on the robot and is except for the target detection device are distributed in the intersection area, the robot still ignores the target obstacles detected by the target detection device distributed in the intersection area, marks the other obstacles as the detonable obstacles, and generates a navigation path which is currently used for escaping from the trapped area by the robot by using the path passing through the intersection area in a manner of executing aiming at the other obstacles, so that the obstacle avoidance can be carried out on the detonable obstacles.
Specifically, for example, in the case where the robot further detects and identifies that an obstacle actually exists in the intersection area by an infrared sensor or a collision sensor among all the sensors for sensing the environment currently configured, but since the obstacle has a small blocking influence on the robot, and thus a detour can be performed for the obstacle, the robot marks the obstacle as a detour obstacle and then also ignores radar obstacles distributed in the intersection area to generate a navigation path that can escape to an outside area from the trapped area, i.e., the bed bottom or the sofa bottom, where the robot is located by using a path passing through the intersection area.
In this embodiment, when the robot compares a first area in which a target obstacle is distributed with a second area in which the robot has previously walked, thereby determining that there is an intersection area where the first area and the second area are overlapped, the robot further detects whether other obstacles detected by an environmental awareness sensor other than a radar based on the robot configuration are distributed in the intersection area; when the robot further detects that other obstacles detected by the environmental perception sensor which is configured based on the robot and is except for the target detection device do not distribute in the intersection area, the robot can directly ignore the target obstacles which are distributed in the intersection area and detected by the target detection device, so that a navigation path which is currently used for escaping from the trapped area of the robot is generated by using a path which passes through the intersection area; or when the robot further detects that other obstacles detected by the environment sensing sensor which is configured based on the robot and is except the target detection device are distributed in the intersection area, the robot still ignores the target obstacles distributed in the intersection area and marks the other obstacles as detonable obstacles, so that the navigation path which is currently used by the robot for escaping from the trapped area is generated by using the path passing through the intersection area in a manner of being executed for the other obstacles.
Therefore, the navigation path generation method of the robot realizes that when the robot determines that the robot is trapped based on the detection of radar obstacles by the radar sensor, the robot can efficiently escape from the narrow-passage space and walk smoothly when the radar identifies the edge position of the narrow-passage space as an obstacle by selecting to ignore radar obstacles existing in the intersection area based on the fact that whether the actual obstacles exist in the intersection area between the area where the radar obstacles are distributed and the area where the robot has already walked and whether the actual obstacles can bypass.
In addition, in the embodiment, after the navigation path for escaping from the currently trapped area is generated, the robot immediately executes walking according to the navigation path in a manner of ignoring radar obstacles distributed in the intersection area, so as to start to attempt to escape from the trapped area; the robot attempts to escape from the trapped area by ignoring radar obstacles distributed in the intersection area, and after the robot smoothly walks to a target point outside the trapped area, the robot determines that the robot has successfully escaped from the trapped area currently, so that the robot stops executing the ignoring process for the radar obstacles detected and identified by the laser radar sensor, and recovers to a normal house cleaning working mode to continue to execute the walking process; the robot attempts to escape from a trapped area by ignoring radar obstacles distributed in an intersection area, but still does not detect and identify actual existing obstacle interception in the intersection area based on other sensors, so that when the robot does not smoothly walk outside the trapped area, the robot determines that the current escape from the trapped area fails, and therefore, the robot notifies a user that the robot is currently trapped by a voice prompt or a message prompt which needs human assistance in a mode of directly sending a message to user terminal equipment through a signal output module or a communication module such as a configured loudspeaker and the like by voice or a notification server.
Therefore, the robot can immediately recover the working mode of normally cleaning the house after smoothly escaping from the trapped area, and does not continuously ignore the obstacles identified by the radar sensor, so that the cleaning efficiency and the working safety of the robot during normally cleaning the house are ensured, and the robot outputs the trapped prompt for the assistance of a user when the robot cannot successfully escape from the trapped area, instead of outputting the prompt when the robot determines to be trapped based on the radar obstacles and does not try to escape at the beginning, so that the intelligence of the robot is improved.
Further, based on the first embodiment and the second embodiment, a third embodiment of the navigation path generation method of the robot according to the present invention is proposed, and the main difference between the present embodiment and the first embodiment is that, after the step of "generating the navigation path for the robot to escape from the trapped region based on the intersection region" in the step S30, the navigation path generation method of the robot according to the present invention may further include:
step S50, adjusting the navigation path to obtain a new navigation path, so that the robot can walk according to the new navigation path to escape from the trapped area.
After the robot generates the navigation path for escaping from the currently trapped area and before the robot executes walking according to the navigation path to try to escape from the trapped area, the robot can also adjust the currently generated navigation path based on the need, so as to generate a new navigation path for the robot to execute walking according to the new navigation path to escape from the trapped area.
Further, in a possible embodiment, in the step S50, the step of "adjusting the navigation path to obtain a new navigation path" may include:
step S501, determining a first intersection point of the navigation path and the intersection area, which is close to one side of the robot;
after the robot generates the navigation path escaping from the current trapped region, the side, closest to the current position of the robot, of the intersection region is further determined, and a first intersection point between the navigation path and the side, close to the current position of the robot, of the intersection region is determined.
Specifically, for example, referring to the application scenario shown in fig. 5, a robot determines a circle of obstacles that are enclosed by radar obstacles near a side where the robot is currently located to form an innermost layer in a first area where the radar obstacles detected and identified by a radar sensor are distributed in advance, and then marks an intersection point a between an innermost layer obstacle in an intersection area between the first area and a second area and a generated navigation path that can escape from a trapped area, of the circle of obstacles on the innermost layer, as a first intersection point.
Step S502, determining an obstacle point in the intersection region according to the first intersection point;
after determining a first intersection point between the navigation path and one side of the intersection area close to the current position of the robot, the robot further determines an obstacle point between the intersection area and a first area where radar obstacles identified by radar detection are distributed according to the first intersection point.
Further, in a possible embodiment, the step S502 may include:
step S5021, determining the path direction of the navigation path;
step S5022, searching is conducted on the two sides of the path direction in the intersection area, and an obstacle point which is farthest away from the first intersection point in the intersection area is obtained.
When the machine determines a second intersection point between the intersection area and the first area distributed with radar obstacles according to the first intersection point, the path direction of the navigation path generated by the robot at present is determined, and then searching is carried out on two sides of the intersection area along the path direction, so that the searched obstacle point where the intersection area is connected with the first area and the obstacle farthest from the first intersection point is located in the intersection area.
Specifically, for example, referring to the application scenario shown in fig. 5, after determining a first intersection point a between an innermost obstacle in an intersection area and a generated navigation path capable of escaping from a trapped area, the robot performs obstacle search along the path direction of the navigation path on both sides of the path direction, and determines an obstacle point B where an obstacle which is farthest from the first intersection point but exists at a contact position between the intersection area and the first area is located, where the path direction points to one side of the first area where radar obstacles are distributed.
Step S503, adjusting a preset distance path including the first intersection in the navigation path according to the first intersection and the obstacle point, to obtain a new navigation path.
After the first intersection point and the barrier point are determined, the robot is connected with the first intersection point and the barrier point to obtain a connecting line, so that the robot can adjust the preset distance path containing the first intersection point in the generated navigation path by taking the connecting line as a reference to obtain a new adjusted navigation path.
It should be noted that, in this embodiment, the preset distance path is a path with a preset distance in the generated navigation paths, and the preset distance may be specifically defined based on design requirements of practical applications.
Further, in a possible embodiment, the step S503 may include:
step S5031, determining each reference point on a connection line between the first intersection point and the obstacle point according to a preset first separation distance, and extending each reference point by a corresponding first extension distance in a preset target direction to obtain each extension point, wherein the first extension distance is sequentially decreased in a direction from the first intersection point to the obstacle point;
it should be noted that, in this embodiment, the preset first separation distance and the first extension distance may be set based on design requirements of practical applications, where the first extension distance corresponds to each reference point, and each first extension distance is sequentially decreased in a direction from the first intersection point to the obstacle point as a distance between the reference point and the first intersection point increases. In addition, the preset target direction is a direction perpendicular to a line connecting the first intersection point and the obstacle point.
After a first intersection point and an obstacle point are determined by the robot so as to obtain a connecting line between the first intersection point and the obstacle point, namely on the connecting line, a point is taken as a reference point every time a first interval distance is preset from the first intersection point, and the process is analogized until the obstacle point, then the robot sequentially takes the reference point to expand a first expansion distance corresponding to the current reference point in the direction perpendicular to the connecting line according to the direction from the first intersection point to the obstacle point so as to obtain expansion points, and thus, the reference points are sequentially expanded to respectively obtain the expansion points.
Specifically, for example, referring to the application scenario shown in fig. 5, after determining a first intersection point a and an obstacle B to obtain an AB connection line, a point is taken as a reference point every preset first distance a on the AB connection line, then the robot expands each reference point outward by a first expansion distance along a direction perpendicular to the AB connection line to obtain expansion points, and when the robot expands the reference point outward, the first expansion distance decreases sequentially in a direction from the first intersection point to the obstacle point as a distance between the current reference point and the first intersection point a increases.
Step S5032, determining each reference point on the preset distance path according to a second distance, sequentially extending each reference point in a preset first direction by a corresponding second extension distance in an area between the preset distance path and the obstacle point to obtain each extension point, wherein the second extension distance is sequentially decreased in a direction from the first intersection point to a boundary point of the preset distance path;
it should be noted that, in this embodiment, similarly, the preset second separation distance and the second extension distance may be set based on design requirements of practical applications, where the second extension distance corresponds to each reference point, and each second extension distance is sequentially decreased in a direction from the first intersection point to a boundary point of the preset distance path as the distance between the reference point and the first intersection point increases (that is, the second extension distance is sequentially increased in a direction from the first intersection point to an obstacle point as the distance between the current reference point and the first intersection point decreases, and when the reference point and the first intersection point coincide, the second extension distance is increased to a maximum and is equal to the distance between the first intersection point and the obstacle point). In addition, the preset first direction is a direction pointing to one side of the preset distance path deviated from the obstacle point.
In this embodiment, in addition to generating each expansion point by expanding the connection line between the first intersection point and the obstacle point as a reference, the robot may directly determine a reference point based on a preset distance path including the first intersection point in the currently generated navigation path and expand the reference point to obtain an expansion point, that is, the robot may start from a first boundary point where the preset distance path is located in the trapped region to a second boundary point where the preset distance path is located outside the trapped region on the preset distance path, take a point at every preset second interval distance as the reference point, and then sequentially take the reference point to expand a second expansion distance corresponding to the current reference point in a direction parallel to the connection line according to the direction from the intersection point of the preset distance path with the connection line between the first intersection point and the obstacle point to the first boundary point and the second boundary point respectively to obtain the expansion point, thus, the reference points are sequentially expanded to obtain expansion points.
Specifically, for example, referring also to the application scenario as shown in fig. 5, after the robot determines the first intersection point a and the obstacle B to obtain an AB connection line, on a preset distance path which comprises the first intersection point A in the current generated navigation path, taking a point at every preset second interval distance b from a first boundary point of the preset distance path within the trapped area as a reference point until the reference point is taken to a second boundary point of the preset distance path outside the trapped area, then, the robot expands each reference point outwards by a second expansion distance along the direction parallel to the AB connection line to obtain each expansion point, and when the robot expands the reference points outwards, the second extended distance increases in the direction from the first intersection point to the obstacle point in order as the distance between the current reference point and the first intersection point a decreases.
Further, in a possible embodiment, the method for generating a navigation path of a robot of the present invention may further include:
and if other obstacles exist in the area for expanding the expansion point, or if other obstacles exist between the currently expanded expansion point and the previous expansion point, stopping expanding the subsequent expansion point.
It should be noted that, in this embodiment, when the robot determines the reference point on the connection line AB, when the distance between the reference point and the first intersection point a is greater than the preset maximum distance, the position of the reference point belongs to the distribution area of the radar obstacles in the first area, and/or the connection line between the corresponding extension points of the reference point belongs to the distribution area of the radar obstacles in the first area, the reference point is not determined any more.
Step S5033, sequentially connecting the extension points to form an improved path, and replacing the preset distance path including the first intersection in the navigation path with the improved path to obtain a new navigation path.
After the robot expands outwards for each reference point to obtain each expansion point, the robot sequentially connects the expansion points to form an improved path for replacing the preset distance path including the first intersection point in the navigation path, so that the robot obtains a new navigation path after replacing the preset distance path including the first intersection point by using the improved path.
Specifically, for example, referring to the application scenarios shown in fig. 5 and 6, after the robot expands outwards for each reference point on the AB line to obtain each expansion point, the robot sequentially connects the expansion points to form an improved path for replacing the preset distance path including the first intersection point a in the navigation path, and further, after replacing the preset distance path including the first intersection point with the improved path, the robot obtains a new navigation path as shown in fig. 6.
In this embodiment, after the robot generates the navigation path for escaping from the currently trapped area and before the robot performs walking according to the navigation path to attempt to escape from the trapped area, the robot may further adjust the currently generated navigation path based on the need, so as to generate a new navigation path for the robot to perform walking according to the new navigation path to escape from the trapped area. Therefore, on the basis of ensuring that the robot smoothly escapes from the trapped area based on the navigation path, the adjustment aiming at the navigation path is further realized, and the flexibility of the robot for generating the navigation path to smoothly escape from the trapped area is improved.
Furthermore, the invention also provides a navigation path generation device of the robot. Referring to fig. 7, fig. 7 is a functional module schematic diagram of an embodiment of a navigation path generating device of a robot according to the present invention. As shown in fig. 7, the navigation path generating apparatus of the robot of the present invention includes:
a determining module 10, configured to determine, in a trapped area where the robot is located, a first area where a target obstacle is distributed, where the target obstacle is detected based on a radar configured by the robot;
a detection module 20, configured to detect an intersection region between the first region and a second region where the robot has walked;
and a path generating module 30, configured to generate a navigation path for the robot to escape from the trapped region according to the intersection region when the intersection region is detected.
Further, the path generating module 30 includes:
the detection unit is used for detecting whether other obstacles exist in the intersection area, wherein the other obstacles are detected based on an environment perception sensor which is configured by the robot and is not the target detection device;
a first path generating unit, configured to, when it is detected that the intersection area does not have the other obstacle, ignore the target obstacle existing in the intersection area, and generate a navigation path for the robot to escape from the trapped area through the intersection area;
and the second path generating unit is used for marking other obstacles as detouring obstacles when detecting that the other obstacles exist in the intersection area, and generating the navigation path so that the robot detours along the edge of the other obstacles to escape from the trapped area when the robot travels to the intersection area according to the navigation path.
Further, the path generating module 30 further includes:
the escape execution unit is used for ignoring the target obstacle existing in the intersection region according to the navigation path and executing walking to try to escape from the trapped region;
a abandoning unit for stopping to ignore the target barrier and executing walking when the robot successfully escapes from the trapped area;
and the prompt output unit is used for outputting a preset trapped prompt when the robot does not successfully escape from the trapped area.
Further, the navigation path generating apparatus of the robot according to the present invention further includes:
and the edge escape module is used for performing edge detour aiming at the target obstacle when the intersection area is not detected, and ignoring the target obstacle until the trapped area is successfully escaped when a preset collision sensor does not detect the target obstacle.
Further, the navigation path generating apparatus of the robot according to the present invention further includes:
and the path adjusting module is used for adjusting the navigation path to obtain a new navigation path so that the robot can walk according to the new navigation path to escape from the trapped area.
Further, the path adjustment module includes:
the first determining unit is used for determining a first intersection point of the navigation path and one side, close to the robot, of the intersection area;
a second determination unit configured to determine an obstacle point in the intersection region according to the first intersection point;
and the adjusting unit is used for adjusting a preset distance path containing the first intersection point in the navigation path according to the first intersection point and the obstacle point to obtain a new navigation path.
Further, the second determination unit includes:
a direction determining subunit, configured to determine a path direction of the navigation path;
and the searching subunit is configured to search the intersection region to two sides of the path direction to obtain an obstacle point farthest from the first intersection point in the intersection region.
Further, the adjusting unit further includes:
the first extension subunit is configured to determine each reference point according to a first separation distance on a connection line between the first intersection point and the obstacle point, and extend each reference point by a corresponding first extension distance in a preset target direction according to a direction from the first intersection point to the second intersection point to obtain each extension point, where the first extension distance sequentially decreases in a direction from the first intersection point to the obstacle point;
the second extension subunit is configured to determine each reference point on the preset distance path according to a second distance, and sequentially extend each reference point in a preset first direction by a corresponding second extension distance in an area between the preset distance path and the obstacle point to obtain each extension point, where the second extension distance sequentially decreases in a direction from the first intersection point to a boundary point of the preset distance path;
and the replacing unit is used for sequentially connecting the extension points to form an improved path and replacing a preset distance path containing the first intersection point in the navigation path by using the improved path to obtain a new navigation path.
Further, the expanding unit is further configured to stop expanding the subsequent expansion point if there is another obstacle in the area where the expansion point is expanded, or if there is another obstacle between the currently expanded expansion point and the previous expansion point.
The function implementation of each module in the navigation path generation device of the robot corresponds to each step in the navigation path generation method embodiment of the robot, and the function and implementation process are not described in detail herein.
The present invention also provides a computer storage medium having a navigation path generation program for a robot stored thereon, which when executed by a processor implements the steps of the navigation path generation method for a robot according to any one of the above embodiments.
The specific embodiment of the computer storage medium of the present invention is substantially the same as the embodiments of the navigation path generating method of the robot, and will not be described herein again.
The invention also provides a computer program product comprising a computer program which, when being executed by a processor, realizes the steps of the navigation path generation method of the robot according to any one of the above embodiments.
The specific embodiment of the computer storage medium of the present invention is substantially the same as the embodiments of the navigation path generating method of the robot, and will not be described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on this understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for causing a robot to perform the methods according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (12)

1. A method for generating a navigation path of a robot, the method comprising:
determining a first area with a target obstacle in a trapped area where the robot is located, wherein the target obstacle is detected by a target detection device configured on the basis of the robot;
detecting an intersection region between the first region and a second region through which the robot travels;
and when the intersection area is detected, generating a navigation path for the robot to escape from the trapped area according to the intersection area.
2. The method of generating a navigation path for a robot according to claim 1, wherein the step of generating a navigation path for the robot to escape from the trapped region based on the intersection region comprises:
detecting whether other obstacles exist in the intersection area, wherein the other obstacles are detected based on an environment perception sensor which is configured by the robot and is not the target detection device;
when the other obstacles do not exist in the intersection area, ignoring the target obstacles existing in the intersection area, and generating a navigation path for the robot to escape from the trapped area through the intersection area;
when the other obstacles exist in the intersection area, marking the other obstacles as the detonable obstacles, and generating the navigation path, so that when the robot travels to the intersection area according to the navigation path, the robot detours along the edge of the other obstacles to escape from the trapped area.
3. The method of generating a navigation path for a robot according to claim 2, wherein after the step of generating a navigation path for the robot to escape from the trapped region directly through the intersection region by ignoring the target obstacle existing in the intersection region, the method further comprises:
according to the navigation path, ignoring the target obstacle existing in the intersection region, executing walking to try to escape from the trapped region;
stopping to ignore the target obstacle and perform walking when the robot successfully escapes from the trapped area;
and outputting a preset trapped prompt when the robot is not successfully escaped from the trapped area.
4. The navigation path generation method of a robot according to claim 1, further comprising:
when the intersection area is not detected, performing edge-bypassing for the target obstacle, and when a preset collision sensor does not detect the target obstacle, ignoring the target obstacle until the target obstacle successfully escapes from the trapped area.
5. The method of generating a navigation path for a robot according to claim 1, wherein after the step of generating a navigation path for the robot to escape from the trapped region from the intersection region, the method further comprises:
and adjusting the navigation path to obtain a new navigation path, so that the robot can walk according to the new navigation path to escape from the trapped area.
6. The method for generating a navigation path of a robot according to claim 5, wherein the step of adjusting the navigation path to obtain a new navigation path comprises:
determining a first intersection point of the navigation path and one side, close to the robot, of the intersection area;
determining an obstacle point in the intersection region according to the first intersection point;
and adjusting a preset distance path including the first intersection point in the navigation path according to the first intersection point and the obstacle point to obtain a new navigation path.
7. The method of generating a navigation path for a robot according to claim 6, wherein the step of determining the obstacle point in the intersection region from the first intersection point includes:
determining a path direction of the navigation path;
and searching to the two sides of the path direction in the intersection area to obtain an obstacle point which is farthest away from the first intersection point in the intersection area.
8. The method for generating a navigation path of a robot according to claim 6 or 7, wherein the step of adjusting a preset distance path including the first intersection point in the navigation path according to the first intersection point and the obstacle to obtain a new navigation path includes:
determining each reference point on a connecting line between the first intersection point and the obstacle point according to a first spacing distance, and expanding each reference point in a preset target direction by a corresponding first expansion distance according to the direction from the first intersection point to the second intersection point to obtain each expansion point, wherein the first expansion distance is sequentially decreased in the direction from the first intersection point to the obstacle point; alternatively, the first and second electrodes may be,
determining each reference point on the preset distance path according to a second distance, and sequentially expanding each reference point in a preset first direction by a corresponding second expansion distance in an area between the preset distance path and the obstacle point to obtain each expansion point, wherein the second expansion distance is sequentially decreased in a direction from the first intersection point to a boundary point of the preset distance path;
and sequentially connecting the extension points to form an improved path, and replacing the preset distance path with the improved path to obtain a new navigation path.
9. The navigation path generation method of a robot according to claim 8, further comprising:
and if other obstacles exist in the area for expanding the expansion point, or if other obstacles exist between the currently expanded expansion point and the previous expansion point, stopping expanding the subsequent expansion point.
10. A robot, characterized in that the robot comprises: a memory, a processor and a navigation path generation program of a robot stored on the memory and operable on the processor, the navigation path generation program of a robot implementing the steps of the navigation path generation method of a robot according to any one of claims 1 to 9 when executed by the processor.
11. A navigation path generation apparatus of a robot, the navigation path generation apparatus of the robot comprising:
the robot comprises a determining module, a detecting module and a judging module, wherein the determining module is used for determining a first area distributed with target obstacles in a trapped area where the robot is located, and the target obstacles are obtained based on radar detection configured by the robot;
the detection module is used for detecting an intersection area between the first area and a second area where the robot walks;
and the path generation module is used for generating a navigation path for the robot to escape from the trapped region according to the intersection region when the intersection region is detected.
12. A computer storage medium, characterized in that the computer storage medium has stored thereon a navigation path generation program of a robot, which when executed by a processor, implements the steps of the navigation path generation method of a robot according to any one of claims 1 to 9.
CN202111478561.9A 2021-12-06 2021-12-06 Method for generating navigation path of robot, device, and storage medium Pending CN114326712A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111478561.9A CN114326712A (en) 2021-12-06 2021-12-06 Method for generating navigation path of robot, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111478561.9A CN114326712A (en) 2021-12-06 2021-12-06 Method for generating navigation path of robot, device, and storage medium

Publications (1)

Publication Number Publication Date
CN114326712A true CN114326712A (en) 2022-04-12

Family

ID=81049463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111478561.9A Pending CN114326712A (en) 2021-12-06 2021-12-06 Method for generating navigation path of robot, device, and storage medium

Country Status (1)

Country Link
CN (1) CN114326712A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101359229A (en) * 2008-08-18 2009-02-04 浙江大学 Barrier-avoiding method for mobile robot based on moving estimation of barrier
CN103823466A (en) * 2013-05-23 2014-05-28 电子科技大学 Path planning method for mobile robot in dynamic environment
CN107041718A (en) * 2016-02-05 2017-08-15 北京小米移动软件有限公司 Clean robot and its control method
JP2017204061A (en) * 2016-05-10 2017-11-16 ヤンマー株式会社 Autonomous travel route generation system
KR20170129571A (en) * 2016-05-17 2017-11-27 엘지전자 주식회사 Robot cleaner
CN109496288A (en) * 2017-07-13 2019-03-19 北京嘀嘀无限科技发展有限公司 System and method for determining track
CN110464262A (en) * 2019-07-30 2019-11-19 广东宝乐机器人股份有限公司 The method of getting rid of poverty of sweeping robot
CN111904346A (en) * 2020-07-09 2020-11-10 深圳拓邦股份有限公司 Method and device for getting rid of difficulties of sweeping robot, computer equipment and storage medium
CN113110497A (en) * 2021-05-08 2021-07-13 珠海市一微半导体有限公司 Navigation path-based edge obstacle-detouring path selection method, chip and robot
CN113303705A (en) * 2021-06-18 2021-08-27 云鲸智能(深圳)有限公司 Control method and device for cleaning robot, cleaning robot and storage medium
CN113359754A (en) * 2021-06-25 2021-09-07 深圳市海柔创新科技有限公司 Obstacle avoidance method, obstacle avoidance device, electronic device, and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101359229A (en) * 2008-08-18 2009-02-04 浙江大学 Barrier-avoiding method for mobile robot based on moving estimation of barrier
CN103823466A (en) * 2013-05-23 2014-05-28 电子科技大学 Path planning method for mobile robot in dynamic environment
CN107041718A (en) * 2016-02-05 2017-08-15 北京小米移动软件有限公司 Clean robot and its control method
JP2017204061A (en) * 2016-05-10 2017-11-16 ヤンマー株式会社 Autonomous travel route generation system
KR20170129571A (en) * 2016-05-17 2017-11-27 엘지전자 주식회사 Robot cleaner
CN109496288A (en) * 2017-07-13 2019-03-19 北京嘀嘀无限科技发展有限公司 System and method for determining track
CN110464262A (en) * 2019-07-30 2019-11-19 广东宝乐机器人股份有限公司 The method of getting rid of poverty of sweeping robot
CN111904346A (en) * 2020-07-09 2020-11-10 深圳拓邦股份有限公司 Method and device for getting rid of difficulties of sweeping robot, computer equipment and storage medium
CN113110497A (en) * 2021-05-08 2021-07-13 珠海市一微半导体有限公司 Navigation path-based edge obstacle-detouring path selection method, chip and robot
CN113303705A (en) * 2021-06-18 2021-08-27 云鲸智能(深圳)有限公司 Control method and device for cleaning robot, cleaning robot and storage medium
CN113359754A (en) * 2021-06-25 2021-09-07 深圳市海柔创新科技有限公司 Obstacle avoidance method, obstacle avoidance device, electronic device, and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
YUNFEI MA, 2018 5TH INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 3 January 2019 (2019-01-03), pages 1 - 6 *
王岩涛: "移动机器人视野污物定位与清洁规划研究", 《中国优秀博硕士学位论文全文数据库(硕士)-信息科技辑》, 15 March 2018 (2018-03-15), pages 1 - 85 *
陈天德: "基于碰撞预测的无碰撞航路规划", 《***工程理论与实践》, vol. 40, no. 4, 25 April 2020 (2020-04-25), pages 1057 - 1068 *

Similar Documents

Publication Publication Date Title
KR101999033B1 (en) Method for building a map of probability of one of absence and presence of obstacles for an autonomous robot
WO2019128933A1 (en) Map construction and navigation method, and device and system
JP4675811B2 (en) Position detection device, autonomous mobile device, position detection method, and position detection program
US10948907B2 (en) Self-driving mobile robots using human-robot interactions
CN110575099B (en) Fixed-point cleaning method, floor sweeping robot and storage medium
CN112739244A (en) Mobile robot cleaning system
KR101883473B1 (en) Apparatus and method for constructing map of mobile robot
CN110968083B (en) Method for constructing grid map, method, device and medium for avoiding obstacles
WO2019190395A1 (en) Method and system for returning a displaced autonomous mobile robot to its navigational path
KR20140031316A (en) Tracking and following of moving objects by a mobile robot
KR100962593B1 (en) Method and apparatus for area based control of vacuum cleaner, and recording medium thereof
JP2008084135A (en) Movement control method, mobile robot and movement control program
JP2006350776A (en) Traveling object route generating device
KR20180038879A (en) Robot for airport and method thereof
CN110989630A (en) Self-moving robot control method, device, self-moving robot and storage medium
US20200376676A1 (en) Method of localization using multi sensor and robot implementing same
WO2014048597A1 (en) Autonomous mobile robot and method for operating the same
KR20180031153A (en) Airport robot, and method for operating server connected thereto
KR20180040907A (en) Airport robot
CN112015187A (en) Semantic map construction method and system for intelligent mobile robot
CN114326712A (en) Method for generating navigation path of robot, device, and storage medium
JP2009178782A (en) Mobile object, and apparatus and method for creating environmental map
JP5448429B2 (en) Moving body detection device and moving body detection method
EP4209849A1 (en) Telepresence robots having cognitive navigation capabilit
Luo et al. Autonomous mobile robot navigation and localization based on floor plan map information and sensory fusion approach

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination