CN112783147A - Trajectory planning method and device, robot and storage medium - Google Patents

Trajectory planning method and device, robot and storage medium Download PDF

Info

Publication number
CN112783147A
CN112783147A CN201911097094.8A CN201911097094A CN112783147A CN 112783147 A CN112783147 A CN 112783147A CN 201911097094 A CN201911097094 A CN 201911097094A CN 112783147 A CN112783147 A CN 112783147A
Authority
CN
China
Prior art keywords
moving
self
robot
determining
vibration amplitude
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911097094.8A
Other languages
Chinese (zh)
Inventor
吕成器
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to CN201911097094.8A priority Critical patent/CN112783147A/en
Publication of CN112783147A publication Critical patent/CN112783147A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the application provides a track planning method, a track planning device, a robot and a storage medium. In some embodiments of the present application, a plurality of frames of images acquired from a mobile robot while moving are acquired; determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame image; determining the path direction of the self-moving robot according to the vibration amplitude; and planning the moving track of the self-moving robot according to the path direction. The moving track with small vibration amplitude can be planned for the self-moving robot. Through the detection of the vibration amplitude of each moving direction in the ground medium area, the moving track with the minimum influence of the ground texture on the self-moving robot can be planned, and the working effects of the self-moving robot on different ground textures can be effectively improved.

Description

Trajectory planning method and device, robot and storage medium
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a trajectory planning method, a trajectory planning device, a robot and a storage medium.
Background
With the continuous development of artificial intelligence technology, various intelligent robots increasingly enter the lives of people, such as logistics robots, floor sweeping robots, welcoming robots and the like.
Taking a sweeping robot as an example, the sweeping robot needs to be capable of performing sweeping work on various objects to be swept, such as carpet, wood floor, ceramic tiles, and the like. In practical application, the cleaning effect for different cleaning objects is not completely the same, for example, when the sweeping robot cleans a carpet, the sweeping robot has poor cleaning effect due to fluff on the surface of the carpet.
Disclosure of Invention
Aspects of the application provide a trajectory planning method, a trajectory planning device, a robot and a storage medium, so as to plan a movement trajectory with insignificant vibration amplitude in a movement process of a mobile robot.
The embodiment of the application provides a trajectory planning method, which is applied to a self-moving robot and comprises the following steps:
acquiring a plurality of frames of images acquired by the self-moving robot during moving;
determining the vibration amplitude of the self-moving robot in different moving directions according to the pixel coordinates of the corresponding feature points in the multi-frame image;
determining the path direction of the self-moving robot according to the vibration amplitude;
and planning the moving track of the self-moving robot according to the path direction.
The embodiment of the application provides a trajectory planning device, is applied to from mobile robot, the device includes:
the acquisition module is used for acquiring multi-frame images acquired by the self-moving robot during moving;
the amplitude determining module is used for determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame images;
the path determining module is used for determining the path direction of the self-moving robot according to the vibration amplitude;
and the track planning module is used for planning the moving track of the self-moving robot according to the path direction.
Embodiments of the present application provide a computer-readable storage medium storing a computer program that, when executed by one or more processors, causes the one or more processors to perform actions comprising:
acquiring a plurality of frames of images acquired by a mobile robot during movement;
determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame image;
determining the path direction of the self-moving robot according to the vibration amplitude;
and planning the moving track of the self-moving robot according to the path direction.
The embodiment of the application provides a from mobile robot, includes: the machine body is provided with one or more processors and one or more memories for storing computer programs;
the one or more processors to execute the computer program to:
acquiring a plurality of frames of images acquired by the self-moving robot during moving;
determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame image;
determining the path direction of the self-moving robot according to the vibration amplitude;
and planning the moving track of the self-moving robot according to the path direction.
In some embodiments of the present application, before the self-moving robot performs a moving operation in the ground medium region, a moving trajectory needs to be planned, specifically, after the self-moving robot arrives in the ground medium region, the self-moving robot rotates according to a preset radius, and continuously collects multiple frames of images corresponding to the self-moving robot during the rotation. And further, calculating the pixel coordinates of the corresponding characteristic points in the multi-frame images so as to determine the vibration amplitude in different moving directions. Through the detection to the vibration amplitude of each moving direction in the ground medium region, can plan out the ground texture and to the minimum removal orbit of self-moving robot vibration amplitude influence, can effectively promote the work effect on different ground textures from the self-moving robot, for example can promote the effect of cleaning from the self-moving robot of sweeping the floor.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic view of a cleaning track of a conventional sweeping robot provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a trajectory planning method according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating a method for detecting texture direction according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of determining a path direction according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of a movement trajectory planning provided in the embodiment of the present application;
fig. 6a and 6b are schematic diagrams of different initial position movement track plans provided in the embodiment of the present application;
FIG. 7 is a diagram illustrating an exemplary movement trajectory planning based on texture directions, according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a trajectory planning apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a self-moving robot according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terminology used in the embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and "a" and "an" typically include at least two, but do not exclude the presence of at least one.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
In the application, the self-moving robot can independently walk and execute corresponding service functions, and also can have the functions of calculation, communication, internet surfing and the like. The embodiment of the application can be unmanned aerial vehicle, unmanned vehicle and the like from mobile robot. The basic service functions of the self-moving robot are different according to different application scenes. The self-moving robot can be a sweeping robot, a following robot, a welcoming robot and the like. For example, for a sweeping robot applied to scenes such as homes, office buildings, shopping malls and the like, the basic service function of the sweeping robot is to sweep the ground in the scene; for glass cleaning robots applied to scenes such as families, office buildings, markets and the like, the basic service function is to clean glass in the scenes; for a following robot, the basic service function is to follow a target object; the basic service function of a guest-greeting robot is to welcome the customer and guide the customer to the destination.
In order to facilitate understanding of a suitable scenario of the trajectory planning method provided in the present application, a sweeping robot is taken as an example for description, and it is assumed that a current sweeping robot is sweeping a carpet area. Fig. 1 is a schematic view of a cleaning track of a conventional sweeping robot provided in an embodiment of the present application. It can be seen from fig. 1 that the carpet weaving directions are all from left to right, as indicated by the solid arrowed lines in the figure. In the prior art, a sweeping robot may sweep back and forth on a carpet according to a left-right back-and-forth movement track. It will be readily appreciated that the pile fibers of the carpet are woven in a sequence, i.e., the direction of weave is constant for the same carpet. When the sweeping robot moves towards the direction same as the weaving direction, the sweeping robot is stable; when the sweeping robot moves towards the direction opposite to the weaving direction, the sweeping robot jolts and has larger vibration amplitude in the advancing process due to the fact that the carpet fiber fluff sticks, and the sweeping effect of the sweeping robot is seriously influenced by the jolt.
Therefore, the technical scheme of the application provides a cleaning method for an area to be cleaned, wherein the carpet and the like have obvious texture directions, and the texture directions can cause the robot to generate large vibration. The robot of sweeping the floor needs confirm the direction of weaving of carpet earlier before sweeping the carpet region, then confirms the removal orbit that can satisfy the robot of sweeping the floor and clean the demand according to weaving the direction, when the robot of sweeping the floor cleans according to this removal orbit, can effectively avoid because the carpet is woven the adverse effect that the direction caused to cleaning work.
Fig. 2 is a schematic flow chart of a trajectory planning method according to an embodiment of the present application. The method comprises the following steps:
201: multiple frames of images acquired by the mobile robot while moving are acquired.
202: and determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame image.
203: and determining the path direction of the self-moving robot according to the vibration amplitude.
204: and planning the moving track of the self-moving robot according to the path direction.
When the self-moving robot needs to move on the ground with obvious textures (such as wood floors) or the ground paved with tracks with regular shapes, the self-moving robot needs to plan a moving track capable of moving more stably according to the actual ground conditions. When planning the movement track, the user can assist in planning, and the self-moving robot can also completely plan the movement track.
In an application scene in which a trajectory is completely planned autonomously by the self-moving robot, the self-moving robot is required to detect a ground medium area first, and if the self-moving robot is detected to be located in a preset ground medium area (for example, a carpet area), the texture direction of the ground medium area is judged; then determining the path direction according to the texture direction; further, a movement trajectory is planned based on the path direction. If the self-moving robot detects the ground medium area and finds that the self-moving robot is not located in the preset ground medium area, detection of the texture direction is not needed, and the path direction is not needed to be determined according to the texture direction.
Fig. 3 is a schematic diagram of a method for detecting a texture direction according to an embodiment of the present disclosure. For the sake of understanding, how to determine the texture direction in the area of the floor medium is specifically described below with reference to fig. 3, assuming that the floor medium is a carpet, and the texture direction is the weaving direction of the carpet, that is, the floor sweeping robot determines the weaving direction of the carpet as an example.
The robot for sweeping the floor is provided with at least one monocular camera, and the monocular camera is usually arranged at the front end of the robot as shown in figure 3, so that the monocular camera has a better visual field, and more clear and accurate acquisition of multi-frame images is facilitated. When acquiring the multi-frame images acquired by the sweeping robot during movement, the moving mode of the sweeping robot is to move in different directions, and may be non-linear, such as circular, elliptical, and the like, for example, the sweeping robot performs circular rotation according to a preset radius (it should be noted that the circular rotation is not performed by the sweeping robot for rotation), and during the rotation of the sweeping robot, the multi-frame images in different moving directions are continuously acquired by at least one monocular camera. It should be noted that the circular rotation is adopted here to detect all directions of the ground medium region, and avoid the problem that detection is omitted in individual directions.
The sweeping robot analyzes the collected images so as to determine the vibration amplitude of the sweeping robot in different moving directions. In the moving process of the robot, the representing mode of the vibration amplitude can be various, and the technical scheme of the application is that the vibration amplitude is represented by the pixel longitudinal coordinate difference value of the characteristic point in the collected multi-frame image.
It is easy to understand that when the vibration amplitude of the sweeping robot is large, the vertical coordinate change of the pixels corresponding to the same feature points in the collected multi-frame images is large; when the vibration amplitude of the sweeping robot is small (for example, no vibration exists), the pixel ordinate corresponding to the same feature point in the collected multi-frame image is less in change.
A map of the current environment is stored in the sweeping robot, and a corresponding coordinate system is arranged in the map, so that the sweeping robot can accurately distinguish the position and the direction of the sweeping robot in the current environment through a laser radar in the working process, accurately know the position of a marked obstacle in the map, and assist the robot in planning a moving track and automatically avoiding obstacles. In the image acquisition process of the sweeping robot, the image acquisition frequency is usually higher, and multiple frames of images are continuously acquired in each moving direction, so that the vibration amplitude corresponding to each moving direction is determined by comparing the coordinate variation of the feature points in the multiple frames of images.
If two adjacent frames of images are acquired through the monocular camera, the acquired images need to be simply processed firstly, because the sweeping robot has a constantly changing shooting visual angle in the vibration process, the relative positions of the boundary feature points of the shot images and the like have obvious distortion, in order to eliminate the influence of the feature points on the coordinate comparison result, the acquired images can be cut, and only the middle parts (parts with good visual angle consistency) of the images are reserved as the first image and the second image. After the first image and the second image are obtained, the vibration amplitude of the sweeping robot can be calculated by using an optical flow algorithm.
It is assumed that the effective feature point pixel coordinates in the first image are Pk { (x1, y1), (x2, y2), (x3, y3),. ((xn, yn) }, and the effective feature point pixel coordinates in the second image are Pk-1 { (x '1, y' 1), (x '2, y' 2), (x '3, y' 3),. ((x 'n, y' n) }, respectively.
After the first image and the second image are obtained, before the vibration amplitude is calculated according to the pixel coordinates of the characteristic points, the influence of some interference characteristic points is eliminated. In practical application, the sweeping robot keeps moving forward all the time in the process of image acquisition, and during the moving process, the sweeping robot may move at a constant speed or at a variable speed. The sweeping robot moves forwards, so that a relatively obvious difference exists between pixel coordinates of part of feature points in two adjacent images, and when the forward movement speed of the sweeping robot is too high, a relatively obvious difference exists between pixel coordinates of all feature points in the first image and the second image. Therefore, in order to eliminate the influence of the motion of the sweeping robot on the coordinate comparison result of the feature points, the following filtering measures can be specifically adopted:
the pixel coordinate values of the characteristic points in the first image and the second image are firstly obtained, then the pixel abscissa of the characteristic point at the specified position in the first image is compared with the pixel abscissa of the characteristic point corresponding to the second image, and the pixel abscissa difference value is obtained. If the difference value of the horizontal coordinates of the pixels is larger than the threshold value, the pixel coordinates of the characteristic point are considered to be obviously influenced by the self motion of the robot, and the characteristic point can not be used for calculating the vibration amplitude. The sweeping robot can compare and screen all the feature points in the first image and the second image once through the pixel abscissa comparison method to obtain qualified feature points. And determining the vibration amplitude according to the qualified characteristic points.
If all the feature points in the first image and the second image are compared and screened once to obtain qualified feature points, then the vibration amplitude can be calculated by using the effective feature points in the first image and the second image. Specifically, the method comprises the following steps: obtaining n effective characteristic points in the first image and corresponding pixel longitudinal coordinate values y 1-yn, and obtaining n effective characteristic points in the second image, wherein the corresponding pixel longitudinal coordinate values are y '1-y' n. Then, the absolute values of the differences between the pixel ordinate y and the pixel ordinate y' of the n effective feature points are obtained one by one, and then the average value of the absolute values of the n differences is obtained to serve as the pixel ordinate difference between the first image and the second image. This pixel ordinate difference represents the relative vibration amplitude generated by the adjacent first and second images.
Then, the maximum amplitude and the minimum amplitude are determined from the pixel ordinate difference, and generally, the larger the pixel ordinate difference, the larger the corresponding vibration amplitude, and the smaller the pixel ordinate difference, the smaller the corresponding vibration amplitude. The moving direction corresponding to the maximum vibration amplitude is opposite to the weaving direction of the carpet, and the moving direction corresponding to the minimum vibration amplitude is the same as the weaving direction of the carpet. Therefore, in practical applications, if the texture direction is to be determined, the direction of the maximum vibration amplitude or the direction of the minimum vibration amplitude may be searched first. For example, the sweeping robot wants to find the texture direction of a carpet, and experiments show that the generated vibration amplitude is maximum when the moving direction of the sweeping robot is opposite to the texture direction; when the moving direction of the sweeping robot is the same as or close to the grain direction, the vibration amplitude is not very large, the minimum vibration amplitude is not easy to find, but the maximum vibration amplitude is relatively easy to find.
As described above, during the rotation of the self-moving robot, the image and the corresponding direction are collected in real time, however, the image collection frequency of the robot monocular camera is higher than the direction collection frequency of the robot lidar. In order to accurately find the directions corresponding to the maximum amplitude and the minimum amplitude, a list may be established, for example, in the list, every 10 vibration amplitudes V1-V10 correspond to a moving direction D1, the vibration amplitudes V11-V20 correspond to a moving direction D2, and so on, the corresponding relationship between all vibration amplitudes and moving directions may be obtained.
After obtaining the above list, in order to find the maximum vibration amplitude or the minimum vibration amplitude more accurately, a sliding window algorithm may be used, and then the texture direction (i.e. the weaving direction of the carpet) may be determined. For example, assume that each sliding window contains 10 sampling data of vibration amplitude, and then the vibration amplitude average V is calculated for the absolute values of the 10 data; and calculating the average value of the vibration amplitude of the next sliding window, wherein the difference between every two adjacent sliding windows is at least ten sampling periods corresponding to the sampling data. And each vibration amplitude average value V corresponds to one moving direction D, and then the vibration amplitude average values V corresponding to all the sliding windows in the process of rotating for one circle are compared to determine the maximum amplitude and the minimum amplitude.
In order to accurately determine the maximum vibration amplitude or the minimum vibration amplitude, the vibration amplitude calculation is firstly carried out on the effective characteristic points in the two adjacent frames of images, and the average value of the vibration amplitudes of all the effective characteristic points is used as the vibration amplitude between the two frames of images. Then, according to the respective corresponding vibration amplitudes of the multiple frames of images, calculating the average value of the vibration amplitudes of the multiple frames of images as the vibration amplitude in a certain moving direction. The amplitudes of the vibrations in the respective moving directions are compared to confirm the maximum amplitude or the minimum amplitude in one rotation.
In practical applications, if a plurality of identical large vibration amplitudes are obtained through calculation, the middle direction in the directions corresponding to the large vibration amplitudes is selected as the direction corresponding to the maximum amplitude. For example, if V1, V2, and V3 with the same vibration amplitude are obtained, and the corresponding moving directions are D1, D2, and D3, respectively, and D2 is greater than D3, and D2 is smaller than D1, then the moving direction corresponding to the maximum vibration amplitude is selected as D2. This direction of movement D2 is determined for one rotation of the robot.
In practical applications, when the robot rotates for one circle, due to the influence of the positioning accuracy of the pose of the robot, or the calculation accuracy of the maximum vibration amplitude and the minimum vibration amplitude, the direction corresponding to the maximum vibration amplitude or the corresponding relationship between the minimum vibration amplitude and the moving direction may be inaccurate. For this purpose, the robot may rotate several turns, and a candidate texture direction may be determined through each turn (as described above, the moving direction corresponding to the maximum vibration amplitude is opposite to the texture direction, and the moving direction corresponding to the minimum vibration amplitude is the same as the texture direction), and then an average value of the candidate texture directions is obtained as the actual texture direction of the floor medium (for example, the weaving direction of the carpet). For example, if the candidate texture direction obtained in the first circle is shifted clockwise by 30 degrees with respect to the Y axis, the candidate texture direction obtained in the second circle is shifted clockwise by 40 degrees with respect to the Y axis, and the candidate texture direction obtained in the third circle is shifted clockwise by 22 degrees with respect to the Y axis in the rectangular coordinate system of the current room, the texture direction obtained finally is shifted clockwise by 31 degrees with respect to the Y axis.
In practice, after the robot has completed the detection of the grain direction, the path direction corresponding to the ground medium area will be further determined. It is easy to understand that when the robot walks, the direction is not single, and there may be many times of round trips, and usually the round trips indicate that the going direction and the returning direction are completely opposite, for example, as shown in fig. 1, the going direction of the robot is from left to right, and then the returning direction is from right to left, because the texture direction is from left to right, the robot does not vibrate when going, but the returning direction vibrates greatly, and therefore, the direction same as or opposite to the texture direction cannot be selected as the path direction. In order to make the robot less affected by the texture direction during multiple round trips, a direction perpendicular to the texture direction may be selected as the path direction. The path direction can ensure that the robot does not generate large vibration in the reciprocating process.
For ease of understanding, the determination process of the path direction will be explained below with reference to fig. 4. Fig. 4 is a schematic diagram of determining a path direction according to an embodiment of the present application. It is assumed that the carpet weaving direction is from left to right, that is, the sweeping robot will generate a large vibration amplitude when moving from right to left. In order to ensure that the reciprocating movement is influenced by the knitting direction as little as possible, a direction perpendicular to the knitting direction may be selected, for example, an upward or downward direction in fig. 4 is taken as a path direction (the up, down, left, and right directions as used herein refer to directions in a plan view), and both the path directions can satisfy the requirement of the robot to plan the movement path. If the upward direction is taken as the robot path direction, i.e. the direction the robot goes, then the downward direction is the corresponding return direction. It can be seen that during the reciprocating movement of the robot in the illustrated path direction, no excessive vibration amplitude is generated due to the texture direction. It is easy to understand that the starting position of the robot relative to the carpet is different, so the path direction is not unique, and the moving track determined based on the path direction is not unique.
After the path direction is determined, the movement track of the robot is planned based on the path direction. Fig. 5 is a schematic diagram of a movement trajectory planning provided in the embodiment of the present application. Specifically, as shown in fig. 5, assuming that the starting path direction is upward, the robot is reciprocated, and an end-to-end "bow" shaped movement locus can be obtained. It should be noted that even if the path direction is determined, the influence of the texture direction should be eliminated as much as possible when planning the movement trajectory, that is, the extension direction of the movement trajectory should be the same as the texture direction, for example, the movement trajectory extends in the sequence of the trajectory sequence numbers (i), (ii), (iii), and (iv) in the figure, so as to ensure that the movement direction of the end-to-end connection of the end-to-end bow-shaped form is the same as or close to the texture direction. The whole sweeping process of the sweeping robot is to sweep the carpet area according to the moving track.
In practical applications, after the sweeping robot finishes sweeping the carpet area, the sweeping robot marks the position of the carpet, and the path direction or texture direction suitable for the carpet in its stored map. And further, planning a moving track by combining the initial position of the sweeping robot in the carpet. For example, as shown in fig. 6a and 6b, which are schematic diagrams of planning movement trajectories at different starting positions provided in the embodiment of the present application, if the starting position of the robot is a, the movement trajectory shown in fig. 6a can be obtained; if the starting position of the robot is B, the moving track shown in fig. 6B can be obtained; if the starting position of the robot is C or D, a relatively large vibration amplitude will be generated because the moving direction of the reciprocating connection position (i.e., the semi-circular arc connection position in fig. 6B) in the moving track is opposite to the texture direction, and the moving track needs to be re-planned after the sweeping robot moves to the position a or B. Through the scheme, the robot is not required to rotate a circle to search the carpet weaving direction again, and the cleaning efficiency of the floor sweeping robot can be effectively improved on the premise of ensuring the cleaning effect on the carpet area.
It should be noted that, before the sweeping robot starts working in different working environments, it is first determined whether the current room is a full-house carpet, if not, the non-carpet area such as the floor area is preferentially cleaned, and after the non-carpet area is cleaned, the carpet area is cleaned again. The sweeping robot judges whether the current sweeping robot moves to the carpet area or not according to the marked position of the carpet area in the map.
If the sweeping robot moves into the carpet area, the sweeping robot rotates around the carpet for one circle according to the method shown in fig. 3, so that the texture direction of the carpet (i.e. the knitting direction of the pile of the carpet) can be determined, and then the path direction of the sweeping robot during the sweeping operation is determined based on the texture direction.
If the carpet is a full-house carpet, the sweeping robot firstly rotates around the carpet for a circle before the sweeping is started, and the sweeping robot rotates around the carpet for a circle by adopting the method shown in fig. 3, so that the texture direction of the carpet (namely the weaving direction of the fluff of the carpet) can be determined, and further the path direction of the sweeping robot during the sweeping work is determined based on the texture direction.
As an alternative embodiment, fig. 7 is a schematic diagram of an exemplary trajectory planning based on a texture direction according to an embodiment of the present application. As shown, the direction of the grain in the carpet area (i.e., the direction of the carpet pile weave) is along the diagonal of the rectangle. After moving to the carpet area, the sweeping robot rotates on the carpet circularly around 360 degrees according to the rotating track shown in the figure, so that the sweeping robot can determine that the texture direction of the carpet faces to the opposite angle direction of the rectangle shown in the figure. Based on the texture direction, a path direction perpendicular to the texture direction is further determined, and a movement track in a shape of a Chinese character 'bow' as shown in the figure can be planned.
Fig. 8 is a schematic structural diagram of a trajectory planning apparatus provided in an embodiment of the present application, which may be applied to a robot, and the apparatus includes: an acquisition module 81, an amplitude determination module 82, a path determination module 83, and a trajectory planning module 84.
And the acquiring module 81 is used for acquiring a plurality of frames of images acquired by the mobile robot during movement.
And the amplitude determining module 82 is used for determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame images.
And a path determining module 83, configured to determine a path direction of the self-moving robot according to the vibration amplitude.
And a trajectory planning module 84, configured to plan a movement trajectory of the self-moving robot according to the path direction.
Optionally, the obtaining module 81 is configured to obtain the multi-frame image acquired by the self-moving robot during moving if it is determined that the self-moving robot is located in the preset ground medium area.
Optionally, the predetermined area of ground media comprises a carpeted area.
Optionally, the obtaining module 81 is configured to control the self-moving robot to move in different directions; and acquiring multi-frame images collected in different moving directions in the moving process of the self-moving robot.
Optionally, the amplitude determining module 82 is configured to determine the vibration amplitudes of the self-moving robot in different moving directions according to the average value of the pixel ordinate difference values of the corresponding feature points in the multi-frame images.
Optionally, the amplitude determining module 82 is configured to determine a difference value of pixel abscissas of corresponding feature points in the multi-frame image; filtering out corresponding characteristic points with the difference value of the horizontal coordinates of the pixels larger than a threshold value; and determining the vibration amplitude of the self-moving robot in different moving directions according to the average value of the pixel longitudinal coordinate difference values of the corresponding feature points left in the multi-frame images.
Optionally, a path determination module 83 for determining a maximum or minimum vibration amplitude; determining the texture direction of the ground medium according to the maximum or minimum vibration amplitude; and determining the path direction of the self-moving robot according to the texture direction.
Optionally, the path determining module 83 is configured to determine that the moving direction corresponding to the minimum vibration amplitude is the alternative texture direction, or determine that the direction opposite to the moving direction corresponding to the maximum vibration amplitude is the alternative texture direction; and determining the texture direction of the ground medium according to the average direction of the plurality of candidate texture directions.
Optionally, the path determining module 83 is configured to determine that a direction perpendicular to the texture direction is a path direction of the self-moving robot.
Fig. 9 is a schematic structural diagram of a self-moving robot according to an exemplary embodiment of the present application. The self-moving robot comprises a machine body, one or more processors 901, one or more memories 902 storing computer programs. In addition, the self-moving robot may include necessary components such as a power supply component 903.
One or more processors 901 for executing a computer program for:
acquiring a plurality of frames of images acquired by a mobile robot during movement; determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame image; determining the path direction of the self-moving robot according to the vibration amplitude; and planning the moving track of the self-moving robot according to the path direction.
Optionally, the one or more processors 901 are configured to acquire a plurality of frames of images acquired by the self-moving robot while moving if it is determined that the self-moving robot is located in a preset ground medium area.
Optionally, the predetermined area of ground media comprises a carpeted area.
Optionally, one or more processors 901 for controlling the self-moving robot to move in different directions; and acquiring multi-frame images collected in different moving directions in the moving process of the self-moving robot.
Optionally, the one or more processors 901 are configured to determine vibration amplitudes of the self-moving robot in different moving directions according to an average value of pixel ordinate difference values of corresponding feature points in the multiple frames of images.
Optionally, the one or more processors 901 are configured to determine a difference value of pixel abscissas of corresponding feature points in the multiple frames of images; filtering out corresponding characteristic points with the difference value of the horizontal coordinates of the pixels larger than a threshold value; and determining the vibration amplitude of the self-moving robot in different moving directions according to the average value of the pixel longitudinal coordinate difference values of the corresponding feature points left in the multi-frame images.
Optionally, one or more processors 901 for determining a maximum or minimum vibration amplitude; determining the texture direction of the ground medium according to the maximum or minimum vibration amplitude; and determining the path direction of the self-moving robot according to the texture direction.
Optionally, the one or more processors 901 are configured to determine that a moving direction corresponding to the minimum vibration amplitude is an alternative texture direction, or determine that a direction opposite to the moving direction corresponding to the maximum vibration amplitude is an alternative texture direction; and determining the texture direction of the ground medium according to the average direction of the plurality of candidate texture directions.
Optionally, the one or more processors 901 are configured to determine that the direction perpendicular to the texture direction is the path direction of the self-moving robot.
In the embodiment of the self-moving robot, the self-moving robot and a corresponding computer readable storage medium may also be provided, and are not described herein again.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (12)

1. A trajectory planning method, applied to a self-moving robot, comprising:
acquiring a plurality of frames of images acquired by the self-moving robot during moving;
determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame image;
determining the path direction of the self-moving robot according to the vibration amplitude;
and planning the moving track of the self-moving robot according to the path direction.
2. The method of claim 1, wherein acquiring the plurality of frames of images acquired while the self-moving robot is moving comprises:
and if the self-moving robot is determined to be located in a preset ground medium area, acquiring multi-frame images acquired when the self-moving robot moves.
3. The method of claim 2, wherein the predetermined area of ground media comprises a carpeted area.
4. The method of claim 1 or 2, wherein acquiring the plurality of frames of images acquired while the self-moving robot is moving comprises:
controlling the self-moving robot to move along different directions;
and acquiring multi-frame images collected in different moving directions in the moving process of the self-moving robot.
5. The method according to claim 4, wherein the determining the vibration amplitude of the self-moving robot in moving according to the pixel coordinates of the corresponding feature points in the multi-frame images comprises:
and determining the vibration amplitude of the self-moving robot in different moving directions according to the average value of the pixel longitudinal coordinate difference values of the corresponding feature points in the multi-frame images.
6. The method according to claim 5, wherein the determining the vibration amplitude of the self-moving robot in different moving directions according to the average value of the difference values of the pixel ordinates of the corresponding feature points in the multi-frame images comprises:
determining the difference value of the pixel abscissa of the corresponding feature point in the multi-frame image;
filtering out corresponding characteristic points with the difference value of the horizontal coordinates of the pixels larger than a threshold value;
and determining the vibration amplitude of the self-moving robot in different moving directions according to the average value of the pixel longitudinal coordinate difference values of the corresponding feature points remaining in the multi-frame images.
7. The method of claim 1, wherein determining the path direction of the self-moving robot from the vibration amplitude comprises:
determining a maximum or minimum vibration amplitude;
determining the texture direction of the ground medium according to the maximum or minimum vibration amplitude;
and determining the path direction of the self-moving robot according to the texture direction.
8. The method of claim 7, wherein determining the grain direction of the ground medium from the maximum or minimum vibration amplitude comprises:
determining that the moving direction corresponding to the minimum vibration amplitude is the alternative texture direction, or determining that the direction opposite to the moving direction corresponding to the maximum vibration amplitude is the alternative texture direction;
and determining the texture direction of the ground medium according to the average direction of the plurality of candidate texture directions.
9. The method of claim 7, wherein determining the path direction of the self-moving robot from the texture direction comprises:
and determining the direction perpendicular to the texture direction as the path direction of the self-moving robot.
10. A trajectory planning device, applied to a self-moving robot, comprising:
the acquisition module is used for acquiring multi-frame images acquired by the self-moving robot during moving;
the amplitude determining module is used for determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame images;
the path determining module is used for determining the path direction of the self-moving robot according to the vibration amplitude;
and the track planning module is used for planning the moving track of the self-moving robot according to the path direction.
11. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by one or more processors, causes the one or more processors to perform acts comprising:
acquiring a plurality of frames of images acquired by a mobile robot during movement;
determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame image;
determining the path direction of the self-moving robot according to the vibration amplitude;
and planning the moving track of the self-moving robot according to the path direction.
12. A self-moving robot, comprising: the machine body is provided with one or more processors and one or more memories for storing computer programs;
the one or more processors execute the computer program to:
acquiring a plurality of frames of images acquired by the self-moving robot during moving;
determining the vibration amplitude of the self-moving robot during moving according to the pixel coordinates of the corresponding feature points in the multi-frame image;
determining the path direction of the self-moving robot according to the vibration amplitude;
and planning the moving track of the self-moving robot according to the path direction.
CN201911097094.8A 2019-11-11 2019-11-11 Trajectory planning method and device, robot and storage medium Pending CN112783147A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911097094.8A CN112783147A (en) 2019-11-11 2019-11-11 Trajectory planning method and device, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911097094.8A CN112783147A (en) 2019-11-11 2019-11-11 Trajectory planning method and device, robot and storage medium

Publications (1)

Publication Number Publication Date
CN112783147A true CN112783147A (en) 2021-05-11

Family

ID=75749867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911097094.8A Pending CN112783147A (en) 2019-11-11 2019-11-11 Trajectory planning method and device, robot and storage medium

Country Status (1)

Country Link
CN (1) CN112783147A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113657164A (en) * 2021-07-15 2021-11-16 美智纵横科技有限责任公司 Method and device for calibrating target object, cleaning equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203053428U (en) * 2012-12-06 2013-07-10 紫光股份有限公司 Detection device of azimuth of optical imaging type wheeled mobile robot
CN106325270A (en) * 2016-08-12 2017-01-11 天津大学 Intelligent vehicle navigation system and method based on perception and autonomous calculation positioning navigation
CN107314773A (en) * 2017-08-18 2017-11-03 广东宝乐机器人股份有限公司 The map creating method of mobile robot and the paths planning method based on the map
JP2018165889A (en) * 2017-03-28 2018-10-25 カシオ計算機株式会社 Autonomous mobile device, method and program
WO2019076044A1 (en) * 2017-10-20 2019-04-25 纳恩博(北京)科技有限公司 Mobile robot local motion planning method and apparatus and computer storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203053428U (en) * 2012-12-06 2013-07-10 紫光股份有限公司 Detection device of azimuth of optical imaging type wheeled mobile robot
CN106325270A (en) * 2016-08-12 2017-01-11 天津大学 Intelligent vehicle navigation system and method based on perception and autonomous calculation positioning navigation
JP2018165889A (en) * 2017-03-28 2018-10-25 カシオ計算機株式会社 Autonomous mobile device, method and program
CN107314773A (en) * 2017-08-18 2017-11-03 广东宝乐机器人股份有限公司 The map creating method of mobile robot and the paths planning method based on the map
WO2019076044A1 (en) * 2017-10-20 2019-04-25 纳恩博(北京)科技有限公司 Mobile robot local motion planning method and apparatus and computer storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113657164A (en) * 2021-07-15 2021-11-16 美智纵横科技有限责任公司 Method and device for calibrating target object, cleaning equipment and storage medium

Similar Documents

Publication Publication Date Title
US11740634B2 (en) Systems and methods for configurable operation of a robot based on area classification
AU2017228620B2 (en) Autonomous coverage robot
CN110801180B (en) Operation method and device of cleaning robot
CN111609852A (en) Semantic map construction method, sweeping robot and electronic equipment
CN111197985B (en) Area identification method, path planning method, device and storage medium
CN110477813B (en) Laser type cleaning robot and control method thereof
CN111714028A (en) Method, device and equipment for escaping from restricted zone of cleaning equipment and readable storage medium
CN110946511B (en) Method, apparatus and storage medium for determining slippage
CN111679664A (en) Three-dimensional map construction method based on depth camera and sweeping robot
CN110084825B (en) Image edge information navigation-based method and system
CN112754363A (en) Cleaning control method, cleaning control device, cleaning apparatus, and storage medium
CN112783147A (en) Trajectory planning method and device, robot and storage medium
CN111609854A (en) Three-dimensional map construction method based on multiple depth cameras and sweeping robot
CN107028558B (en) Computer readable recording medium and automatic cleaning machine
CN112438658A (en) Cleaning area dividing method for cleaning robot and cleaning robot
CN111780744A (en) Mobile robot hybrid navigation method, equipment and storage device
CN112338908B (en) Autonomous mobile device
RU2658092C2 (en) Method and navigation system of the mobile object using three-dimensional sensors
CN112396611B (en) Self-adaptive optimization method, device and storage medium for point-line visual odometer
CN113984071B (en) Map matching method, apparatus, robot, and computer-readable storage medium
CN113009911B (en) Cleaning path generation method and device and self-moving equipment
Perroni Filho et al. Mobile robot path drift estimation using visual streams
CN118058658A (en) Movement control method of cleaning robot and cleaning robot
AU2015224421A1 (en) Autonomous coverage robot
CN115778266A (en) Control method and device of sweeping robot, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination