CN113409388A - Sweeper pose determination method and device, computer equipment and storage medium - Google Patents

Sweeper pose determination method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113409388A
CN113409388A CN202110540974.9A CN202110540974A CN113409388A CN 113409388 A CN113409388 A CN 113409388A CN 202110540974 A CN202110540974 A CN 202110540974A CN 113409388 A CN113409388 A CN 113409388A
Authority
CN
China
Prior art keywords
pose
sweeper
feature point
matching
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110540974.9A
Other languages
Chinese (zh)
Inventor
黄纯
贾盛泽
韩淑婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Water World Co Ltd
Original Assignee
Shenzhen Lechun Power Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Lechun Power Robot Co ltd filed Critical Shenzhen Lechun Power Robot Co ltd
Priority to CN202110540974.9A priority Critical patent/CN113409388A/en
Publication of CN113409388A publication Critical patent/CN113409388A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the field of sweeper, and provides a sweeper pose determining method, a sweeper pose determining device, computer equipment and a storage medium.

Description

Sweeper pose determination method and device, computer equipment and storage medium
Technical Field
The invention relates to the field of sweeper, in particular to a sweeper pose determination method, a sweeper pose determination device, computer equipment and a storage medium.
Background
The existing sweeping robot based on a visual sensor mainly depends on visual information acquired by the robot to complete positioning, if the robot is moved to the ground in the sweeping process and moved to a swept area, a sweeper can perform feature matching according to the acquired image information, the pose of the sweeper is determined according to the result of the feature matching, and then the robot is repositioned after being bound. However, the practical application scene of the sweeper is a home environment, the orientation of the sweeper camera is mostly set upwards, the number of feature points in the acquired image information is small, the texture is single, and when feature matching is performed under the practical situation, the problem that a large number of errors occur in feature matching due to over-similar feature textures exists, so that the problem that the pose accuracy is low can occur, and the problems that the repositioning result is wrong or inaccurate and the like can further occur.
Disclosure of Invention
The invention mainly aims to provide a sweeper pose determination method, a sweeper pose determination device, computer equipment and a storage medium, and aims to solve the technical problem that pose determination of a sweeper after the sweeper is moved away from the ground is inaccurate.
In order to achieve the purpose, the invention provides a sweeper pose determining method, which is applied to a sweeper and comprises the following steps:
detecting whether the sweeper is moved away from the ground;
if the moving-off state exists, acquiring an initial pose during moving-off, and calculating a relative pose relative to the initial pose in the moving-off process;
detecting whether the sweeper is placed back on the ground or not;
if the sweeper is replaced, calculating according to the initial pose and the relative pose to obtain a reference pose, and acquiring a current image when the sweeper is replaced on the ground;
acquiring a plurality of historical images, and matching the current image with the historical images to obtain a target image with the highest matching degree with the current image;
extracting current feature points in the current image, and determining the reference position of each current feature point on the target image according to the reference pose;
determining a first matching area corresponding to each current feature point on the target image according to the reference position;
matching each current feature point with a corresponding matching feature point in the first matching area, wherein the matching feature point which is successfully matched serves as a target feature point of each current feature point;
and determining the target pose of the sweeper when the sweeper is put back to the ground according to each current feature point and the target feature point corresponding to the current feature point.
Further, after the step of determining the target pose of the sweeper when the sweeper is put back on the ground according to each current feature point and the target feature point corresponding to the current feature point, the method includes the following steps:
and determining whether to enter a relocation program according to the target pose.
Further, the step of determining a first matching region corresponding to each current feature point on the target image according to the reference position includes:
and on the target image, taking the reference position as a circle center and a circular area determined by taking the first length as a radius as the corresponding first matching area of the current feature point.
Further, before the step of determining the target pose of the sweeper when the sweeper is put back on the ground according to each current feature point and the target feature point corresponding to the current feature point, the method comprises the following steps:
calculating the proportion value of the current feature points which are successfully matched to the total current feature points;
comparing the proportional value with a preset proportional value;
if the proportion value is smaller than the preset proportion value, determining a second matching area corresponding to each current feature point which is not successfully matched according to the reference position; wherein the second matching region is larger than the first matching region;
and in the second matching area, determining the target feature points of the current feature points which are not successfully matched.
Further, the step of obtaining a plurality of historical images, matching the current image with the historical images, and obtaining a target image with the highest matching degree with the current image includes:
acquiring a plurality of historical images;
calculating the similarity between the current image and the historical image through a DBoW algorithm;
and taking the historical image with the highest similarity as a target image of the current image.
Further, the step of determining the target pose of the sweeper when the sweeper is put back on the ground according to each current feature point and the target feature point corresponding to the current feature point comprises the following steps:
and calculating the target pose by the current feature point and the target feature point through a PnP algorithm.
Further, the step of calculating the relative pose in the moving-off process includes:
and calculating the position translation amount and the angle transformation amount of the sweeper relative to the initial pose in the moving-away process through IMU integral operation to obtain the relative pose.
The invention also provides a position and posture determining device of the sweeper, which comprises:
the first detection unit is used for detecting whether the sweeper is moved away from the ground or not;
the first calculation unit is used for acquiring an initial pose during moving if the moving object is moved away, and calculating a relative pose relative to the initial pose in the moving away process;
the second detection unit is used for detecting whether the sweeper is placed back on the ground or not;
the second calculation unit is used for calculating to obtain a reference pose according to the initial pose and the relative pose if the sweeper is replaced, and acquiring a current image when the sweeper is replaced on the ground;
the first matching unit is used for acquiring a plurality of historical images, matching the current image with the historical images and obtaining a target image with the highest matching degree with the current image;
the extracting unit is used for extracting current feature points in the current image and determining the reference position of each current feature point on the target image according to the reference pose;
a first determining unit, configured to determine, according to the reference position, a first matching region corresponding to each current feature point on the target image;
a second matching unit, configured to match each current feature point with a corresponding matching feature point in the first matching region, where the matching feature point that is successfully matched serves as a target feature point of each current feature point;
and the second determining unit is used for determining the target pose of the sweeper when the sweeper is put back on the ground according to each current characteristic point and the corresponding target characteristic point.
The invention also provides computer equipment which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the sweeper pose determination method when executing the computer program.
The invention also provides a computer-readable storage medium on which a computer program is stored, which, when executed by a processor, implements the steps of the sweeper pose determination method of any one of the above.
The invention provides a sweeper pose determining method, a sweeper pose determining device, computer equipment and a storage medium, wherein the method comprises the following steps: detecting whether the sweeper is moved away from the ground; if the moving-off state exists, acquiring an initial pose during moving-off, and calculating a relative pose relative to the initial pose in the moving-off process; detecting whether the sweeper is placed back on the ground or not; if the sweeper is replaced, calculating according to the initial pose and the relative pose to obtain a reference pose, and acquiring a current image when the sweeper is replaced on the ground; acquiring a plurality of historical images, and matching the current image with the historical images to obtain a target image with the highest matching degree with the current image; extracting current feature points in the current image, and determining the reference position of each current feature point on the target image according to the reference pose; determining a first matching area corresponding to each current feature point on the target image according to the reference position; matching each current feature point with a corresponding matching feature point in the first matching area, wherein the matching feature point which is successfully matched serves as a target feature point of each current feature point; and determining the target pose of the sweeper when the sweeper is put back to the ground according to each current feature point and the target feature point corresponding to the current feature point. When the sweeper is put back on the ground, a reference pose is determined according to the initial pose before the sweeper is moved away and the relative pose in the moving away process, a target image which is most matched with the current image is determined, a first matching area of the current feature point is determined in the target image according to the reference pose, the current feature point has the corresponding first matching area in the target image, and corresponding feature matching is carried out in the first matching area.
Drawings
Fig. 1 is a schematic step diagram of a sweeper pose determination method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a current image and a target image according to an embodiment of the invention;
fig. 3 is a block diagram of a sweeper pose determination apparatus according to an embodiment of the present disclosure;
fig. 4 is a block diagram schematically illustrating a structure of a computer apparatus according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, an embodiment of the present invention provides a method for determining a pose of a sweeper, which is applied to the sweeper and includes the following steps:
step S1, detecting whether the sweeper is moved away from the ground;
step S2, if the moving object is moved away, acquiring an initial pose during moving away, and calculating a relative pose relative to the initial pose in the moving away process;
step S3, detecting whether the sweeper is placed back on the ground or not;
step S4, if the sweeper is replaced, calculating to obtain a reference pose according to the initial pose and the relative pose, and acquiring a current image when the sweeper is replaced on the ground;
step S5, acquiring a plurality of historical images, and matching the current image with the historical images to obtain a target image with the highest matching degree with the current image;
step S6, extracting current feature points in the current image, and determining the reference position of each current feature point on the target image according to the reference pose;
step S7, determining a first matching area corresponding to each current feature point on the target image according to the reference position;
step S8, matching each current feature point with the corresponding matching feature point in the first matching area, wherein the matching feature point which is successfully matched is used as the target feature point of each current feature point;
and step S9, determining the target pose of the sweeper when the sweeper is put back on the ground according to each current feature point and the target feature point corresponding to the current feature point.
In this embodiment, as described in step S1, the infrared sensor is installed on the traveling wheel of the sweeper, and the infrared sensor can detect whether the sweeper is moved away from the ground.
As described in the step S2, the sweeper is provided with the camera, the sweeper continuously acquires images during traveling, the acquired images are stored as historical images, and each image corresponds to a pose of the sweeper, that is, a historical pose. In other embodiments, the feature points of the acquired images may be extracted and stored together. And determining the initial pose (R1, t1) during the moving according to the image during the moving, wherein R represents a rotation matrix, and t represents a translation vector. The sweeper is provided with an IMU (inertial measurement unit), the IMU can measure three-axis attitude angles and acceleration of the sweeper, integral operation is carried out in the moving-away process through the IMU, and integral position translation and angle transformation of the sweeper in the moving-away process are obtained through operation, so that a relative pose (R2, t2) is obtained. It is understood that the relative pose here can also be understood as the amount of pose change during the removal/kidnapping of the sweeper by the user.
As described in step S3, it can be detected whether the sweeper is replaced on the floor by the infrared sensor.
As described in step S4, when the sweeper is replaced on the ground, the IMU stops the integration operation and calculates the initial and relative poses to obtain the reference poses (RN, tN). The image acquisition of the sweeper is always performed, and the image acquired when the sweeper is placed back on the ground is used as the current image.
As described in step S5 above, the current image and the history image acquired before are matched, and the history image with the highest similarity is used as the target image by calculating the similarity between the current image and the history image.
Referring to fig. 2, as described in step S6, the current image a has a plurality of current feature points a, and a part of the current feature points a may obtain corresponding reference positions in the target image B according to the reference poses. Specifically, the current feature point has a corresponding position on the current image, the reference pose is used as reference pose information of the current image, and the target image has a corresponding historical pose, so that the relationship between the reference pose and the historical pose can be obtained, and the current feature point can obtain the reference position on the target image according to the relationship between the reference pose and the historical pose. As described in the above steps S7-S8, the first matching region S of the current feature point a is determined according to each reference position, and when performing feature matching, the current feature point a can only be demarked with the matching feature point a' in the corresponding first matching region S, and the current feature point a has a first matching region S which is smaller than the region of the whole target image B, so that some incorrect but similar feature points in the target image B are excluded from the first matching region S to some extent, thereby improving the accuracy of feature matching, avoiding dematching of the current feature point in the whole target image, and improving the matching efficiency.
It should be noted that, after the reference position is confirmed on the target image through the foregoing steps, since the position is only a function of "referencing" and assisting in locking the target, and the position is close to the real target position in an approximate range, and there may be a certain deviation between the two, therefore, a matching region (i.e. a first matching region) is appropriately expanded from the reference position, a certain deviation space can be supplemented, which is helpful for matching to the real target feature point, and the matching success rate is improved.
As described in step S9, the target pose of the sweeper is determined according to all the current feature points and the target feature points that are successfully matched, and the accuracy of the final target pose is high because the accuracy of the target feature points is high during matching.
In the embodiment, when the sweeper is put back on the ground, a reference pose is determined according to the initial pose before the sweeper is moved away and the relative pose in the moving away process, then a target image which is most matched with the current image is determined, a first matching area of each current feature point is determined in the target image according to the reference pose, a plurality of current feature points all have corresponding first matching areas in the target image, and corresponding feature matching is performed in the first matching areas, namely, the matching range of the feature points on the target image is narrowed, interference of similar features is reduced, accuracy of the feature matching is improved, further accuracy of pose determination of the sweeper after the sweeper is moved away from the ground is improved, and accuracy of a repositioning result is further improved.
In an embodiment, after the step S9 of determining the target pose of the sweeper when the sweeper is put back on the ground according to each of the current feature points and the target feature point corresponding to the current feature point, the method includes:
and step S10, determining whether to enter a relocation program according to the target pose.
In this embodiment, the target pose is obtained based on the current feature point and the target feature point, and the target feature point is determined in the corresponding first matching region, so that the accuracy is high, and the accuracy of the target pose is high. The repositioning refers to positioning the position of the sweeper in the whole environment, the sweeper is possibly moved to a cleaned area after being moved away, whether the sweeper is in the position when being moved away is determined according to the target pose, and when the sweeper is not in the cleaned area, the sweeper can directly advance to the moved-away position to continue cleaning tasks, so that repeated cleaning can be effectively avoided; when in the swept area, the sweeping task continues directly there.
In an embodiment, the step S7 of determining a first matching region corresponding to each of the current feature points on the target image according to the reference position includes:
step S71, on the target image, a circular area determined by taking the reference position as a center of a circle and the first length as a radius is taken as the first matching area of the corresponding current feature point.
In this embodiment, a circular region formed at the reference position is taken as the first matching region, and when the reference position is located at the edge position of the target image, the region formed at the reference position is not circular, and the formed region is taken as the first matching region. The first matching area is small, and when the features are matched, the matching is only carried out in the corresponding first matching area, so that the interference of other similar features is avoided. In other embodiments, the reference position may be the center, forming a rectangular area, etc.
In an embodiment, before the step S9 of determining the target pose of the sweeper when the sweeper is put back on the ground according to each of the current feature points and the target feature point corresponding to the current feature point, the method includes:
step S9A, calculating the proportion value of the current feature points which are successfully matched to the total current feature points;
step S9B, comparing the proportional value with a preset proportional value;
step S9C, if the proportion value is smaller than the preset proportion value, determining a second matching area corresponding to each current feature point which is not successfully matched according to the reference position; wherein the second matching region is larger than the first matching region;
step S9D, in the second matching area, determining each target feature point of the current feature points that has not been successfully matched.
In this embodiment, as described in steps S9A-S9C, there are many current feature points in the current image, and the number of current feature points that are successfully matched is divided by the total number of current feature points to obtain a ratio value. When the proportional value is smaller than the preset proportional value, it indicates that there are fewer current feature points that can be successfully matched, and when there are fewer current feature points that can be successfully matched, the subsequent solution of the target pose is affected, so when the proportional value is smaller than the preset proportional value, a second matching region can be re-determined, the area of the second matching region is larger than that of the first matching region.
As described in step S9D, the corresponding feature matching is performed in the second matching region, so as to reduce the problem of a low number of successful feature matching due to the undersize of the first matching region, and if the matching still cannot be successfully performed in the second matching region, the feature matching is not performed any longer, and all successfully matched current feature points and corresponding target feature points are calculated to obtain the target pose, so that the accuracy of the calculated target pose is improved.
In an embodiment, the step of comparing the ratio value with a preset ratio value includes:
if the proportional value is smaller than the preset proportional value;
acquiring a previous image and a next image of the target image;
and splicing the previous image, the next image and the target image by an image splicing technology to obtain a new target image, and returning to the step of determining the reference position of each current feature point on the target image according to the reference pose.
In the embodiment, the sweeper acquires images at a certain frequency, the images are sorted according to time, the time interval between the previous image and the next image and the target image is very small, so that the similarity between the three images is very high, slight differences still exist, the three images are spliced through an image splicing technology to obtain a new target image, the initial pose of the original target image is used as the pose of the new target image, when the proportion value is smaller than a preset proportion value, the corresponding reference position is determined again on the new target image according to the reference pose of the current feature point, the first matching area is further determined, corresponding feature matching is performed in the first matching area, the success rate of feature matching is improved, and the accuracy of the pose is further improved.
In an embodiment, the step S5 of acquiring a plurality of history images, matching the current image with the history images, and obtaining a target image with the highest matching degree with the current image includes:
step S51, acquiring a plurality of historical images;
step S52, calculating a similarity between the current image and the history image by a DBoW algorithm;
in step S53, the history image with the highest similarity is used as the target image of the current image.
In this embodiment, the similarity is calculated by a DBoW (Distributed Bag of Words) algorithm, the DBoW algorithm converts the features on the images into word representations and then into Bag-of-Words vectors, and the similarity between the two images can be calculated by a hamming distance or a cosine distance between the Bag-of-Words vectors.
In an embodiment, the step S9 of determining the target pose of the sweeper when the sweeper is put back on the ground according to each of the current feature points and the target feature point corresponding to the current feature point includes:
and step S91, calculating the target pose by the current feature point and the target feature point through a PnP algorithm.
In this embodiment, a PnP (passive-n-Point, multipoint Perspective imaging) algorithm may estimate the pose of the sweeper when n 3D spatial points and their projection positions are known, where n is the number of successful matching of the current feature points, and since the accuracy of the target feature points is high, the accuracy of the target pose obtained by the PnP algorithm is also high.
In an embodiment, the step of calculating the relative pose in the moving-off process includes:
and calculating the position translation amount and the angle transformation amount of the sweeper relative to the initial pose in the moving-away process through IMU integral operation to obtain the relative pose.
In this embodiment, in the moving process, the IMU starts to perform integral operation, and obtains the position translation amount and the angle transformation amount in the moving process through the operation, so as to obtain the relative pose.
Referring to fig. 3, an embodiment of the present invention further provides a position and orientation determination device for a sweeper, including:
the first detection unit 10 is used for detecting whether the sweeper is moved away from the ground;
the first calculating unit 20 is configured to, if the object is moved away, acquire an initial pose during moving away, and calculate a relative pose with respect to the initial pose during moving away;
a second detecting unit 30 for detecting whether the sweeper is put back on the ground;
the second calculating unit 40 is used for calculating to obtain a reference pose according to the initial pose and the relative pose if the sweeper is replaced, and acquiring a current image when the sweeper is replaced on the ground;
the first matching unit 50 is configured to acquire a plurality of historical images, match the current image with the historical images, and obtain a target image with the highest matching degree with the current image;
an extracting unit 60, configured to extract current feature points in the current image, and determine, according to the reference pose, a reference position of each of the current feature points on the target image;
a first determining unit 70, configured to determine, according to the reference position, a first matching area corresponding to each current feature point on the target image;
a second matching unit 80, configured to match each current feature point with a corresponding matching feature point in the first matching region, where the matching feature point that is successfully matched is used as a target feature point of each current feature point;
and a second determining unit 90, configured to determine, according to each current feature point and the target feature point corresponding to the current feature point, a target pose of the sweeper when the sweeper is put back on the ground.
In an embodiment, the sweeper pose determination apparatus further includes:
and the repositioning unit is used for determining whether to enter a repositioning program according to the target pose.
In an embodiment, the first determining unit 70 includes:
and the determining subunit is configured to determine, on the target image, a circular region with the reference position as a center and a first length as a radius, as the first matching region of the corresponding current feature point.
In an embodiment, the sweeper pose determination apparatus further includes:
the third calculating unit is used for calculating the proportion value of the current characteristic points which are successfully matched to the total current characteristic points;
the comparison unit is used for comparing the proportional value with a preset proportional value;
a third determining unit, configured to determine, according to the reference position, a second matching region corresponding to each current feature point that is not successfully matched if the ratio value is smaller than the preset ratio value; wherein the second matching region is larger than the first matching region;
and a fourth determining unit, configured to determine, in the second matching area, target feature points of the current feature points that are not successfully matched.
In one embodiment, the first matching unit 50 includes:
the acquisition subunit is used for acquiring a plurality of historical images;
the first calculating subunit is used for calculating the similarity between the current image and the historical image through a DBoW algorithm;
and the target image subunit is used for taking the historical image with the highest similarity as the target image of the current image.
In an embodiment, the second determining unit 90 includes:
and the second calculating subunit is used for calculating the target pose by the current feature point and the target feature point through a PnP algorithm.
In one embodiment, the first computing unit 20 includes:
the step of calculating the relative pose in the moving-away process comprises the following steps:
and the third calculation subunit is used for calculating the position translation amount and the angle transformation amount of the sweeper relative to the initial pose in the moving-away process through IMU integral operation to obtain the relative pose.
In this embodiment, please refer to the above method embodiment for the specific implementation of each unit and sub-unit, which is not described herein again.
Referring to fig. 4, an embodiment of the present invention further provides a computer device, where the computer device may be a server, and an internal structure of the computer device may be as shown in fig. 4. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the computer designed processor is used to provide computational and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing authentication data and the like. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a sweeper pose determination method.
It will be understood by those skilled in the art that the structure shown in fig. 4 is only a block diagram of a portion of the structure associated with the inventive arrangements, and does not constitute a limitation on the computer apparatus to which the inventive arrangements are applied.
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a method for determining a pose of a sweeper.
In summary, the sweeper pose determining method, the sweeper pose determining device, the computer device and the storage medium provided by the invention include the following steps: detecting whether the sweeper is moved away from the ground; if the moving-off state exists, acquiring an initial pose during moving-off, and calculating a relative pose relative to the initial pose in the moving-off process; detecting whether the sweeper is placed back on the ground or not; if the sweeper is replaced, calculating according to the initial pose and the relative pose to obtain a reference pose, and acquiring a current image when the sweeper is replaced on the ground; acquiring a plurality of historical images, and matching the current image with the historical images to obtain a target image with the highest matching degree with the current image; extracting current feature points in the current image, and determining the reference position of each current feature point on the target image according to the reference pose; determining a first matching area corresponding to each current feature point on the target image according to the reference position; matching each current feature point with a corresponding matching feature point in the first matching area, wherein the matching feature point which is successfully matched serves as a target feature point of each current feature point; and determining the target pose of the sweeper when the sweeper is put back to the ground according to each current feature point and the target feature point corresponding to the current feature point. According to the sweeper pose determining method provided by the invention, when the sweeper is put back on the ground, a reference pose is determined according to the initial pose before moving away and the relative pose in the moving away process, then a target image which is most matched with the current image is determined, then a first matching area of the current feature point is determined in the target image according to the reference pose, the current feature point has a corresponding first matching area on the target image, and corresponding feature matching is carried out in the first matching area.
It will be understood by those skilled in the art that all or part of the processes of the methods of the above embodiments may be implemented by hardware associated with instructions of a computer program, which may be stored on a non-volatile computer-readable storage medium, and when executed, may include processes of the above embodiments of the methods. Any reference to memory, storage, databases, or other media provided herein or used in embodiments of the present invention may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double-rate SDRAM (SSRSDRAM), Enhanced SDRAM (ESDRAM), synchronous link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, apparatus, article, or method that includes the element.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A sweeper pose determination method is applied to a sweeper and comprises the following steps:
detecting whether the sweeper is moved away from the ground;
if the moving-off state exists, acquiring an initial pose during moving-off, and calculating a relative pose relative to the initial pose in the moving-off process;
detecting whether the sweeper is placed back on the ground or not;
if the sweeper is replaced, calculating according to the initial pose and the relative pose to obtain a reference pose, and acquiring a current image when the sweeper is replaced on the ground;
acquiring a plurality of historical images, and matching the current image with the historical images to obtain a target image with the highest matching degree with the current image;
extracting current feature points in the current image, and determining the reference position of each current feature point on the target image according to the reference pose;
determining a first matching area corresponding to each current feature point on the target image according to the reference position;
matching each current feature point with a corresponding matching feature point in the first matching area, wherein the matching feature point which is successfully matched serves as a target feature point of each current feature point;
and determining the target pose of the sweeper when the sweeper is put back to the ground according to each current feature point and the target feature point corresponding to the current feature point.
2. The sweeper pose determination method according to claim 1, wherein the step of determining the target pose of the sweeper when replaced on the ground from each of the current feature points and the target feature point corresponding thereto is followed by:
and determining whether to enter a relocation program according to the target pose.
3. The sweeper pose determination method according to claim 1, wherein the step of determining a first matching area corresponding to each of the current feature points on the target image according to the reference position comprises:
and on the target image, taking the reference position as a circle center and a circular area determined by taking the first length as a radius as the corresponding first matching area of the current feature point.
4. The sweeper pose determination method according to claim 1, wherein the step of determining the target pose of the sweeper when placed back onto the ground from each of the current feature points and the target feature point corresponding thereto is preceded by the step of:
calculating the proportion value of the current feature points which are successfully matched to the total current feature points;
comparing the proportional value with a preset proportional value;
if the proportion value is smaller than the preset proportion value, determining a second matching area corresponding to each current feature point which is not successfully matched according to the reference position; wherein the second matching region is larger than the first matching region;
and in the second matching area, determining the target feature points of the current feature points which are not successfully matched.
5. The sweeper pose determining method according to claim 1, wherein the step of obtaining a plurality of historical images, matching the current image with the historical images to obtain a target image with a highest degree of matching with the current image comprises:
acquiring a plurality of historical images;
calculating the similarity between the current image and the historical image through a DBoW algorithm;
and taking the historical image with the highest similarity as a target image of the current image.
6. The sweeper pose determining method according to claim 1, wherein the step of determining the target pose of the sweeper when being replaced on the ground according to each of the current feature points and the target feature point corresponding thereto comprises:
and calculating the target pose by the current feature point and the target feature point through a PnP algorithm.
7. The sweeper pose determination method of claim 1, wherein the step of calculating the relative pose during the lift-off process comprises:
and calculating the position translation amount and the angle transformation amount of the sweeper relative to the initial pose in the moving-away process through IMU integral operation to obtain the relative pose.
8. The utility model provides a quick-witted position appearance determination device sweeps, its characterized in that includes:
the first detection unit is used for detecting whether the sweeper is moved away from the ground or not;
the first calculation unit is used for acquiring an initial pose during moving if the moving object is moved away, and calculating a relative pose relative to the initial pose in the moving away process;
the second detection unit is used for detecting whether the sweeper is placed back on the ground or not;
the second calculation unit is used for calculating to obtain a reference pose according to the initial pose and the relative pose if the sweeper is replaced, and acquiring a current image when the sweeper is replaced on the ground;
the first matching unit is used for acquiring a plurality of historical images, matching the current image with the historical images and obtaining a target image with the highest matching degree with the current image;
the extracting unit is used for extracting current feature points in the current image and determining the reference position of each current feature point on the target image according to the reference pose;
a first determining unit, configured to determine, according to the reference position, a first matching region corresponding to each current feature point on the target image;
a second matching unit, configured to match each current feature point with a corresponding matching feature point in the first matching region, where the matching feature point that is successfully matched serves as a target feature point of each current feature point;
and the second determining unit is used for determining the target pose of the sweeper when the sweeper is put back on the ground according to each current characteristic point and the corresponding target characteristic point.
9. A computer device comprising a memory and a processor, the memory having stored therein a computer program, characterized in that the processor, when executing the computer program, implements the steps of the sweeper pose determination method of any of claims 1 to 7.
10. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the steps of the sweeper pose determination method of any one of claims 1 to 7.
CN202110540974.9A 2021-05-18 2021-05-18 Sweeper pose determination method and device, computer equipment and storage medium Pending CN113409388A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110540974.9A CN113409388A (en) 2021-05-18 2021-05-18 Sweeper pose determination method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110540974.9A CN113409388A (en) 2021-05-18 2021-05-18 Sweeper pose determination method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113409388A true CN113409388A (en) 2021-09-17

Family

ID=77678743

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110540974.9A Pending CN113409388A (en) 2021-05-18 2021-05-18 Sweeper pose determination method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113409388A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108648235A (en) * 2018-04-27 2018-10-12 腾讯科技(深圳)有限公司 Method for relocating, device and the storage medium of camera posture tracing process
CN109974693A (en) * 2019-01-31 2019-07-05 中国科学院深圳先进技术研究院 Unmanned plane localization method, device, computer equipment and storage medium
CN111750864A (en) * 2020-06-30 2020-10-09 杭州海康机器人技术有限公司 Repositioning method and device based on visual map
CN112106113A (en) * 2019-09-16 2020-12-18 深圳市大疆创新科技有限公司 Method and device for determining pose information of image in three-dimensional reconstruction

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108648235A (en) * 2018-04-27 2018-10-12 腾讯科技(深圳)有限公司 Method for relocating, device and the storage medium of camera posture tracing process
CN109974693A (en) * 2019-01-31 2019-07-05 中国科学院深圳先进技术研究院 Unmanned plane localization method, device, computer equipment and storage medium
CN112106113A (en) * 2019-09-16 2020-12-18 深圳市大疆创新科技有限公司 Method and device for determining pose information of image in three-dimensional reconstruction
CN111750864A (en) * 2020-06-30 2020-10-09 杭州海康机器人技术有限公司 Repositioning method and device based on visual map

Similar Documents

Publication Publication Date Title
CN111199564B (en) Indoor positioning method and device of intelligent mobile terminal and electronic equipment
CN110174894B (en) Robot and repositioning method thereof
US6795590B1 (en) SAR and FLIR image registration method
US20040062419A1 (en) Landmark, apparatus, and method for effectively determining position of autonomous vehicles
Goncalves et al. A visual front-end for simultaneous localization and mapping
JP5141644B2 (en) Autonomous mobile body, self-position estimation apparatus, and program
CN111445531B (en) Multi-view camera navigation method, device, equipment and storage medium
KR20120048370A (en) Object pose recognition apparatus and method using the same
JP2009190164A (en) Object recognition and method for estimating self-position of robot based on information about surrounding environment including recognized object
JP4709668B2 (en) 3D object recognition system
Lee et al. Vision-based kidnap recovery with SLAM for home cleaning robots
Tomono 3-D localization and mapping using a single camera based on structure-from-motion with automatic baseline selection
Churchill et al. An orientation invariant visual homing algorithm
CN110619497B (en) Address checking method, device, electronic equipment and storage medium
Iocchi et al. Self-localization in the RoboCup environment
CN111239763A (en) Object positioning method and device, storage medium and processor
CN117011457A (en) Three-dimensional drawing construction method and device, electronic equipment and storage medium
KR101235525B1 (en) Homing navigation method of mobile robot based on vision information
CN113409388A (en) Sweeper pose determination method and device, computer equipment and storage medium
CN114463429B (en) Robot, map creation method, positioning method, and medium
Rybski et al. Appearance-based minimalistic metric SLAM
Baligh Jahromi et al. Layout slam with model based loop closure for 3d indoor corridor reconstruction
Wakita et al. Laser variational autoencoder for map construction and self-localization
Mottaghi et al. Place recognition-based fixed-lag smoothing for environments with unreliable GPS
CN115700507B (en) Map updating method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220425

Address after: 518000 1201, South Block, Yuanxing science and technology building, No. 1, songpingshan Road, songpingshan community, Xili street, Nanshan District, Shenzhen, Guangdong

Applicant after: SHENZHEN WATER WORLD Co.,Ltd.

Address before: 518000 1201, South Block, Yuanxing science and technology building, No. 1, songpingshan Road, songpingshan community, Xili street, Nanshan District, Shenzhen, Guangdong

Applicant before: Shenzhen Lechun power robot Co.,Ltd.

TA01 Transfer of patent application right
RJ01 Rejection of invention patent application after publication

Application publication date: 20210917

RJ01 Rejection of invention patent application after publication