CN116185046B - Mobile robot positioning method, mobile robot and medium - Google Patents

Mobile robot positioning method, mobile robot and medium Download PDF

Info

Publication number
CN116185046B
CN116185046B CN202310465399.XA CN202310465399A CN116185046B CN 116185046 B CN116185046 B CN 116185046B CN 202310465399 A CN202310465399 A CN 202310465399A CN 116185046 B CN116185046 B CN 116185046B
Authority
CN
China
Prior art keywords
mobile robot
distance
target object
determining
relative position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310465399.XA
Other languages
Chinese (zh)
Other versions
CN116185046A (en
Inventor
骆怡
张有为
朱建国
康路青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Chenpuhao New Technology Co ltd
Original Assignee
Beijing Chenpuhao New Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Chenpuhao New Technology Co ltd filed Critical Beijing Chenpuhao New Technology Co ltd
Priority to CN202310465399.XA priority Critical patent/CN116185046B/en
Publication of CN116185046A publication Critical patent/CN116185046A/en
Application granted granted Critical
Publication of CN116185046B publication Critical patent/CN116185046B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The application provides a mobile robot positioning method, a mobile robot and a medium, wherein the method comprises the following steps: respectively acquiring distance data acquired by a plurality of single line sensors and an image to be processed including a target object acquired by an image sensor; determining the current relative position of the mobile robot and the target object based on the distance data and the image to be processed; acquiring a preset target relative position of the mobile robot and a target object, and determining a potential difference between the current relative position and the target relative position; determining the moving direction and the moving speed of the mobile robot by using a PID control algorithm based on the potential difference; and adjusting the current position of the mobile robot based on the moving direction and the moving speed of the mobile robot to obtain the target position of the mobile robot. In the positioning process, the accuracy and stability of the mobile robot positioning are improved by combining various sensors and PID control algorithms.

Description

Mobile robot positioning method, mobile robot and medium
Technical Field
The application relates to the technical field of intelligent robots, in particular to a mobile robot positioning method, a mobile robot and a medium.
Background
The application of robot technology in the current society is more and more extensive, and the robot technology is widely applied not only in the field of production and manufacture, but also in the fields of medical treatment, service robots, military and the like. In these application scenarios, the robot needs to accurately and reliably sense and position itself in position and direction to complete the intended task.
In the prior art, although many robot positioning technologies are available, in a complex environment, a robot may be interfered, so that the positioning accuracy and reliability of the robot are affected, and the robot cannot accurately reach a destination or cannot accurately perform a task. Therefore, there is a need for a more accurate robot positioning method that can improve the accuracy and reliability of robot positioning in a complex environment, thereby improving the working efficiency of the robot.
Disclosure of Invention
The purpose of the application is to provide a mobile robot positioning method, a mobile robot and a medium, which are used for solving the problem that the robot cannot accurately and reliably reach a destination under a complex environment.
In a first aspect, an embodiment of the present application provides a positioning method of a mobile robot, where the method includes:
Respectively acquiring distance data acquired by a plurality of single line sensors and an image to be processed including a target object acquired by an image sensor;
determining the current relative position of the mobile robot and the target object based on the distance data and the image to be processed;
acquiring a preset target relative position of the mobile robot and the target object, and determining a potential difference between the current relative position and the target relative position;
and adjusting the position of the mobile robot based on the potential difference.
In the embodiment, the current relative positions of the mobile robot and the target object in different directions are determined by combining the information acquired by the various sensors, so that the positions of the mobile robot are adjusted according to the potential differences between the current relative positions and the preset target relative positions, and then the positions of the mobile robot are adjusted according to the potential differences until the potential differences between the relative positions of the mobile robot and the target object in different directions and the preset target relative positions are smaller than the preset threshold value, the positioning of the mobile robot is completed, and the positioning accuracy and stability of the mobile robot are improved.
In one possible embodiment, the distance data includes a first distance between the mobile robot and an obstacle of an environment in which the target object is located, and a second distance between the mobile robot and the target object; the relative positions include a relative angle, an X-axis relative position, and a Y-axis relative position;
The determining, based on the distance data and the image to be processed, a current relative position of the mobile robot and the target object includes:
determining a relative angle of the mobile robot and the target object based on the first distances acquired by the first single-wire sensor and the second single-wire sensor; the first single-wire sensor and the second single-wire sensor are symmetrically arranged on two sides of the mobile robot;
determining a Y-axis relative position of the mobile robot and the target object based on the second distance acquired by the third single-line sensor; the third single-wire sensor is positioned between the first single-wire sensor and the second single-wire sensor;
and determining the X-axis relative position of the mobile robot and the target object based on the image to be processed.
In one possible embodiment, the first distance includes a first reference distance and a second reference distance; the first reference distance is a distance between the mobile robot and a first obstacle on the same side of the first single-wire sensor; the second reference distance is a distance between the mobile robot and a second obstacle on the same side of the second single-wire sensor;
The determining the relative angle of the mobile robot and the target object based on the first distance acquired by the first single-wire sensor and the second single-wire sensor comprises the following steps:
determining a distance difference between the first reference distance and the second reference distance based on the first reference distance and the second reference distance;
a relative angle of the mobile robot and the target object is determined based on a distance difference between the first reference distance and the second reference distance.
In one possible embodiment, the determining the relative angle of the mobile robot and the target object based on the distance difference between the first reference distance and the second reference distance includes:
determining the relative angle between the mobile robot and the target object based on a corresponding relation between a preset distance difference value and the relative angle and a distance difference value between the first reference distance and the second reference distance; or alternatively, the process may be performed,
determining a relative angle of the mobile robot and the target object based on a distance difference between the first reference distance and the second reference distance and a pre-stored distance between the first single-line sensor and the second single-line sensor.
In one possible embodiment, the determining the Y-axis relative position of the mobile robot and the target work object based on the second distance acquired by the third single-wire sensor includes:
and taking the second distance, in the Y-axis direction, between the mobile robot and the target object, acquired by the third single-line sensor as the Y-axis relative position between the mobile robot and the target object.
In a possible implementation manner, the determining the X-axis relative position of the mobile robot and the target object based on the image to be processed acquired by the image sensor includes:
acquiring the area occupied by a preset plane of the target object in the image to be processed, and determining a first ratio of the actual area of the preset plane to the area occupied by the preset plane in the image to be processed;
and determining the relative position of the mobile robot and the X axis of the target object based on the first ratio and the coordinate position of the X axis of the target object in the image to be processed.
In one possible embodiment, the adjusting the position of the mobile robot based on the head includes:
Determining a moving direction and a moving speed of the mobile robot based on the potential difference;
and adjusting the current position of the mobile robot based on the moving direction and the moving speed of the mobile robot to obtain the target position of the mobile robot.
In one possible embodiment, the head comprises a relative angle difference, an X-axis relative head, and a Y-axis relative head; the determining the moving direction and the moving speed of the mobile robot based on the potential difference includes:
determining a movement angular velocity of the mobile robot based on the relative angle difference;
determining a moving speed of the mobile robot in the X-axis direction based on the X-axis relative position difference value;
determining a moving speed of the mobile robot in the Y-axis direction based on the Y-axis relative position difference value;
the moving direction and the moving speed of the mobile robot are determined based on the moving angular speed, the moving speed in the X-axis direction, and the moving speed in the Y-axis direction.
In a second aspect, embodiments of the present application provide a mobile robot, including:
at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of positioning a mobile robot as described in any one of the first aspects above.
In a third aspect, embodiments of the present application provide a computer storage medium storing a computer program for causing a computer to execute the positioning method of a mobile robot according to any one of the first aspect.
In a fourth aspect, the present application provides a computer program product comprising a computer program:
the computer program, when executed by a processor, implements a method of positioning a mobile robot as described in any of the first aspects above.
The technical scheme provided by the embodiment of the application at least brings the following beneficial effects:
according to the embodiment of the application, the distance data acquired by the single line sensors and the image to be processed including the target object acquired by the image sensor are acquired respectively; determining the current relative position of the mobile robot and the target object based on the distance data and the image to be processed; acquiring a preset target relative position of the mobile robot and a target object, and determining a potential difference between the current relative position and the target relative position; determining the moving direction and the moving speed of the mobile robot by using a PID control algorithm based on the potential difference; and adjusting the current position of the mobile robot based on the moving direction and the moving speed of the mobile robot to obtain the target position of the mobile robot. The position of the mobile robot is adjusted in real time until the position difference between the relative positions of the mobile robot and the target object in different directions and the preset target relative position is smaller than the preset threshold value, so that the positioning of the mobile robot is finished, various sensors and PID control algorithms are used in the positioning process, and the positioning accuracy and stability of the mobile robot are improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
Fig. 1 is an application scenario schematic diagram of a positioning method of a mobile robot according to an embodiment of the present application;
fig. 2 is a flow chart of a positioning method of a mobile robot according to an embodiment of the present application;
fig. 3 is a schematic diagram of a relative position between a mobile robot and a target object according to an embodiment of the present application;
fig. 4 is a schematic diagram of an image to be processed according to an embodiment of the present application;
fig. 5 is a complete flow diagram of a positioning method of a mobile robot according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a mobile robot according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a positioning device of a mobile robot according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure. Embodiments and features of embodiments in this application may be combined with each other arbitrarily without conflict. Also, while a logical order of illustration is depicted in the flowchart, in some cases the steps shown or described may be performed in a different order than presented.
The application of robot technology in the current society is more and more extensive, and the robot technology is widely applied not only in the field of production and manufacture, but also in the fields of medical treatment, service robots, military and the like. In these application scenarios, the robot needs to accurately and reliably sense and position itself in position and direction to complete the intended task.
In the prior art, although many robot positioning technologies are available, in complex environments, such as hospital hallways, office building toilets, etc., robots may be disturbed. For example: global Positioning System (GPS) signals may be disturbed in covered areas such as indoors, canyons, etc.; the precision of an Inertial Navigation System (INS) is interfered by movements such as vibration, rotation and the like of a robot, so that measurement errors are increased; the visual recognition technology can be influenced by factors such as illumination, shielding, specular reflection and the like, so that the recognition and tracking of the position and the direction of the robot are influenced; the laser radar technology has higher equipment cost and higher requirements on the environment; the accuracy of the ultrasonic ranging technology is easily limited by distance and angle, three-dimensional information cannot be provided, and the requirement on the environment is high.
The positioning accuracy and reliability of the robot are affected by the above positioning technologies, so that the robot cannot accurately reach a destination or cannot accurately perform a task. Therefore, there is a need for a more accurate robot positioning method that can improve the accuracy and reliability of robot positioning in a complex environment, thereby improving the working efficiency of the robot.
In view of this, the embodiments of the present application provide a positioning method of a mobile robot, and a medium, so as to solve the problem that the robot cannot accurately and reliably reach a destination in a complex environment.
The inventive concepts of the embodiments of the present application may be summarized as follows: respectively acquiring distance data acquired by a plurality of single line sensors and an image to be processed including a target object acquired by an image sensor; determining the current relative position of the mobile robot and the target object based on the distance data and the image to be processed; acquiring a preset target relative position of the mobile robot and the target object, and determining a potential difference between the current relative position and the target relative position; and adjusting the position of the mobile robot based on the potential difference. Therefore, the position difference between the current relative position of the mobile robot and the target relative position is calculated through the information acquired by the various sensors in real time, and the position of the mobile robot is adjusted in real time through the position difference, so that the accuracy and the stability of the positioning of the mobile robot are improved.
As shown in fig. 1, an application scenario diagram of a positioning method of a mobile robot according to an embodiment of the present application is shown, where the scenario diagram includes: a communication network 101, a terminal device 102 and at least one mobile robot (103_1, 103_2, 103_n in the illustration). Wherein the communication network 101 is used for connecting the terminal device 102 and at least one mobile robot, the terminal device 102 is used for communicating with the at least one mobile robot through the communication network 102, the terminal device 102 may be a computer or a mobile phone, which is not particularly limited herein, and the at least one mobile robot may communicate with the terminal device through the communication network 101. The user may remotely control the operation mode of the at least one mobile robot, for example, the operation mode or the sleep mode, etc., using the terminal device 102, and the operation time may be set, for example, to start operation at 9 am, without being particularly limited thereto. The user may also monitor the operation of the mobile robot through the terminal device 102, for example, after the mobile robot completes one bathroom cleaning, feedback to the terminal device 102, and the user may control the mobile robot through the terminal device 102 to continue cleaning another bathroom, or terminate the cleaning operation.
Application fields of the mobile robot positioning method provided by the embodiment of the application include, but are not limited to, intelligent manufacturing fields, medical fields, service robot fields and military fields.
Because the existing mobile robot is not positioned accurately enough, and thus the problem that the destination cannot be reached accurately or the task cannot be executed accurately is solved, the embodiment of the application provides a positioning method of the mobile robot, as shown in fig. 2, the method includes:
s201: distance data acquired by a plurality of single line sensors and an image to be processed including a target object acquired by an image sensor are acquired respectively.
In the embodiment of the application, the mobile robots all move according to a preset movement model, wherein the movement model can be provided by a robot manufacturer. Because the mobile robot needs to move to a target position near a preset target object to execute the corresponding task when executing the task, the mobile robot needs to continuously determine the moving direction and the moving speed in the moving process, and the moving direction is continuously adjusted in real time to accurately position the target position. Therefore, distance data and images to be processed, which are acquired by a plurality of single line sensors, and images to be processed, which are acquired by the image sensors and comprise target objects, need to be acquired periodically, the distance data and the images to be processed can be acquired in real time, and the position of the mobile robot can be judged according to the acquired distance data and the images to be processed in each period.
In the embodiment of the application, the mobile robot can reach the reference position near the target object through the current mature robot navigation technology SLAM (Simultaneous localization and mapping, synchronous positioning and mapping), and then continuously adjust the moving direction and moving speed of the mobile robot on the basis of the reference position by using the positioning method of the mobile robot provided by the application until the mobile robot reaches the target position near the preset target object, namely, the positioning task is completed, and the corresponding work task is executed.
For example, the preset target position near the target object is about 1 cm to 2cm away from the target object, the mobile robot reaches a position about 5 cm to 10cm away from the reference position near the target object through SLAM technology, and then the moving direction and the moving speed of the mobile robot are continuously adjusted on the basis of the reference position by using the positioning method of the mobile robot provided by the application until the mobile robot moves from 5 cm to 10cm to 1 cm to 2cm, so that the positioning task is completed, and the corresponding work task is executed.
Among them, a multi-line laser sensor is used in SLAM technology.
S202: based on the distance data and the image to be processed, the current relative position of the mobile robot and the target object is determined.
In one possible embodiment, the distance data includes a first distance between the mobile robot and an obstacle of an environment in which the target object is located, and a second distance between the mobile robot and the target object; the relative positions include a relative angle, an X-axis relative position, and a Y-axis relative position; thus, in the present application, determining the current relative position of the mobile robot and the target object based on the distance data and the image to be processed may be performed as the following steps:
a1: determining a relative angle between the mobile robot and the target object based on the first distances acquired by the first single-wire sensor and the second single-wire sensor;
a2: determining the Y-axis relative position of the mobile robot and the target object based on the second distance acquired by the third single-line sensor;
a3: based on the image to be processed, the X-axis relative position of the mobile robot and the target object is determined.
The first single-wire sensor and the second single-wire sensor are symmetrically arranged on two sides of the mobile robot, and the third single-wire sensor is positioned between the first single-wire sensor and the second single-wire sensor. The order of the steps A1 to A3 is not limited in the embodiment of the present application.
In the implementation, as shown in fig. 3, the target object is a bathroom, and three single-wire sensors, namely a first single-wire sensor, a second single-wire sensor and a third single-wire sensor, are arranged in the mobile robot. When the first distance is acquired in step S201, what is required to be acquired is the distance between the mobile robot and the obstacle in the environment where the target object is located, that is, the distance between the mobile robot and the left and right walls of the bathroom where the bathroom is located, which are respectively acquired by the first single-wire sensor and the second single-wire sensor. And then the relative angle between the mobile robot and the target object can be calculated according to the distance between the mobile robot and the left wall and the right wall of the bathroom where the bathroom is located. When the second distance is acquired in step S201, the distance between the mobile robot and the target object, that is, the distance between the mobile robot and the bathroom acquired by the third single-line sensor, is required to be acquired, so that the Y-axis relative position between the mobile robot and the target object as shown in fig. 3 can be obtained. Finally, according to the image to be processed acquired by the image sensor, the relative position of the mobile robot and the X axis of the target object shown in figure 3 can be determined.
In one possible embodiment, the first distance in step A1 comprises a first reference distance and a second reference distance; the first reference distance is the distance between the mobile robot and a first obstacle on the same side of the first single-wire sensor; the second reference distance is a distance between the mobile robot and a second obstacle on the same side of the second single-wire sensor acquired by the second single-wire sensor.
As shown in fig. 3, the first obstacle on the same side of the first single-wire sensor is a right wall of the bathroom where the bathroom is located, and the second obstacle on the same side of the second single-wire sensor is a left wall of the bathroom where the bathroom is located, so that the first reference distance in the embodiment of the application is the distance between the mobile robot and the right wall acquired by the first single-wire sensor; the second reference distance is a distance between the mobile robot and the left wall acquired by a second single line sensor.
Thus, determining the relative angle of the mobile robot to the target object based on the first distances acquired by the first single-wire sensor and the second single-wire sensor in step A1 may be performed as:
determining a distance difference between the first reference distance and the second reference distance based on the first reference distance and the second reference distance; a relative angle of the mobile robot to the target object is determined based on a distance difference between the first reference distance and the second reference distance.
In one possible embodiment, the determining the relative angle of the mobile robot and the target object based on the distance difference between the first reference distance and the second reference distance may be specifically performed as two embodiments:
Embodiment one: and determining the relative angle between the mobile robot and the target object based on the corresponding relation between the preset distance difference value and the relative angle and the distance difference value between the first reference distance and the second reference distance.
In specific implementation, a corresponding relation between a preset distance difference and a relative angle may be set, as shown in table 1, and then after the first single-line sensor collects a first reference distance between the mobile robot and the right wall and the second single-line sensor collects a second reference distance between the mobile robot and the left wall, the first reference distance is subtracted from the second reference distance to obtain a distance difference between the first reference distance and the second reference distance, and then the relative angle between the mobile robot and the target object is directly obtained according to the corresponding relation shown in table 1.
Figure SMS_1
The preset correspondence shown in table 1 may be set according to actual needs, or may be obtained according to multiple experiments in advance, which is not limited in this embodiment of the present application.
Embodiment two: based on the distance difference between the first reference distance and the second reference distance and the pre-stored distance between the first single-line sensor and the second single-line sensor, the relative angle of the mobile robot and the target object is determined.
In a specific implementation, the distance between the first single-wire sensor and the second single-wire sensor in the mobile robot, which measure the first reference distance and the second reference distance, is fixed, i.e. the distance between the first single-wire sensor and the second single-wire sensor can be determined when the mobile robot is produced and stored in the memory of the mobile robot. When the mobile robot is actually used, after the first single-line sensor collects the first reference distance between the mobile robot and the right wall and the second single-line sensor collects the second reference distance between the mobile robot and the left wall, the first reference distance is used for subtracting the second reference distance to obtain a distance difference value between the first reference distance and the second reference distance, and then the distance between the first single-line sensor and the second single-line sensor stored in the memory is used for obtaining the relative angle between the mobile robot and the target object by utilizing a trigonometric function.
In the embodiment of the application, when the relative angle is determined in positioning, the left-right distance between the two single-line laser sensors is not less than 30cm, and the flat wall areas at the left side and the right side of the bathroom are 5cm by 10cm, so that the relative angle can be determined by the left laser and the right laser at the position.
In a possible implementation manner, the determining the Y-axis relative position of the mobile robot and the target working object based on the second distance acquired by the third single-line sensor in step A2 in the embodiment of the present application may be performed as: and taking the second distance between the mobile robot acquired by the third single-line sensor and the target object in the Y-axis direction as the Y-axis relative position between the mobile robot and the target object.
In specific implementation, a second distance between the mobile robot and the target object, which is acquired by a third single-wire sensor in the mobile robot shown in fig. 3, in the Y-axis direction, that is, a distance between the mobile robot and the bathroom, which is acquired by the third single-wire sensor, is used, and then the distance between the mobile robot and the bathroom, which is acquired by the third single-wire sensor, is taken as a Y-axis relative position between the mobile robot and the target object, as shown in fig. 3.
In a possible implementation manner, the determining the relative position of the mobile robot and the X axis of the target object based on the image to be processed acquired by the image sensor in step A3 in the embodiment of the present application may be performed as: acquiring the area occupied by a preset plane of a target object in an image to be processed, and determining a first ratio of the actual area of the preset plane to the area occupied by the preset plane in the image to be processed; and determining the X-axis relative position of the mobile robot and the target object based on the first ratio and the coordinate position of the X-axis of the target object in the image to be processed.
The position in the X-axis direction of the image to be processed as shown in fig. 4 represents the coordinate position of the target object in the X-axis direction, and the position in the Y-axis direction represents the distance of the image sensor from the target object in the Y-axis. Since the mobile robot is generally higher than the target object, the distance in the Y-axis direction represents the distance in the up-down direction of the image sensor from the target object.
In the implementation, when the mobile robot is produced, the parameter information of the image sensor for collecting the image to be processed in the mobile robot, namely the field of view of the collected image, can be obtained, and then the distance between the target object and the collecting position of the image sensor in the X-axis direction can be obtained according to the coordinate position of the target object in the collected image to be processed and the field of view of the collected image shown in fig. 4. The first ratio of the actual area of the preset plane to the area of the preset plane in the image to be processed is obtained by dividing the area of the preset plane of the target object by the actual area of the preset plane of the target object, and then the distance between the target object and the acquisition position of the image sensor in the X-axis direction is multiplied by the first ratio, so that the X-axis relative position of the mobile robot and the target object, namely the distance between the mobile robot and the target object in the X-axis direction, can be obtained.
The preset plane can be set according to actual needs, for example, the plane of a bathroom water tank.
S203: and acquiring a preset target relative position of the mobile robot and the target object, and determining a potential difference between the current relative position and the target relative position.
Wherein the relative positions of the mobile robot include the relative angle, the X-axis relative position, and the Y-axis relative position in step S201, and thus the head herein includes the relative angle difference, the X-axis relative head, and the Y-axis relative head.
In particular, the relative angle A is calculated according to step S202 t Relative position B of X axis t And Y-axis relative position C t Thus, the current relative position of the mobile robot and the target object is determined to be (A t ,B t ,C t ) The method comprises the steps of carrying out a first treatment on the surface of the Then, at a preset target position, a preset target relative position (A) between the mobile robot and the target object, which is acquired by a plurality of single-line laser sensors and image sensors 0 ,B 0 ,C 0 ) It can be determined that:
the relative angle difference is:
Figure SMS_2
the relative position difference of the X axis is:
Figure SMS_3
the Y-axis relative head is:
Figure SMS_4
it is thereby possible to determine that the potential difference between the current relative position and the target relative position is (e A ,e B ,e C )。
S204: based on the head, the position of the mobile robot is adjusted.
In one possible implementation manner, adjusting the position of the mobile robot based on the potential difference in the embodiment of the present application may be performed as: determining a moving direction and a moving speed of the mobile robot based on the potential difference; and adjusting the current position of the mobile robot based on the moving direction and the moving speed of the mobile robot to obtain the target position of the mobile robot.
Wherein, based on the potential difference, determining the moving direction and the moving speed of the mobile robot may be performed as:
determining a movement angular velocity of the mobile robot based on the relative angle difference value;
determining the moving speed of the mobile robot in the X-axis direction based on the X-axis relative position difference value;
determining the moving speed of the mobile robot in the Y-axis direction based on the Y-axis relative position difference value;
the moving direction and the moving speed of the moving robot are determined based on the moving angular speed, the moving speed in the X-axis direction, and the moving speed in the Y-axis direction.
In specific implementation, the movement angular speed of the mobile robot, the movement speed of the mobile robot in the X-axis direction and the movement speed of the mobile robot in the Y-axis direction may be determined by using a PID control algorithm. The moving angular velocity of the mobile robot can be determined according to formula (1):
Figure SMS_5
(1)
Wherein V is t A movement angular velocity indicating a current relative angle of the mobile robot; e, e A Representing the relative angle difference value between the current mobile robot and the target object; k (K) p 、K i 、K d The system respectively represents proportional parameters, integral parameters and differential parameters, and can be configured according to the moving speed of the robot in the actual application process; t represents the current period duration.
Determining a moving speed of the mobile robot in the X-axis direction according to formula (2):
Figure SMS_6
(2)
wherein V is t Representing a moving speed of the mobile robot in the X-axis direction; e, e B Representing the X-axis relative position difference value of the current mobile robot and the target object; k (K) p 、K i 、K d The system respectively represents proportional parameters, integral parameters and differential parameters, and can be configured according to the moving speed of the robot in the actual application process; t represents the current period duration.
Determining a moving speed of the mobile robot in the Y-axis direction according to formula (3):
Figure SMS_7
(3)
wherein V is t Representing a moving speed of the mobile robot in a current Y-axis direction; e, e C Representing the Y-axis relative position difference value of the current mobile robot and the target object; k (K) p 、K i 、K d The method respectively represents a proportional parameter, an integral parameter and a differential parameter, and can represent the current period duration according to the moving speed configuration t of the robot in the actual application process.
And finally, vector sum processing is carried out on the basis of the movement angular velocity obtained in the formula (1), the movement velocity in the X-axis direction obtained in the formula (2) and the movement velocity in the Y-axis direction obtained in the formula (3), so as to obtain the final movement direction and the final movement velocity of the mobile robot.
Therefore, distance data and images to be processed are continuously collected through the single-line sensors and the image sensors, then the distance data and the images to be processed are used for continuously calculating the potential difference between the relative position between the mobile robot and the target object and the relative position of the target, the moving direction and the moving speed are continuously determined according to the potential difference, finally the position of the mobile robot is adjusted according to the moving direction and the moving speed until the potential difference between the relative position between the mobile robot and the target object and the relative position of the target is approximately 0, the position of the mobile robot is stopped to be adjusted, the positioning task is ended, and the positioning precision and the positioning reliability of the robot can be improved.
The accuracy of robot positioning in the embodiment of the application is related to the accuracy of a plurality of single-line laser sensors, the accuracy of an image sensor, the running efficiency of a processor, the minimum speed of robot movement, and if higher accuracy is needed, a better single-line laser sensor, an image sensor, a motor with a higher reduction ratio and the like are needed.
In the embodiment of the application, a real-time updating mode is adopted when the robot is positioned, and the positioning data of the robot can be continuously updated and optimized, so that different scenes and requirements can be better adapted.
For easy understanding, a complete flow of the positioning method of the mobile robot according to the embodiment of the present application is described in detail below with reference to fig. 5.
S501: periodically and respectively acquiring distance data acquired by a plurality of single line sensors and images to be processed, including target objects, acquired by the image sensors;
s502: determining a current relative angle of the mobile robot and the target object based on a distance difference between the first reference distance and the second reference distance;
the first reference distance is a distance between the mobile robot and a first obstacle on the same side of the first single-wire sensor; wherein the second reference distance is a distance between the mobile robot and a second obstacle on the same side of the second single-wire sensor acquired by the second single-wire sensor.
The first single-wire sensor and the second single-wire sensor are symmetrically arranged on two sides of the mobile robot.
S503: taking the distance between the mobile robot acquired by the third single-line sensor and the target object in the Y-axis direction as the current Y-axis relative position of the mobile robot and the target object;
Wherein the third single-wire sensor is located intermediate the first single-wire sensor and the second single-wire sensor.
S504: acquiring the area occupied by a preset plane of a target object in an image to be processed, and determining a first ratio of the actual area of the preset plane to the area occupied by the preset plane in the image to be processed; determining the current X-axis relative position of the mobile robot and the target object based on the first ratio and the X-axis coordinate position of the target object in the image to be processed;
s505: acquiring a preset target relative angle, a target Y-axis relative position and a target X-axis relative position of the mobile robot and a target object, and calculating to obtain a relative angle difference value between the current relative angle and the target relative angle, a Y-axis relative position difference value between the current Y-axis relative position and the target Y-axis relative position and an X-axis relative position difference value between the current X-axis relative position and the target X-axis relative position;
s506: determining the moving angular speed of the mobile robot by using a PID control algorithm based on the relative angle difference value; determining the moving speed of the mobile robot in the X-axis direction by utilizing a PID control algorithm based on the X-axis relative position difference value; determining the moving speed of the mobile robot in the Y-axis direction by utilizing a PID control algorithm based on the Y-axis relative position difference value;
S507: determining a moving direction and a moving speed of the moving robot based on the moving angular speed, the moving speed in the X-axis direction, and the moving speed in the Y-axis direction;
s508: and adjusting the current position of the mobile robot based on the moving direction and the moving speed of the mobile robot to obtain the target position of the mobile robot.
Each cycle executes the operations of steps S501-S508 until the relative angle difference, the Y-axis relative position difference, and the X-axis relative position difference calculated in step S505 all approach 0, i.e., the current relative position of the mobile robot is the same as the target relative position, and the positioning of the mobile robot is ended.
The application provides a positioning method of a mobile robot, which periodically acquires the current relative position of the mobile robot, so that the positioning precision of the mobile robot can be improved by shortening each period, and further, the accurate movement of the mobile robot can be realized.
Based on the foregoing description, in the embodiment of the application, distance data acquired by a plurality of single line sensors and an image to be processed including a target object acquired by an image sensor are acquired periodically and respectively; determining the current relative position of the mobile robot and the target object based on the distance data and the image to be processed; acquiring a preset target relative position of the mobile robot and a target object, and determining a potential difference between the current relative position and the target relative position; determining the moving direction and the moving speed of the mobile robot by using a PID control algorithm based on the potential difference; and adjusting the current position of the mobile robot based on the moving direction and the moving speed of the mobile robot to obtain the target position of the mobile robot. The position of the mobile robot is adjusted in real time until the position difference between the relative positions of the mobile robot and the target object in different directions and the preset target relative position is smaller than the preset threshold value, so that the positioning of the mobile robot is finished, various sensors and PID control algorithms are used in the positioning process, and the positioning accuracy and stability of the mobile robot are improved.
Based on the same inventive concept, the embodiments of the present application also provide a mobile robot, as shown in fig. 6, including: at least one processor 601, a memory 602 communicatively coupled to the at least one processor 601, and a bus 603 connecting the various system components, including the memory 602 and the processor 601; the memory 602 stores instructions executable by the at least one processor 601, where the instructions are executed by the at least one processor 601, so that the at least one processor 601 can execute the positioning method of the mobile robot according to any one of the embodiments provided herein.
Bus 603 may be a peripheral component interconnect standard (peripheral component interconnect, PCI) bus, or an extended industry standard architecture (extended industry standard architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, only one thick line is shown in fig. 6, but not only one bus or one type of bus.
The processor 601 may be any combination of a central processor (central processing unit, CPU for short), a network processor (network processor, NP for short), an image processor (Graphic Processing Unit, GPU for short) or CPU, NP, GPU. But also a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (programmable logic device, PLD), or a combination thereof. The PLD may be a complex programmable logic device (complex programmable logic device, CPLD for short), a field-programmable gate array (field-programmable gate array, FPGA for short), general-purpose array logic (generic array logic, GAL for short), or any combination thereof.
The memory 602 may include readable media in the form of volatile memory, such as Random Access Memory (RAM) and/or cache memory, and may further include Read Only Memory (ROM).
Memory 602 may also include a program/utility having a set (at least one) of program modules including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Based on the same inventive concept, the embodiment of the present application further provides a positioning device 700 of a mobile robot, as shown in fig. 7, the device includes:
an acquiring module 701, configured to acquire distance data acquired by a plurality of single line sensors and an image to be processed including a target object acquired by an image sensor, respectively;
a position determining module 702, configured to determine a current relative position of the mobile robot and the target object based on the distance data and the image to be processed;
a level difference determining module 703, configured to obtain a preset target relative position between the mobile robot and the target object, and determine a level difference between the current relative position and the target relative position;
And an adjustment module 704, configured to adjust the position of the mobile robot based on the level difference.
In one possible embodiment, the distance data includes a first distance between the mobile robot and an obstacle of an environment in which the target object is located, and a second distance between the mobile robot and the target object; the relative positions include a relative angle, an X-axis relative position, and a Y-axis relative position;
the location determining module 702 is specifically configured to:
determining a relative angle of the mobile robot and the target object based on the first distances acquired by the first single-wire sensor and the second single-wire sensor; the first single-wire sensor and the second single-wire sensor are symmetrically arranged on two sides of the mobile robot;
determining a Y-axis relative position of the mobile robot and the target object based on the second distance acquired by the third single-line sensor; the third single-wire sensor is positioned between the first single-wire sensor and the second single-wire sensor;
and determining the X-axis relative position of the mobile robot and the target object based on the image to be processed.
In one possible embodiment, the first distance includes a first reference distance and a second reference distance; the first reference distance is a distance between the mobile robot and a first obstacle on the same side of the first single-wire sensor; the second reference distance is a distance between the mobile robot and a second obstacle on the same side of the second single-wire sensor;
The location determining module 702 is specifically configured to:
determining a distance difference between the first reference distance and the second reference distance based on the first reference distance and the second reference distance;
a relative angle of the mobile robot and the target object is determined based on a distance difference between the first reference distance and the second reference distance.
In one possible implementation, the location determining module 702 is specifically configured to:
determining the relative angle between the mobile robot and the target object based on a corresponding relation between a preset distance difference value and the relative angle and a distance difference value between the first reference distance and the second reference distance; or alternatively, the process may be performed,
determining a relative angle of the mobile robot and the target object based on a distance difference between the first reference distance and the second reference distance and a pre-stored distance between the first single-line sensor and the second single-line sensor.
In one possible implementation, the location determining module 702 is specifically configured to:
and taking the second distance, in the Y-axis direction, between the mobile robot and the target object, acquired by the third single-line sensor as the Y-axis relative position between the mobile robot and the target object.
In one possible implementation, the location determining module 702 is specifically configured to:
acquiring the area occupied by a preset plane of the target object in the image to be processed, and determining a first ratio of the actual area of the preset plane to the area occupied by the preset plane in the image to be processed;
and determining the relative position of the mobile robot and the X axis of the target object based on the first ratio and the coordinate position of the X axis of the target object in the image to be processed.
In one possible implementation, the adjusting module 704 is specifically configured to:
determining a moving direction and a moving speed of the mobile robot based on the potential difference;
and adjusting the current position of the mobile robot based on the moving direction and the moving speed of the mobile robot to obtain the target position of the mobile robot.
In one possible embodiment, the head comprises a relative angle difference, an X-axis relative head, and a Y-axis relative head; the adjusting module 704 is specifically configured to:
determining a movement angular velocity of the mobile robot based on the relative angle difference;
determining a moving speed of the mobile robot in the X-axis direction based on the X-axis relative position difference value;
Determining a moving speed of the mobile robot in the Y-axis direction based on the Y-axis relative position difference value;
the moving direction and the moving speed of the mobile robot are determined based on the moving angular speed, the moving speed in the X-axis direction, and the moving speed in the Y-axis direction.
In an exemplary embodiment, a computer readable storage medium is also provided, such as a memory, comprising instructions executable by a processor to perform the above-described mobile robot positioning method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, comprising a computer program which, when executed by a processor, implements any one of the methods of positioning a mobile robot as provided herein.
In an exemplary embodiment, aspects of a method for positioning a mobile robot provided herein may also be implemented in the form of a program product comprising program code for causing a computer device to carry out the steps of the method for positioning a mobile robot according to the various exemplary embodiments of the present application as described herein above, when the program product is run on a computer device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product of the positioning method for a mobile robot of embodiments of the present application may employ a portable compact disc read only memory (CD-ROM) and include program code and may run on an electronic device. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "like" programming language or similar programming languages. The program code may execute entirely on the consumer electronic device, partly on the consumer electronic device, as a stand-alone software package, partly on the consumer electronic device, partly on the remote electronic device, or entirely on the remote electronic device or server. In the case of remote electronic devices, the remote electronic device may be connected to the consumer electronic device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external electronic device (e.g., connected through the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such a division is merely exemplary and not mandatory. Indeed, the features and functions of two or more of the elements described above may be embodied in one element in accordance with embodiments of the present application. Conversely, the features and functions of one unit described above may be further divided into a plurality of units to be embodied.
Furthermore, although the operations of the methods of the present application are depicted in the drawings in a particular order, this is not required to or suggested that these operations must be performed in this particular order or that all of the illustrated operations must be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
The foregoing detailed description of the embodiments is merely illustrative of the general principles of the present application and should not be taken in any way as limiting the scope of the invention. Any other embodiments developed in accordance with the present application without inventive effort are within the scope of the present application for those skilled in the art.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (7)

1. A method of positioning a mobile robot, the method comprising:
respectively acquiring distance data acquired by a plurality of single line sensors and an image to be processed including a target object acquired by an image sensor;
determining the current relative position of the mobile robot and the target object based on the distance data and the image to be processed; the distance data comprises a first distance between the mobile robot and an obstacle of the environment where the target object is located and a second distance between the mobile robot and the target object; the first distance includes a first reference distance and a second reference distance; the first reference distance is a distance between the mobile robot and a first obstacle on the same side of the first single-wire sensor; the second reference distance is a distance between the mobile robot and a second obstacle on the same side of the second single-wire sensor; the first single-wire sensor and the second single-wire sensor are symmetrically arranged on two sides of the mobile robot; the relative positions include a relative angle, an X-axis relative position, and a Y-axis relative position; the determining, based on the distance data and the image to be processed, a current relative position of the mobile robot and the target object includes: determining a distance difference between the first reference distance and the second reference distance based on the first reference distance and the second reference distance; determining the relative angle between the mobile robot and the target object based on a corresponding relation between a preset distance difference value and the relative angle and a distance difference value between the first reference distance and the second reference distance; or determining a relative angle of the mobile robot and the target object based on a distance difference between the first reference distance and the second reference distance and a pre-stored distance between the first single-line sensor and the second single-line sensor; determining a Y-axis relative position of the mobile robot and the target object based on the second distance acquired by the third single-line sensor; the third single-wire sensor is positioned between the first single-wire sensor and the second single-wire sensor; determining the X-axis relative position of the mobile robot and the target object based on the image to be processed;
Acquiring a preset target relative position of the mobile robot and the target object, and determining a potential difference between the current relative position and the target relative position;
and adjusting the position of the mobile robot based on the potential difference.
2. The method of claim 1, wherein the determining the Y-axis relative position of the mobile robot and the target work object based on the second distance acquired by the third single-wire sensor comprises:
and taking the second distance, in the Y-axis direction, between the mobile robot and the target object, acquired by the third single-line sensor as the Y-axis relative position between the mobile robot and the target object.
3. The method of claim 1, wherein the determining an X-axis relative position of the mobile robot and the target object based on the image to be processed acquired by the image sensor comprises:
acquiring the area occupied by a preset plane of the target object in the image to be processed, and determining a first ratio of the actual area of the preset plane to the area occupied by the preset plane in the image to be processed;
And determining the relative position of the mobile robot and the X axis of the target object based on the first ratio and the coordinate position of the X axis of the target object in the image to be processed.
4. The method of claim 1, wherein adjusting the position of the mobile robot based on the head comprises:
determining a moving direction and a moving speed of the mobile robot based on the potential difference;
and adjusting the current position of the mobile robot based on the moving direction and the moving speed of the mobile robot to obtain the target position of the mobile robot.
5. The method of claim 4, wherein the head comprises a relative angle difference, an X-axis relative head, and a Y-axis relative head; the determining the moving direction and the moving speed of the mobile robot based on the potential difference includes:
determining a movement angular velocity of the mobile robot based on the relative angle difference;
determining a moving speed of the mobile robot in the X-axis direction based on the X-axis relative position difference value;
determining a moving speed of the mobile robot in the Y-axis direction based on the Y-axis relative position difference value;
The moving direction and the moving speed of the mobile robot are determined based on the moving angular speed, the moving speed in the X-axis direction, and the moving speed in the Y-axis direction.
6. A mobile robot, the mobile robot comprising:
at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of positioning a mobile robot according to any one of claims 1-5.
7. A computer storage medium, characterized in that the computer storage medium stores a computer program for causing a computer to execute the positioning method of a mobile robot according to any one of claims 1-5.
CN202310465399.XA 2023-04-27 2023-04-27 Mobile robot positioning method, mobile robot and medium Active CN116185046B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310465399.XA CN116185046B (en) 2023-04-27 2023-04-27 Mobile robot positioning method, mobile robot and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310465399.XA CN116185046B (en) 2023-04-27 2023-04-27 Mobile robot positioning method, mobile robot and medium

Publications (2)

Publication Number Publication Date
CN116185046A CN116185046A (en) 2023-05-30
CN116185046B true CN116185046B (en) 2023-06-30

Family

ID=86434843

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310465399.XA Active CN116185046B (en) 2023-04-27 2023-04-27 Mobile robot positioning method, mobile robot and medium

Country Status (1)

Country Link
CN (1) CN116185046B (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108388244A (en) * 2018-01-16 2018-08-10 上海交通大学 Mobile-robot system, parking scheme based on artificial landmark and storage medium
CN109798896B (en) * 2019-01-21 2023-01-03 东南大学 Indoor robot positioning and mapping method and device
CN113048978B (en) * 2021-02-01 2023-10-20 苏州澜途科技有限公司 Mobile robot repositioning method and mobile robot
CN113907645A (en) * 2021-09-23 2022-01-11 追觅创新科技(苏州)有限公司 Mobile robot positioning method and device, storage medium and electronic device
CN114353807B (en) * 2022-03-21 2022-08-12 沈阳吕尚科技有限公司 Robot positioning method and positioning device
CN114519739A (en) * 2022-04-21 2022-05-20 深圳史河机器人科技有限公司 Direction positioning method and device based on recognition device and storage medium
CN115014338A (en) * 2022-05-31 2022-09-06 南京理工大学 Mobile robot positioning system and method based on two-dimensional code vision and laser SLAM

Also Published As

Publication number Publication date
CN116185046A (en) 2023-05-30

Similar Documents

Publication Publication Date Title
US10852139B2 (en) Positioning method, positioning device, and robot
CN110673115B (en) Combined calibration method, device, equipment and medium for radar and integrated navigation system
US9386209B2 (en) Method and apparatus for estimating position
US10006772B2 (en) Map production method, mobile robot, and map production system
CN109917788B (en) Control method and device for robot to walk along wall
CN112539749B (en) Robot navigation method, robot, terminal device, and storage medium
CN111182174B (en) Method and device for supplementing light for sweeping robot
CN110942474B (en) Robot target tracking method, device and storage medium
CN107782304A (en) The localization method and device of mobile robot, mobile robot and storage medium
CN111857114A (en) Robot formation moving method, system, equipment and storage medium
CN110850859A (en) Robot and obstacle avoidance method and obstacle avoidance system thereof
WO2023142353A1 (en) Pose prediction method and apparatus
CN116185046B (en) Mobile robot positioning method, mobile robot and medium
CN110653810B (en) Robot distance measuring method and device and terminal equipment
CN113534805B (en) Robot recharging control method, device and storage medium
CN109769206A (en) A kind of indoor positioning fusion method, device, storage medium and terminal device
CN115307641A (en) Robot positioning method, device, robot and storage medium
CN114061573A (en) Ground unmanned vehicle formation positioning device and method
Dien et al. Building Environmental Awareness System for Mobile Robot Operating in Indoor Environment on ROS Platform
CN114115263A (en) Automatic mapping method and device for AGV, mobile robot and medium
CN112562671A (en) Voice control method and device for service robot
CN113110426A (en) Edge detection method, edge detection device, robot and storage medium
KR101339899B1 (en) method for robot self-localization based on smart phone platform
CN113268062B (en) Human body curved surface modeling method, modeling device and modeling system
CN115345939A (en) Camera external parameter calibration method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant