CN112276933A - Control method of mobile robot and mobile robot - Google Patents

Control method of mobile robot and mobile robot Download PDF

Info

Publication number
CN112276933A
CN112276933A CN201910673188.9A CN201910673188A CN112276933A CN 112276933 A CN112276933 A CN 112276933A CN 201910673188 A CN201910673188 A CN 201910673188A CN 112276933 A CN112276933 A CN 112276933A
Authority
CN
China
Prior art keywords
mobile robot
environment
sub
data
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910673188.9A
Other languages
Chinese (zh)
Inventor
高苗
郑卓斌
王立磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Bona Robot Co ltd
Original Assignee
Guangdong Bona Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Bona Robot Co ltd filed Critical Guangdong Bona Robot Co ltd
Priority to CN201910673188.9A priority Critical patent/CN112276933A/en
Publication of CN112276933A publication Critical patent/CN112276933A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • B25J11/0085Cleaning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a control method of a mobile robot and the mobile robot. The method comprises the following steps: s1: dividing the surrounding environment of the mobile robot into at least two sub-areas, and acquiring the surrounding environment information of the current position of the mobile robot through a combined sensor; s2: generating environment identification parameters, wherein the environment identification parameters comprise environment state data and azimuth data; s3: identifying passable areas around the mobile robot according to the environment identification parameters; s4: controlling the mobile robot to enter any passable area; s5: after the environmental status data of the passable area is traversed, the process returns to step S1. The embodiment of the invention improves the working efficiency of the mobile robot.

Description

Control method of mobile robot and mobile robot
Technical Field
The embodiment of the invention relates to the technical field of robots, in particular to a control method of a mobile robot and the mobile robot.
Background
Along with the improvement of living standard, mobile robots such as sweeping robots are popular due to the advantages of time saving, convenience, intelligence, simple operation and the like.
In practical applications, when the mobile robot performs a cleaning task, path planning is required, that is, a feasible path is selected to avoid collision with obstacles in a working area. At present, one of the commonly adopted methods is to control a mobile robot to perform a cleaning task in a random manner, specifically: and controlling the mobile robot to move forwards, randomly rotating the mobile robot by an angle after the mobile robot touches an obstacle, then moving the mobile robot straight, and repeating the process. Another method is commonly adopted, in which a sensor of the mobile robot senses the surrounding environment in real time and feeds back information to the mobile robot in real time when detecting an obstacle or a running area, and then plans the next step of movement according to the information. However, when the mobile robot is controlled to perform the cleaning task in the above manner, the work efficiency is low, and the main reason is that the path can only be planned according to the map signal detected in real time during the information processing, and even after a global map is created in some cases, the mobile robot is unable to adapt to the changed environmental information, thereby causing a wrong route of the mobile robot.
Therefore, how to improve the work efficiency becomes a problem that needs to be solved at present, and the key is how to enable the running mobile robot to construct the surrounding map and execute the work task by identifying the map information more quickly.
Disclosure of Invention
In order to solve the problems in the prior art, embodiments of the present invention provide a mobile robot and a control method thereof.
In a first aspect, an embodiment of the present invention provides a method for controlling a mobile robot, where the method includes:
s1: dividing the surrounding environment of the mobile robot into at least two sub-areas, and acquiring the surrounding environment information of the current position of the mobile robot through a combined sensor;
s2: generating environment identification parameters, wherein the environment identification parameters comprise environment state data and azimuth data;
s3: identifying a passable area around the mobile robot according to the environment identification parameters;
s4: controlling the mobile robot to enter any passable area;
s5: after the environmental status data of the passable area is traversed, the process returns to step S1.
In a second aspect, an embodiment of the present invention further provides a mobile robot, where the mobile robot includes a combined sensor, configured to collect surrounding environment information of a current location of the mobile robot; the control method of the mobile robot is characterized by further comprising a controller, wherein the controller is used for executing the control method of the mobile robot.
The technical scheme disclosed by the embodiment of the invention has the following beneficial effects:
the mobile robot collects the surrounding environment information at the current position through the combined sensor and generates the environment identification parameters, the mobile robot detects the obstacle area and the passable area in the environment according to the environment identification parameters in the moving process, so that the mobile robot can enter the passable area, and the information in the environment identification parameters is suitable for the next time before the combined sensor starts to collect, so that the working efficiency of the mobile robot is improved.
Drawings
Fig. 1 is a schematic flowchart of a control method for a mobile robot according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a rectangular coordinate system established with the starting position of the mobile robot as the origin and moving according to an arcuate cleaning strategy according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of determining orientation data of at least two sub-regions according to an embodiment of the present invention;
FIG. 4 is a flow chart illustrating a process for generating environmental status data according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of marking azimuth sub-regions according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a mobile robot according to a second embodiment of the present invention.
Detailed Description
The embodiments of the present invention will be described in further detail with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad invention. It should be further noted that, for convenience of description, only some structures, not all structures, relating to the embodiments of the present invention are shown in the drawings.
The embodiment of the invention provides a control method of a mobile robot, aiming at the problem that how to improve the working efficiency becomes the current urgent solution, and the key is how to enable the running mobile robot to more quickly construct a surrounding map and execute a working task by identifying map information.
According to the embodiment of the invention, the surrounding environment of the mobile robot is divided into at least two sub-areas, the surrounding environment information of the current position of the mobile robot is acquired through the combined sensor, the environment identification parameter is generated, the passable area around the mobile robot is identified according to the environment identification parameter, the mobile robot is controlled to enter any passable area, the environment state data of the passable area is traversed, then the operation of dividing the surrounding environment of the mobile robot into at least two sub-areas and acquiring the surrounding environment information of the current position is returned and executed. From this, mobile robot passes through the peripheral environment information of combination formula sensor collection current position department and generates the environment identification parameter, and mobile robot is in the barrier area and the region of can passing through in the removal in-process according to environment identification parameter detection environment to can enter into the region of can passing through, the information in the environment identification parameter is all suitable for before combination formula sensor begins to gather next time, with the work efficiency who improves mobile robot.
A control method of a mobile robot and a mobile robot according to an embodiment of the present invention will be described in detail below with reference to the drawings.
Example one
Fig. 1 is a flowchart illustrating a control method for a mobile robot according to an embodiment of the present invention, where the method is applicable to a scenario where the mobile robot performs a cleaning task, and the method can be performed by the mobile robot. The control method of the mobile robot specifically comprises the following steps:
s101, dividing the surrounding environment of the mobile robot into at least two sub-areas, and collecting the surrounding environment information of the current position of the mobile robot through a combined sensor.
Wherein the mobile robot may be, but is not limited to: a sweeper, a cleaner and a dust collector.
In this embodiment, the combined sensor includes an ultrasonic sensor, a camera, an infrared sensor, or a laser radar.
Wherein, large obstacles such as walls and the like in the surrounding environment of the mobile robot can be detected through the ultrasonic sensor; small and medium-sized obstacles in the surrounding environment of the mobile robot can be detected through the laser radar; the infrared sensor can detect the close-range obstacles in the surrounding environment of the mobile robot; the camera can collect images of the surrounding environment of the mobile robot, and therefore the attribute information of the obstacles in the surrounding environment is determined by analyzing the collected images. In the present embodiment, the attribute information of the obstacle may be an obstacle category. Such as socks, shoes, books, etc., which are not specifically limited herein.
Alternatively, before executing S101, the present embodiment may first establish a rectangular coordinate system with the starting position of the mobile robot as the origin, where the direction in which the mobile robot starts to move is a positive X-axis direction, and the right side perpendicular to the moving direction of the mobile robot is a positive Y-axis direction, and then control the mobile robot to move according to a preset arcuate cleaning strategy, as shown in fig. 2.
The arched cleaning strategy is characterized in that the mobile robot moves linearly in a certain direction at the beginning until the mobile robot touches an obstacle or the boundary of a working area or moves for a preset distance, then moves in the opposite direction after moving linearly for a certain distance in the direction perpendicular to the certain direction, and then the process is repeated until the coverage of the working area is completed.
Since the mobile robot performs a cleaning task in a work area, an environmental state of the work area is unclear. For this reason, before the mobile robot performs the cleaning task, the environment state of the working area may be initially set in advance to set the environment state of the working area to an unknown state, so as to implement a detection operation of the surrounding environment information from the start position, thereby determining the movement exit of the mobile robot according to the detection result.
In specific implementation, the surrounding environment of the mobile robot can be divided into at least two sub-areas in a manual mode; or dividing the surrounding environment of the mobile robot into at least two sub-areas based on the arrangement mode of the combined sensor; alternatively, the surrounding environment of the mobile robot may be divided into at least two sub-areas, etc. by other means, which is not specifically limited herein.
It should be noted that, in order to ensure that the mobile robot moves from one sub-area to another sub-area, the present embodiment may divide the body size of the mobile robot into at least two sub-areas. That is, the at least two sub-areas are each larger in size than the body of the mobile robot.
Furthermore, the peripheral environment information at the current position of the mobile robot can be collected through the combined sensor.
In the embodiment, the setting mode of the combined sensor can be adaptively set according to actual needs. For example, an ultrasonic sensor, an infrared sensor or a laser radar is respectively arranged at four positions of the front, the rear, the left and the right of the body of the mobile robot, and a camera is arranged at the top of the body of the mobile robot; alternatively, an ultrasonic sensor, an infrared sensor or a laser radar may be provided at eight positions, i.e., the front, the rear, the left, the right, the front left, the front right, the rear left and the rear right of the body of the mobile robot, and a camera may be provided at the top of the body of the mobile robot.
It should be noted that, in this embodiment, the execution sequence of dividing the surrounding environment of the mobile robot into at least two sub-areas and acquiring the surrounding environment information at the current position of the mobile robot through the combined sensor may be that the surrounding environment of the mobile robot is divided into at least two sub-areas first, and then the surrounding environment information at the current position of the mobile robot is acquired through the combined sensor; or, firstly, acquiring the surrounding environment information of the current position of the mobile robot through the combined sensor, and then dividing the surrounding environment of the mobile robot into at least two sub-areas; or, the dividing of the surrounding environment of the mobile robot into at least two sub-areas and the collecting of the surrounding environment information at the current position of the mobile robot by the combined sensor are performed simultaneously, which is not specifically limited herein.
Further, in the embodiment, when the combined sensor is used to collect the ambient environment information at the current position of the mobile robot, the current position of the mobile robot may be determined first, and then the combined sensor is used to collect the ambient environment information at the current position.
Alternatively, the current position of the mobile robot may be determined in the following manner.
As an alternative implementation, the current position of the mobile robot may be determined according to the following equation (1):
Figure BDA0002142378810000061
wherein X and y represent coordinates of a current position h point of the mobile robot, t represents moving time from a starting position to the point h of the mobile robot, and theta represents a moving angle of the mobile robot relative to an X axis at the point h, and the moving angle can be calculated according to recorded moving angles of the mobile robot each time.
As another alternative implementation manner, the present embodiment may also read coordinates recorded by an odometer and a gyroscope in the mobile robot, and determine the current position of the mobile robot.
Of course, in addition to the above two ways, the present embodiment may also determine the current position of the mobile robot through other ways, which is not described in detail in this embodiment.
S102, generating environment identification parameters, wherein the environment identification parameters comprise environment state data and azimuth data.
Because the combined sensor is provided with a Micro Control Unit (MCU), when the surrounding environment information of the current position of the mobile robot is collected, the collected surrounding environment information can be analyzed and processed by the MCU in the combined sensor to generate the environment identification parameters.
In specific implementation, the position data of the at least two sub-regions can be determined according to the relative positions of the at least two sub-regions and the current position of the mobile robot.
For example, as shown in fig. 3, if the surrounding environment of the mobile robot is divided into at least two sub-areas of 8, the relative positions of the at least two sub-areas and the current position of the mobile robot are front, right rear, left front, and the position data of the at least two sub-areas can be determined as follows according to the relative positions: front, right side, right back, left front.
In this embodiment, the collected ambient environment information may be mapped to the at least two sub-regions to generate the environment state data.
Wherein the environmental status data may comprise at least one of the following information: unknown state, standby state, working state and obstacle state.
In this embodiment, the corresponding relationship between the collected ambient environment information and the collected ambient environment information may be determined according to the at least two sub-regions, and then the ambient environment information is mapped to the at least two divided sub-regions to generate the environmental state data.
For example, if the collected ambient environment information is in an unknown state, a standby state and an obstacle state, the ambient environment information is in the unknown state and corresponds to the left sub-region; the peripheral environment information is in a state to be worked and corresponds to the front subregion; and if the peripheral environment information is in the barrier state and corresponds to the right subregion, mapping the unknown state of the peripheral environment information to the left subregion, mapping the peripheral environment information to the front subregion in the form of the to-be-operated state, mapping the barrier state of the peripheral environment information to the right subregion in the form of the peripheral environment information, and generating the environment state data of the current position of the mobile robot.
The following describes in detail an implementation process of generating environment state data according to an embodiment of the present invention with reference to fig. 4.
Fig. 4 is a schematic flowchart of generating environment status data according to an embodiment of the present invention. As shown in fig. 4, the generating of the environment state data according to the embodiment of the present invention specifically includes the following steps:
s401, carrying out data analysis on the collected surrounding environment data, determining whether an obstacle exists, if so, executing S402, otherwise, executing S405.
The obstacle may be a static obstacle, such as a wall, a corner, furniture, etc. of a work area, or a dynamic obstacle, such as a pet, a person, etc.
As an alternative implementation manner, the present embodiment may detect whether an obstacle exists in the surrounding environment at the current position of the mobile robot through an ultrasonic sensor, an infrared sensor, or a laser radar.
S402, if the distance exists, determining the distance between the current position of the mobile robot and the obstacle, and determining whether the distance is smaller than a first distance threshold, if so, executing S403, otherwise, executing S404.
The first distance threshold is a distance that the mobile robot moves once, and may be adaptively set according to the performance of the mobile robot, and is not specifically limited herein, for example, 0.5 centimeters (cm), 1cm, and the like.
Optionally, when it is determined that an obstacle exists in the surrounding environment at the current position of the mobile robot, the distance between the current position of the mobile robot and the obstacle is further detected by using an ultrasonic sensor, an infrared sensor, or a laser radar, and the distance is differentiated from the first distance threshold. If the difference value is greater than or equal to zero, the obstacle does not influence the next movement of the mobile robot; if the difference is smaller than zero, it indicates that the mobile robot touches an obstacle when moving in the current moving direction, and may adversely affect the normal movement of the mobile robot, such as jamming the mobile robot.
For example, if the first distance threshold is 1cm, when the distance between the mobile robot and the obstacle is determined to be 0.5cm, it indicates that the obstacle is close to the mobile robot, and the mobile robot is likely to be trapped.
And S403, if the distance is smaller than the first distance threshold, determining that the surrounding environment information of the current position is in an obstacle state.
And S404, if the distance is greater than or equal to the first distance threshold, determining that the surrounding environment information of the current position is in a to-be-operated state.
When the distance between the mobile robot and the obstacle is determined to be larger than or equal to the first distance threshold, the obstacle does not cause adverse effects on the next movement of the mobile robot, and the surrounding environment information of the current position of the mobile robot can be determined to be the state to be worked, so that the mobile robot can move to the state to be worked to carry out the cleaning task.
For example, if the distance threshold is 1cm, when the distance between the mobile robot and the obstacle is 2cm, it indicates that the mobile robot can continue to move in the current moving direction to perform the cleaning task.
S405, if not, determining whether the surrounding environment information at the current position is in a working state, if not, executing S406, otherwise, executing step S407.
In an actual use process, in order to accurately judge whether cleaning operation is performed on at least two sub-areas in a working area, in this embodiment, after cleaning is performed on each sub-area, the sub-area is marked to mark the sub-area as a working state, so that the mobile robot can check the sub-area to be in the working state in the working area according to the marked working state, and thus cleaning efficiency can be improved.
That is, when it is determined that there is no obstacle in the collected ambient environment data, the present embodiment may determine whether the ambient environment information is in the operated state by querying the operated state of the flag.
And S406, if not, determining that the surrounding environment information at the current position is in a to-be-operated state.
And when the peripheral environment information at the current position of the mobile robot is determined to be in the state to be worked, controlling the mobile robot to move to the state in which the peripheral environment information is in the state to be worked, and executing a cleaning task.
And S407, if so, controlling the mobile robot to move once in any direction, and acquiring the surrounding environment data at the moved position to determine the surrounding environment information.
When the surrounding environment information at the current position of the mobile robot is determined to be in a working state, the mobile robot is indicated to finish cleaning at least two sub-areas, the mobile robot is controlled to move towards any direction, and the surrounding environment information at the moved position is determined. The method for determining the surrounding environment information at the position where the mobile robot moves in any direction is the same as the above determination method, and is not described in detail here.
In this embodiment, the environment state data includes: if the at least two sub-region environment states are unknown states, marking as a first numerical value; if the at least two sub-area environment states are to-be-worked states, marking the sub-area environment states as second numerical values; if the at least two sub-area environment states are working states, marking the state as a third numerical value; if the at least two sub-area environment states are barrier states, marking the state as a fourth numerical value; and setting the sub-area with the environment state as the standby working state as a passable area. Therefore, the mobile robot moving outlet is determined according to the environmental state data.
Wherein the first value may be represented as 00, the second value may be represented as 01, the third value may be represented as 10 and the fourth value may be represented as 11.
Further, in this embodiment, the environment status data is formed in a first data table, where the first data table includes an adjacent front environment data bit, a rear environment data bit, a left environment data bit, a right environment data bit, a left front environment data bit, a right front environment data bit, a left rear environment data bit, and a right rear environment data bit, which are used for storing environment data at sub-areas adjacent to the front, rear, left, right, front left, front right, rear left, and rear right of the current position of the mobile robot, respectively;
the adjacent front environment data bit, the rear environment data bit, the left environment data bit, the right environment data bit, the left front environment data bit, the right front environment data bit, the left rear environment data bit, or the right rear environment data bit takes the values of the first value, the second value, the third value, or the fourth value.
In this embodiment, the first data list is a binary table.
For example, the first data list may be represented as { (x, y), A [ n ] }, where (x, y) represents coordinates at the current position of the mobile robot and A [ n ] represents environment status data of at least two sub-areas.
For example, if the current position of the mobile robot is (x1, y1), the surrounding environment at the current position of the mobile robot is divided into 8 sub-regions, the 8 sub-regions are respectively located in front of, right side of, right back of, rear of, left back of, left front of, and left front of the current position of the mobile robot, wherein each piece of orientation sub-region flag information is 1, 2, 3, 4, 5, 6, 7, 8, as shown in fig. 5, and the environment status data of the 8 sub-regions are respectively: unknown state, waiting state, working state, obstacle state, and working state, the first data table is { (x1, y1),00, 01,01,01,10,10,11,10 }.
It should be noted that the first data list may also be a six-system table or a decimal number table in the embodiment of the present invention, and is not limited specifically here.
In this embodiment, the environment identification parameters of the mobile robot are stored by using the first data list, so that the data size is small, the requirement on hardware of the mobile robot is low, and the processing speed is increased.
S103, identifying passable areas around the mobile robot according to the environment identification parameters.
In this embodiment, the environment status data in the environment identification parameter can be identified as the to-be-operated status, and the direction of the passable area is determined according to the direction data in the environment identification parameter.
And S104, controlling the mobile robot to enter any passable area.
Optionally, after determining the passable area around the mobile robot, the mobile robot may be controlled to enter the passable area to perform the cleaning task according to the position data of the passable area. Wherein, the mobile robot also executes the cleaning task in the process of moving from the current position to the passable area.
In this embodiment, when the passable area around the mobile robot is identified according to the environment identifier parameter, a plurality of sub-areas are passable areas, and in this case, in order to facilitate the mobile robot to perform the cleaning task, in this embodiment, one passable area closest to the current position of the mobile robot may be selected from the plurality of passable areas as a target area, so as to control the mobile robot to move to the target area from the current position as a starting point to perform the cleaning task.
And S105, after traversing the environmental state data of the passable area, returning to execute the step S101.
In this embodiment, after the mobile robot enters the passable area, the combined sensor collects the ambient environment information in the passable area, and determines the moving exit of the mobile robot according to the collected ambient environment information, so as to control the mobile robot to clean the passable area and move from the determined exit to the next passable area.
And before the mobile robot is controlled to move to the next passable area from the determined exit, returning to the step S101 to determine the surrounding environment information of the mobile robot at the position, and executing the cleaning task according to the identified passable area until the cleaning task is completed on the whole working area.
According to the control method of the mobile robot provided by the embodiment of the invention, the mobile robot acquires the surrounding environment information at the current position through the combined sensor and generates the environment identification parameters, the mobile robot detects the obstacle area and the passable area in the environment according to the environment identification parameters in the moving process, so that the mobile robot can enter the passable area, and the information in the environment identification parameters is suitable before the combined sensor starts to acquire the next time, so that the working efficiency of the mobile robot is improved.
Example two
Fig. 6 is a schematic structural diagram of a mobile robot according to a second embodiment of the present invention, and as shown in fig. 6, the mobile robot 60 includes a combination sensor 610 for collecting surrounding environment information at a current position of the mobile robot; a controller 620 is further included, and the controller 620 is configured to perform a control method of the mobile robot according to any one of the embodiments of the present invention. The control method of the mobile robot comprises the following steps:
s1: dividing the surrounding environment of the mobile robot into at least two sub-areas, and acquiring the surrounding environment information of the current position of the mobile robot through a combined sensor;
s2: generating environment identification parameters, wherein the environment identification parameters comprise environment state data and azimuth data;
s3: identifying passable areas around the mobile robot according to the environment identification parameters;
s4: controlling the mobile robot to enter any passable area;
s5: after the environmental status data of the passable area is traversed, the process returns to step S1.
In this embodiment, the mobile robot may be, but is not limited to: a sweeper, a cleaner and a dust collector.
It should be noted that the foregoing explanation of the embodiment of the control method of the mobile robot is also applicable to the mobile robot in this embodiment, and the implementation principle is similar, and is not described herein again.
According to the mobile robot provided by the embodiment of the invention, the mobile robot acquires the peripheral environment information at the current position through the combined sensor and generates the environment identification parameters, the mobile robot detects the obstacle area and the passable area in the environment according to the environment identification parameters in the moving process, so that the mobile robot can enter the passable area, and the information in the environment identification parameters is suitable before the next combined sensor starts to acquire, so that the working efficiency of the mobile robot is improved.
From the above description of the embodiments, it is obvious for those skilled in the art that the embodiments of the present invention can be implemented by software and necessary general hardware, and certainly can be implemented by hardware, but the former is a better implementation in many cases. Based on such understanding, the technical solutions of the embodiments of the present invention may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk, or an optical disk of a computer, and includes several instructions for enabling a mobile robot to perform the methods according to the embodiments of the present invention.
It should be noted that, in the foregoing embodiment, each included unit and each included module are only divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
It should be noted that the foregoing is only a preferred embodiment of the present invention and the technical principles applied. Those skilled in the art will appreciate that the embodiments of the present invention are not limited to the specific embodiments described herein, and that various obvious changes, adaptations, and substitutions are possible, without departing from the scope of the embodiments of the present invention. Therefore, although the embodiments of the present invention have been described in more detail through the above embodiments, the embodiments of the present invention are not limited to the above embodiments, and many other equivalent embodiments may be included without departing from the concept of the embodiments of the present invention, and the scope of the embodiments of the present invention is determined by the scope of the appended claims.

Claims (8)

1. A method of controlling a mobile robot, the method comprising:
s1: dividing the surrounding environment of the mobile robot into at least two sub-areas, and acquiring the surrounding environment information of the current position of the mobile robot through a combined sensor;
s2: generating environment identification parameters, wherein the environment identification parameters comprise environment state data and azimuth data;
s3: identifying a passable area around the mobile robot according to the environment identification parameters;
s4: controlling the mobile robot to enter any passable area;
s5: after the environmental status data of the passable area is traversed, the process returns to step S1.
2. The control method of claim 1, wherein generating the environment identification parameter comprises:
determining position data of the at least two sub-areas according to the relative positions of the at least two sub-areas and the current position of the mobile robot;
and mapping the collected surrounding environment information to the at least two sub-areas to generate environment state data.
3. The control method of claim 2, wherein the environmental status data comprises:
if the at least two sub-region environment states are unknown states, marking as a first numerical value;
if the at least two sub-area environment states are to-be-worked states, marking the sub-area environment states as second numerical values;
if the at least two sub-area environment states are working states, marking the state as a third numerical value;
if the at least two sub-area environment states are barrier states, marking the state as a fourth numerical value;
and setting the sub-area with the environment state as the standby working state as a passable area.
4. The control method of claim 3, wherein the environment status data is formed in a first data table including adjacent front environment data bits, rear environment data bits, left environment data bits, right environment data bits, front left environment data bits, front right environment data bits, rear left environment data bits, and rear right environment data bits for storing environment data at adjacent front, rear, left, right, front left, front right, rear left, and rear right sub-regions of the current position of the mobile robot, respectively;
the adjacent front environment data bit, the rear environment data bit, the left environment data bit, the right environment data bit, the left front environment data bit, the right front environment data bit, the left rear environment data bit, or the right rear environment data bit takes the values of the first value, the second value, the third value, or the fourth value.
5. The control method according to claim 4, wherein the first data list is a binary table.
6. The control method of claim 1, wherein the combination sensor comprises an ultrasonic sensor, a camera, an infrared sensor, or a lidar.
7. A mobile robot comprises a combined sensor for collecting the surrounding environment information at the current position of the mobile robot; further comprising a controller characterized in that the controller is adapted to perform the method of controlling a mobile robot according to any of claims 1 to 6.
8. The mobile robot of claim 7, wherein the mobile robot is a sweeper.
CN201910673188.9A 2019-07-24 2019-07-24 Control method of mobile robot and mobile robot Pending CN112276933A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910673188.9A CN112276933A (en) 2019-07-24 2019-07-24 Control method of mobile robot and mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910673188.9A CN112276933A (en) 2019-07-24 2019-07-24 Control method of mobile robot and mobile robot

Publications (1)

Publication Number Publication Date
CN112276933A true CN112276933A (en) 2021-01-29

Family

ID=74419554

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910673188.9A Pending CN112276933A (en) 2019-07-24 2019-07-24 Control method of mobile robot and mobile robot

Country Status (1)

Country Link
CN (1) CN112276933A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024000672A1 (en) * 2022-06-29 2024-01-04 Hong Kong Applied Science and Technology Research Institute Company Limited Method of Controlling Movement of a Mobile Robot in the Event of a Localization Failure

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102968122A (en) * 2012-12-12 2013-03-13 深圳市银星智能科技股份有限公司 Covering method of map self-established by mobile platform in unknown region
CN105911988A (en) * 2016-04-26 2016-08-31 湖南拓视觉信息技术有限公司 Automatic drawing device and method
US20170083022A1 (en) * 2014-04-14 2017-03-23 Ecovacs Robotics Co., Ltd. Obstacle avoidance walking method of self-moving robot
CN107121142A (en) * 2016-12-30 2017-09-01 深圳市杉川机器人有限公司 The topological map creation method and air navigation aid of mobile robot
CN107505939A (en) * 2017-05-13 2017-12-22 大连理工大学 A kind of complete coverage path planning method of mobile robot
CN108120441A (en) * 2016-11-28 2018-06-05 沈阳新松机器人自动化股份有限公司 Complete coverage path planning method and system
US20190129433A1 (en) * 2016-12-29 2019-05-02 Amicro Semiconductor Corporation A path planning method of intelligent robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102968122A (en) * 2012-12-12 2013-03-13 深圳市银星智能科技股份有限公司 Covering method of map self-established by mobile platform in unknown region
US20170083022A1 (en) * 2014-04-14 2017-03-23 Ecovacs Robotics Co., Ltd. Obstacle avoidance walking method of self-moving robot
CN105911988A (en) * 2016-04-26 2016-08-31 湖南拓视觉信息技术有限公司 Automatic drawing device and method
CN108120441A (en) * 2016-11-28 2018-06-05 沈阳新松机器人自动化股份有限公司 Complete coverage path planning method and system
US20190129433A1 (en) * 2016-12-29 2019-05-02 Amicro Semiconductor Corporation A path planning method of intelligent robot
CN107121142A (en) * 2016-12-30 2017-09-01 深圳市杉川机器人有限公司 The topological map creation method and air navigation aid of mobile robot
CN107505939A (en) * 2017-05-13 2017-12-22 大连理工大学 A kind of complete coverage path planning method of mobile robot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024000672A1 (en) * 2022-06-29 2024-01-04 Hong Kong Applied Science and Technology Research Institute Company Limited Method of Controlling Movement of a Mobile Robot in the Event of a Localization Failure

Similar Documents

Publication Publication Date Title
EP3907575B1 (en) Dynamic region division and region channel identification method, and cleaning robot
JP7462244B2 (en) Method for planning cleaning area while moving along edge of robot, chip and robot
EP4043988A1 (en) Robot edge treading areal sweep planning method, chip, and robot
JP7159291B2 (en) Path planning method for autonomous mobile robot, autonomous mobile robot, and storage medium
WO2020259274A1 (en) Area identification method, robot, and storage medium
CN111459153B (en) Dynamic region division and region channel identification method and cleaning robot
CN108189039B (en) Moving method and device of mobile robot
CN111164529A (en) Environment information updating device, environment information updating method, and program
CN113741438A (en) Path planning method and device, storage medium, chip and robot
CN111728535B (en) Method and device for generating cleaning path, electronic equipment and storage medium
CN112101378A (en) Robot repositioning method, device and equipment
CN107765694A (en) A kind of method for relocating, device and computer read/write memory medium
CN111714028A (en) Method, device and equipment for escaping from restricted zone of cleaning equipment and readable storage medium
CN111240308A (en) Method and device for detecting repeated obstacle, electronic equipment and readable storage medium
CN111552290B (en) Method for robot to find straight line along wall and cleaning method
CN111367299B (en) Traveling avoidance method, mobile robot and storage medium
CN113503877A (en) Robot partition map establishing method and device and robot
CN113475977A (en) Robot path planning method and device and robot
CN112698657A (en) Sweeping robot path planning method
CN112274063B (en) Robot cleaning method, control device, readable storage medium and robot
CN112276933A (en) Control method of mobile robot and mobile robot
CN113317733B (en) Path planning method and cleaning robot
CN114779777A (en) Sensor control method and device for self-moving robot, medium and robot
KR101970191B1 (en) Apparatus and method for controlling cleaning function and robotic cleaner with the apparatus
CN114098529B (en) Cleaning method for cleaning robot system, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210129