CN111920353A - Cleaning control method, cleaning area dividing method, device, equipment and storage medium - Google Patents

Cleaning control method, cleaning area dividing method, device, equipment and storage medium Download PDF

Info

Publication number
CN111920353A
CN111920353A CN202010694770.6A CN202010694770A CN111920353A CN 111920353 A CN111920353 A CN 111920353A CN 202010694770 A CN202010694770 A CN 202010694770A CN 111920353 A CN111920353 A CN 111920353A
Authority
CN
China
Prior art keywords
cleaning
sub
information
sweeping robot
cleaning area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010694770.6A
Other languages
Chinese (zh)
Inventor
陈远
郝勇凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Robozone Technology Co Ltd
Original Assignee
Midea Group Co Ltd
Jiangsu Midea Cleaning Appliances Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Group Co Ltd, Jiangsu Midea Cleaning Appliances Co Ltd filed Critical Midea Group Co Ltd
Priority to CN202010694770.6A priority Critical patent/CN111920353A/en
Publication of CN111920353A publication Critical patent/CN111920353A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electric Vacuum Cleaner (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a cleaning control method, which comprises the following steps: receiving a cleaning instruction; the cleaning instruction carries an identifier of a sub-cleaning area; the sub-cleaning area is a sub-area obtained by dividing the cleaning area of the sweeping robot according to environment map information and pitch angle change information of the sweeping robot in the completed one-time cleaning process; responding to the cleaning instruction, and determining a current starting position of the sweeping robot; determining a target path reaching the sub-cleaning area according to the initial position; and reaching the sub-cleaning area according to the target path so as to clean the sub-cleaning area.

Description

Cleaning control method, cleaning area dividing method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to household technology, and relates to but is not limited to a cleaning control method, a cleaning area dividing device, cleaning equipment and a storage medium.
Background
With the increasing degree of automation, people have higher requirements on the intelligence of a sweeping robot (also called a sweeping robot or a cleaning robot).
In the related art, most sweeping robots sweep a sweeping area as a whole, and the sweeping efficiency is low.
Disclosure of Invention
In view of the above, embodiments of the present application provide a cleaning control method, a cleaning area dividing device, a cleaning apparatus, and a storage medium.
In a first aspect, an embodiment of the present application provides a sweeping control method, where the method includes: receiving a cleaning instruction; the cleaning instruction carries an identifier of a sub-cleaning area; the sub-cleaning area is a sub-area obtained by dividing the cleaning area of the sweeping robot according to environment map information and pitch angle change information of the sweeping robot in the completed one-time cleaning process; responding to the cleaning instruction, and determining a current starting position of the sweeping robot; determining a target path reaching the sub-cleaning area according to the initial position; and reaching the sub-cleaning area according to the target path so as to clean the sub-cleaning area.
In a second aspect, an embodiment of the present application provides a cleaning area dividing method, where the method includes: the sweeping robot acquires environment map information and pitch angle change information in a one-time sweeping process; and dividing the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information to obtain at least two sub-cleaning areas.
In a third aspect, an embodiment of the present application provides a cleaning control device, including: the receiving module is used for receiving a cleaning instruction; the cleaning instruction carries an identifier of a sub-cleaning area; the sub-cleaning area is a sub-area obtained by dividing the cleaning area of the sweeping robot according to environment map information and pitch angle change information of the sweeping robot in the completed one-time cleaning process; the response module is used for responding to the cleaning instruction and determining the current initial position of the sweeping robot; the determining module is used for determining a target path reaching the sub-cleaning area according to the initial position; and the cleaning module is used for reaching the sub-cleaning area according to the target path so as to clean the sub-cleaning area.
In a fourth aspect, an embodiment of the present application provides a cleaning area dividing device, including: the acquisition module is used for acquiring environmental map information and pitch angle change information in a primary cleaning process by the sweeping robot; and the dividing module is used for dividing the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information to obtain at least two sub-cleaning areas.
In a fifth aspect, an embodiment of the present application provides a sweeping robot, including a memory and a processor, where the memory stores a computer program operable on the processor, and the processor implements the steps in the above cleaning control method or the above cleaning area dividing method when executing the computer program.
In a sixth aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps in the cleaning control method or the cleaning area dividing method described above.
In the embodiment of the application, the cleaning area of the sweeping robot is divided into a plurality of sub-cleaning areas according to environment map information and pitch angle change information of the sweeping robot in the completed cleaning process, the sub-cleaning areas to be cleaned are determined according to the marks of the sub-cleaning areas carried by the cleaning instruction, and the paths reaching the sub-cleaning areas can be determined, so that the paths can reach the sub-cleaning areas, the cleaning efficiency of the sweeping robot is improved, and the requirements of users are met more intelligently.
Drawings
FIG. 1 is a schematic view of a cleaning zone according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a cleaning control method according to an embodiment of the present application;
fig. 3 is a schematic flow chart of a cleaning area dividing method according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a method for fusing and partitioning multiple sensors of a sweeping robot according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a cleaning control device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a cleaning area dividing device according to an embodiment of the present application;
fig. 7 is a schematic diagram of a hardware entity of a cleaning robot according to an embodiment of the present disclosure.
Detailed Description
Fig. 1 is a schematic view of a cleaning area provided in an embodiment of the present application, and as shown in fig. 1, a cleaning robot 10 may divide a cleaning area 100 according to environment map information and pitch angle change information in a completed cleaning process to obtain a plurality of sub-cleaning areas, where the sub-cleaning areas may include: kitchen 101, study 102, restaurant 103, hallway 104, living room 105, balcony 106, child room 107, guest toilet 108, guest sleeping 109, host toilet 110, and host sleeping 111.
The sweeping robot 10 receives a sweeping instruction of a user, the sweeping instruction carries an identifier of the sub-sweeping area 102 (i.e. a study), and if the sweeping robot 10 determines that the position of the sweeping robot is a certain position in the sub-sweeping area 105, the sweeping robot 10 determines a target path 11 reaching the sub-sweeping area 102 according to the position, reaches the sub-sweeping area 102 according to the target path 11, and then cleans the sub-sweeping area 102 according to a sweeping path 12 corresponding to the sub-sweeping area 102.
The technical solution of the present application is further elaborated below with reference to the drawings and the embodiments.
In this embodiment, the cleaning control method is provided, where the functions implemented by the method may be implemented by calling a program code by a processor in the cleaning robot, and the program code may be stored in a computer storage medium.
Fig. 2 is a schematic flow chart of an implementation of a cleaning control method of a cleaning robot provided in an embodiment of the present application, and as shown in fig. 2, the method includes:
step 202: receiving a cleaning instruction; the cleaning instruction carries an identifier of a sub-cleaning area;
the sub-cleaning area is a sub-area obtained by dividing the cleaning area of the sweeping robot according to environment map information and pitch angle change information of the sweeping robot in the completed one-time cleaning process.
Referring to fig. 1, the cleaning area of the sweeping robot may be a cleaning area 100, and the sub-cleaning area may be one of sub-cleaning areas 101 to 111, such as a kitchen or a living room; the cleaning instruction can be a voice instruction and an action instruction sent by a user, and can also be characters, images or voice instructions and the like input by the user through a character input interface of an application program on a user terminal; the identifier of the sub-cleaning region may be any one of a number, a region number, a name, and the like corresponding to the sub-cleaning region.
Assuming that the cleaning instruction is a voice instruction "please clean room No. 1" sent by the user, the sub-cleaning area carried in the cleaning instruction is marked as room No. 1, and since the sub-cleaning area corresponding to room No. 1 is the sub-cleaning area 101, i.e. the kitchen, the sub-cleaning area 101 can be cleaned. For another example, if the cleaning instruction is a voice instruction "please clean the child room" issued by the user, the sub-cleaning area carried in the cleaning instruction is identified as "child room", and since the sub-cleaning area corresponding to "child room" is the sub-cleaning area 107, the sub-cleaning area 107 can be cleaned. In another example, if the cleaning instruction is a text instruction "cleaning area 111" issued by the user through the user terminal, the sub-cleaning area is identified by area number "111", and therefore, the sub-cleaning area 111 can be cleaned.
Step 204: responding to the cleaning instruction, and determining a current starting position of the sweeping robot;
referring to fig. 1, the starting position of the sweeping robot 10 may be a certain position of the living room; the start position may be represented by coordinate information of the sweeping robot.
Step 206: determining a target path reaching the sub-cleaning area according to the initial position;
the target path can be a path from the initial position to the target position of the sub-cleaning area, which is determined by the processor of the cleaning robot according to the constructed environment map information and the initial position of the cleaning robot by adopting a path planning algorithm; the target position may be a starting cleaning position of a sub-cleaning area that is default by a processor of the cleaning robot, or may be a starting cleaning position of a sub-cleaning area that is preset by a user, and in a case that the starting position is a certain position of a living room and the sub-cleaning area is a study room, the target position may be a position where a door of the study room is located, and correspondingly, referring to fig. 1, the target path may be a path 11 from the starting position to the door of the study room.
The path planning algorithm may be a shortest path algorithm for enabling the sweeping robot to reach a sub-sweeping area with a shortest path, and the shortest path algorithm includes dijkstra algorithm, an interpolation method, bellman-ford algorithm, a shortest path fast algorithm, and the like.
Step 208: and reaching the sub-cleaning area according to the target path so as to clean the sub-cleaning area.
When the sub-cleaning area is cleaned, a cleaning path is needed, and the cleaning path is used for enabling the sweeping robot to avoid the obstacle so as to complete the cleaning of the sub-cleaning area; the cleaning path may be a cleaning path corresponding to the sub-cleaning area preset by a processor of the cleaning robot, or may be a cleaning path selected by a user in advance according to personal preference or needs from a plurality of cleaning paths corresponding to the sub-cleaning area provided by the processor, the cleaning path may be in a zigzag shape, a circular shape or other shapes, or may be a combination of a plurality of shapes, as shown in fig. 1, and the cleaning path may be path 12.
In the embodiment of the application, the cleaning area of the sweeping robot is divided into a plurality of sub-cleaning areas according to the environment map information and the pitch angle change information of the sweeping robot in the completed one-time cleaning process, so that the cleaning area can be divided more accurately, more efficiently and conveniently; the sub-cleaning area to be cleaned and the path to reach the sub-cleaning area are determined according to the mark of the sub-cleaning area carried by the cleaning instruction, so that the sub-cleaning area can be reached according to the path, the cleaning efficiency of the floor cleaning robot is improved, and the requirements of users are met more intelligently.
Fig. 3 is a cleaning area dividing method provided in an embodiment of the present application, where the method may include steps 302 and 304:
step 302: the sweeping robot acquires environment map information and pitch angle change information in a one-time sweeping process;
the environment map information is related information of an environment map constructed by the sweeping robot according to the position and the environment where the sweeping robot is located; the pitch angle change information is the information that the pitch angle of robot of sweeping the floor generated along with time change, the pitch angle can be used for describing the gesture of robot of sweeping the floor, the pitch angle can be the angle of robot of sweeping the floor relative to the XOY plane "every single move" of inertial coordinate system, be on a parallel with the axis of robot fuselage of sweeping the floor promptly and the contained angle of vector and the ground in the directional robot of sweeping the floor the place ahead.
Step 304: and dividing the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information to obtain at least two sub-cleaning areas.
It should be noted that there may be toys left by children or other general obstacles on the ground in the house, and in addition, due to the requirements of heat preservation, sound insulation and the like of the house, a threshold is generally installed at the lower part of the door of each room, and when the sweeping robot crosses the general obstacle or the threshold, the posture of the sweeping robot changes, that is, whether the sweeping robot crosses the obstacle can be judged according to the pitch angle change information of the sweeping robot, further, the current position of the sweeping robot can be judged according to the environment map information, and since the threshold is the position of the door and is a division point of different rooms or different areas in the actual environment, the position of the sweeping robot in the threshold position or the division point position of the area can be determined according to the environment map information and the pitch angle change information, and the division point position of the threshold is determined according to the position or the area of the threshold, the cleaning region is divided, so that the region division can be more matched with the real environment.
In the embodiment of the application, the cleaning area of the sweeping robot is divided into a plurality of sub-cleaning areas according to the acquired environment map information and pitch angle change information of the sweeping robot in the completed one-time cleaning process, so that the cleaning area can be divided more accurately, more efficiently and conveniently.
In some embodiments, when the sweeping robot encounters an obstacle with a relatively high height, the sweeping robot generally chooses to bypass the obstacle, and when the sweeping robot encounters an obstacle with a relatively low height, the sweeping robot generally chooses to pass over the obstacle.
In some embodiments, since no threshold is installed at the lower part of the door in some houses, the cleaning area cannot be divided according to the pitch angle variation, and the cleaning area of the sweeping robot can be divided according to the environment map information of the sweeping robot.
An embodiment of the present application further provides a method for dividing a cleaning area, where the method may include steps 402 to 406:
step 402: the sweeping robot acquires constructed environment map information; the environment map information is constructed according to the environment information and the position information which are collected in the completed one-time cleaning process;
the position information can be position coordinate information of the sweeping robot generated by positioning the sweeping robot in real time; the environment information can be position coordinate information of each scanning point in an environment generated by the sweeping robot scanning the surrounding environment in real time, and environment map information consistent with the actual environment can be constructed according to the position information and the environment information; the environment map information may be constructed according to an SLAM (simultaneous localization and mapping) algorithm, and the SLAM problem may be described as: the sweeping robot moves from an unknown position in an unknown environment, self-positioning is carried out according to position estimation and a map in the moving process, and meanwhile, an incremental map is built on the basis of self-positioning, so that the autonomous positioning and navigation of the sweeping robot are realized.
Step 404: acquiring pitch angle change information in the completed one-time sweeping process;
step 406: and dividing the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information to obtain at least two sub-cleaning areas.
In the embodiment of the application, the environment map information is constructed according to the position information of the sweeping robot and the environment information of the environment where the sweeping robot is located, so that stronger environment map information can be constructed more efficiently and more accurately.
An embodiment of the present application further provides a method for dividing a cleaning area, where the method may include steps 502 to 506:
step 502: the sweeping robot acquires environment map information and pitch angle change information in a one-time sweeping process;
step 504: determining the position information of the divided marker bodies according to the environment map information and the pitch angle change information acquired in the completed at least one cleaning process;
the dividing sign body may be an aisle, a door, etc., and referring to fig. 1, the dividing sign body may be a door between the living room 105 and the balcony 106 as shown by a sector, or an aisle between the living room 105 and the corridor 104 as shown by a dotted line in the figure, etc.
It should be noted that, assuming that the mark body is divided into doors, the sweeping robot scansThe coordinate of each scanning point in the environment map acquired by the surrounding environment is { (x)1,y1),(x2,y2),(x3,y3),(x4,y4),……,(xn,yn) }; the sweeping robot continuously scans the coordinates of the scanning points to generate a two-dimensional grid map, wherein each coordinate point in the grid map may be an obstacle point such as a door, a wall, a table foot and the like, and may also be a non-obstacle point; assuming that the width of one door is P cm, and the distance between two points in the grid map is Q cm, the two points can be determined as two corresponding points on the door frame under the condition that the distance between the two points in the grid map is P cm and both the two points are non-obstacle points, correspondingly, the coordinates of all the points on the door frame can be determined, and the area where the door is located is determined according to a coordinate set formed by the coordinates of all the points on the door frame, namely the position information of the door is determined.
Step 506: and in the environment map information, dividing two areas connected by the position information of the dividing mark body to obtain at least two sub-cleaning areas.
Referring to fig. 1, a child room 107 and a corridor 104 connected to a door of the child room may be divided, and a restaurant 103 and a kitchen 101 connected to a door of the kitchen may be divided, and the like.
In the embodiment of the application, the cleaning area is divided according to the position information of the division mark body by determining the position coordinates of the division mark body, so that the division of the cleaning area is more fit with the actual house structure.
An embodiment of the present application further provides a method for dividing a cleaning area, where the method may include steps 602 to 616:
step 602: positioning the position information of the sweeping robot in real time through a laser radar;
the laser radar is a radar system that detects a characteristic quantity such as a position and a velocity of a target by emitting a laser beam. The working principle is that a detection signal (laser beam) is transmitted to a target, then a received signal (target echo) reflected from the target is compared with the transmitted signal, and after appropriate processing, relevant information of the target, such as target distance, direction, height, speed, attitude, even shape and other parameters, can be obtained, so that the target is detected, tracked and identified.
Step 604: collecting environmental information of the environment where the sweeping robot is located in real time through a laser radar;
step 606: according to the position information and the environment information, environment map information of the sweeping robot is constructed;
step 608: acquiring pitch angle information of the sweeping robot in real time through a gyroscope;
the gyroscope can be an angular motion detection device which uses a momentum moment sensitive shell of a high-speed revolving body to rotate around one or two axes which are orthogonal to a rotation axis relative to an inertia space, and can be divided into a sensing gyroscope and an indicating gyroscope according to purposes.
Step 610: taking the timestamp for collecting each pitch angle information as the time information corresponding to the pitch angle information;
step 612: generating pitch angle change information according to each piece of pitch angle information and time information corresponding to each piece of pitch angle information;
step 614: the sweeping robot acquires environment map information and pitch angle change information in a one-time sweeping process;
step 616: and dividing the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information to obtain at least two sub-cleaning areas.
In the embodiment of the application, on one hand, the current position information of the sweeping robot and the environmental information of the environment are collected in real time through the laser radar, so that the constructed environmental map information is more timely and accurate; on the other hand, the pitch angle information of the sweeping robot is collected in real time through the gyroscope, so that the collected pitch angle information is timely and accurate.
An embodiment of the present application further provides a method for dividing a cleaning area, where the method may include steps 702 to 710:
step 702: the sweeping robot acquires environmental image information and pitch angle change information in a one-time sweeping process;
it should be noted that the environment map information of the sweeping robot can also be acquired in a manner of combining a laser radar and a vision sensor, so as to obtain environment image information, where the environment image information includes coordinate information, image information, and the like of each scanning point of the environment where the sweeping robot is located.
Step 704: determining a division mark body for dividing the cleaning area;
step 706: identifying a boundary body and a division mark body in the environment image information;
the boundary body may be an indoor wall, the dividing sign body may be a door on the wall, referring to fig. 1, the boundary body may be a first wall between the living room 105 and the guest toilet 108, may be a second wall between the study room 102 and the corridor 104, may be a third wall between the main bed 111 and the main toilet 110, and the dividing sign body may be a door on the second wall, or a door on the third wall.
It should be noted that the division mark body and the boundary body in the environmental image information may be identified by image recognition software, and the image recognition software may include kannai vision, map intelligence, deep sea technology, and the like.
Step 708: when a boundary body and a division marker body are recognized, determining coordinate information of the boundary body and coordinate information of the division marker body from position information corresponding to the environment image information;
it should be noted that, assuming that the identified boundary bodies include a third wall body between the master 111 and the master 110 and a fourth wall body between the master 111 and the guest 108, a coordinate set formed by scanning points forming the third wall body is determined as coordinate information of the third wall body, and a coordinate set formed by scanning points forming the fourth wall body is determined as coordinate information of the fourth wall body; assuming that the identified division flag body is a gate between the master 111 and the master 110, a coordinate set composed of scanning points constituting the gate between the master 111 and the master 110 is determined as coordinate information of the gate between the master 111 and the master 110.
Step 710: and dividing the cleaning area of the sweeping robot according to the coordinate information of the boundary body and the coordinate information of the dividing mark body.
In this application embodiment, through the coordinate information according to the boundary body with divide the coordinate information of sign body, divide the region of cleaning of robot of sweeping the floor, can make the sub-distribution of cleaning region and the actual house type in house of division laminate more, can make the robot of sweeping the floor clean the house according to the actual regional condition in house, improve and clean efficiency.
An embodiment of the present application further provides a method for dividing a cleaning area, where the method may include steps 802 to 810:
step 802: the sweeping robot acquires environment map information and pitch angle change information in a one-time sweeping process;
step 804: dividing the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information to obtain at least two sub-cleaning areas;
step 806: respectively carrying out different marking treatments on each sub-cleaning area;
the marking process may be to fill different colors into different sub-cleaning regions, or to mark different serial numbers for different sub-cleaning regions, or to set outer edge boundary lines of different colors for different sub-cleaning regions.
Step 808: sending each sub-cleaning area subjected to marking processing to a user terminal;
the user terminal may be a terminal device used by a user, and the terminal device may be a notebook computer, a tablet computer, a desktop computer, a mobile phone, and the like.
Step 810: and receiving the identification respectively determined by the user for each sub-cleaning area.
It should be noted that the user may name the sub-cleaning areas according to his/her preference and the actual configuration of the house to determine the identity of the sub-cleaning areas, for example, the user may name the area 111 as "lying home", and may also name the area 109 as "sleeping room", "inner room", etc.
In the embodiment of the application, the user determines the identification for each sub-cleaning area, so that the naming of the sub-cleaning area can better accord with the habit of the user, and the cleaning robot can more accurately determine the corresponding cleaning area when the user sends a cleaning instruction next time.
The embodiment of the application relates to intelligent application technologies such as optimized walking route of a household cleaning robot and SLAM (laser radar mapping), in particular to intelligent partition application in doorsill type families, the household cleaning robot can determine the area, the walking route is optimized, and intelligent control is achieved.
With the development of household cleaning robots, more and more intelligent robots have become the current trend of household robots. At present, few cleaning robots in China can perfectly divide different cleaned rooms, and most of the cleaning robots are used as a whole for treatment. Along with the deepening of the automation degree, people have higher requirements on the intelligence of the cleaning robot. The robot can optimize the cleaning route of the robot when the robot cleans the map partition, and the task can be completed more efficiently. In addition, better experience can be provided for a user by realizing the partition and calibrating the area, the user can select the area to clean, and the user requirements are met.
Fig. 4 is a multi-sensor fusion partitioning method for a sweeping robot based on a threshold provided in an embodiment of the present Application, where the sweeping robot is configured with a gyroscope for recording machine angle information, a laser radar module, an information processing unit, and a cloud and an APP (Application) for user interaction, as shown in fig. 4, the method includes steps a402 to a 422:
step A402: starting the sweeping robot;
step A404: the laser radar and the gyroscope of the sweeping robot start working, and the sweeping starts;
step A406: whether the pitch angle of the gyroscope changes or not;
if yes, executing step A408, otherwise executing step A404;
it should be noted that, the robot collects the pitch angle in the movement process, the walking state of the robot in the current time and a period of time is judged according to the collected pitch angle, the pitch angle of the internal gyroscope changes when the robot crosses an obstacle, and the change of the pitch angle is used as one of the conditions for judging whether the robot passes through the threshold;
step A408: whether two-dimensional environment coordinates established by a gyroscope and a laser radar meet the requirement of a door frame (namely a threshold) or not;
if yes, go to step A410; if not, executing the step A404;
step A410: entering a subarea;
it should be noted that, in steps a406 to a410, the lidar tests various environments within a certain distance range around, and establishes a two-dimensional environment coordinate system according to the SLAM algorithm. Judging the position of the robot according to the coordinate information of the collected surrounding environment, wherein the judgment is also one of conditions for judging whether the robot passes through a threshold at present; judging whether the obstacle crossed currently is a threshold or not by combining the collected pitch angle information and the established two-dimensional environment image, if so, dividing the area where the two parts are connected, otherwise, not dividing;
step A412: cleaning the whole environment area;
step A414: saving the map;
step A416: displaying the map after the partition;
it should be noted that the processor of the sweeping robot marks the divided regions with different colors and transmits the marked regions to the APP end through the cloud end to be presented to the user.
Step A418: the user marks the subarea (such as a kitchen);
it should be noted that the user may name each region according to different color regions.
Step A420: and the user transmits the labeling information (namely the naming result) to the cloud end and transmits the labeling information to the sweeping robot.
Step A422: optimizing the path or cleaning a specified area when executing the next task;
it should be noted that, when the user cleans the whole environment, the path is optimized according to the saved area map, and the cleaning efficiency is increased; when a user cleans a certain designated divided area, the user can directly reach the area to clean the area.
In the embodiment of the application, the cleaning area of the machine can be divided, so that the cleaning path is optimized, and the cleaning efficiency is increased; the cleaning device can clean the designated area, gives better experience to the user, and meets the requirements of the user.
Based on the foregoing embodiments, the embodiment of the present application provides an interaction device applied to a sweeping robot, where the device includes each unit and each module included in each unit, and can be implemented by a processor of the sweeping robot; of course, the implementation can also be realized through a specific logic circuit; in implementation, the processor may be a Central Processing Unit (CPU), a Microprocessor (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
Fig. 5 is a schematic structural diagram of a cleaning control apparatus according to an embodiment of the present application, and as shown in fig. 5, the cleaning control apparatus 500 includes a receiving module 501, a responding module 502, a determining module 503, and a cleaning module 504, where:
a receiving module 501, configured to receive a cleaning instruction; the cleaning instruction carries an identifier of a sub-cleaning area; the sub-cleaning area is a sub-area obtained by dividing the cleaning area of the sweeping robot according to environment map information and pitch angle change information of the sweeping robot in the completed one-time cleaning process;
a response module 502, configured to determine, in response to the cleaning instruction, a current starting position of the sweeping robot;
a determining module 503, configured to determine a target path to the sub-cleaning area according to the starting position;
a cleaning module 504, configured to reach the sub-cleaning area according to the target path, so as to clean the sub-cleaning area.
Fig. 6 is a schematic structural diagram of a cleaning area dividing device according to an embodiment of the present application, and as shown in fig. 6, the cleaning area dividing device includes an obtaining module 601 and a dividing module 602, where:
the acquisition module 601 is used for acquiring environmental map information and pitch angle change information in a primary cleaning process by the sweeping robot;
the dividing module 602 is configured to divide the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information, so as to obtain at least two sub-cleaning areas.
In some embodiments, the obtaining module 601 includes: the first acquisition unit is used for acquiring the constructed environment map information by the sweeping robot; the environment map information is constructed according to the environment information and the position information which are collected in the completed one-time cleaning process; and the second acquisition unit is used for acquiring pitch angle change information in the finished one-time cleaning process.
In some embodiments, the partitioning module 602 includes: the first determining unit is used for determining the position information of the divided mark bodies according to the environment map information and the pitch angle change information which are acquired in the completed at least one cleaning process; and the first dividing unit is used for dividing two areas connected by the position information of the dividing mark body in the environment map information to obtain at least two sub-cleaning areas.
In some embodiments, the apparatus further comprises: the positioning module is used for positioning the position information of the sweeping robot in real time through a laser radar; the acquisition module is used for acquiring the environmental information of the environment where the sweeping robot is located in real time through a laser radar; and the construction module is used for constructing the environment map information of the sweeping robot according to the position information and the environment information.
In some embodiments, the environment map information is environment image information, and the dividing module 602 further includes: a second determination unit configured to determine a division flag body that divides the cleaning region; the identification unit is used for identifying the boundary body and the dividing mark body in the environment image information; a third determination unit configured to determine, when a boundary body and a division flag body are recognized, coordinate information of the boundary body and coordinate information of the division flag body from position information corresponding to the environment image information; and the second dividing unit is used for dividing the cleaning area of the sweeping robot according to the coordinate information of the boundary body and the coordinate information of the dividing mark body.
In some embodiments, the apparatus further comprises: the marking module is used for respectively carrying out different marking processing on each sub-cleaning area; the sending module is used for sending each sub-cleaning area subjected to marking processing to a user terminal; and the receiving module is used for receiving the identification which is respectively determined by the user for each sub-cleaning area.
The above description of the apparatus embodiments, similar to the above description of the method embodiments, has similar beneficial effects as the method embodiments. For technical details not disclosed in the embodiments of the apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be noted that, in the embodiment of the present application, if the interaction method applied to the sweeping robot is implemented in the form of a software functional module, and is sold or used as an independent product, the interaction method may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application, in essence or parts contributing to the related art, may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes several instructions for enabling the sweeping robot to perform all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
Correspondingly, an embodiment of the present application provides a sweeping robot, fig. 7 is a schematic diagram of a hardware entity of the sweeping robot of the embodiment of the present application, and as shown in fig. 7, the hardware entity of the sweeping robot 700 includes: comprising a memory 701 and a processor 702, said memory 701 storing a computer program operable on the processor 702, said processor 702 implementing the steps in the cleaning control method or cleaning area dividing method provided in the above embodiments when executing said computer program.
The Memory 701 is configured to store instructions and applications executable by the processor 702, and may also buffer data (e.g., image data, audio data, voice communication data, and video communication data) to be processed or already processed by each module in the to-be-processed processor 702 and the robot 700, and may be implemented by a FLASH Memory (FLASH) or a Random Access Memory (RAM).
Correspondingly, the present application provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the cleaning control method or the cleaning area dividing method of the sweeping robot provided in the above embodiments.
Here, it should be noted that: the above description of the storage medium and device embodiments is similar to the description of the method embodiments above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the storage medium and apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment. In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application, in essence or parts contributing to the related art, may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes several instructions for enabling the sweeping robot to perform all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments. Features disclosed in several of the product embodiments provided in the present application may be combined in any combination to yield new product embodiments without conflict. The features disclosed in the several method or apparatus embodiments provided in the present application may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
The above description is only for the embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A sweeping control method, characterized by comprising:
receiving a cleaning instruction; the cleaning instruction carries an identifier of a sub-cleaning area; the sub-cleaning area is a sub-area obtained by dividing the cleaning area of the sweeping robot according to environment map information and pitch angle change information of the sweeping robot in the completed one-time cleaning process;
responding to the cleaning instruction, and determining a current starting position of the sweeping robot;
determining a target path reaching the sub-cleaning area according to the initial position;
and reaching the sub-cleaning area according to the target path so as to clean the sub-cleaning area.
2. A cleaning region dividing method, characterized by comprising:
the sweeping robot acquires environment map information and pitch angle change information in a one-time sweeping process;
and dividing the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information to obtain at least two sub-cleaning areas.
3. The method of claim 2, wherein the sweeping robot obtains environment map information and pitch angle change information during a sweeping process, comprising:
the sweeping robot acquires constructed environment map information; the environment map information is constructed according to the environment information and the position information which are collected in the completed one-time cleaning process;
and acquiring pitch angle change information in the completed one-time sweeping process.
4. The method according to claim 2 or 3, wherein the dividing the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information to obtain at least two sub-cleaning areas comprises:
determining the position information of the divided marker bodies according to the environment map information and the pitch angle change information acquired in the completed at least one cleaning process;
and in the environment map information, dividing two areas connected by the position information of the dividing mark body to obtain at least two sub-cleaning areas.
5. The method according to any one of claims 2 to 4, further comprising:
positioning the position information of the sweeping robot in real time through a laser radar;
collecting environmental information of the environment where the sweeping robot is located in real time through a laser radar;
and according to the position information and the environment information, constructing environment map information of the sweeping robot.
6. The method according to claim 2 or 3, wherein the environment map information is environment image information, and the dividing the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information to obtain at least two sub-cleaning areas comprises:
determining a division mark body for dividing the cleaning area;
identifying a boundary body and a division mark body in the environment image information;
when a boundary body and a division marker body are recognized, determining coordinate information of the boundary body and coordinate information of the division marker body from position information corresponding to the environment image information;
and dividing the cleaning area of the sweeping robot according to the coordinate information of the boundary body and the coordinate information of the dividing mark body.
7. The method of claim 2, wherein after said dividing the cleaning area of the sweeping robot into at least two sub-cleaning areas, the method further comprises:
respectively carrying out different marking treatments on each sub-cleaning area;
sending each sub-cleaning area subjected to marking processing to a user terminal;
and receiving the identification respectively determined by the user for each sub-cleaning area.
8. A sweeping control apparatus, characterized in that the apparatus comprises:
the receiving module is used for receiving a cleaning instruction; the cleaning instruction carries an identifier of a sub-cleaning area; the sub-cleaning area is a sub-area obtained by dividing the cleaning area of the sweeping robot according to environment map information and pitch angle change information of the sweeping robot in the completed one-time cleaning process;
the response module is used for responding to the cleaning instruction and determining the current initial position of the sweeping robot;
the determining module is used for determining a target path reaching the sub-cleaning area according to the initial position;
and the cleaning module is used for reaching the sub-cleaning area according to the target path so as to clean the sub-cleaning area.
9. A cleaning region dividing device, characterized in that the device comprises:
the acquisition module is used for acquiring environmental map information and pitch angle change information in a primary cleaning process by the sweeping robot;
and the dividing module is used for dividing the cleaning area of the sweeping robot according to the environment map information and the pitch angle change information to obtain at least two sub-cleaning areas.
10. A sweeping robot comprising a memory and a processor, the memory storing a computer program operable on the processor, wherein the processor when executing the program performs the steps of the method of claim 1;
alternatively, the processor implements the steps of the method of any one of claims 2 to 7 when executing the program.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method as claimed in claim 1;
alternatively, the computer program when executed by a processor implements the steps in the method of any one of claims 2 to 7.
CN202010694770.6A 2020-07-17 2020-07-17 Cleaning control method, cleaning area dividing method, device, equipment and storage medium Pending CN111920353A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010694770.6A CN111920353A (en) 2020-07-17 2020-07-17 Cleaning control method, cleaning area dividing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010694770.6A CN111920353A (en) 2020-07-17 2020-07-17 Cleaning control method, cleaning area dividing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111920353A true CN111920353A (en) 2020-11-13

Family

ID=73313326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010694770.6A Pending CN111920353A (en) 2020-07-17 2020-07-17 Cleaning control method, cleaning area dividing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111920353A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113116232A (en) * 2021-04-06 2021-07-16 三峡大学 Floor sweeping robot with remote interaction function and use method
CN113475977A (en) * 2021-06-22 2021-10-08 深圳拓邦股份有限公司 Robot path planning method and device and robot
CN113485335A (en) * 2021-07-02 2021-10-08 追觅创新科技(苏州)有限公司 Voice instruction execution method and device, storage medium and electronic device
CN113616117A (en) * 2021-08-13 2021-11-09 珠海格力电器股份有限公司 Cleaning area determination method, cleaning area determination device, cleaning area determination equipment and storage medium
CN113749562A (en) * 2021-08-13 2021-12-07 珠海格力电器股份有限公司 Sweeping robot and control method, device, equipment and storage medium thereof
CN114237218A (en) * 2021-11-04 2022-03-25 深圳拓邦股份有限公司 Indoor robot threshold area identification method and indoor robot
CN114545939A (en) * 2022-02-18 2022-05-27 智橙动力(北京)科技有限公司 Driving control method and device for swimming pool cleaning robot and electronic equipment
CN114608178A (en) * 2020-12-03 2022-06-10 佛山市顺德区美的电子科技有限公司 Direct blowing prevention control method and device and air conditioner
CN115040031A (en) * 2021-02-26 2022-09-13 云米互联科技(广东)有限公司 Environmental information acquisition method, sweeper, terminal equipment and readable storage medium
CN115177178A (en) * 2021-04-06 2022-10-14 美智纵横科技有限责任公司 Cleaning method, cleaning device and computer storage medium
CN116300974A (en) * 2023-05-18 2023-06-23 科沃斯家用机器人有限公司 Operation planning, partitioning, operation method, autonomous mobile device and cleaning robot
WO2023179393A1 (en) * 2022-03-24 2023-09-28 追觅创新科技(苏州)有限公司 Region division method, device, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006252346A (en) * 2005-03-11 2006-09-21 Secom Co Ltd Mobile robot
CN104825101A (en) * 2014-02-12 2015-08-12 Lg电子株式会社 Robot cleaner and controlling method thereof
CN105892457A (en) * 2015-02-13 2016-08-24 美国iRobot公司 Mobile Floor-Cleaning Robot With Floor-Type Detection
CN106940560A (en) * 2010-07-01 2017-07-11 德国福维克控股公司 Surveying and mapping with region division
CN109947109A (en) * 2019-04-02 2019-06-28 北京石头世纪科技股份有限公司 Robot working area map construction method and device, robot and medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006252346A (en) * 2005-03-11 2006-09-21 Secom Co Ltd Mobile robot
CN106940560A (en) * 2010-07-01 2017-07-11 德国福维克控股公司 Surveying and mapping with region division
CN104825101A (en) * 2014-02-12 2015-08-12 Lg电子株式会社 Robot cleaner and controlling method thereof
CN105892457A (en) * 2015-02-13 2016-08-24 美国iRobot公司 Mobile Floor-Cleaning Robot With Floor-Type Detection
CN109947109A (en) * 2019-04-02 2019-06-28 北京石头世纪科技股份有限公司 Robot working area map construction method and device, robot and medium

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114608178A (en) * 2020-12-03 2022-06-10 佛山市顺德区美的电子科技有限公司 Direct blowing prevention control method and device and air conditioner
CN115040031A (en) * 2021-02-26 2022-09-13 云米互联科技(广东)有限公司 Environmental information acquisition method, sweeper, terminal equipment and readable storage medium
CN115040031B (en) * 2021-02-26 2024-03-29 云米互联科技(广东)有限公司 Environment information acquisition method, sweeper, terminal equipment and readable storage medium
CN113116232A (en) * 2021-04-06 2021-07-16 三峡大学 Floor sweeping robot with remote interaction function and use method
CN115177178A (en) * 2021-04-06 2022-10-14 美智纵横科技有限责任公司 Cleaning method, cleaning device and computer storage medium
CN113475977A (en) * 2021-06-22 2021-10-08 深圳拓邦股份有限公司 Robot path planning method and device and robot
CN113485335A (en) * 2021-07-02 2021-10-08 追觅创新科技(苏州)有限公司 Voice instruction execution method and device, storage medium and electronic device
WO2023273898A1 (en) * 2021-07-02 2023-01-05 追觅创新科技(苏州)有限公司 Method and apparatus for executing voice instruction, storage medium, and electronic apparatus
CN113749562A (en) * 2021-08-13 2021-12-07 珠海格力电器股份有限公司 Sweeping robot and control method, device, equipment and storage medium thereof
CN113749562B (en) * 2021-08-13 2022-08-16 珠海格力电器股份有限公司 Sweeping robot and control method, device, equipment and storage medium thereof
CN113616117B (en) * 2021-08-13 2022-08-16 珠海格力电器股份有限公司 Cleaning area determination method, cleaning area determination device, cleaning area determination equipment and storage medium
CN113616117A (en) * 2021-08-13 2021-11-09 珠海格力电器股份有限公司 Cleaning area determination method, cleaning area determination device, cleaning area determination equipment and storage medium
CN114237218A (en) * 2021-11-04 2022-03-25 深圳拓邦股份有限公司 Indoor robot threshold area identification method and indoor robot
CN114545939A (en) * 2022-02-18 2022-05-27 智橙动力(北京)科技有限公司 Driving control method and device for swimming pool cleaning robot and electronic equipment
WO2023179393A1 (en) * 2022-03-24 2023-09-28 追觅创新科技(苏州)有限公司 Region division method, device, and storage medium
CN116300974A (en) * 2023-05-18 2023-06-23 科沃斯家用机器人有限公司 Operation planning, partitioning, operation method, autonomous mobile device and cleaning robot

Similar Documents

Publication Publication Date Title
CN111920353A (en) Cleaning control method, cleaning area dividing method, device, equipment and storage medium
JP7184435B2 (en) Restrictions on movement of mobile robots
Taylor et al. Vision-based motion planning and exploration algorithms for mobile robots
JP6849330B2 (en) Map generation method, self-position estimation method, robot system, and robot
JP6705465B2 (en) Observability grid-based autonomous environment search
WO2019144541A1 (en) Cleaning robot
CN108673501B (en) Target following method and device for robot
JP4942733B2 (en) Self-localization method of robot based on object recognition and surrounding environment information including recognized object
Kim et al. Dynamic ultrasonic hybrid localization system for indoor mobile robots
DK2952993T3 (en) PROCEDURE FOR MAKING A CARD OF LIKELIHOOD FOR ONE OF THE ABSENCE OR EXISTENCE OF BARRIERS FOR AN AUTONOMOUS ROBOT
US20200306989A1 (en) Magnetometer for robot navigation
KR102577785B1 (en) Cleaning robot and Method of performing task thereof
KR20240063820A (en) Cleaning robot and Method of performing task thereof
KR101731968B1 (en) Apparatus and method for relocation of robot
CN106264359B (en) Clean robot and its barrier-avoiding method
CN112867424A (en) Navigation and cleaning area dividing method and system, and moving and cleaning robot
JP5276931B2 (en) Method for recovering from moving object and position estimation error state of moving object
CN106708037A (en) Autonomous mobile equipment positioning method and device, and autonomous mobile equipment
CN108121333A (en) Shopping guide robot
WO2017038012A1 (en) Mapping method, localization method, robot system, and robot
CN106679647A (en) Method and device for initializing pose of autonomous mobile equipment
WO2021235100A1 (en) Information processing device, information processing method, and program
JP2006127355A (en) Position estimation system for radio tag
WO2022188333A1 (en) Walking method and apparatus, and computer storage medium
Chikhalikar et al. An object-oriented navigation strategy for service robots leveraging semantic information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210310

Address after: No.39 Caohu Avenue, Xiangcheng Economic Development Zone, Suzhou City, Jiangsu Province

Applicant after: Meizhizongheng Technology Co.,Ltd.

Address before: 39 Caohu Avenue, Xiangcheng Economic Development Zone, Suzhou, Jiangsu Province, 215144

Applicant before: JIANGSU MIDEA CLEANING APPLIANCES Co.,Ltd.

Applicant before: MIDEA GROUP Co.,Ltd.