CN114451816B - Cleaning policy generation method, cleaning policy generation device, computer device and storage medium - Google Patents
Cleaning policy generation method, cleaning policy generation device, computer device and storage medium Download PDFInfo
- Publication number
- CN114451816B CN114451816B CN202111589697.7A CN202111589697A CN114451816B CN 114451816 B CN114451816 B CN 114451816B CN 202111589697 A CN202111589697 A CN 202111589697A CN 114451816 B CN114451816 B CN 114451816B
- Authority
- CN
- China
- Prior art keywords
- cleaned
- area
- cleaning
- determining
- target area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000004140 cleaning Methods 0.000 title claims abstract description 134
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000003860 storage Methods 0.000 title claims abstract description 15
- 238000004590 computer program Methods 0.000 claims description 14
- 238000010408 sweeping Methods 0.000 abstract description 23
- 238000010586 diagram Methods 0.000 description 7
- 239000000428 dust Substances 0.000 description 7
- 238000004891 communication Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000001035 drying Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000001680 brushing effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 210000003608 fece Anatomy 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/24—Floor-sweeping machines, motor-driven
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/28—Floor-scrubbing machines, motor-driven
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4002—Installations of electric equipment
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4011—Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/06—Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computational Linguistics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Evolutionary Biology (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Electric Vacuum Cleaner (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The present application relates to the field of sweeping robots, and in particular, to a method, an apparatus, a computer device, and a storage medium for generating a sweeping policy, where the method includes: determining a region to be cleaned and an object to be cleaned in the target region based on a comparison result of the target region in the acquired scene picture and a reference region in a reference picture under the same scene; and generating a corresponding cleaning strategy based on the area to be cleaned and the object to be cleaned. Compared with the prior art, the cleaning strategy is generated based on the area to be cleaned and the object to be cleaned, and cleaning of the clean area is not needed, and the mode of cleaning is not needed to be set manually, so that the cleaning efficiency and the cleaning effectiveness of the cleaning robot are improved.
Description
Technical Field
The present disclosure relates to the field of sweeping robots, and in particular, to a method and apparatus for generating a sweeping policy, a computer device, and a storage medium.
Background
The floor sweeping robot, also called automatic sweeping machine, intelligent dust collector, robot dust collector, etc., is one kind of intelligent household appliance and can complete floor cleaning automatically inside room with certain artificial intelligence. Generally, the brushing and vacuum modes are adopted to absorb the sundries on the ground into the garbage storage box of the self-cleaning machine, so that the function of cleaning the ground is completed. Generally, robots for cleaning, dust collection and floor wiping are also uniformly classified as floor sweeping robots, and important functions of the robots in life and production of the robots are not ignored, so that the robots bring great convenience to the users, and the robots become important components in production and life of the robots.
The existing sweeping robot has the following problems: the path planning is based on a fixed strategy, and cleaning is performed according to a certain sequence from the charging pile and the rule of the construction, and because a plurality of areas in the path do not need cleaning, the cleaning efficiency is low; the object to be cleaned in the area to be cleaned cannot be judged, so that automatic judgment of which level of suction and which cleaning mode is adopted cannot be achieved.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a cleaning policy generation method, apparatus, computer device, and storage medium.
In a first aspect, an embodiment of the present invention proposes a cleaning policy generating method, where the cleaning policy is executed by a cleaning robot, the method includes:
determining a region to be cleaned and an object to be cleaned in the target region based on a comparison result of the target region in the acquired scene picture and a reference region in a reference picture under the same scene;
and generating a corresponding cleaning strategy based on the area to be cleaned and the object to be cleaned.
In an embodiment, the determining the area to be cleaned and the object to be cleaned in the target area based on the comparison result of the target area in the acquired scene picture and the reference area in the reference picture under the same scene includes:
comparing a target area in the acquired scene picture with a reference area in a reference picture under the same scene to determine a difference area in the target area;
and determining the area to be cleaned and the object to be cleaned based on the identification result of the object in the difference area.
In an embodiment, the cleaning strategy includes a cleaning path and a cleaning mode corresponding to the object to be cleaned; the generating a corresponding cleaning strategy based on the area to be cleaned and the object to be cleaned comprises:
generating a corresponding cleaning path based on the area to be cleaned;
and determining a corresponding cleaning mode based on the object to be cleaned.
In an embodiment, the method further comprises:
determining the number of times of passing through each block based on the motion trail of the living things appearing in the target area; the block is divided by the target area;
when the number of times of passing a certain block is larger than the set number of times, a corresponding cleaning strategy is generated based on the block.
In a second aspect, an embodiment of the present invention proposes a cleaning policy generating device, the cleaning policy being executed by a cleaning robot, the device comprising:
the first determining module is used for determining a region to be cleaned and an object to be cleaned in the target region based on a comparison result of the target region in the acquired scene picture and a reference region in the reference picture under the same scene;
and the strategy generation module is used for generating a corresponding cleaning strategy based on the area to be cleaned and the object to be cleaned.
In an embodiment, the first determining module includes:
the first determining submodule is used for comparing a target area in the acquired scene picture with a reference area in a reference picture in the same scene to determine a difference area in the target area;
and the second determining submodule is used for determining the area to be cleaned and the object to be cleaned based on the identification result of the object in the difference area.
In an embodiment, the cleaning strategy includes a cleaning path and a cleaning mode corresponding to the object to be cleaned; the policy generation module comprises:
the path generation module is used for generating a corresponding cleaning path based on the area to be cleaned;
and the type determining module is used for determining a corresponding cleaning mode based on the object to be cleaned.
In an embodiment, the device further comprises:
the second determining module is used for determining the times of passing through each block based on the motion trail of the living things in the target area; the block is divided by the target area;
the strategy generation module is also used for generating a corresponding cleaning strategy based on a certain block when the number of times of passing through the block is larger than the set number of times.
In a third aspect, an embodiment of the present invention proposes a computer device comprising a memory storing a computer program and a processor executing the steps of the first aspect.
In a fourth aspect, an embodiment of the present invention proposes a computer readable storage medium, on which a computer program is stored, the processor implementing the steps of the first aspect when executing the computer program.
The method, the device, the computer equipment and the storage medium are used for determining the area to be cleaned and the object to be cleaned in the target area based on the comparison result of the target area in the acquired scene picture and the reference area in the reference picture under the same scene; and generating a corresponding cleaning strategy based on the area to be cleaned and the object to be cleaned. Compared with the prior art, the cleaning strategy is generated based on the area to be cleaned and the object to be cleaned, and cleaning of the clean area is not needed, and the mode of cleaning is not needed to be set manually, so that the cleaning efficiency and the cleaning effectiveness of the cleaning robot are improved.
Drawings
FIG. 1 is a schematic diagram of an application environment of a method of generating a cleaning policy according to one embodiment;
FIG. 2 is a flow diagram of a method of generating a cleaning policy in one embodiment;
FIG. 3 is a flow chart of a method for determining an area to be cleaned and an object to be cleaned in one embodiment;
FIG. 4 is a flow diagram of a policy generation method in one embodiment;
FIG. 5 is a flowchart of a strategy generation method according to another embodiment;
FIG. 6 is a flow diagram of a method of generating and executing a cleaning policy in one embodiment;
FIG. 7 is a flow diagram of a cleaning policy generation device in one embodiment;
FIG. 8 is a schematic diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The cleaning strategy generation method provided by the application can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the sweeping robot 104 through a network. The terminal 102 determines a to-be-cleaned area and an to-be-cleaned object in the target area based on a comparison result of the target area in the acquired scene picture and the reference area in the reference picture in the same scene, generates a corresponding cleaning strategy based on the to-be-cleaned area and the to-be-cleaned object, and sends the cleaning strategy to the cleaning robot 104 by the terminal 102, and the cleaning robot 104 executes the cleaning strategy.
Wherein the terminal 102 may include one or more processors (e.g., a single chip processor or a multi-chip processor). By way of example only, the terminal 102 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a special instruction set processor (ASIP), an image processing unit (GPU), a physical arithmetic processing unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination thereof.
The communication may be or include a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN)), a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a switch, a server computer, and/or any combination thereof. For example, the network may include a cable network, a wired network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, and the like, or any combination thereof.
The sweeping robot has various sweeping modes, such as hair cleaning, dust collection, mopping and drying, and simultaneously has a plurality of levels of suction force, and the sweeping robot can execute a sweeping strategy to realize sweeping.
In one embodiment, as shown in fig. 2, a cleaning policy generating method is provided, and the application environment of the method in fig. 1 is taken as an example for explanation, which includes the following steps:
s202: and determining a region to be cleaned and an object to be cleaned in the target region based on a comparison result of the target region in the acquired scene picture and a reference region in the reference picture under the same scene.
The scene picture is acquired by a camera mounted in the scene.
The target area refers to an area which can be cleaned by the sweeping robot, and is usually a ground area.
The reference picture is a picture used for comparison with the scene picture under the same scene, and the target area in the target area and the object to be cleaned are determined by comparing the target area in the acquired scene picture with the reference area in the reference picture under the same scene.
S204: and generating a corresponding cleaning strategy based on the area to be cleaned and the object to be cleaned.
The cleaning strategy in the embodiment is generated based on the area to be cleaned and the object to be cleaned, compared with the prior art, the cleaning method does not need to clean the clean area or manually set the cleaning mode, and therefore cleaning efficiency and cleaning effectiveness of the cleaning robot are improved.
Secondly, the sweeping robot executes the sweeping strategy to trigger through the comparison result of the target area in the acquired scene picture and the reference area in the reference picture in the same scene, and manual control is not needed, so that the sweeping robot is more intelligent and more convenient to use.
In an embodiment, as shown in fig. 3, based on a comparison result of a target area in an acquired scene picture and a reference area in a reference picture under the same scene, determining a to-be-cleaned area in the target area and an object to be cleaned includes the following steps:
s302: comparing a target area in the acquired scene picture with a reference area in a reference picture under the same scene to determine a difference area in the target area;
s304: and determining the area to be cleaned and the object to be cleaned based on the identification result of the object in the difference area.
Firstly, identifying a target area in a scene picture and a reference area in a reference picture through an image identification algorithm, comparing the target area with the reference area through an image comparison algorithm to identify a difference area in the target area, identifying an object in the difference area through the image identification algorithm, judging whether the object is an object to be cleaned, if so, taking the difference area as the object to be cleaned, and if not, taking the difference area not as the object to be cleaned, and not taking the object as the object to be cleaned.
The image recognition algorithm may classify objects in the difference region, such as socks, wires, toys, pet feces, shoes, coils, books, shells, etc., based on a deep learning algorithm, among others.
In an example embodiment, if the object in the difference area is identified as the peel, and the peel is determined to be the object to be cleaned, the area where the peel is located is taken as the area to be cleaned, and the peel is taken as the object to be cleaned; if the object in the difference area is identified as a toy, judging that the toy is not an object to be cleaned, and if the toy is not the object to be cleaned, the area where the toy is located is not used as the area to be cleaned, and the toy is not used as the object to be cleaned.
In one embodiment, objects that cannot be identified in the discrepancy area are determined to be dirty, considering that dust and dirt on some grey variegated floors or tiles are difficult to judge by depth identification. If the object in the difference area cannot be identified, the difference area is taken as an area to be cleaned, and the object is taken as an object to be cleaned.
In one embodiment, the cleaning strategy includes a cleaning path and a cleaning mode corresponding to the object to be cleaned. As shown in fig. 4, based on the area to be cleaned and the object to be cleaned, generating the corresponding cleaning strategy includes the following steps:
s402: generating a corresponding cleaning path based on the area to be cleaned;
s404: and determining a corresponding cleaning mode based on the object to be cleaned.
Firstly, a coordinate system is established based on a target area, the coordinates of the target area are determined according to the determined position of the area to be cleaned in the target area, then all the areas to be cleaned are displayed in the coordinate system, and finally a cleaning path with the shortest path is generated, so that the cleaning time of the cleaning robot is saved.
It will be appreciated that different cleaning modes are required for different objects to be cleaned, for example, cleaning modes of mopping and drying are required for stains; a special hair-cleaning mode is required for the hair. Second, for different objects to be cleaned, a corresponding level of suction is required, e.g. higher for peel cleaning than for hair. After the object to be cleaned is determined, the corresponding cleaning mode can be determined, and the cleaning strategy is obtained by combining the cleaning paths obtained in the steps.
For dust image recognition algorithms that are difficult to recognize, in one embodiment, as shown in FIG. 5, the cleaning strategy method further comprises the steps of:
s502: determining the number of times of passing through each block based on the motion trail of the living things appearing in the target area; the block is divided by the target area;
s504: when the number of times of passing a certain block is larger than the set number of times, a corresponding cleaning strategy is generated based on the block.
In this embodiment, living things including people and animals appearing in a target area are detected by a camera provided in a scene, and specifically, detection can be performed by a deep learning algorithm. When the living things are detected, the living things are tracked to obtain a motion track so as to determine the times of passing through each block. When the number of times of passing a certain block is larger than the set number of times, a corresponding cleaning strategy is generated based on the block, and a cleaning path and a cleaning mode are determined.
By determining the number of times of passing through each block, the cleaning of the areas without cleaning due to the inability to recognize dust or the like is avoided.
In an embodiment, the camera shoots a scene picture, and when no living object in the scene picture is detected, the area to be cleaned, the object to be cleaned and the cleaning strategy are determined. Avoiding the cleaning robot executing cleaning to disturb life when people exist.
In an embodiment, after the sweeping robot finishes sweeping, the scene image is shot again, and the scene image is used as a reference image, so that the reference image is updated, the identification of objects in a difference area is reduced, and the sweeping speed is improved.
The process of generating and executing the cleaning strategy is shown in fig. 6, the camera shoots a scene picture, whether a living object exists in the scene picture or not is detected, if yes, a motion track is obtained, the number of times of passing through each block is determined, if not, the comparison of the scene picture and a reference picture is carried out, whether a difference area exists or not is determined, if not, whether a book passing through the block is larger than the set number of times is judged, and if yes, the cleaning strategy is generated; under the condition of a difference area, determining an area to be cleaned and an object to be cleaned, and generating a cleaning strategy; and sending the cleaning strategy to a sweeping robot, executing the cleaning strategy by the sweeping robot, and updating the reference picture after the cleaning is completed.
It should be understood that each step in the above-described flowcharts is shown in order as indicated by the arrow, but the steps are not necessarily performed in order as indicated by the arrow. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described above may include a plurality of steps or stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of execution of the steps or stages is not necessarily sequential, but may be performed in turn or alternately with at least a part of other steps or stages.
In one embodiment, as shown in fig. 7, the present invention provides a cleaning policy generating device, including:
a first determining module 702, configured to determine a region to be cleaned and an object to be cleaned in a target region in an acquired scene picture based on a comparison result of the target region and a reference region in a reference picture in the same scene;
and the policy generation module 704 is configured to generate a corresponding cleaning policy based on the area to be cleaned and the object to be cleaned.
In an embodiment, the first determining module includes:
the first determining submodule is used for comparing a target area in the acquired scene picture with a reference area in a reference picture in the same scene to determine a difference area in the target area;
and the second determining submodule is used for determining the area to be cleaned and the object to be cleaned based on the identification result of the object in the difference area.
In an embodiment, the cleaning strategy includes a cleaning path and a cleaning mode corresponding to the object to be cleaned; the policy generation module comprises:
the path generation module is used for generating a corresponding cleaning path based on the area to be cleaned;
and the type determining module is used for determining a corresponding cleaning mode based on the object to be cleaned.
In an embodiment, the device further comprises:
the second determining module is used for determining the times of passing through each block based on the motion trail of the living things in the target area; the block is divided by the target area;
the strategy generation module is also used for generating a corresponding cleaning strategy based on a certain block when the number of times of passing through the block is larger than the set number of times.
For specific limitations of the cleaning strategy generation means, reference may be made to the above limitations of the cleaning strategy generation method, and no further description is given here. The respective modules in the above-described cleaning policy generation apparatus may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure of which may be as shown in fig. 8. The computer device includes a processor, a memory, and a network interface connected by a device bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The nonvolatile storage medium stores an operating device, a computer program, and a database. The internal memory provides an environment for the operation of the operating device and the computer program in the non-volatile storage medium. The database of the computer device is for storing motion detection data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by a processor, implements the steps of any of the embodiments of the cleaning policy generation method described above.
It will be appreciated by those skilled in the art that the structure shown in fig. 8 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of any one of the cleaning policy generation method embodiments described above when the computer program is executed.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, implements the steps of any of the above-described cleaning policy generation method embodiments.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.
Claims (8)
1. A cleaning strategy generation method, the cleaning strategy being performed by a cleaning robot, the method comprising:
determining the number of times of passing through each block based on the motion trail of the living things appearing in the target area; the block is obtained by dividing a target area;
determining whether a difference area exists or not based on a comparison result of a target area in the acquired scene picture and a reference area in a reference picture under the same scene;
under the condition of no difference area, when the number of times of passing through a certain block is larger than the set number of times and no living matters in the scene picture are detected, a corresponding cleaning strategy is generated based on the block;
and under the condition of a difference area and the condition that no living matter exists in the scene picture, determining an area to be cleaned and an object to be cleaned in the target area, and generating a corresponding cleaning strategy based on the area to be cleaned and the object to be cleaned.
2. The method of claim 1, wherein the determining the area to be cleaned and the object to be cleaned in the target area comprises:
comparing a target area in the acquired scene picture with a reference area in a reference picture under the same scene to determine a difference area in the target area;
and determining the area to be cleaned and the object to be cleaned based on the identification result of the object in the difference area.
3. The method of claim 1, wherein the cleaning strategy comprises a cleaning path and a cleaning pattern corresponding to an object to be cleaned; the generating a corresponding cleaning strategy based on the area to be cleaned and the object to be cleaned comprises:
generating a corresponding cleaning path based on the area to be cleaned;
and determining a corresponding cleaning mode based on the object to be cleaned.
4. A cleaning policy generation apparatus, the cleaning policy being executed by a cleaning robot, the apparatus comprising:
the second determining module is used for determining the times of passing through each block based on the motion trail of the living things in the target area; the block is obtained by dividing a target area;
the first determining module is used for determining whether a difference area exists or not based on a comparison result of a target area in the acquired scene picture and a reference area in a reference picture in the same scene; under the condition of a difference area, determining an area to be cleaned and an object to be cleaned in the target area;
the strategy generation module is used for generating a corresponding cleaning strategy based on a certain block when the number of times of passing through the block is larger than the set number of times and no living matters in the scene picture are detected under the condition of no difference area;
and under the condition of a difference area and the condition that no living matter exists in the scene picture, determining an area to be cleaned and an object to be cleaned in the target area, and generating a corresponding cleaning strategy based on the area to be cleaned and the object to be cleaned.
5. The apparatus of claim 4, wherein the first determining module comprises:
the first determining submodule is used for comparing a target area in the acquired scene picture with a reference area in a reference picture in the same scene to determine a difference area in the target area;
and the second determining submodule is used for determining the area to be cleaned and the object to be cleaned based on the identification result of the object in the difference area.
6. The apparatus of claim 4, wherein the cleaning strategy comprises a cleaning path and a cleaning pattern corresponding to an object to be cleaned; the policy generation module comprises:
the path generation module is used for generating a corresponding cleaning path based on the area to be cleaned;
and the type determining module is used for determining a corresponding cleaning mode based on the object to be cleaned.
7. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, carries out the steps of the method of any one of claims 1 to 3.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111589697.7A CN114451816B (en) | 2021-12-23 | 2021-12-23 | Cleaning policy generation method, cleaning policy generation device, computer device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111589697.7A CN114451816B (en) | 2021-12-23 | 2021-12-23 | Cleaning policy generation method, cleaning policy generation device, computer device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114451816A CN114451816A (en) | 2022-05-10 |
CN114451816B true CN114451816B (en) | 2024-02-09 |
Family
ID=81405243
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111589697.7A Active CN114451816B (en) | 2021-12-23 | 2021-12-23 | Cleaning policy generation method, cleaning policy generation device, computer device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114451816B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115715651A (en) * | 2022-12-29 | 2023-02-28 | 科大讯飞股份有限公司 | Sweeping robot control method, device, equipment and readable storage medium |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007323402A (en) * | 2006-06-01 | 2007-12-13 | Matsushita Electric Ind Co Ltd | Self-propelled equipment and its program |
CN105411491A (en) * | 2015-11-02 | 2016-03-23 | 中山大学 | Home intelligent cleaning system and method based on environment monitoring |
CN110703769A (en) * | 2019-11-12 | 2020-01-17 | 山东交通学院 | Automatic driving sweeper system based on cloud platform and control method |
CN111281274A (en) * | 2020-03-18 | 2020-06-16 | 苏宁智能终端有限公司 | Visual floor sweeping method and system |
CN111543898A (en) * | 2020-05-09 | 2020-08-18 | 小狗电器互联网科技(北京)股份有限公司 | Garbage classification cleaning method and system, electronic equipment, storage medium and sweeper |
CN111562777A (en) * | 2019-01-29 | 2020-08-21 | 北京奇虎科技有限公司 | Sweeping path planning method and device of sweeping robot |
CN111643017A (en) * | 2020-06-02 | 2020-09-11 | 深圳市杉川机器人有限公司 | Cleaning robot control method and device based on schedule information and cleaning robot |
CN111743463A (en) * | 2020-06-18 | 2020-10-09 | 小狗电器互联网科技(北京)股份有限公司 | Cleaning method and device for target object, readable medium and electronic equipment |
CN112022000A (en) * | 2020-07-30 | 2020-12-04 | 奇酷互联网络科技(深圳)有限公司 | Sweeping method of sweeping robot and related device |
CN112674650A (en) * | 2020-12-25 | 2021-04-20 | 北京小狗吸尘器集团股份有限公司 | Sweeping method and device of sweeping robot |
CN113367616A (en) * | 2021-05-19 | 2021-09-10 | 科沃斯机器人股份有限公司 | Robot control method, robot control device, robot, and storage medium |
CN113455964A (en) * | 2021-06-30 | 2021-10-01 | 青岛海尔科技有限公司 | Area cleaning method and device, storage medium and electronic device |
CN113509104A (en) * | 2021-04-25 | 2021-10-19 | 珠海格力电器股份有限公司 | Cleaning method, storage medium and cleaning robot |
-
2021
- 2021-12-23 CN CN202111589697.7A patent/CN114451816B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007323402A (en) * | 2006-06-01 | 2007-12-13 | Matsushita Electric Ind Co Ltd | Self-propelled equipment and its program |
CN105411491A (en) * | 2015-11-02 | 2016-03-23 | 中山大学 | Home intelligent cleaning system and method based on environment monitoring |
CN111562777A (en) * | 2019-01-29 | 2020-08-21 | 北京奇虎科技有限公司 | Sweeping path planning method and device of sweeping robot |
CN110703769A (en) * | 2019-11-12 | 2020-01-17 | 山东交通学院 | Automatic driving sweeper system based on cloud platform and control method |
CN111281274A (en) * | 2020-03-18 | 2020-06-16 | 苏宁智能终端有限公司 | Visual floor sweeping method and system |
CN111543898A (en) * | 2020-05-09 | 2020-08-18 | 小狗电器互联网科技(北京)股份有限公司 | Garbage classification cleaning method and system, electronic equipment, storage medium and sweeper |
CN111643017A (en) * | 2020-06-02 | 2020-09-11 | 深圳市杉川机器人有限公司 | Cleaning robot control method and device based on schedule information and cleaning robot |
CN111743463A (en) * | 2020-06-18 | 2020-10-09 | 小狗电器互联网科技(北京)股份有限公司 | Cleaning method and device for target object, readable medium and electronic equipment |
CN112022000A (en) * | 2020-07-30 | 2020-12-04 | 奇酷互联网络科技(深圳)有限公司 | Sweeping method of sweeping robot and related device |
CN112674650A (en) * | 2020-12-25 | 2021-04-20 | 北京小狗吸尘器集团股份有限公司 | Sweeping method and device of sweeping robot |
CN113509104A (en) * | 2021-04-25 | 2021-10-19 | 珠海格力电器股份有限公司 | Cleaning method, storage medium and cleaning robot |
CN113367616A (en) * | 2021-05-19 | 2021-09-10 | 科沃斯机器人股份有限公司 | Robot control method, robot control device, robot, and storage medium |
CN113455964A (en) * | 2021-06-30 | 2021-10-01 | 青岛海尔科技有限公司 | Area cleaning method and device, storage medium and electronic device |
Also Published As
Publication number | Publication date |
---|---|
CN114451816A (en) | 2022-05-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111568314B (en) | Cleaning method and device based on scene recognition, cleaning robot and storage medium | |
KR102522951B1 (en) | Control method of cleaning devices | |
CN107913039B (en) | Block selection method and device for cleaning robot and robot | |
CN110989630B (en) | Self-moving robot control method, device, self-moving robot and storage medium | |
CN109582015B (en) | Indoor cleaning planning method and device and robot | |
US11144062B2 (en) | Cleaning area selection method and device | |
CN108514381A (en) | Method, apparatus of sweeping the floor and sweeping robot | |
CN111609852A (en) | Semantic map construction method, sweeping robot and electronic equipment | |
CN114451816B (en) | Cleaning policy generation method, cleaning policy generation device, computer device and storage medium | |
CN110174888A (en) | Self-movement robot control method, device, equipment and storage medium | |
CN113219992B (en) | Path planning method and cleaning robot | |
CN111643017B (en) | Cleaning robot control method and device based on schedule information and cleaning robot | |
CN111679661A (en) | Semantic map construction method based on depth camera and sweeping robot | |
CN108836195A (en) | A kind of get rid of poverty method and the sweeping robot of sweeping robot | |
CN108803586A (en) | A kind of working method of sweeping robot | |
KR20210079610A (en) | Artificial intelligence cleaning robot and method thereof | |
KR101333496B1 (en) | Apparatus and Method for controlling a mobile robot on the basis of past map data | |
JP2021118757A (en) | Smart cleaning robot | |
CN111714028A (en) | Method, device and equipment for escaping from restricted zone of cleaning equipment and readable storage medium | |
CN114690769B (en) | Path planning method, electronic device, storage medium and computer program product | |
CN114594764A (en) | Cleaning route generation method, cleaning route generation system, cleaning robot, cleaning device, and storage medium | |
WO2023019922A1 (en) | Navigation method and self-propelled apparatus | |
CN114869175A (en) | Cleaning obstacle avoidance method and device, electronic equipment and storage medium | |
CN114967698A (en) | Cleaning method, cleaning device, electronic apparatus, and storage medium | |
Sugiyama et al. | Meta-strategy for cooperative tasks with learning of environments in multi-agent continuous tasks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |