WO2024022223A1 - Method and apparatus for processing cleaning image of cleaning device, system, and storage medium - Google Patents

Method and apparatus for processing cleaning image of cleaning device, system, and storage medium Download PDF

Info

Publication number
WO2024022223A1
WO2024022223A1 PCT/CN2023/108453 CN2023108453W WO2024022223A1 WO 2024022223 A1 WO2024022223 A1 WO 2024022223A1 CN 2023108453 W CN2023108453 W CN 2023108453W WO 2024022223 A1 WO2024022223 A1 WO 2024022223A1
Authority
WO
WIPO (PCT)
Prior art keywords
cleaning
region
preset
image
dirtiness
Prior art date
Application number
PCT/CN2023/108453
Other languages
French (fr)
Inventor
Jingwen HUANG
Xiaoqian SHEN
Original Assignee
Yunjing Intelligence (Shenzhen) Co., Ltd.
Yunjing Intelligence Innovation (Shenzhen) Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunjing Intelligence (Shenzhen) Co., Ltd., Yunjing Intelligence Innovation (Shenzhen) Co., Ltd. filed Critical Yunjing Intelligence (Shenzhen) Co., Ltd.
Publication of WO2024022223A1 publication Critical patent/WO2024022223A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4091Storing or parking devices, arrangements therefor; Means allowing transport of the machine when it is not being used
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/0009Storing devices ; Supports, stands or holders
    • A47L9/0063External storing devices; Stands, casings or the like for the storage of suction cleaners
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • A47L9/2826Parameters or conditions being sensed the condition of the floor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2857User input or output elements for control, e.g. buttons, switches or displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/02Docking stations; Docking operations
    • A47L2201/022Recharging of batteries
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/02Docking stations; Docking operations
    • A47L2201/028Refurbishing floor engaging tools, e.g. cleaning of beating brushes
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Definitions

  • the present disclosure relates to the field of cleaning technologies, in particular to a method, an apparatus, a system, and a storage medium for processing cleaning images of a cleaning device.
  • Cleaning devices can be used to automatically clean the ground in the scenarios such as household indoor cleaning, large place cleaning, and the like.
  • the cleaning device cannot detect a dirty degree of the ground to be cleaned when cleaning the ground, which cannot reflect the cleaning workload of the cleaning device, thus affecting the user’s experience of using the cleaning device.
  • the present disclosure provides a method and apparatus for a cleaning image of a cleaning device, a system, and a storage medium, aiming to solve the technical problems in the related art that the cleaning workload of the cleaning device cannot be reflected due to that the cleaning device cannot detect the dirtiness degree of the ground when cleaning the ground, which affects the user experience of the cleaning device.
  • an embodiment of the present disclosure provides a method for processing a cleaning image of a cleaning device, configured to generate the cleaning image after the cleaning device finishing cleaning at least one preset cleaning region through a cleaning member during performing a cleaning task, and the method including:
  • an embodiment of the present disclosure provides an apparatus for processing a cleaning image of a cleaning device, the apparatus including a memory and a processor;
  • the memory is configured to store computer-executable instructions
  • the processor is configured to execute the computer-executable instructions to implement the operations in the foregoing method.
  • an embodiment of the present disclosure provides a system, including:
  • the cleaning device including a motion mechanism and a cleaning member, the motion mechanism being configured to drive the cleaning device to move to allow the cleaning member to clean a preset cleaning region;
  • the base station being at least configured to clean the cleaning member of the cleaning device
  • an embodiment of the present disclosure provides a system, including:
  • the cleaning device including a motion mechanism, a cleaning member, and a maintenance mechanism, the motion mechanism being configured to drive the cleaning device to move to allow the cleaning member to clean a preset cleaning region, and the maintenance mechanism being configured to clean the cleaning member;
  • an embodiment of the present disclosure provides a computer-readable storage medium storing computer-executable instructions, the computer-executable instructions, when being executed by a processor, causing the processor to implement he operations of the foregoing method.
  • the embodiments of the present disclosure provide a method and apparatus for processing a cleaning image of a cleaning device, a system, and a storage medium.
  • the method includes: acquiring a dirtiness degree corresponding to one preset cleaning region after the cleaning device has cleaned the preset cleaning region one time through the cleaning member; and generating the cleaning image according to the dirtiness degree corresponding to at least one preset cleaning region. This realizes visualization of the cleaning workload of the cleaning device, thereby improving the user experience of the cleaning device.
  • FIG. 1 is a schematic flowchart of a method for processing a cleaning image of a cleaning device according to an embodiment of the present disclosure.
  • FIG. 2 is a schematic block diagram of a system according to an embodiment of the present disclosure.
  • FIG. 3 is a schematic block diagram of a system according to another embodiment of the present disclosure.
  • FIG. 4 is a schematic diagram of a variation of dirtiness degree of a mopping member over mopping time according to an embodiment of the present disclosure.
  • FIG. 5 is a schematic diagram of preset cleaning regions and their respective image regions according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of cleaning images according to an embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram of a cleaning image according to another embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram of a cleaning image according to still another embodiment of the present disclosure.
  • FIG. 10 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
  • FIG. 11 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
  • FIG. 12 is a schematic diagram of a cleaning image according to another embodiment of the present disclosure.
  • FIG. 13 is a schematic diagram of a cleaning image according to still another embodiment of the present disclosure.
  • FIG. 14 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
  • FIG. 15 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
  • FIG. 16 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
  • FIG. 17 is a schematic diagram of a cleaning image according to another embodiment of the present disclosure.
  • FIG. 18 is a schematic diagram of a cleaning image according to still another embodiment of the present disclosure.
  • FIG. 19 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
  • FIG. 20 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
  • FIG. 21 is a schematic diagram of a cleaning image involved in an embodiment of the present disclosure.
  • FIG. 22 is a schematic diagram of a room and its corresponding room region according to an embodiment of the present disclosure.
  • FIG. 23 is a schematic diagram of a room cleaning image according to an embodiment of the present disclosure.
  • FIG. 24 is a schematic diagram of a trajectory cleaning image according to an embodiment of the present disclosure.
  • FIG. 25 is a schematic diagram of a trajectory cleaning image according to an embodiment of the present disclosure.
  • FIG. 26 is a cleaning image according to an embodiment of the present disclosure.
  • FIG. 27 is a schematic block diagram of a processing apparatus for a cleaning image of a cleaning device according to an embodiment of the present disclosure.
  • FIG. 1 is a schematic flowchart of a method for processing a cleaning image of a cleaning device according to an embodiment of the present disclosure.
  • the method may be applied to a system for the cleaning device, and is configured to generate and display the cleaning image of the cleaning device, so as to visualize the cleaning workload of the cleaning device.
  • the preset cleaning region may be any region to be cleaned, such as a home space, a room in the home space, a partial region of the room, a large place, or a partial region of the large place. From another perspective, the preset cleaning region may refer to a relatively large region that will be initially cleaned, such as, an entire room; or may refer to partial region of the relatively large region that needs to be cleaned again after the initial cleaning, such as, a region near a wall in the room, or an obstacle region in the room.
  • the system includes one or more cleaning devices 100, one or more base stations 200, and a processing apparatus 300.
  • the cleaning device 100 includes a motion mechanism and a cleaning member.
  • the motion mechanism of the cleaning device 100 is configured to drive the cleaning device 100 to move, so that the cleaning member cleans the preset cleaning region.
  • the cleaning member touches the preset cleaning region so as to clean the preset cleaning region along with the movement of the cleaning device 100.
  • the base station 200 is configured to cooperate with the cleaning device 100.
  • the base station 200 may be configured to charge the cleaning device 100, and provide a docking position for the cleaning device 100.
  • the base station 200 may also be configured to clean the cleaning member of the cleaning device 100.
  • the system includes one or more cleaning devices 100 and the processing apparatus 300.
  • the cleaning device 100 includes a motion mechanism, a cleaning member, and a maintenance mechanism.
  • the motion mechanism is configured to drive the cleaning device 100 to move to allow the cleaning member to clean the preset cleaning region.
  • the maintenance mechanism is configured to clean the cleaning member.
  • the processing apparatus 300 may be configured to perform the steps in the method according to the embodiments of the present disclosure.
  • the cleaning device 100 is provided with a device controller for controlling the cleaning device 100
  • the base station 200 is provided with a base station controller for controlling the base station 200.
  • the device controller of the cleaning device 100 and/or the base station controller of the base station 200 may be separately served as the control apparatus 300 or cooperatively served as the control apparatus 300, to implement the steps in the method according to the embodiments of the present disclosure.
  • the system includes a separate control apparatus 300 which is configured to implement the steps of the method according to the embodiments of the present disclosure.
  • the control apparatus 300 may be arranged on the cleaning device 100, or arranged on the base station 200.
  • the control apparatus 300 may be an apparatus other than the cleaning device 100 and the base station 200, such as a home intelligent terminal, a general control apparatus, and the like.
  • the cleaning device 100 may be configured to automatically clean the preset cleaning region in the application scenarios such as household indoor cleaning, large space cleaning, and the like.
  • the cleaning member of the cleaning device 100 includes at least one of a mopping member and a dust suction member.
  • the cleaning device 100 or the base station 200 further includes a dirt detection apparatus which is configured to detect the dirtiness degree of the cleaning member.
  • the dirt detection apparatus includes at least one of the following: a visual sensor and a sewage detection sensor.
  • the visual sensor acquires image or color information of the cleaning member, and the dirtiness degree of the cleaning member is determined based on the image or color information of the cleaning member.
  • the darker the surface of the cleaning member (the mopping member) the greater the dirtiness degree of the mopping member.
  • the sewage detection sensor may acquire detection information of the sewage generated by cleaning the cleaning member (the mopping member) , and the dirtiness degree of the mopping member is determined based on the acquired detection information.
  • the sewage detection sensor includes at least one of the following: a visible light detection sensor, an infrared detection sensor, and a total dissolved solid detection sensor.
  • the infrared detection sensor acquires a turbidity of the sewage
  • the visible light detection sensor acquires a chroma of the sewage
  • the total dissolved solid detection sensor acquires a water conductivity of the sewage.
  • the dirtiness degree of the mopping member may be determined according to one or more of the turbidity, the chroma, and the water conductivity. For example, the greater the turbidity or the water conductivity of the sewage, the greater the dirtiness degree of the mopping member.
  • the way of determining the dirtiness degree of the cleaning member of the cleaning device 100 is not limited thereto.
  • the method for processing a cleaning image of a cleaning device is configured to generate the cleaning image after the cleaning device finishing cleaning at least one preset cleaning region through a cleaning member.
  • the method includes step S110 to step S120.
  • Step S110 a dirtiness degree corresponding to one preset cleaning region is acquired after the cleaning device has cleaned the preset cleaning region one time through the cleaning member.
  • the preset cleaning region may be a region to be cleaned divided by the cleaning device based on a task map.
  • the task map may be created by the cleaning device in response to a map creation command to explore the space it is currently in, or may be updated by the cleaning device based on obstacles, carpets, and the like identified during the cleaning process.
  • the task map may be a map of the cleaning regions specified by a user. For example, in response to the user’s selection of a cleaning region on the map, such as one or more rooms, the one or more rooms are determined as the task map; or, in response to the user circling a cleaning region on the map, such as a portion of one or more rooms, the portion of the one or more rooms is determined as the task map.
  • the preset cleaning region may be determined based on room layout in the task map, and/or a workload threshold of the cleaning device.
  • the workload for each preset cleaning region is less than or equal to the workload threshold.
  • the workload threshold is configured to instruct the cleaning device to interrupt the current cleaning task and move to the base station for maintenance before finishing the workload corresponding to the workload threshold, the cleaning task refers to the task of cleaning all the preset cleaning regions corresponding to the task map by the cleaning device in response to a cleaning command.
  • the cleaning device may clean each preset cleaning region a different number of times, and for cleaner preset cleaning regions, the cleaning device may clean them only once, while for dirtier preset cleaning regions, the cleaning device may also clean them at least once repeatedly after cleaning them for the first time.
  • one room may be one preset cleaning region, or one room includes a plurality of preset cleaning regions.
  • the present disclosure is certainly not limited thereto.
  • one preset cleaning region includes one room and at least partial region of another room.
  • the preset cleaning region may be determined according to a user’s segmentation operation on the task map, or may be defined according to a preset cleaning region segmentation rule.
  • the acquiring the dirtiness degree corresponding to one preset cleaning region includes: acquiring a dirtiness degree of the cleaning member after the cleaning device finishing cleaning one preset cleaning region through the cleaning member; and determining the dirtiness degree corresponding to the preset cleaning region according to the dirtiness degree of the cleaning member.
  • the dirtiness degree of the mopping member is acquired after the cleaning device finishing cleaning the preset cleaning region through the mopping member.
  • the cleaning member includes the mopping member.
  • the mopping member such as a mop
  • the mopping member has a limited ability to collect dirt.
  • FIG. 4 shows a relationship between the dirt amount collected by the mopping member (namely a mopping member dirtiness value d) and the mopping time, in case the cleaning robot moves forward at a constant speed and does not repeatedly mop a ground with uniform dirt distribution (having infinite area) , with the mop from the moment it was washed until the mopping member dirtiness degree value reaches its maximum.
  • the cleaning device may be controlled to move to the base station to get a maintenance, such as cleaning the mopping member, or replacing the mopping member with another cleaned mopping member.
  • the maintenance mechanism of the cleaning device may be controlled to maintain the cleaning device, such as cleaning the mopping member or replacing the mopping member with another cleaned mopping member.
  • the maximum dirtiness value d_max of the mopping member is an empirical value, which may be measured, for example, in the laboratory.
  • the cleaning member includes the dust suction member.
  • the dust suction member defines a certain dirt holding space. In case the dirt sucked by the dust suction member reaches the maximum capacity of the dirt holding space, the dust suction member will not be able to suck any more dirt, and thus has a very poor vacuuming effect on the ground, then it can be determined that the accumulated work amount of the dust suction member, namely the amount of the dirt, has reached the workload threshold, thus needing to stop the vacuuming.
  • the cleaning device may be controlled to move to the base station to get a maintenance, such as removing the dirt from the dust suction member or replacing the dust suction member.
  • the maintenance mechanism of the cleaning device may be controlled to maintain the cleaning device, such as removing the dirt from the dust suction member or replacing the dust suction member.
  • the maximum dirt amount of the dust suction member is an empirical value, which may be measured, for example, in the laboratory.
  • the dirtiness value corresponding to the preset cleaning region is positively correlated with the mopping member dirtiness value. That is, the greater the mopping member dirtiness value, the dirtier the preset cleaning region.
  • the mopping member dirtiness value is equal to the maximum dirtiness value d_max of the mopping member, it can be determined that the preset cleaning region is very dirty, and there is still dirt left in the preset cleaning region that has not been removed by the mopping member after step S110 finishing mopping the preset cleaning region.
  • the dirtiness degree corresponding to the preset cleaning region is positively correlated with the dirtiness degree of the dust suction member. That is, the greater the dirtiness degree of the dust suction member, the dirtier the preset cleaning region.
  • the dirt sucked by the dust suction member reaches an amount equal to the maximum capacity of the dust suction member, it can be determined that the preset cleaning region is very dirty, and it is more likely that there is still dirt left in the preset cleaning region that has not been sucked by the dust suction member after the current cleaning of the preset cleaning region is finished, for example, after the vacuuming of the preset cleaning region.
  • the dirtiness degree of the preset cleaning region may be determined according to the dirtiness degree of the mopping member and/or the dirtiness degree of the dust suction member.
  • the dirtiness degree of the cleaning member is acquired by a dirt detection apparatus, such as a vision sensor which is arranged on the base station or the cleaning device.
  • a dirt detection apparatus such as a vision sensor which is arranged on the base station or the cleaning device.
  • the darker the mopping member the greater the dirtiness degree of the mopping member; the closer the dirt inside the dust suction member to the edge of the dust suction member, the greater the dirtiness degree of the dust suction member.
  • the present disclosure is certainly not limited thereto.
  • the dirtiness degree of the cleaning member of the cleaning device may be determined by acquiring the dirtiness degree of the mopping member by a vision sensor mounted on the cleaning device and towards the mopping member, or may be determined by acquiring the dirtiness degree of the dust suction member by a vision sensor mounted on the cleaning device and towards the inside of the dust suction member.
  • the acquiring the dirtiness degree of the cleaning member includes: acquiring detection information of sewage generated during cleaning the mopping member; and determining the dirtiness degree of the mopping member according to the detection information.
  • the dirt detection apparatus includes a sewage detection sensor which is configured to detect, for example, one or more of turbidity information, chroma information, and water conductivity information of the sewage generated by cleaning the mopping member.
  • the dirt amount cleaned off from the mopping member may be determined by the turbidity of the sewage, the chroma of the sewage, or the water conductivity of the sewage.
  • the larger the turbidity, the chroma or the water conductivity of the sewage, the dirtier the sewage, and the greater the dirt amount cleaned off from the mopping member that is, the greater the dirt elution value of the mopping member configured to characterize the dirt amount cleaned off from the mopping member, the greater the dirt amount absorbed on the mopping member before the mopping member being cleaned, i.e., the larger the dirtiness degree of the mopping member.
  • any one of the turbidity, the chroma, and the water conductivity of the sewage can be configured to characterize the dirt amount cleaned off from the mopping member, namely the dirtiness degree of the mopping member.
  • Each of the turbidity, the chroma, and the water conductivity of the sewage has a positive correlation or corresponding relationship with the dirt elution value, the dirt amount, or the dirtiness degree.
  • the turbidity of the sewage generated by the first cleaning of the mop is detected to be 1NTU, and the corresponding dirt elution value or the dirt amount is 100; the turbidity of the sewage generated by the second cleaning of the mop is detected to be 2NTU, and the corresponding dirt elution value or the dirt amount is 200.
  • the dirt amount cleaned off from the mopping member of the first cleaning is less than the dirt amount cleaned off from the mopping member of the second cleaning, that is, the dirtiness degree of the mopping member at the first cleaning is less than the dirtiness degree of the mopping member at the second cleaning.
  • the corresponding relationship between the chroma or the water conductivity of the sewage and the dirt elution value or the dirt amount is similar, which is not detailed herein.
  • the dirtiness degree may be characterized by any numeric value of the turbidity of the sewage, the chroma of the sewage, the water conductivity of the sewage, the dirt amount, and the dirt elution value; or, the dirtiness degree may be determined by any numeric value of the turbidity of the sewage, the chroma of the sewage, the water conductivity of the sewage, the dirt amount, and the dirt elution value.
  • the dirtiness degree of the mopping member may be characterized by 1; or, if the turbidity of the sewage generated during cleaning the mopping member is 1NTU and the corresponding dirtiness degree is 100, the dirtiness degree of the mopping member is 100.
  • the sewage detection sensor may acquire the detection values at intervals.
  • the dirt amounts corresponding to the detection values may be accumulated to obtain an accumulated result of the dirt amount according to the time and/or the amount of water consumed for cleaning the mopping member.
  • the water amount may be determined according to an amount of clean water supplied to a cleaning tank that contains and cleans the mopping member and/or an amount of sewage discharged from the cleaning tank that contains and cleans the mopping member.
  • a cleaning operation to the mopping member performed between two cleaning operations to the ground may be viewed as one mopping member cleaning task.
  • the mopping member cleaning task for cleaning the mopping member may include, for example, the process of cleaning the mopping member after cleaning one preset cleaning region and before cleaning another preset cleaning region, and may also include the process of cleaning the mopping member after finishing the cleaning task.
  • the condition of ending the cleaning task is that the dirtiness value of any region in the task map is less than its corresponding dirt amount threshold.
  • the mopping member cleaning task includes one or more stage tasks.
  • clean water is supplied to the cleaning tank of the base station, or is directly supplied to the mopping member to clean the mopping member, and then the sewage generated after cleaning the mopping member is discharged out of the cleaning tank or recycled to a sewage container.
  • the sewage container may be disposed in the base station or in the cleaning robot. This process may be done repeatedly or not.
  • the supplying clean water to clean the mopping member and the discharging or recycling the sewage generated after cleaning the mopping member are performed simultaneously.
  • the present disclosure is certainly not limited thereto. For example, during supplying clean water to the cleaning tank, the sewage generated during cleaning the mopping member is intermittently discharged.
  • the time and/or the amount of water consumed for cleaning the mopping member corresponding to different stage tasks may be the same or different.
  • the dirt amounts corresponding to the detection values acquired during executing all the stage tasks may be accumulated to obtain an accumulated result d of the dirt amount based on the time and/or the amount of water corresponding to one or more stage tasks in the mopping member cleaning task.
  • the determining the dirtiness degree of the mopping member according to the detection information includes: accumulating the dirt amount corresponding to the detection information according to the time and/or the amount of water consumed for cleaning the mopping member, where the water amount may be determined according to the amount of the clean water supplied to the cleaning tank that contains and cleans the mopping member and/or the amount of the sewage discharged from the cleaning tank that contains and cleans the mopping member.
  • the detection information such as the turbidity of the sewage, may be directly used as the dirt amount, that is, if the turbidity is 1NTU, the dirt amount is 1.
  • T i represents the turbidity T of the sewage at the i-th sampling operation
  • l i represents the water amount between two sampling operations
  • i is any number of 1, 2, ..., and n
  • n is a total number of the sampling operations.
  • the determining the dirtiness degree of the mopping member according to the detection information includes: pre-judging the dirtiness degree of the mopping member according to a single detection information. For example, after stopping supplying clean water to the cleaning tank, the sewage is discharged, the turbidity of the sewage is detected for one time during discharging the sewage, and the amount of the sewage discharged is acquired. The turbidity of the sewage and the amount of the sewage are multiplied to obtain the accumulated result d of the dirt amount.
  • the present disclosure is certainly not limited thereto.
  • the sewage may be detected multiple times to acquire a plurality of turbidity values, and an average value, a maximum value, or a minimum value of the plurality of turbidity values is multiplied with the amount of the sewage to obtain the accumulated result d of the dirt amount.
  • the dirt amounts corresponding to the detection information are accumulated according to the time and/or the amount of water consumed for cleaning the mopping member.
  • the accumulated result of the dirt amount represents the dirt amount cleaned off from the mopping member, which may be called a dirt elution value.
  • the dirt elution value of the mopping member cleaning task may be determined according to the dirt elution value of one or more stage tasks in the mopping member cleaning task. For example, the dirt elution values of all the stage tasks in the mopping member cleaning task are accumulated to obtain the dirt elution value of the mopping member cleaning task.
  • the detection information of the sewage may be acquired for only one time or multiple times.
  • the dirt elution value of the stage task is determined according to the one detection information or the multiple detection information. For example, the dirt elution value of the stage task is determined according to the product of an average value of the multiple detection information and the amount of water consumed in the stage task.
  • the dirtiness degree of the mopping member may be determined according to the dirt elution value of one or more stage tasks, or the dirt elution value of the mopping member cleaning task.
  • the dirtiness degree of the mopping member is determined according to the dirt elution value of the first stage task in the mopping member cleaning task.
  • the greater the dirt elution value of the first stage task the greater the mopping member dirtiness degree.
  • the dirtiness degree of the mopping member is determined based on a maximum value or an average value of the dirt elution values of multiple stage tasks. The greater the maximum value or the average value, the greater the mopping member dirtiness degree.
  • the mopping member cleaning task is performed after the cleaning device finishing mopping the preset clean region, for example, through the mopping member.
  • the dirt elution values of all the stage tasks in the mopping member cleaning task are accumulated to obtain the dirt elution value of the mopping member cleaning task.
  • the dirt elution value of the mopping member cleaning task is determined as the dirtiness degree corresponding to the preset cleaning region.
  • Step S120 the cleaning image is generated according to the dirtiness degree corresponding to at least one preset cleaning region.
  • the step of generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region further includes: determining a preset cleaning region with a dirtiness degree greater than or equal to a preset dirtiness degree threshold; and generating the cleaning image according to the preset cleaning region with the dirtiness degree greater than or equal to the preset dirtiness degree threshold.
  • the preset cleaning region with the dirtiness degree less than the preset dirtiness degree threshold may not be displayed in the cleaning image, which gives a more visual representation of the cleaning effect of the cleaning device on the preset cleaning regions.
  • the cleaning image may be generated after the cleaning device finishes cleaning all the preset cleaning regions in the task map, or may be generated after the cleaning device finishes at least one time of cleaning to each of all the preset cleaning regions in the task map, or may be generated after the cleaning device finishes one time of cleaning to at least one preset cleaning region in the task map, which is not limited herein.
  • the cleaning image includes an image region corresponding to the preset cleaning region.
  • the generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region includes: determining a target filling pattern of the image region according to a value range where the dirtiness degree corresponding to the preset cleaning region is located; and marking the image region according to the target filling pattern, where different value ranges correspond to different target filling patterns.
  • the target filling pattern may include at least one selected from the group of color, line, shade, pattern, numeric value, or other filling pattern.
  • the target filling pattern may be preset or may be set by a user, which is not limited herein. It should be understood that the target filling pattern may be expanded.
  • the value range where the dirtiness degree corresponding to the preset cleaning region is located is determined, and the target filling pattern of the image region is determined according to the value range where the dirtiness degree corresponding to the preset cleaning region is located. For example, the greater the dirtiness degree corresponding to the preset cleaning region, the darker the color in the target filling pattern; the greater the dirtiness degree corresponding to the preset cleaning region, the denser the lines in the target filling pattern; the greater the dirtiness degree corresponding to the preset cleaning region, the deeper the shade in the target filling pattern; the greater the dirtiness degree corresponding to the preset cleaning region, the denser the patterns in the target filling pattern; the greater the dirtiness degree corresponding to the preset cleaning region, the greater the numeric value in the target filling pattern.
  • other filling patterns such as text, may be extended, which is not limited herein.
  • the dirtiness degree corresponding to the preset cleaning region can be characterized by the number of times the preset cleaning region has been cleaned in a single cleaning task. Generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region, including: determining a target filling pattern of the image region according to the number of times the preset cleaning region has been cleaned during a single cleaning task; marking the image region with the target filling pattern, where different cleaned times correspond to different target filling patterns. For example, the preset cleaning region A1 has been cleaned twice in the cleaning task, the preset cleaning region A2 has been cleaned once in the cleaning task, the preset cleaning region A3 has been cleaned twice in the cleaning task, and the preset cleaning region A4 has been cleaned once in the cleaning task.
  • the preset cleaning regions A1 and A3 correspond to relatively high dirtiness degrees, and they have been repeatedly cleaned by the cleaning member of the cleaning device during the cleaning task. While, the dirtiness degrees corresponding to the preset cleaning regions A2 and A4 are relatively low, and the preset cleaning regions A2 and A4 have been cleaned only once by the cleaning member of the cleaning device during the cleaning task. Therefore, the image region a1 corresponding to the preset cleaning region A1 and the image region a3 corresponding to the preset cleaning region A3 can be filled with a darker colored target filling pattern, while the image region a2 corresponding to the preset cleaning region A2 and the image region a4 corresponding to the preset cleaning region A4 can be filled with a lighter colored target filling pattern.
  • the determining the target filling pattern of the image region according to the value range where the dirtiness degree corresponding to the preset cleaning region is located includes: determining the target filling pattern of the image region corresponding to the preset cleaning region according to the value range where at least one dirtiness degree corresponding to the preset cleaning region is located, in case the number of cleaning times of the preset cleaning region is greater than 1.
  • a dirtiness degree corresponding to the preset cleaning region may be acquired after the cleaning device has cleaned the preset cleaning region one time.
  • the dirtiness degree corresponding to each cleaning may be acquired after each cleaning of the preset cleaning region.
  • the preset cleaning region has been cleaned five times, and a dirtiness degree is acquired after each time of cleaning, that is, five dirtiness degrees are acquired in total.
  • the target filling pattern of the image region corresponding to the preset cleaning region is determined by any one of the five dirtiness degrees, or determined by an accumulated value of at least two dirtiness degrees among the five dirtiness degrees.
  • the preset cleaning region has been cleaned five times, and the dirtiness degree is acquired after only two of the five cleanings, that is, two dirtiness degrees are acquired in total.
  • the target filling pattern of the image region corresponding to the preset cleaning region is determined by either of the two dirtiness degrees, or determined by an accumulated value of the two dirtiness degrees.
  • a remaining number of cleaning times of the preset cleaning region may be predicted according to the first acquired dirtiness degree corresponding to the preset cleaning region, for example, predicting the remaining number of cleaning times to be 4. In case the predicted remaining number of cleaning times of the preset cleaning region is greater than 1, the dirtiness degree of the preset cleaning region may be acquired after only a few of these cleanings, to generate the cleaning image.
  • the dirtiness degree of the preset cleaning region corresponding to each time of cleaning may be acquired, and the cleaning image is generated according to the dirtiness degree of the preset cleaning region corresponding to each time of cleaning.
  • the generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region includes: generating a first cleaning image according to a dirtiness degree corresponding to a first target preset cleaning region.
  • the first target preset cleaning region is one preset cleaning region of the at least two preset cleaning regions, and the first target preset cleaning region is the last cleaned preset cleaning region among the at least one preset cleaning region that has been cleaned by the cleaning device in a current cleaning task.
  • the first cleaning image includes the image regions corresponding to all the preset cleaning regions, at least the image region corresponding to the first target preset cleaning region is marked with a target filling pattern, and the target filling pattern is determined according to a last acquired dirtiness degree corresponding to the first target preset cleaning region.
  • the method further includes: skipping marking an image region corresponding to a non-first target preset cleaning region with a target filling pattern; or marking an image region corresponding to a non-first target preset cleaning region with a preset target filling pattern; or marking an image region corresponding to a non-first target preset cleaning region with a target filling pattern determined according to a last acquired dirtiness degree corresponding to the non-first target preset cleaning region, where the non-first target preset cleaning region is a preset cleaning region other than the first target preset cleaning region.
  • the cleaning device cleans the preset cleaning regions A1, A2, A3, and A4 according to a cleaning sequence of from A1 to A2 to A3 to A4.
  • the cleaning sequence of the preset cleaning regions is not limited thereto.
  • the cleaning sequence of from A1 to A2 to A3 to A4 is used as an example for the description below.
  • the preset cleaning regions have one-to-one correspondence to the image regions in the cleaning image.
  • the preset cleaning region A1 corresponds to the image region a1
  • the preset cleaning region A2 corresponds to the image region a2, and so on. Referring to FIG.
  • the cleaning device first cleans the preset cleaning region A1 one time, and a first cleaning image (a) , as shown in FIG. 6 (a) , may be generated according to the dirtiness degree corresponding to the preset cleaning region A1.
  • the first target cleaning region is the preset cleaning region A1.
  • the first cleaning image (a) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4.
  • the image region a1 corresponding to the preset cleaning region A1 is marked with color, which is determined by the dirtiness degree corresponding to the preset cleaning region A1.
  • the image regions a2 to a4 corresponding to the non-first target cleaning regions A2 to A4 other than the preset cleaning region A1 are marked with no target filling pattern.
  • a first cleaning image (b) may be generated according to the dirtiness degree corresponding to the preset cleaning region A2.
  • the first target cleaning region is the preset cleaning region A2.
  • the first cleaning image (b) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4.
  • the image region a1 corresponding to the preset cleaning region A1 is marked with color
  • the image region a2 corresponding to the preset cleaning region A2 is marked with color.
  • the color marked in the image region a1 is determined by the last acquired dirtiness degree corresponding to the preset cleaning region A1.
  • the color marked in the image region a1 of the first cleaning image (b) is the same as the color marked in the image region a1 of the first cleaning image (a) .
  • the color marked in the image region a2 is determined by the dirtiness degree corresponding to the preset cleaning region A2.
  • the image regions a3 and a4 corresponding to the preset cleaning regions A3 and A4 are marked with no target filling pattern.
  • the image regions corresponding to the non-first target regions in the first cleaning image (a) and the first cleaning image (b) may be marked with no target filling pattern, or may be marked with the preset target filling pattern, which is not limited herein. In the embodiment as shown in FIG.
  • the first cleaning image (b) retains the target filling pattern marked in the image region a1 corresponding to the preset cleaning region A1 in the first cleaning image (a) .
  • the preset cleaning region A1 and the preset cleaning region A2 are different preset cleaning regions in the task map, and the dirtiness degree corresponding to the preset cleaning region A2 does not affect the dirtiness degree corresponding to the preset cleaning region A1. Therefore, in the first cleaning image (b) , the target filling pattern marked in the image region a2 does not affect the target filling pattern marked in the image region a1, and the target filling pattern marked in the image region a1 of the first cleaning image (b) is the same as the target filling pattern marked in the image region a1 of the first cleaning image (a) .
  • a first cleaning image (c) may be generated according to the dirtiness degree corresponding to the preset cleaning region A3, at this time, the first target cleaning region is the preset cleaning region A3; and if the cleaning device continues to clean the preset cleaning region A4 for one time, a first cleaning image (d) , as shown in FIG. 6 (d) , may be generated according to the dirtiness degree corresponding to the preset cleaning region A4, at this time, the first target cleaning region is the preset cleaning region A4.
  • the first cleaning image is generated upon acquiring the dirtiness degree corresponding to the first target preset cleaning region, that is, a first cleaning image is displayed for each completed cleaning of one preset cleaning region (i.e., for each completed dirt detection) .
  • the first cleaning image (a) , the first cleaning image (b) , the first cleaning image (c) , and the first cleaning image (d) are displayed after the preset cleaning regions A1, A2, A3, and A4 have been cleaned one time, respectively.
  • at least one first cleaning image is generated after the cleaning task is finished.
  • any one of the first cleaning images (a) to (d) can be selected to be displayed by user’s clicking on any one of the labels on the screen, which are shown by the numbers 1 to 5 in FIG. 7.
  • at least two first cleaning images are generated successively or simultaneously after the cleaning task is finished, for example, at least two first cleaning images are displayed successively or simultaneously after the cleaning task is finished.
  • the first cleaning images may be selected to be displayed by user’s clicking on at least two of the labels on the screen, which are shown by the numbers 1 to 5 in FIG.
  • a plurality of first cleaning images may be displayed successively by way of clicking on an icon “cleaning image” on the screen by users; or as shown in FIG. 9, a plurality of first cleaning images are displayed on the screen by way of clicking an icon “cleaning image” on the screen by users.
  • the cleaning process of the cleaning device can be shown according to at least one first cleaning image, for example, the actual cleaning sequence of the cleaning device for each preset cleaning region and the dirtiness degree corresponding to each preset cleaning region can be shown.
  • the cleaning device first starting to clean the preset cleaning region A1, and, the target filling pattern marked in the image region a1 shows the dirtiness degree of the preset cleaning region A1 before cleaning and the dirt amount collected by the cleaning device from the preset cleaning region A1 at that current cleaning.
  • the cleaning device cleans the preset cleaning regions A1, A2, A3, and A4 according to a cleaning sequence of from A1 to A2 to A3 to A4.
  • the cleaning sequence of the preset cleaning regions is not limited thereto.
  • the cleaning sequence of from A1 to A2 to A3 to A4 is used as an example for the description below.
  • the cleaning device first cleans the preset cleaning region A1 one time, and a first cleaning image (a) , as shown in FIG. 10 (a) , is generated according to the dirtiness degree corresponding to the preset cleaning region A1 acquired after the current cleaning to the preset cleaning region A1 by the cleaning device.
  • the preset cleaning region A1 is the first target cleaning region.
  • the first cleaning image (a) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4.
  • the image region a1 corresponding to the preset cleaning region A1 is marked with color, which is determined by the dirtiness degree corresponding to the preset cleaning region A1. If the cleaning device needs to continue cleaning the preset cleaning region A1 for the second time after finishing the first cleaning of the preset cleaning region A1, the preset cleaning region A1 is still the first target cleaning region, and a first cleaning image (b) , as shown in FIG. 10 (b) , may be generated according to the dirtiness degree corresponding to the preset cleaning region A1 after the second cleaning.
  • the first cleaning image (b) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4, and the color marked in the image region a1 corresponding to the preset cleaning region A1 is determined by the dirtiness degree corresponding to the preset cleaning region A1 acquired for the second time.
  • a first cleaning image (c) may be generated according to the dirtiness degree corresponding to the preset cleaning region A1 acquired for the third time. This process is continued until the dirtiness degree corresponding to the preset cleaning region A1 is lower than the preset dirtiness degree threshold.
  • the cleaning process of one preset cleaning region by the cleaning device can be shown according to at least one first cleaning image.
  • the number of times of the cleaning device cleaning the preset cleaning region and the change in the cleaning effect of the preset cleaning region after multiple times of cleaning can be shown.
  • the first cleaning image (a) and the first cleaning image (b) in FIG. 10 show that the preset cleaning region A1 has been cleaned twice, and the change in the cleaning effect of the preset cleaning region A1 is reflected by the change in the target filling pattern marked in the image cleaning region a1.
  • the cleaning device in case the cleaning device finishes cleaning one preset cleaning region, i.e., the dirtiness degree corresponding to the preset cleaning region is lower than the preset dirtiness degree threshold, the cleaning device continues to clean another preset cleaning region and generates at least one first cleaning image according to the dirtiness degree corresponding to the another preset cleaning region. For example, after the cleaning device has cleaned the preset cleaning region A1 three times, the dirtiness degree corresponding to the preset cleaning region A1 is lower than the preset dirtiness degree threshold, then the cleaning device continues to clean the preset cleaning region A2. At this time, the preset cleaning region A2 is the first target cleaning region.
  • the first cleaning image (d) as shown in FIG.
  • the cleaning process of the cleaning device can be reflected by at least one of the first cleaning images. For example, the number of times of the cleaning device cleaning each preset cleaning region, the change in the dirtiness degree of each preset cleaning region after each time of cleaning, and the cleaning sequence of the cleaning device cleaning the preset cleaning regions can be shown.
  • the preset cleaning regions A2 to A4 are the non-first target regions, and the image regions a2 to a4 corresponding to the preset cleaning regions A2 to A4 are marked with no target filling pattern, indicating that the preset cleaning regions A2 to A4 have not been cleaned yet.
  • the preset cleaning region A1 is the non-first target region, and the target filling pattern marked in the image region a1 corresponding to the preset cleaning region A1 is the same as the target filling pattern marked in the image region a1 of the first cleaning image (c) , indicating that the preset cleaning region A1 has not been cleaned again.
  • the image regions a2 to a4 of the first cleaning images (a) to (c) may also be marked with a preset target filling pattern, indicating that the preset cleaning regions A2 to A4 have not been cleaned yet.
  • the image region a1 of the first cleaning image (d) may be marked with no target filling pattern or may be marked with a preset target filling pattern, indicating that the preset cleaning region A1 is not cleaned any more, which is not detailed herein.
  • the first cleaning image is generated upon acquiring the dirtiness degree corresponding to the first target preset cleaning region, that is, a first cleaning image is displayed for each completed cleaning of one preset cleaning region (i.e., for each completed dirt detection) .
  • a first cleaning image is displayed for each completed cleaning of one preset cleaning region (i.e., for each completed dirt detection) .
  • each of the first cleaning image (a) , the first cleaning image (b) , the first cleaning image (c) , and the first cleaning image (d) is displayed after each cleaning of the preset cleaning region A1 or the preset cleaning region A2 respectively.
  • at least two first cleaning images may be generated successively or simultaneously after the cleaning task is finished, that is, a plurality of first cleaning images is displayed successively or simultaneously after finishing the cleaning task. For example, as shown in FIG.
  • the corresponding first cleaning image may be selected to display by way of clicking any of the labels on the screen by users, which are shown by the number 1 to the number 5; or as shown in FIG. 12, a plurality of first cleaning images may be displayed successively by way of clicking an icon “cleaning image” on the screen by users; or as shown in FIG. 13, a plurality of first cleaning images may be displayed on the screen by way of clicking an icon on the screen.
  • an animation or a short video is generated according to at least one first cleaning image.
  • the at least one first cleaning image is dynamically displayed with the playback of the animation or the short video; or, the animation or the short video shows the changes of the preset cleaning regions corresponding to a plurality of first cleaning images.
  • the plurality of first cleaning images (a) to (d) may be successively shown by the animation or the short video; the animation or the short video can also show that the target filling patterns marked in the image region a1, the image region a2, the image region a3, and the image region a4 change successively over time, where the target filling pattern marked in each image region is determined according to the dirtiness degree corresponding to each preset cleaning region.
  • the cleaning process of the cleaning device can be shown by the animation or the short video.
  • the actual cleaning sequence of the cleaning device cleaning the preset cleaning regions and the dirtiness degree corresponding to each preset cleaning region after being cleaned can be shown.
  • the first cleaning images (a) to (d) display successively, to indicate that the cleaning device cleaned the preset cleaning regions A1, A2, A3 and A4 in a sequence of from A1 to A2 to A3 to A4; also, it is possible to reflect the change in the cleaning effect of the preset cleaning regions A1 to A4 by the change in the target filling patterns marked in the image regions a1 to a4 of the first cleaning images (a) to (d) . This helps users to understand the cleaning process of the cleaning device and the dirtiness degree of the preset cleaning regions during the cleaning process.
  • the plurality of first cleaning images may be successively shown by the animation or the short video; also, it is possible to show the change in the target filling pattern marked in the image region a1 and the change of the target filling pattern marked in image region a2 over time through the animation or the short video, where the target filling pattern marked in each image region is determined according to the dirtiness degree corresponding to each preset cleaning region.
  • the cleaning process for each preset cleaning region can be shown by the animation or the short video. For example, the number of times of the cleaning device cleaning the preset cleaning region, the change of the dirtiness degree corresponding to each preset cleaning region after each time of cleaning, and the cleaning sequence for cleaning the preset cleaning regions are shown. This helps users to understand the cleaning process of the cleaning device, and the process of making the preset cleaning regions gradually become clean after multiple times of cleaning by the cleaning device.
  • the animation or the short video may be generated and displayed in real time during the cleaning process of the cleaning device; or may be generated and displayed after the cleaning process of the cleaning device is finished, to show the cleaning process of the cleaning device, which is not limited herein.
  • the generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region includes: determining at least one second target preset cleaning region, where the second target preset cleaning region is the preset cleaning region that has been cleaned i times, and i is an integer greater than or equal to 1; and generating an i-th second cleaning image according to at least one target dirtiness degree, where the target dirtiness degree is a dirtiness degree corresponding to the second target preset cleaning region acquired after the i-th cleaning of the second target preset cleaning region, the i-th second cleaning image includes the image regions corresponding to all the preset cleaning regions, the image region corresponding to each second target preset cleaning region is marked with a target filling pattern, and the target filling pattern marked in the image region corresponding to each second target preset cleaning region is determined according to the target dirtiness degree corresponding to each second target preset cleaning region acquired for the i-th time.
  • the method further includes: skipping marking an image region corresponding to a non-second target preset cleaning region with a target filling pattern; or marking an image region corresponding to a non-second target preset cleaning region with a preset target filling pattern; or marking an image region corresponding to a non-second target preset cleaning region with a target filling pattern determined according to a last acquired dirtiness degree corresponding to the non-second target preset cleaning region; where the non-second target preset cleaning region is a preset cleaning region other than the second target preset cleaning region.
  • a first second cleaning image such as the second cleaning image (a) as shown in FIG. 14 (a) or FIG. 15 (a) , may be generated according to the target dirtiness degrees corresponding to all the preset cleaning regions that have been cleaned one time, such as the dirtiness degrees corresponding to the preset cleaning regions A1 to A4 acquired after the preset cleaning regions A1 to A4 have been cleaned one time.
  • the second cleaning image (a) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4, and each of the image regions a1 to a4 is marked with a target filling pattern.
  • the target filling patterns of the image regions a1 to a4 are determined according to the respective target dirtiness degree corresponding to each of the preset cleaning regions A1 to A4, to indicate the dirtiness degree of the preset cleaning regions A1 to A4 before the first time of cleaning.
  • a second second cleaning image such as the second cleaning image (b) as shown in FIG. 14 (b) or FIG.
  • the second cleaning image (b) may be generated according to the target dirtiness degree corresponding to all the preset cleaning regions that have been cleaned two times, such as the dirtiness degrees acquired after each of the preset cleaning regions A1, A2 and A4 has been cleaned for the second time.
  • the second cleaning image (b) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4, and the target filling patterns marked in the image regions a1, a2, and a4 are determined according to the corresponding target dirtiness degrees respectively, to indicate the dirtiness degrees of the preset cleaning regions A1, A2, and A4 after the second time of cleaning.
  • the preset cleaning region A3 Since the preset cleaning region A3 is not cleaned for the second time, the preset cleaning region A3 is not the second target preset cleaning region determined at the time of generating the second second cleaning image (b) .
  • the target filling pattern marked in the image region a3 is determined according to the corresponding dirtiness degree acquired after the first cleaning of the preset cleaning region A3, to show that there is no change in the dirtiness degree of the preset cleaning region A3, which indicates that the preset cleaning region A3 has not been cleaned for the second time.
  • FIG. 14 (b) the target filling pattern marked in the image region a3 is determined according to the corresponding dirtiness degree acquired after the first cleaning of the preset cleaning region A3, to show that there is no change in the dirtiness degree of the preset cleaning region A3, which indicates that the preset cleaning region A3 has not been cleaned for the second time.
  • the image region a3 may be marked with no target filling pattern or may be marked with a preset target filling pattern, to indicate that the preset cleaning region A3 has not been cleaned for the second time, which is not be detailed herein.
  • a second cleaning image (c) as shown in FIG. 14 (c) or 15 (c) is generated according to the dirtiness degrees acquired after that the cleaning device has cleaned the preset cleaning region A1 and the preset cleaning region A2 for the third time, to indicate the dirtiness degrees of the preset cleaning regions A1 and A2 after the third time of cleaning.
  • the workload of the cleaning device can be reflected by one second cleaning image.
  • the amount of dirt collected by the cleaning device from each preset cleaning region can be reflected.
  • the working process of the cleaning device can be reflected by at least two second cleaning images.
  • the change in the dirtiness degree of each preset cleaning region after multiple times of cleaning can be reflected.
  • a second cleaning image is generated after the cleaning task is finished. That is, only one second cleaning image is displayed after finishing the cleaning task.
  • the second cleaning image may be selected to display by way of, for example, clicking a label on a screen, such as, the label “the first time” , the label “the second time” , or the label “the third time” as shown in FIG. 16. As shown in FIG. 16, any one of the second cleaning images (a) to (c) can be selected to display.
  • At least two second cleaning images are generated successively or simultaneously after the cleaning task is finished. That is, at least two second cleaning images are displayed successively or simultaneously after finishing the cleaning task. For example, after the cleaning task is finished, a user may click an icon on the screen, such as the icon “cleaning image” as shown in FIG. 17, to successively display the plurality of second cleaning images; or the at least two second cleaning images are displayed simultaneously on the screen, as shown in FIG. 18. In this way, the second cleaning images (a) to (c) can be displayed successively or simultaneously.
  • the i-th second cleaning image is generated according to at least one target dirtiness degree after determining that all the second target preset cleaning regions have been cleaned for the i-th time.
  • the cleaning device cleans each of the preset cleaning regions A1, A2, A3, and A4 one time according to a predetermined cleaning sequence, and cleans the preset cleaning regions among the preset cleaning regions A1, A2, A3, and A4 that need to be cleaned for the second time, and so on, until the dirtiness degree corresponding to each of the preset cleaning regions A1, A2, A3, and A4 is less than the dirtiness degree threshold
  • the first second cleaning image such as the second cleaning image (a)
  • the second cleaning image such as the second cleaning image (b)
  • the cleaning device has cleaned the preset cleaning regions that need to be cleaned for the second time, such as the preset cleaning regions A1
  • an animation or a short video is generated according to at least one second cleaning image.
  • at least one second cleaning image is dynamically displayed with the playback of the animation or the short video plays; or, the changing processes of the preset cleaning regions corresponding to the plurality of second cleaning images are displayed by the animation or the short video.
  • the second cleaning image a, the second cleaning image b, and the second cleaning image c may be successively displayed by the animation or the short video, to show the changes in the dirtiness degree after the preset cleaning regions A1, A2, A3, and A4 have been cleaned at least one time. This helps the user to understand the cleaning process of the cleaning device and the process of the preset cleaning regions gradually becoming clean.
  • the animation or the short video may be generated and displayed in real time during the cleaning process of the cleaning device; or may be generated and displayed after the cleaning process of the cleaning device is finished, to reproduce the cleaning process of the cleaning device, which is not limited herein.
  • the generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region includes: generating a third cleaning image based on accumulated amount of the acquired dirtiness degree corresponding to the at least one preset cleaning region.
  • a third cleaning image is generated after each cleaning of one preset cleaning region.
  • Each cleaning image includes the image regions corresponding to all the preset cleaning regions, and the target filling pattern marked in the image region corresponding to each preset cleaning region is determined by a sum dirtiness degree corresponding to each preset cleaning region, and the sum dirtiness degree corresponding to each preset cleaning region is the sum of the dirtiness degrees acquired after each cleaning of a preset cleaning region.
  • the cleaning device cleans the preset cleaning regions A1, A2, A3, and A4 according to the cleaning sequence of from A1 to A2 to A3 to A4.
  • the cleaning sequence of the preset cleaning regions is not limited thereto.
  • the cleaning sequence of from A1 to A2 to A3 to A4 is used as an example for the description below.
  • the cleaning device has cleaned the preset cleaning region A1 a total of three times, has cleaned the preset cleaning region A2 a total of three times, has cleaned the preset cleaning region A3 one time, and has cleaned the preset cleaning region A4 a total of two times.
  • a third cleaning image (a) may be generated according to the dirtiness degree corresponding to the preset cleaning region A1 acquired after the first cleaning of the preset cleaning region A1 by the cleaning device.
  • the third cleaning image (a) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4, and the image region a1 corresponding to the preset cleaning region A1 is marked with a numeric value of 500, which is determined by the dirtiness degree of the preset cleaning region A1 corresponding to the first cleaning. If the cleaning device needs to continue cleaning the preset cleaning region A1 for the second time after finishing the first cleaning, a third cleaning image (b) , as shown in FIG.
  • the third cleaning image (b) may be generated according to an accumulated value of two dirtiness degrees corresponding to the two times of cleaning to the preset cleaning region A1.
  • the third cleaning image (b) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4.
  • the numeric value of 800 marked in the image region a1 corresponding to the preset cleaning region A1 is determined by the accumulated value of the dirtiness degree corresponding to the preset cleaning region A1 acquired for the second time and the dirtiness degree corresponding to the preset cleaning region A1 acquired for the first time. It should be understood that after the third time of cleaning to the preset cleaning region A1, a third cleaning image (c) , as shown in FIG.
  • a third cleaning image (d) is generated according to the accumulated value of the three dirtiness degrees corresponding to the preset cleaning region A1 and an accumulated dirtiness degree corresponding to the preset cleaning region A2.
  • the third cleaning image (d) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4.
  • the numeric value of 900 marked in the image region a1 corresponding to the preset cleaning region A1 is determined by the accumulated value of three corresponding dirtiness degrees
  • the numeric value of 500 marked in the image region a2 corresponding to the preset cleaning region A2 is determined by one corresponding dirtiness degree.
  • the image regions corresponding to the uncleaned preset cleaning regions are marked with no target filling pattern or marked with a preset target filling pattern.
  • the image regions a2 to a4 in the third cleaning images (a) to (c) are marked with no target filling pattern, or marked with a preset target filling pattern, which is not limited herein.
  • a plurality of third cleaning images are generated.
  • the cleaning process for cleaning one preset cleaning region by the cleaning device can be reflected based on at least two third cleaning images, for example, the number of cleaning times of each preset cleaning region, and the change in the dirt amount collected by the cleaning device during cleaning the preset cleaning region can be reflected.
  • the dirtiness degree of each preset cleaning region before being cleaned and the accumulated dirt amount collected by the cleaning device from each preset cleaning region can be displayed based on the last third cleaning image. As shown in FIG.
  • the numeric value of 900 marked in the image region a1 corresponding to the preset cleaning region A1 is determined according to the accumulated dirtiness degree of the preset cleaning region A1 after three times of cleaning to the preset cleaning region A1; the numeric value of 900 marked in the image region a2 corresponding to the preset cleaning region A2 is determined according to the accumulated dirtiness degree of the preset cleaning region A2 after three times of cleaning to the preset cleaning region A2; the numeric value of 100 marked in the image region a3 corresponding to the preset cleaning region A3 is determined according to the accumulated dirtiness degree of the preset cleaning region A3 after one time of cleaning to the preset cleaning region A3; and the numeric value of 400 marked in the image region a4 corresponding to the preset cleaning region A4 is determined according to the accumulated dirtiness degree of the preset cleaning region A4 after two times of cleaning to the preset cleaning region A4. It can see the difference in the accumulated dirt amount collected by the cleaning device for each preset region after different times of cleaning.
  • the target filling pattern may be a numeric value, or a color.
  • one third cleaning image (a) may be generated according to the acquired dirtiness degree corresponding to the preset cleaning region A1.
  • the third cleaning image (a) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4, and the image region a1 corresponding to the preset cleaning region A1 is filled with color, and the color is determined by the dirtiness degree corresponding to the first cleaning of the preset cleaning region A1.
  • the color is used as the target filling pattern to show the dirtiness degree of each preset cleaning region before it is cleaned and the accumulated dirt amount collected by the cleaning device from each preset cleaning region.
  • an animation or a short video is generated according to the third cleaning images.
  • the target filling patterns of the image regions corresponding to the preset cleaning regions included in the third cleaning images gradually changing is dynamically displayed, to reflect the process of cleaning the dirt from the preset cleaning regions, and the accumulation process of the dirt elution values during the mopping member cleaning the preset cleaning regions.
  • a plurality of third cleaning images may be dynamically shown by the animation or the short video successively according to the generation sequence of the third cleaning images, such that the cleaning process of the cleaning device can be shown by the animation or the short video.
  • the accumulation process of dirt of the preset cleaning region collected by the cleaning device is represented. This helps the user to understand the cleaning process of the cleaning device, and the accumulated amount of dirtiness degree of each preset cleaning region after at least one time of cleaning to the preset cleaning region by the cleaning device.
  • the animation or the short video may be generated and displayed in real time during the cleaning process of the cleaning device; or may be generated and displayed after finishing the cleaning process of each preset cleaning region, to reproduce the cleaning process of the cleaning device, which is not limited herein.
  • the cleaning image includes a room region, and the room region corresponds to at least one preset cleaning region.
  • the generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region includes: determining a target filling pattern of the room region according to the dirtiness degree of the at least one preset cleaning region corresponding to the room region.
  • the cleaning device cleans at least one room according to the task map.
  • a room R includes one or more preset cleaning regions B.
  • the cleaning device may generate a cleaning image after cleaning the room R.
  • the cleaning image includes a room region r corresponding to the room R. It should be understood that the room region r corresponds to the preset cleaning regions B, and the target filling pattern of the room region r is determined according to the dirtiness degrees corresponding to the preset cleaning regions B.
  • a room R1 includes a preset cleaning region B1, a preset cleaning region B2, and a preset cleaning region B3.
  • the cleaning device may generate a cleaning image after cleaning the room R1.
  • the cleaning image includes a room region r1 corresponding to the room R1.
  • the room region r1 corresponds to the preset cleaning region B1, the preset cleaning region B2, and the preset cleaning region B3.
  • the target filling pattern of the room region r1 is determined according to the dirtiness degrees corresponding to the preset cleaning region B1, the preset cleaning region B2, and the preset cleaning region B3.
  • the determining the target filling pattern of the room region according to the dirtiness degree of the at least one preset cleaning region corresponding to the room region includes: determining the target filling pattern of the room region according to an average dirtiness degree, a total dirtiness degree, or a maximum dirtiness degree corresponding to the at least two preset cleaning regions corresponding to the room region, or according to the dirtiness degree of any preset cleaning region corresponding to the room region.
  • the target filling pattern of the room region may be determined according to the average dirtiness degree, the total dirtiness degree, and the maximum dirtiness degree corresponding to the preset cleaning regions in the room, or according to the dirtiness degree of any preset cleaning region in the room.
  • the target filling pattern of the room region may be determined according to the average dirtiness degree, the total dirtiness degree, and the maximum dirtiness degree corresponding to the preset cleaning regions in the room, or according to the dirtiness degree of any preset cleaning region in the room.
  • the dirtiness degree corresponding to the preset cleaning region B1, the dirtiness degree corresponding to the preset cleaning region B2, and the dirtiness degree corresponding to the preset cleaning region B3 may be accumulated and then divided by the number of the preset cleaning regions corresponding to this cleaning, to determine the target filling pattern of the room region r1 corresponding to the room R1 according to the average dirtiness degree of the preset cleaning regions in the room R1; for example, the dirtiness degree corresponding to the preset cleaning B1, the dirtiness degree corresponding to the preset cleaning region B2, and the dirtiness degree corresponding to the preset cleaning region B3 may be accumulated, to determine the target filling pattern of the room region r1 corresponding to the room R1 according to the total dirtiness degree of the preset cleaning regions in the room R1; for example, the dirtiness degree corresponding
  • the target filling pattern of the room region may be determined according to an average dirtiness degree, a total dirtiness degree, or a maximum dirtiness degree corresponding to all the preset cleaning regions that correspond to the room region and have been cleaned for an i-th time, or according to the dirtiness degree of any preset cleaning region corresponding to the room region, to generate an i-th cleaning image including the room region.
  • the cleaning device has cleaned the preset cleaning region B1 of room R1 twice, has cleaned the preset cleaning region B2 of room R1 twice, and has cleaned the preset cleaning region B3 of room R1 one time, for example, after the first time of cleaning, the dirtiness degrees of the preset cleaning region B1, the preset cleaning region B2, and the preset cleaning region B3 are 500, 600, and 100 respectively, with the average value of 400, the total value of 1200, and the maximum value of 600, then the target filling pattern of the room region r1 corresponding to the room R1 is determined according to any one of the average value of 400, the total value of 1200, the maximum value of 600, and the dirtiness degrees of 500, 600, and 100 corresponding to the present cleaning regions B1 to B3, to generate the first cleaning image including the room region r1.
  • the target filling pattern of the room region r1 corresponding to the room R1 is determined according to any one of the average value of 150, the total value of 300, the maximum value of 200 and the dirtiness degrees of 100 and 200 corresponding to the present cleaning regions B1 to B2, to generate the second cleaning image including the room region r1.
  • a plurality of room cleaning images in correspondence to the number of the dirtiness degrees corresponding to the room region may be generated according to a plurality of dirtiness degrees corresponding to the room region. It should be understood that the determination of any one of the average dirtiness degree, the total dirtiness degree, the maximum dirtiness degree, and the dirtiness degree of any one of the preset cleaning regions may refer to the foregoing description, which is not detailed herein.
  • the target filling pattern of the room region may be determined according to an average sum dirtiness degree, a total sum dirtiness degree, or a maximum sum dirtiness degree corresponding to the preset cleaning regions corresponding to the room region, or according to a sum dirtiness degree corresponding to any preset cleaning region corresponding to the room region, to generate the cleaning image including the room region, and the sum dirtiness degree corresponding to one preset cleaning region is a sum of dirtiness degrees acquired after each cleaning of one preset cleaning region.
  • the cleaning device has cleaned the preset cleaning region B1 of the room R1 twice, has cleaned the preset cleaning region B2 of the room R1 twice, and has cleaned the preset cleaning region B3 of the room R1 one time, if the dirtiness degrees of the preset cleaning region B1, for example, acquired after each of the two times of cleaning are 500 and 100 respectively, the sum dirtiness degree corresponding to the two times of cleaning is 600; if the dirtiness degrees of the preset cleaning region B2 acquired after each of the two times of cleaning are 600 and 200 respectively, the sum dirtiness degree corresponding to the two times of cleaning is 800; and the dirtiness degree of the preset cleaning region B3 acquired after one time of cleaning is 100.
  • the target filling pattern of the room region r1 corresponding to the room R1 may be determined according to any one of the average sum dirtiness degree of 500, the total sum dirtiness degree of 1500, the maximum sum dirtiness degree of 800, and the sum dirtiness degrees of 600, 800, and 100 corresponding to the preset cleaning regions B1 to B3, to generate the cleaning image including the room region r1.
  • the method further includes: acquiring sequential node positions for the cleaning task performed by the cleaning device, where the node positions include at least one of a start position, an interruption position, and an end position; and determining a region covered by a cleaning trajectory connecting two node positions adjacent to each other in a cleaning sequence as one preset cleaning region.
  • the cleaning device may interrupt the cleaning task to get a maintenance based on a workload threshold.
  • the workload threshold includes a cleaning area threshold, a power consumption threshold, a water consumption threshold, a dirt collection upper limit threshold of the mopping member, a low water-level threshold of the clean water tank, a high water-level threshold of the sewage tank, and the like.
  • the cleaning device continues to perform the cleaning task from the position where the cleaning task was interrupted last time.
  • the region covered by the cleaning trajectory connecting the current interruption position and the next interruption position may be determined as one preset cleaning region, and one cleaning image may be generated according to the dirtiness degree corresponding to the region covered by the cleaning trajectory.
  • the target filling pattern of the image region corresponding to the region covered by the cleaning trajectory is determined according to the dirtiness degree corresponding to the region covered by the cleaning trajectory.
  • the cleaning device starts to clean the room R1 at a start position O1 in the room R1.
  • the position O2 is the interruption position O2.
  • the dirtiness degree corresponding to the region covered by the cleaning trajectory S1 connecting the start position O1 and the interruption position O2 is acquired, and a first cleaning image is generated.
  • the image region s1 in the cleaning image corresponds to the region covered by the cleaning trajectory S1, and the target filling pattern marked in the image region s1 is determined according to the dirtiness degree corresponding to the region covered by the cleaning trajectory S1.
  • a second cleaning image and a third cleaning image may be generated according to the dirtiness degrees corresponding to the regions covered by the cleaning trajectories S2 and S3. If the cleaning device further cleans the region covered by the cleaning trajectory for the second time, the target filling pattern marked in the image region corresponding to the region covered by the cleaning trajectory may be determined based on the dirtiness degree of the region covered by the cleaning trajectory acquired after the second cleaning, to generate one cleaning image.
  • the cleaning process of the cleaning device can be reflected by at least one cleaning image generated according to the dirtiness degree corresponding to the region covered by the cleaning trajectory, for example, at least one of the cleaning trajectory of the cleaning device, the dirtiness degrees corresponding to the regions covered by different cleaning trajectories, and the changes in the dirtiness degrees corresponding to the regions covered by different cleaning trajectories after multiple times of cleaning can be reflected.
  • the target filling pattern of the image regions corresponding to the regions covered by all the cleaning trajectories that have been cleaned for an i-th time are determined according to any one of the average dirtiness degree, the total dirtiness degree, and the maximum dirtiness degree of the dirtiness degrees corresponding to the regions covered by all the cleaning trajectories that have been cleaned for the i-th time, or according to the dirtiness degree of any one of the regions covered by the cleaning trajectories, to generate an i-th cleaning image.
  • the cleaning device cleans the room R1 and the room R2, and three cleaning trajectories, namely the cleaning trajectories S1 to S3, are formed after finishing the cleaning of the room R1.
  • the regions covered by the cleaning trajectories S1 and S2 have been cleaned twice, and the region covered by the cleaning trajectory S3 has been cleaned one time.
  • the corresponding dirtiness degree is acquired after each time of cleaning to each of the regions covered by the cleaning trajectories S1 to S3. For example, if the dirtiness degrees of the regions covered by the cleaning trajectories S1 to S3 acquired after the first time of cleaning are 500, 600 and 100 respectively, then the average value of the three dirtiness degrees is 400, the total value is 1200, and the maximum value is 600.
  • the target filling pattern of the image regions s1 to s3 corresponding to the regions covered by the cleaning trajectories S1 to S3 may be determined according to any one of the average value of 400, the total value of 1200, the maximum value of 600, and the dirtiness degrees of 500, 600 and 100 corresponding to the regions covered by the cleaning trajectories S1 to S3, to generate a first cleaning image, as shown in FIG. 25 (a) . If the dirtiness degrees of the regions covered by the cleaning trajectories S1 and S2 acquired after the second time of cleaning are 100 and 200 respectively, then the average value of the two dirtiness degrees is 150, the total value is 300, and the maximum value is 200.
  • the target filling pattern of the image regions s1 and s2 corresponding to the regions covered by the cleaning trajectories S1 and S2 may be determined according to any one of the average value of 150, the total value of 300, the maximum value 200, and the dirtiness degrees of 100 and 200 corresponding to the regions covered by the cleaning trajectories S1 and S2, to generate a second cleaning image, as shown in FIG. 25 (b) .
  • the working process of the cleaning device can be highlighted in the first and second cleaning images with the change in the overall dirtiness degrees of the preset cleaning regions with different cleaning frequencies.
  • the cleaning region covered by the cleaning trajectory may be appropriately expanded according to a preset rule, to make the cleaning region covered by the cleaning trajectory be more obvious for users to observe, thereby improving the use experience.
  • the cleaning image may be called a dirt heat map.
  • the method further includes: generating an animation or a short video according to the generated cleaning images.
  • the animation or the short video may be generated based on the plurality of generated cleaning images.
  • the cleaning images may be played frame by frame.
  • the present disclosure is certainly not limited thereto.
  • the user's experience of using the cleaning device can be enhanced through a variety of visualization manners to allow the user to understand the cleaning effect of the cleaning device.
  • FIG. 26 is a cleaning image involved in an embodiment of the present disclosure.
  • the cleaning image is displayed according to a selecting operation by a user.
  • the user can determine the cleaning image to be output by selecting different number of cleaning times, and in response to the number of cleaning times selected by the user, the cleaning image corresponding to this number of cleaning times is output and displayed.
  • the present disclosure is certainly not limited thereto. The user may be prompted in various ways to select the previously generated cleaning images, which helps the user to understand the cleaning effect of the cleaning device to the floor in different cleaning stages.
  • the cleaning image corresponding to this number of cleaning times is displayed, which helps the user to understand the cleaning effect of the cleaning device to the floor in different cleaning stages.
  • the output cleaning image further includes cleaning information corresponding to the cleaning task performed by the cleaning device, such as a cleaning area and a cleaning duration. This helps the user to understand the working process of the cleaning device, thereby improving the user’s experience of using the cleaning device.
  • the method according to the embodiments of the present disclosure includes: acquiring a dirtiness degree corresponding to one preset cleaning region after that the cleaning device has cleaned the preset cleaning region one time through the cleaning member; and generating the cleaning image according to the dirtiness degree corresponding to the at least one preset cleaning region. This realizes visualization of the cleaning workload of the cleaning device, thereby improving the user’s experience of using the cleaning device.
  • FIG. 27 is a schematic block diagram of a processing apparatus 300 for a cleaning image of a cleaning device according to an embodiment of the present disclosure.
  • the processing apparatus 300 includes a processor 301 and a memory 302.
  • processor 301 and the memory 302 are connected through a bus 303, such as an inter-integrated circuit (I2C) bus.
  • bus 303 such as an inter-integrated circuit (I2C) bus.
  • the processor 301 may be a micro-control unit (MCU) , a central processing unit (CPU) , or a digital signal processor (DSP) .
  • MCU micro-control unit
  • CPU central processing unit
  • DSP digital signal processor
  • the memory 302 may be a flash chip, a read-only memory (ROM) disk, an optical disk, a USB flash drive, or a mobile hard drive.
  • ROM read-only memory
  • the memory 302 may be a flash chip, a read-only memory (ROM) disk, an optical disk, a USB flash drive, or a mobile hard drive.
  • the processor 301 is configured to run computer-executable instructions stored in the memory 302, and implement the steps in the foregoing method when executing the instructions.
  • the processor 301 is configured to run the computer-executable instructions stored in the memory 302, and implement the following steps when executing the instructions:
  • FIG. 2 is a schematic diagram of a system according to an embodiment of the present disclosure.
  • the system includes:
  • a cleaning device 100 which includes a motion mechanism and a cleaning member, the motion mechanism being configured to drive the cleaning device 100 to move to allow the cleaning member to clean a preset cleaning region;
  • a base station 200 which is at least configured to clean the cleaning member of the cleaning device 100;
  • FIG. 3 is a schematic diagram of a system according to an embodiment of the present disclosure.
  • the system includes:
  • a cleaning device 100 which includes a motion mechanism, a cleaning member, and a maintenance mechanism, the motion mechanism being configured to drive the cleaning device 100 to move to allow the cleaning member to clean a preset cleaning region, and the maintenance mechanism being configured to clean the cleaning member; and
  • the cleaning device 100 includes at least one of a cleaning robot, a hand-held cleaning device, and other cleaning devices.
  • the cleaning device 100 may clean the cleaning member by itself.
  • the cleaning device 100 includes the maintenance mechanism.
  • the cleaning device 100 may not clean the cleaning member by itself.
  • the cleaning device system further includes the base station 200, where the base station 200 is at least configured to clean an executive mechanism of the cleaning device.
  • the cleaning device 100 is provided with, for example, a device controller
  • the base 200 is provided with, for example, a base controller.
  • the device controller and/or the base station controller of the base station 200 may be separately served as the processing apparatus 300 or cooperatively served as the processing apparatus 300, to implement the steps in the method according to the embodiments of the present disclosure.
  • the system includes a separate processing apparatus 300 which is configured to implement the steps in the method according to the embodiments of the present disclosure.
  • the processing apparatus 300 may be disposed on the cleaning device 100, or may be disposed on the base station 200, which is not limited herein.
  • the processing apparatus 300 may be an apparatus other than the cleaning device 100 and the base station 200, such as a home intelligent terminal, a general control apparatus, and the like.
  • An embodiment of the present disclosure further provides a computer-readable storage medium storing computer-executable instructions.
  • the computer-executable instructions when being executed by a processor, causes the processor to implement the steps of the foregoing method.
  • the computer-readable storage medium may be an internal storage unit of the processing apparatus according to any one of the foregoing embodiments, such as a hard disk or an internal memory of the processing apparatus.
  • the computer-readable storage medium may be an external storage device of the processing apparatus, such as a plug-in hard disk, a smart media card (SMC) , a secure digital (SD) card, and a flash card which are arranged on the processing apparatus.
  • SMC smart media card
  • SD secure digital
  • the processing apparatus 300 may be configured to implement the steps in the method according any of the embodiments of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

A method and apparatus for processing a cleaning image of a cleaning device, a system, and a storage medium are provided. The method includes: acquiring a dirtiness degree corresponding to one preset cleaning region after the cleaning device has cleaned the preset cleaning region one time through the cleaning member (S110); and generating the cleaning image according to the dirtiness degree corresponding to the at least one preset cleaning region (S120). This realizes visualization of the cleaning workload of the cleaning device, thereby improving user's experience of using the cleaning device.

Description

METHOD AND APPARATUS FOR PROCESSING CLEANING IMAGE OF CLEANING DEVICE, SYSTEM, AND STORAGE MEDIUM TECHNICAL FIELD
The present disclosure relates to the field of cleaning technologies, in particular to a method, an apparatus, a system, and a storage medium for processing cleaning images of a cleaning device.
BACKGROUND
Cleaning devices can be used to automatically clean the ground in the scenarios such as household indoor cleaning, large place cleaning, and the like. Currently, there is a situation that the cleaning device cannot detect a dirty degree of the ground to be cleaned when cleaning the ground, which cannot reflect the cleaning workload of the cleaning device, thus affecting the user’s experience of using the cleaning device.
SUMMARY
The present disclosure provides a method and apparatus for a cleaning image of a cleaning device, a system, and a storage medium, aiming to solve the technical problems in the related art that the cleaning workload of the cleaning device cannot be reflected due to that the cleaning device cannot detect the dirtiness degree of the ground when cleaning the ground, which affects the user experience of the cleaning device.
According to a first aspect, an embodiment of the present disclosure provides a method for processing a cleaning image of a cleaning device, configured to generate the cleaning image after the cleaning device finishing cleaning at least one preset cleaning region through a cleaning member during performing a cleaning task, and the method including:
acquiring a dirtiness degree corresponding to one preset cleaning region after the cleaning device has cleaned the preset cleaning region for one time through the cleaning member; and
generating the cleaning image according to the dirtiness degree corresponding to the at least one preset cleaning region.
According to a second aspect, an embodiment of the present disclosure provides an apparatus for processing a cleaning image of a cleaning device, the apparatus including a memory and a processor;
the memory is configured to store computer-executable instructions; and
the processor is configured to execute the computer-executable instructions to implement the operations in the foregoing method.
According to a third aspect, an embodiment of the present disclosure provides a system, including:
a cleaning device, the cleaning device including a motion mechanism and a cleaning member, the motion mechanism being configured to drive the cleaning device to move to allow the cleaning member to clean a preset cleaning region;
a base station, the base station being at least configured to clean the cleaning member of the cleaning device; and
the foregoing apparatus.
According to a fourth aspect, an embodiment of the present disclosure provides a system, including:
a cleaning device, the cleaning device including a motion mechanism, a cleaning member, and a maintenance mechanism, the motion mechanism being configured to drive the cleaning device to move to allow the cleaning member to clean a preset cleaning region, and the maintenance mechanism  being configured to clean the cleaning member; and
the foregoing apparatus.
According to a fifth aspect, an embodiment of the present disclosure provides a computer-readable storage medium storing computer-executable instructions, the computer-executable instructions, when being executed by a processor, causing the processor to implement he operations of the foregoing method.
The embodiments of the present disclosure provide a method and apparatus for processing a cleaning image of a cleaning device, a system, and a storage medium. The method includes: acquiring a dirtiness degree corresponding to one preset cleaning region after the cleaning device has cleaned the preset cleaning region one time through the cleaning member; and generating the cleaning image according to the dirtiness degree corresponding to at least one preset cleaning region. This realizes visualization of the cleaning workload of the cleaning device, thereby improving the user experience of the cleaning device.
It should be understood that the foregoing general description and the following detailed description are merely exemplary and explanatory, and do not limit the disclosed content of the embodiments of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to illustrate the technical solutions of the embodiments of the present disclosure more clearly, the accompanying drawings are briefly described below. The drawings described below are some of the embodiments, and it would be obvious for those skilled in the art to obtain other drawings based on these drawings without any creative efforts.
FIG. 1 is a schematic flowchart of a method for processing a cleaning image of a cleaning device according to an embodiment of the present disclosure.
FIG. 2 is a schematic block diagram of a system according to an embodiment of the present disclosure.
FIG. 3 is a schematic block diagram of a system according to another embodiment of the present disclosure.
FIG. 4 is a schematic diagram of a variation of dirtiness degree of a mopping member over mopping time according to an embodiment of the present disclosure.
FIG. 5 is a schematic diagram of preset cleaning regions and their respective image regions according to an embodiment of the present disclosure.
FIG. 6 is a schematic diagram of cleaning images according to an embodiment of the present disclosure.
FIG. 7 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
FIG. 8 is a schematic diagram of a cleaning image according to another embodiment of the present disclosure.
FIG. 9 is a schematic diagram of a cleaning image according to still another embodiment of the present disclosure.
FIG. 10 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
FIG. 11 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
FIG. 12 is a schematic diagram of a cleaning image according to another embodiment of the present disclosure.
FIG. 13 is a schematic diagram of a cleaning image according to still another embodiment of the present disclosure.
FIG. 14 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
FIG. 15 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
FIG. 16 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
FIG. 17 is a schematic diagram of a cleaning image according to another embodiment of the present disclosure.
FIG. 18 is a schematic diagram of a cleaning image according to still another embodiment of the present disclosure.
FIG. 19 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
FIG. 20 is a schematic diagram of a cleaning image according to an embodiment of the present disclosure.
FIG. 21 is a schematic diagram of a cleaning image involved in an embodiment of the present disclosure.
FIG. 22 is a schematic diagram of a room and its corresponding room region according to an embodiment of the present disclosure.
FIG. 23 is a schematic diagram of a room cleaning image according to an embodiment of the present disclosure.
FIG. 24 is a schematic diagram of a trajectory cleaning image according to an embodiment of the present disclosure.
FIG. 25 is a schematic diagram of a trajectory cleaning image according to an embodiment of the present disclosure.
FIG. 26 is a cleaning image according to an embodiment of the present disclosure.
FIG. 27 is a schematic block diagram of a processing apparatus for a cleaning image of a cleaning device according to an embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE EMBODIMENTS
The technical solutions in the embodiments of the present disclosure are clearly and completely described with reference to the accompanying drawings in the embodiments of the present disclosure. The embodiments described herein are some rather than all of the embodiments of the present disclosure. Based on the embodiments in the present disclosure, all other embodiments obtained by those skill in the art without any creative efforts shall fall within the protection scope of the present disclosure.
The flowcharts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they have to be executed in the order described herewith. For example, some operations/steps may be decomposed, combined, or partially combined, so the actual execution order may be changed in the sense of an actual situation.
Some embodiments of the present disclosure are described in detail with reference to the accompanying drawings. In the case of no conflict, the following embodiments and features in the embodiments can be combined with each other.
Referring to FIG. 1, FIG. 1 is a schematic flowchart of a method for processing a cleaning image of a cleaning device according to an embodiment of the present disclosure. The method may be applied to a system for the cleaning device, and is configured to generate and display the cleaning image of the cleaning device, so as to visualize the cleaning workload of the cleaning device.
The preset cleaning region may be any region to be cleaned, such as a home space, a room in the home space, a partial region of the room, a large place, or a partial region of the large place. From another perspective, the preset cleaning region may refer to a relatively large region that will be initially cleaned, such as, an entire room; or may refer to partial region of the relatively large region that needs to be cleaned again after the initial cleaning, such as, a region near a wall in the room, or an obstacle region in the room.
As shown in FIG. 2, the system includes one or more cleaning devices 100, one or more base stations 200, and a processing apparatus 300. Illustratively, the cleaning device 100 includes a motion mechanism and a cleaning member. For example, the motion mechanism of the cleaning device 100 is configured to drive the cleaning device 100 to move, so that the cleaning member cleans the preset cleaning region. For example, during the motion mechanism drives the cleaning device 100 to move, the cleaning member touches the preset cleaning region so as to clean the preset cleaning region along with the movement of the cleaning device 100.
In some embodiments, the base station 200 is configured to cooperate with the cleaning device 100. For example, the base station 200 may be configured to charge the cleaning device 100,  and provide a docking position for the cleaning device 100. The base station 200 may also be configured to clean the cleaning member of the cleaning device 100.
As shown in FIG. 3, the system includes one or more cleaning devices 100 and the processing apparatus 300.
Illustratively, the cleaning device 100 includes a motion mechanism, a cleaning member, and a maintenance mechanism. For example, the motion mechanism is configured to drive the cleaning device 100 to move to allow the cleaning member to clean the preset cleaning region. The maintenance mechanism is configured to clean the cleaning member.
The processing apparatus 300 may be configured to perform the steps in the method according to the embodiments of the present disclosure.
Optionally, the cleaning device 100 is provided with a device controller for controlling the cleaning device 100, and the base station 200 is provided with a base station controller for controlling the base station 200. In some embodiments, the device controller of the cleaning device 100 and/or the base station controller of the base station 200 may be separately served as the control apparatus 300 or cooperatively served as the control apparatus 300, to implement the steps in the method according to the embodiments of the present disclosure. In some other embodiments, the system includes a separate control apparatus 300 which is configured to implement the steps of the method according to the embodiments of the present disclosure. The control apparatus 300 may be arranged on the cleaning device 100, or arranged on the base station 200. The present disclosure is certainly not limited thereto. For example, the control apparatus 300 may be an apparatus other than the cleaning device 100 and the base station 200, such as a home intelligent terminal, a general control apparatus, and the like.
The cleaning device 100 may be configured to automatically clean the preset cleaning region in the application scenarios such as household indoor cleaning, large space cleaning, and the like.
Illustratively, the cleaning member of the cleaning device 100 includes at least one of a mopping member and a dust suction member. The present disclosure is certainly not limited thereto. In some embodiments, the cleaning device 100 or the base station 200 further includes a dirt detection apparatus which is configured to detect the dirtiness degree of the cleaning member. Illustratively, the dirt detection apparatus includes at least one of the following: a visual sensor and a sewage detection sensor. For example, the visual sensor acquires image or color information of the cleaning member, and the dirtiness degree of the cleaning member is determined based on the image or color information of the cleaning member. Illustratively, the darker the surface of the cleaning member (the mopping member) , the greater the dirtiness degree of the mopping member. Illustratively, the closer the dirt inside the cleaning member (the dust suction member) to the edge of the dust suction member, the greater the dirtiness degree of the dust suction member. For example, the sewage detection sensor may acquire detection information of the sewage generated by cleaning the cleaning member (the mopping member) , and the dirtiness degree of the mopping member is determined based on the acquired detection information. The sewage detection sensor includes at least one of the following: a visible light detection sensor, an infrared detection sensor, and a total dissolved solid detection sensor. For example, the infrared detection sensor acquires a turbidity of the sewage, the visible light detection sensor acquires a chroma of the sewage, and the total dissolved solid detection sensor acquires a water conductivity of the sewage. The dirtiness degree of the mopping member may be determined according to one or more of the turbidity, the chroma, and the water conductivity. For example, the greater the turbidity or the water conductivity of the sewage, the greater the dirtiness degree of the mopping member. Of course, the way of determining the dirtiness degree of the cleaning member of the cleaning device 100 is not limited thereto.
As shown in FIG. 1, in the embodiments of the present disclosure, the method for processing a cleaning image of a cleaning device is configured to generate the cleaning image after the cleaning device finishing cleaning at least one preset cleaning region through a cleaning member. The method includes step S110 to step S120.
Step S110, a dirtiness degree corresponding to one preset cleaning region is acquired after the cleaning device has cleaned the preset cleaning region one time through the cleaning member.
In some embodiments, the preset cleaning region may be a region to be cleaned divided by the cleaning device based on a task map. Optionally, the task map may be created by the cleaning device in response to a map creation command to explore the space it is currently in, or may be updated by the cleaning device based on obstacles, carpets, and the like identified during the cleaning process. Optionally, the task map may be a map of the cleaning regions specified by a user. For example, in response to the user’s selection of a cleaning region on the map, such as one or more rooms, the one or more rooms are determined as the task map; or, in response to the user circling a cleaning region on the  map, such as a portion of one or more rooms, the portion of the one or more rooms is determined as the task map. The present disclosure is certainly not limited thereto. Illustratively, the preset cleaning region may be determined based on room layout in the task map, and/or a workload threshold of the cleaning device. For example, the workload for each preset cleaning region is less than or equal to the workload threshold. The workload threshold is configured to instruct the cleaning device to interrupt the current cleaning task and move to the base station for maintenance before finishing the workload corresponding to the workload threshold, the cleaning task refers to the task of cleaning all the preset cleaning regions corresponding to the task map by the cleaning device in response to a cleaning command. For example, after a cleaning task, depending on the dirtiness degree of each preset cleaning region, the cleaning device may clean each preset cleaning region a different number of times, and for cleaner preset cleaning regions, the cleaning device may clean them only once, while for dirtier preset cleaning regions, the cleaning device may also clean them at least once repeatedly after cleaning them for the first time. For example, one room may be one preset cleaning region, or one room includes a plurality of preset cleaning regions. The present disclosure is certainly not limited thereto. For example, one preset cleaning region includes one room and at least partial region of another room. Optionally, the preset cleaning region may be determined according to a user’s segmentation operation on the task map, or may be defined according to a preset cleaning region segmentation rule.
Optionally, the acquiring the dirtiness degree corresponding to one preset cleaning region includes: acquiring a dirtiness degree of the cleaning member after the cleaning device finishing cleaning one preset cleaning region through the cleaning member; and determining the dirtiness degree corresponding to the preset cleaning region according to the dirtiness degree of the cleaning member. Illustratively, the dirtiness degree of the mopping member is acquired after the cleaning device finishing cleaning the preset cleaning region through the mopping member.
Illustratively, the cleaning member includes the mopping member. For example, the mopping member, such as a mop, has a limited ability to collect dirt. Referring to FIG. 4, it shows a relationship between the dirt amount collected by the mopping member (namely a mopping member dirtiness value d) and the mopping time, in case the cleaning robot moves forward at a constant speed and does not repeatedly mop a ground with uniform dirt distribution (having infinite area) , with the mop from the moment it was washed until the mopping member dirtiness degree value reaches its maximum. In case the mopping member dirtiness value d reaches its maximum d_max, the mopping member will no longer get more dirtier in the subsequent mopping, and has a very poor cleaning effect on the ground, then it can be determined that the mopping member dirtiness value d reaches the workload threshold, and the mopping needs to be stopped. In addition, the cleaning device may be controlled to move to the base station to get a maintenance, such as cleaning the mopping member, or replacing the mopping member with another cleaned mopping member. Alternatively, the maintenance mechanism of the cleaning device may be controlled to maintain the cleaning device, such as cleaning the mopping member or replacing the mopping member with another cleaned mopping member. Optionally, the maximum dirtiness value d_max of the mopping member is an empirical value, which may be measured, for example, in the laboratory.
Illustratively, the cleaning member includes the dust suction member. For example, the dust suction member defines a certain dirt holding space. In case the dirt sucked by the dust suction member reaches the maximum capacity of the dirt holding space, the dust suction member will not be able to suck any more dirt, and thus has a very poor vacuuming effect on the ground, then it can be determined that the accumulated work amount of the dust suction member, namely the amount of the dirt, has reached the workload threshold, thus needing to stop the vacuuming. In addition, the cleaning device may be controlled to move to the base station to get a maintenance, such as removing the dirt from the dust suction member or replacing the dust suction member. Alternatively, the maintenance mechanism of the cleaning device may be controlled to maintain the cleaning device, such as removing the dirt from the dust suction member or replacing the dust suction member. Optionally, the maximum dirt amount of the dust suction member is an empirical value, which may be measured, for example, in the laboratory.
For example, in case the mopping member dirtiness value is less than the maximum dirtiness value d_max, the dirtiness value corresponding to the preset cleaning region is positively correlated with the mopping member dirtiness value. That is, the greater the mopping member dirtiness value, the dirtier the preset cleaning region. In case the mopping member dirtiness value is equal to the maximum dirtiness value d_max of the mopping member, it can be determined that the preset cleaning region is very dirty, and there is still dirt left in the preset cleaning region that has not been removed by the mopping member after step S110 finishing mopping the preset cleaning region.
For example, in case the dirt sucked by the dust suction member is less than the maximum capacity of the dust suction member, the dirtiness degree corresponding to the preset cleaning region is positively correlated with the dirtiness degree of the dust suction member. That is, the greater the dirtiness degree of the dust suction member, the dirtier the preset cleaning region. In case the dirt sucked by the dust suction member reaches an amount equal to the maximum capacity of the dust suction member, it can be determined that the preset cleaning region is very dirty, and it is more likely that there is still dirt left in the preset cleaning region that has not been sucked by the dust suction member after the current cleaning of the preset cleaning region is finished, for example, after the vacuuming of the preset cleaning region.
In some embodiments, the dirtiness degree of the preset cleaning region may be determined according to the dirtiness degree of the mopping member and/or the dirtiness degree of the dust suction member.
In some embodiments, the dirtiness degree of the cleaning member is acquired by a dirt detection apparatus, such as a vision sensor which is arranged on the base station or the cleaning device. Illustratively, the darker the mopping member, the greater the dirtiness degree of the mopping member; the closer the dirt inside the dust suction member to the edge of the dust suction member, the greater the dirtiness degree of the dust suction member. The present disclosure is certainly not limited thereto. For example, the dirtiness degree of the cleaning member of the cleaning device may be determined by acquiring the dirtiness degree of the mopping member by a vision sensor mounted on the cleaning device and towards the mopping member, or may be determined by acquiring the dirtiness degree of the dust suction member by a vision sensor mounted on the cleaning device and towards the inside of the dust suction member.
Illustratively, the acquiring the dirtiness degree of the cleaning member includes: acquiring detection information of sewage generated during cleaning the mopping member; and determining the dirtiness degree of the mopping member according to the detection information. Optionally, the dirt detection apparatus includes a sewage detection sensor which is configured to detect, for example, one or more of turbidity information, chroma information, and water conductivity information of the sewage generated by cleaning the mopping member. The dirt amount cleaned off from the mopping member may be determined by the turbidity of the sewage, the chroma of the sewage, or the water conductivity of the sewage. For example, the larger the turbidity, the chroma or the water conductivity of the sewage, the dirtier the sewage, and the greater the dirt amount cleaned off from the mopping member, that is, the greater the dirt elution value of the mopping member configured to characterize the dirt amount cleaned off from the mopping member, the greater the dirt amount absorbed on the mopping member before the mopping member being cleaned, i.e., the larger the dirtiness degree of the mopping member. It should be understood that any one of the turbidity, the chroma, and the water conductivity of the sewage can be configured to characterize the dirt amount cleaned off from the mopping member, namely the dirtiness degree of the mopping member. Each of the turbidity, the chroma, and the water conductivity of the sewage has a positive correlation or corresponding relationship with the dirt elution value, the dirt amount, or the dirtiness degree. For example, the turbidity of the sewage generated by the first cleaning of the mop is detected to be 1NTU, and the corresponding dirt elution value or the dirt amount is 100; the turbidity of the sewage generated by the second cleaning of the mop is detected to be 2NTU, and the corresponding dirt elution value or the dirt amount is 200. In this case, it can be determined that the dirt amount cleaned off from the mopping member of the first cleaning is less than the dirt amount cleaned off from the mopping member of the second cleaning, that is, the dirtiness degree of the mopping member at the first cleaning is less than the dirtiness degree of the mopping member at the second cleaning. The corresponding relationship between the chroma or the water conductivity of the sewage and the dirt elution value or the dirt amount is similar, which is not detailed herein. It should be understood that the dirtiness degree may be characterized by any numeric value of the turbidity of the sewage, the chroma of the sewage, the water conductivity of the sewage, the dirt amount, and the dirt elution value; or, the dirtiness degree may be determined by any numeric value of the turbidity of the sewage, the chroma of the sewage, the water conductivity of the sewage, the dirt amount, and the dirt elution value. For example, if the turbidity of the sewage generated by cleaning the mopping member is 1NTU, then the dirtiness degree of the mopping member may be characterized by 1; or, if the turbidity of the sewage generated during cleaning the mopping member is 1NTU and the corresponding dirtiness degree is 100, the dirtiness degree of the mopping member is 100.
For example, during cleaning the mopping member, the sewage detection sensor may acquire the detection values at intervals. The dirt amounts corresponding to the detection values may be  accumulated to obtain an accumulated result of the dirt amount according to the time and/or the amount of water consumed for cleaning the mopping member. The water amount may be determined according to an amount of clean water supplied to a cleaning tank that contains and cleans the mopping member and/or an amount of sewage discharged from the cleaning tank that contains and cleans the mopping member.
In some embodiments, a cleaning operation to the mopping member performed between two cleaning operations to the ground may be viewed as one mopping member cleaning task. The mopping member cleaning task for cleaning the mopping member may include, for example, the process of cleaning the mopping member after cleaning one preset cleaning region and before cleaning another preset cleaning region, and may also include the process of cleaning the mopping member after finishing the cleaning task. The condition of ending the cleaning task is that the dirtiness value of any region in the task map is less than its corresponding dirt amount threshold.
The mopping member cleaning task includes one or more stage tasks. In each stage task, clean water is supplied to the cleaning tank of the base station, or is directly supplied to the mopping member to clean the mopping member, and then the sewage generated after cleaning the mopping member is discharged out of the cleaning tank or recycled to a sewage container. The sewage container may be disposed in the base station or in the cleaning robot. This process may be done repeatedly or not. Alternatively, the supplying clean water to clean the mopping member and the discharging or recycling the sewage generated after cleaning the mopping member are performed simultaneously. The present disclosure is certainly not limited thereto. For example, during supplying clean water to the cleaning tank, the sewage generated during cleaning the mopping member is intermittently discharged.
The time and/or the amount of water consumed for cleaning the mopping member corresponding to different stage tasks may be the same or different. The dirt amounts corresponding to the detection values acquired during executing all the stage tasks may be accumulated to obtain an accumulated result d of the dirt amount based on the time and/or the amount of water corresponding to one or more stage tasks in the mopping member cleaning task.
The determining the dirtiness degree of the mopping member according to the detection information includes: accumulating the dirt amount corresponding to the detection information according to the time and/or the amount of water consumed for cleaning the mopping member, where the water amount may be determined according to the amount of the clean water supplied to the cleaning tank that contains and cleans the mopping member and/or the amount of the sewage discharged from the cleaning tank that contains and cleans the mopping member. Optionally, the detection information, such as the turbidity of the sewage, may be directly used as the dirt amount, that is, if the turbidity is 1NTU, the dirt amount is 1. The dirtiness degree of the mopping member is determined according to the accumulated result of the dirt amount. For example, the accumulated result d of the dirt amount may be obtained by integrating the turbidity T of the sewage over the water amount l consumed for cleaning the mopping member, which is expressed as follows:
d=∫Tdl
In case the sewage detection sensor has limitations in detecting water capacity and frequency, the accumulated result d of the dirt amount may be determined according to the detection information acquired in one or more sampling operations and the water amount between sampling intervals, which is expressed as follows:
d=∑Ti×li
Wherein Ti represents the turbidity T of the sewage at the i-th sampling operation, li represents the water amount between two sampling operations, i is any number of 1, 2, ..., and n, and n is a total number of the sampling operations.
For example, the determining the dirtiness degree of the mopping member according to the detection information includes: pre-judging the dirtiness degree of the mopping member according to a single detection information. For example, after stopping supplying clean water to the cleaning tank, the sewage is discharged, the turbidity of the sewage is detected for one time during discharging the sewage, and the amount of the sewage discharged is acquired. The turbidity of the sewage and the amount of the sewage are multiplied to obtain the accumulated result d of the dirt amount. The present disclosure is certainly not limited thereto. For example, during discharging the sewage, the sewage may be detected multiple times to acquire a plurality of turbidity values, and an average value, a maximum  value, or a minimum value of the plurality of turbidity values is multiplied with the amount of the sewage to obtain the accumulated result d of the dirt amount.
In some embodiments, the dirt amounts corresponding to the detection information are accumulated according to the time and/or the amount of water consumed for cleaning the mopping member. The accumulated result of the dirt amount represents the dirt amount cleaned off from the mopping member, which may be called a dirt elution value.
In some embodiments, the dirt elution value of the mopping member cleaning task may be determined according to the dirt elution value of one or more stage tasks in the mopping member cleaning task. For example, the dirt elution values of all the stage tasks in the mopping member cleaning task are accumulated to obtain the dirt elution value of the mopping member cleaning task.
In each stage task, the detection information of the sewage may be acquired for only one time or multiple times. The dirt elution value of the stage task is determined according to the one detection information or the multiple detection information. For example, the dirt elution value of the stage task is determined according to the product of an average value of the multiple detection information and the amount of water consumed in the stage task.
Illustratively, the dirtiness degree of the mopping member may be determined according to the dirt elution value of one or more stage tasks, or the dirt elution value of the mopping member cleaning task. For example, the dirtiness degree of the mopping member is determined according to the dirt elution value of the first stage task in the mopping member cleaning task. For example, the greater the dirt elution value of the first stage task, the greater the mopping member dirtiness degree. Or, the dirtiness degree of the mopping member is determined based on a maximum value or an average value of the dirt elution values of multiple stage tasks. The greater the maximum value or the average value, the greater the mopping member dirtiness degree.
In some embodiments, the mopping member cleaning task is performed after the cleaning device finishing mopping the preset clean region, for example, through the mopping member. The dirt elution values of all the stage tasks in the mopping member cleaning task are accumulated to obtain the dirt elution value of the mopping member cleaning task. The dirt elution value of the mopping member cleaning task is determined as the dirtiness degree corresponding to the preset cleaning region.
Step S120, the cleaning image is generated according to the dirtiness degree corresponding to at least one preset cleaning region.
In some embodiments, the step of generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region further includes: determining a preset cleaning region with a dirtiness degree greater than or equal to a preset dirtiness degree threshold; and generating the cleaning image according to the preset cleaning region with the dirtiness degree greater than or equal to the preset dirtiness degree threshold. For example, in case there is at least one preset cleaning region with a dirtiness degree less than the preset dirtiness degree threshold after the cleaning device cleaning a plurality of preset cleaning regions, the preset cleaning region with the dirtiness degree less than the preset dirtiness degree threshold may not be displayed in the cleaning image, which gives a more visual representation of the cleaning effect of the cleaning device on the preset cleaning regions.
Optionally, the cleaning image may be generated after the cleaning device finishes cleaning all the preset cleaning regions in the task map, or may be generated after the cleaning device finishes at least one time of cleaning to each of all the preset cleaning regions in the task map, or may be generated after the cleaning device finishes one time of cleaning to at least one preset cleaning region in the task map, which is not limited herein. Illustratively, the cleaning image includes an image region corresponding to the preset cleaning region.
For example, the generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region includes: determining a target filling pattern of the image region according to a value range where the dirtiness degree corresponding to the preset cleaning region is located; and marking the image region according to the target filling pattern, where different value ranges correspond to different target filling patterns.
Optionally, the target filling pattern may include at least one selected from the group of color, line, shade, pattern, numeric value, or other filling pattern. For example, the target filling pattern may be preset or may be set by a user, which is not limited herein. It should be understood that the target filling pattern may be expanded.
In some embodiments, the value range where the dirtiness degree corresponding to the preset cleaning region is located is determined, and the target filling pattern of the image region is determined according to the value range where the dirtiness degree corresponding to the preset cleaning  region is located. For example, the greater the dirtiness degree corresponding to the preset cleaning region, the darker the color in the target filling pattern; the greater the dirtiness degree corresponding to the preset cleaning region, the denser the lines in the target filling pattern; the greater the dirtiness degree corresponding to the preset cleaning region, the deeper the shade in the target filling pattern; the greater the dirtiness degree corresponding to the preset cleaning region, the denser the patterns in the target filling pattern; the greater the dirtiness degree corresponding to the preset cleaning region, the greater the numeric value in the target filling pattern. By analogy, other filling patterns, such as text, may be extended, which is not limited herein.
In some embodiments, the dirtiness degree corresponding to the preset cleaning region can be characterized by the number of times the preset cleaning region has been cleaned in a single cleaning task. Generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region, including: determining a target filling pattern of the image region according to the number of times the preset cleaning region has been cleaned during a single cleaning task; marking the image region with the target filling pattern, where different cleaned times correspond to different target filling patterns. For example, the preset cleaning region A1 has been cleaned twice in the cleaning task, the preset cleaning region A2 has been cleaned once in the cleaning task, the preset cleaning region A3 has been cleaned twice in the cleaning task, and the preset cleaning region A4 has been cleaned once in the cleaning task. This indicates that the preset cleaning regions A1 and A3 correspond to relatively high dirtiness degrees, and they have been repeatedly cleaned by the cleaning member of the cleaning device during the cleaning task. While, the dirtiness degrees corresponding to the preset cleaning regions A2 and A4 are relatively low, and the preset cleaning regions A2 and A4 have been cleaned only once by the cleaning member of the cleaning device during the cleaning task. Therefore, the image region a1 corresponding to the preset cleaning region A1 and the image region a3 corresponding to the preset cleaning region A3 can be filled with a darker colored target filling pattern, while the image region a2 corresponding to the preset cleaning region A2 and the image region a4 corresponding to the preset cleaning region A4 can be filled with a lighter colored target filling pattern.
In some embodiments, the determining the target filling pattern of the image region according to the value range where the dirtiness degree corresponding to the preset cleaning region is located includes: determining the target filling pattern of the image region corresponding to the preset cleaning region according to the value range where at least one dirtiness degree corresponding to the preset cleaning region is located, in case the number of cleaning times of the preset cleaning region is greater than 1.
Illustratively, a dirtiness degree corresponding to the preset cleaning region may be acquired after the cleaning device has cleaned the preset cleaning region one time. In case one preset cleaning region has been cleaned more than one time, the dirtiness degree corresponding to each cleaning may be acquired after each cleaning of the preset cleaning region. For example, the preset cleaning region has been cleaned five times, and a dirtiness degree is acquired after each time of cleaning, that is, five dirtiness degrees are acquired in total. In this case, the target filling pattern of the image region corresponding to the preset cleaning region is determined by any one of the five dirtiness degrees, or determined by an accumulated value of at least two dirtiness degrees among the five dirtiness degrees. For another example, the preset cleaning region has been cleaned five times, and the dirtiness degree is acquired after only two of the five cleanings, that is, two dirtiness degrees are acquired in total. In this case, the target filling pattern of the image region corresponding to the preset cleaning region is determined by either of the two dirtiness degrees, or determined by an accumulated value of the two dirtiness degrees. Illustratively, in case the dirtiness degree is acquired after the first cleaning, a remaining number of cleaning times of the preset cleaning region may be predicted according to the first acquired dirtiness degree corresponding to the preset cleaning region, for example, predicting the remaining number of cleaning times to be 4. In case the predicted remaining number of cleaning times of the preset cleaning region is greater than 1, the dirtiness degree of the preset cleaning region may be acquired after only a few of these cleanings, to generate the cleaning image.
Illustratively, based on the actual number of cleaning times, the dirtiness degree of the preset cleaning region corresponding to each time of cleaning may be acquired, and the cleaning image is generated according to the dirtiness degree of the preset cleaning region corresponding to each time of cleaning.
In some embodiments, the generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region includes: generating a first cleaning image according to a dirtiness degree corresponding to a first target preset cleaning region. The first target preset cleaning region is one preset cleaning region of the at least two preset cleaning regions, and the first target preset  cleaning region is the last cleaned preset cleaning region among the at least one preset cleaning region that has been cleaned by the cleaning device in a current cleaning task. The first cleaning image includes the image regions corresponding to all the preset cleaning regions, at least the image region corresponding to the first target preset cleaning region is marked with a target filling pattern, and the target filling pattern is determined according to a last acquired dirtiness degree corresponding to the first target preset cleaning region.
Optionally, the method further includes: skipping marking an image region corresponding to a non-first target preset cleaning region with a target filling pattern; or marking an image region corresponding to a non-first target preset cleaning region with a preset target filling pattern; or marking an image region corresponding to a non-first target preset cleaning region with a target filling pattern determined according to a last acquired dirtiness degree corresponding to the non-first target preset cleaning region, where the non-first target preset cleaning region is a preset cleaning region other than the first target preset cleaning region.
Illustratively, in case of performing a cleaning task, the cleaning device cleans the preset cleaning regions A1, A2, A3, and A4 according to a cleaning sequence of from A1 to A2 to A3 to A4. Certainly, the cleaning sequence of the preset cleaning regions is not limited thereto. The cleaning sequence of from A1 to A2 to A3 to A4 is used as an example for the description below. Referring to FIG. 5, the preset cleaning regions have one-to-one correspondence to the image regions in the cleaning image. For example, the preset cleaning region A1 corresponds to the image region a1, the preset cleaning region A2 corresponds to the image region a2, and so on. Referring to FIG. 6, the cleaning device first cleans the preset cleaning region A1 one time, and a first cleaning image (a) , as shown in FIG. 6 (a) , may be generated according to the dirtiness degree corresponding to the preset cleaning region A1. At this time, the first target cleaning region is the preset cleaning region A1. The first cleaning image (a) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4. The image region a1 corresponding to the preset cleaning region A1 is marked with color, which is determined by the dirtiness degree corresponding to the preset cleaning region A1. The image regions a2 to a4 corresponding to the non-first target cleaning regions A2 to A4 other than the preset cleaning region A1 are marked with no target filling pattern. If the cleaning device continues to clean the preset cleaning region A2 one time after finishing cleaning the preset cleaning region A1, a first cleaning image (b) , as shown in FIG. 6 (b) , may be generated according to the dirtiness degree corresponding to the preset cleaning region A2. At this time, the first target cleaning region is the preset cleaning region A2. The first cleaning image (b) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4. The image region a1 corresponding to the preset cleaning region A1 is marked with color, and the image region a2 corresponding to the preset cleaning region A2 is marked with color. The color marked in the image region a1 is determined by the last acquired dirtiness degree corresponding to the preset cleaning region A1. That is, the color marked in the image region a1 of the first cleaning image (b) is the same as the color marked in the image region a1 of the first cleaning image (a) . The color marked in the image region a2 is determined by the dirtiness degree corresponding to the preset cleaning region A2. The image regions a3 and a4 corresponding to the preset cleaning regions A3 and A4 are marked with no target filling pattern. Certainly, the image regions corresponding to the non-first target regions in the first cleaning image (a) and the first cleaning image (b) may be marked with no target filling pattern, or may be marked with the preset target filling pattern, which is not limited herein. In the embodiment as shown in FIG. 6, the first cleaning image (b) retains the target filling pattern marked in the image region a1 corresponding to the preset cleaning region A1 in the first cleaning image (a) . The preset cleaning region A1 and the preset cleaning region A2 are different preset cleaning regions in the task map, and the dirtiness degree corresponding to the preset cleaning region A2 does not affect the dirtiness degree corresponding to the preset cleaning region A1. Therefore, in the first cleaning image (b) , the target filling pattern marked in the image region a2 does not affect the target filling pattern marked in the image region a1, and the target filling pattern marked in the image region a1 of the first cleaning image (b) is the same as the target filling pattern marked in the image region a1 of the first cleaning image (a) . By analogy, if the cleaning device continues to clean the preset cleaning region A3 one time, a first cleaning image (c) , as shown in FIG. 6 (c) , may be generated according to the dirtiness degree corresponding to the preset cleaning region A3, at this time, the first target cleaning region is the preset cleaning region A3; and if the cleaning device continues to clean the preset cleaning region A4 for one time, a first cleaning image (d) , as shown in FIG. 6 (d) , may be generated according to the dirtiness degree corresponding to the preset cleaning region A4, at this time, the first target cleaning region is the preset cleaning region A4.
In some embodiments, the first cleaning image is generated upon acquiring the dirtiness degree corresponding to the first target preset cleaning region, that is, a first cleaning image is  displayed for each completed cleaning of one preset cleaning region (i.e., for each completed dirt detection) . For example, the first cleaning image (a) , the first cleaning image (b) , the first cleaning image (c) , and the first cleaning image (d) are displayed after the preset cleaning regions A1, A2, A3, and A4 have been cleaned one time, respectively. In some embodiments, at least one first cleaning image is generated after the cleaning task is finished. For example, only one first cleaning image will be displayed on the screen after the cleaning task is finished, such as selectively displaying the corresponding first cleaning image by the user’s clicking on a label on the screen. As shown in FIG. 7, any one of the first cleaning images (a) to (d) can be selected to be displayed by user’s clicking on any one of the labels on the screen, which are shown by the numbers 1 to 5 in FIG. 7. In some embodiments, at least two first cleaning images are generated successively or simultaneously after the cleaning task is finished, for example, at least two first cleaning images are displayed successively or simultaneously after the cleaning task is finished. After the cleaning task is finished, the first cleaning images may be selected to be displayed by user’s clicking on at least two of the labels on the screen, which are shown by the numbers 1 to 5 in FIG. 7; or as shown in FIG. 8, a plurality of first cleaning images may be displayed successively by way of clicking on an icon “cleaning image” on the screen by users; or as shown in FIG. 9, a plurality of first cleaning images are displayed on the screen by way of clicking an icon “cleaning image” on the screen by users. In this way, the cleaning process of the cleaning device can be shown according to at least one first cleaning image, for example, the actual cleaning sequence of the cleaning device for each preset cleaning region and the dirtiness degree corresponding to each preset cleaning region can be shown. For example, the first cleaning image (a) in FIG. 6 shows the cleaning device first starting to clean the preset cleaning region A1, and, the target filling pattern marked in the image region a1 shows the dirtiness degree of the preset cleaning region A1 before cleaning and the dirt amount collected by the cleaning device from the preset cleaning region A1 at that current cleaning.
Illustratively, in case of performing a cleaning task, the cleaning device cleans the preset cleaning regions A1, A2, A3, and A4 according to a cleaning sequence of from A1 to A2 to A3 to A4. Certainly, the cleaning sequence of the preset cleaning regions is not limited thereto. The cleaning sequence of from A1 to A2 to A3 to A4 is used as an example for the description below. Referring to FIG. 10, the cleaning device first cleans the preset cleaning region A1 one time, and a first cleaning image (a) , as shown in FIG. 10 (a) , is generated according to the dirtiness degree corresponding to the preset cleaning region A1 acquired after the current cleaning to the preset cleaning region A1 by the cleaning device. At this time, the preset cleaning region A1 is the first target cleaning region. The first cleaning image (a) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4. The image region a1 corresponding to the preset cleaning region A1 is marked with color, which is determined by the dirtiness degree corresponding to the preset cleaning region A1. If the cleaning device needs to continue cleaning the preset cleaning region A1 for the second time after finishing the first cleaning of the preset cleaning region A1, the preset cleaning region A1 is still the first target cleaning region, and a first cleaning image (b) , as shown in FIG. 10 (b) , may be generated according to the dirtiness degree corresponding to the preset cleaning region A1 after the second cleaning. The first cleaning image (b) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4, and the color marked in the image region a1 corresponding to the preset cleaning region A1 is determined by the dirtiness degree corresponding to the preset cleaning region A1 acquired for the second time. By analogy, if the cleaning device needs to continue cleaning the preset cleaning region A1 for the third time after finishing the second cleaning of the preset cleaning region A1, a first cleaning image (c) , as shown in FIG. 10 (c) , may be generated according to the dirtiness degree corresponding to the preset cleaning region A1 acquired for the third time. This process is continued until the dirtiness degree corresponding to the preset cleaning region A1 is lower than the preset dirtiness degree threshold. Then, other preset cleaning regions are cleaned, and other first cleaning images are generated. In this way, the cleaning process of one preset cleaning region by the cleaning device can be shown according to at least one first cleaning image. For example, the number of times of the cleaning device cleaning the preset cleaning region and the change in the cleaning effect of the preset cleaning region after multiple times of cleaning can be shown. For example, the first cleaning image (a) and the first cleaning image (b) in FIG. 10 show that the preset cleaning region A1 has been cleaned twice, and the change in the cleaning effect of the preset cleaning region A1 is reflected by the change in the target filling pattern marked in the image cleaning region a1. In some embodiments, in case the cleaning device finishes cleaning one preset cleaning region, i.e., the dirtiness degree corresponding to the preset cleaning region is lower than the preset dirtiness degree threshold, the cleaning device continues to clean another preset cleaning region and generates at least one first cleaning image according to the dirtiness degree corresponding to the another preset cleaning  region. For example, after the cleaning device has cleaned the preset cleaning region A1 three times, the dirtiness degree corresponding to the preset cleaning region A1 is lower than the preset dirtiness degree threshold, then the cleaning device continues to clean the preset cleaning region A2. At this time, the preset cleaning region A2 is the first target cleaning region. The first cleaning image (d) , as shown in FIG. 10 (d) , may be generated according to the dirtiness degree of the preset cleaning region A2. By analogy, at least one first cleaning image is generated according to the dirtiness degree of the preset cleaning region A3, and at least one first cleaning image is generated according to the dirtiness degree of the preset cleaning region A4. In this way, the cleaning process of the cleaning device can be reflected by at least one of the first cleaning images. For example, the number of times of the cleaning device cleaning each preset cleaning region, the change in the dirtiness degree of each preset cleaning region after each time of cleaning, and the cleaning sequence of the cleaning device cleaning the preset cleaning regions can be shown. Illustratively, when the first cleaning images (a) to (c) are generated, the preset cleaning regions A2 to A4 are the non-first target regions, and the image regions a2 to a4 corresponding to the preset cleaning regions A2 to A4 are marked with no target filling pattern, indicating that the preset cleaning regions A2 to A4 have not been cleaned yet. When the first cleaning image (d) is generated, the preset cleaning region A1 is the non-first target region, and the target filling pattern marked in the image region a1 corresponding to the preset cleaning region A1 is the same as the target filling pattern marked in the image region a1 of the first cleaning image (c) , indicating that the preset cleaning region A1 has not been cleaned again. Certainly, the image regions a2 to a4 of the first cleaning images (a) to (c) may also be marked with a preset target filling pattern, indicating that the preset cleaning regions A2 to A4 have not been cleaned yet. The image region a1 of the first cleaning image (d) may be marked with no target filling pattern or may be marked with a preset target filling pattern, indicating that the preset cleaning region A1 is not cleaned any more, which is not detailed herein.
In some embodiments, the first cleaning image is generated upon acquiring the dirtiness degree corresponding to the first target preset cleaning region, that is, a first cleaning image is displayed for each completed cleaning of one preset cleaning region (i.e., for each completed dirt detection) . For example, each of the first cleaning image (a) , the first cleaning image (b) , the first cleaning image (c) , and the first cleaning image (d) is displayed after each cleaning of the preset cleaning region A1 or the preset cleaning region A2 respectively. In some embodiments, at least two first cleaning images may be generated successively or simultaneously after the cleaning task is finished, that is, a plurality of first cleaning images is displayed successively or simultaneously after finishing the cleaning task. For example, as shown in FIG. 11, after the cleaning task is finished, the corresponding first cleaning image may be selected to display by way of clicking any of the labels on the screen by users, which are shown by the number 1 to the number 5; or as shown in FIG. 12, a plurality of first cleaning images may be displayed successively by way of clicking an icon “cleaning image” on the screen by users; or as shown in FIG. 13, a plurality of first cleaning images may be displayed on the screen by way of clicking an icon on the screen.
Optionally, an animation or a short video is generated according to at least one first cleaning image. For example, the at least one first cleaning image is dynamically displayed with the playback of the animation or the short video; or, the animation or the short video shows the changes of the preset cleaning regions corresponding to a plurality of first cleaning images.
Illustratively, the plurality of first cleaning images (a) to (d) may be successively shown by the animation or the short video; the animation or the short video can also show that the target filling patterns marked in the image region a1, the image region a2, the image region a3, and the image region a4 change successively over time, where the target filling pattern marked in each image region is determined according to the dirtiness degree corresponding to each preset cleaning region. In this way, the cleaning process of the cleaning device can be shown by the animation or the short video. For example, the actual cleaning sequence of the cleaning device cleaning the preset cleaning regions and the dirtiness degree corresponding to each preset cleaning region after being cleaned can be shown. For example, the first cleaning images (a) to (d) display successively, to indicate that the cleaning device cleaned the preset cleaning regions A1, A2, A3 and A4 in a sequence of from A1 to A2 to A3 to A4; also, it is possible to reflect the change in the cleaning effect of the preset cleaning regions A1 to A4 by the change in the target filling patterns marked in the image regions a1 to a4 of the first cleaning images (a) to (d) . This helps users to understand the cleaning process of the cleaning device and the dirtiness degree of the preset cleaning regions during the cleaning process.
Illustratively, the plurality of first cleaning images may be successively shown by the animation or the short video; also, it is possible to show the change in the target filling pattern marked  in the image region a1 and the change of the target filling pattern marked in image region a2 over time through the animation or the short video, where the target filling pattern marked in each image region is determined according to the dirtiness degree corresponding to each preset cleaning region. In this way, the cleaning process for each preset cleaning region can be shown by the animation or the short video. For example, the number of times of the cleaning device cleaning the preset cleaning region, the change of the dirtiness degree corresponding to each preset cleaning region after each time of cleaning, and the cleaning sequence for cleaning the preset cleaning regions are shown. This helps users to understand the cleaning process of the cleaning device, and the process of making the preset cleaning regions gradually become clean after multiple times of cleaning by the cleaning device.
In some embodiments, the animation or the short video may be generated and displayed in real time during the cleaning process of the cleaning device; or may be generated and displayed after the cleaning process of the cleaning device is finished, to show the cleaning process of the cleaning device, which is not limited herein.
Optionally, the generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region includes: determining at least one second target preset cleaning region, where the second target preset cleaning region is the preset cleaning region that has been cleaned i times, and i is an integer greater than or equal to 1; and generating an i-th second cleaning image according to at least one target dirtiness degree, where the target dirtiness degree is a dirtiness degree corresponding to the second target preset cleaning region acquired after the i-th cleaning of the second target preset cleaning region, the i-th second cleaning image includes the image regions corresponding to all the preset cleaning regions, the image region corresponding to each second target preset cleaning region is marked with a target filling pattern, and the target filling pattern marked in the image region corresponding to each second target preset cleaning region is determined according to the target dirtiness degree corresponding to each second target preset cleaning region acquired for the i-th time.
Optionally, the method further includes: skipping marking an image region corresponding to a non-second target preset cleaning region with a target filling pattern; or marking an image region corresponding to a non-second target preset cleaning region with a preset target filling pattern; or marking an image region corresponding to a non-second target preset cleaning region with a target filling pattern determined according to a last acquired dirtiness degree corresponding to the non-second target preset cleaning region; where the non-second target preset cleaning region is a preset cleaning region other than the second target preset cleaning region.
Illustratively, referring to FIG. 14 and FIG. 15, in case a cleaning task is finished, the cleaning device has cleaned the preset cleaning region A1 a total of three times, has cleaned the preset cleaning region A2 a total of three times, has cleaned the preset cleaning region A3 one time, and has cleaned the preset cleaning region A4 a total of two times, then a first second cleaning image, such as the second cleaning image (a) as shown in FIG. 14 (a) or FIG. 15 (a) , may be generated according to the target dirtiness degrees corresponding to all the preset cleaning regions that have been cleaned one time, such as the dirtiness degrees corresponding to the preset cleaning regions A1 to A4 acquired after the preset cleaning regions A1 to A4 have been cleaned one time. The second cleaning image (a) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4, and each of the image regions a1 to a4 is marked with a target filling pattern. The target filling patterns of the image regions a1 to a4 are determined according to the respective target dirtiness degree corresponding to each of the preset cleaning regions A1 to A4, to indicate the dirtiness degree of the preset cleaning regions A1 to A4 before the first time of cleaning. Similarly, a second second cleaning image, such as the second cleaning image (b) as shown in FIG. 14 (b) or FIG. 15 (b) , may be generated according to the target dirtiness degree corresponding to all the preset cleaning regions that have been cleaned two times, such as the dirtiness degrees acquired after each of the preset cleaning regions A1, A2 and A4 has been cleaned for the second time. The second cleaning image (b) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4, and the target filling patterns marked in the image regions a1, a2, and a4 are determined according to the corresponding target dirtiness degrees respectively, to indicate the dirtiness degrees of the preset cleaning regions A1, A2, and A4 after the second time of cleaning. Since the preset cleaning region A3 is not cleaned for the second time, the preset cleaning region A3 is not the second target preset cleaning region determined at the time of generating the second second cleaning image (b) . As shown in FIG. 14 (b) , the target filling pattern marked in the image region a3 is determined according to the corresponding dirtiness degree acquired after the first cleaning of the preset cleaning region A3, to show that there is no change in the dirtiness degree of the preset cleaning region A3, which indicates that the preset cleaning region A3 has not been  cleaned for the second time. Alternatively, as shown in FIG. 15 (b) , the image region a3 may be marked with no target filling pattern or may be marked with a preset target filling pattern, to indicate that the preset cleaning region A3 has not been cleaned for the second time, which is not be detailed herein. By analogy, a second cleaning image (c) as shown in FIG. 14 (c) or 15 (c) is generated according to the dirtiness degrees acquired after that the cleaning device has cleaned the preset cleaning region A1 and the preset cleaning region A2 for the third time, to indicate the dirtiness degrees of the preset cleaning regions A1 and A2 after the third time of cleaning. In this way, the workload of the cleaning device can be reflected by one second cleaning image. For example, the amount of dirt collected by the cleaning device from each preset cleaning region can be reflected. Also, the working process of the cleaning device can be reflected by at least two second cleaning images. For example, the change in the dirtiness degree of each preset cleaning region after multiple times of cleaning can be reflected.
In some embodiments, a second cleaning image is generated after the cleaning task is finished. That is, only one second cleaning image is displayed after finishing the cleaning task. The second cleaning image may be selected to display by way of, for example, clicking a label on a screen, such as, the label “the first time” , the label “the second time” , or the label “the third time” as shown in FIG. 16. As shown in FIG. 16, any one of the second cleaning images (a) to (c) can be selected to display.
In some embodiments, at least two second cleaning images are generated successively or simultaneously after the cleaning task is finished. That is, at least two second cleaning images are displayed successively or simultaneously after finishing the cleaning task. For example, after the cleaning task is finished, a user may click an icon on the screen, such as the icon “cleaning image” as shown in FIG. 17, to successively display the plurality of second cleaning images; or the at least two second cleaning images are displayed simultaneously on the screen, as shown in FIG. 18. In this way, the second cleaning images (a) to (c) can be displayed successively or simultaneously.
In some embodiments, the i-th second cleaning image is generated according to at least one target dirtiness degree after determining that all the second target preset cleaning regions have been cleaned for the i-th time. Illustratively, referring to FIG. 14, in case the cleaning device cleans each of the preset cleaning regions A1, A2, A3, and A4 one time according to a predetermined cleaning sequence, and cleans the preset cleaning regions among the preset cleaning regions A1, A2, A3, and A4 that need to be cleaned for the second time, and so on, until the dirtiness degree corresponding to each of the preset cleaning regions A1, A2, A3, and A4 is less than the dirtiness degree threshold, then the first second cleaning image, such as the second cleaning image (a) , may be generated after the cleaning device has cleaned the preset cleaning regions A1, A2, A3, and A4 one time; and the second cleaning image, such as the second cleaning image (b) , is generated after the cleaning device has cleaned the preset cleaning regions that need to be cleaned for the second time, such as the preset cleaning regions A1, A2, and A4. By analogy, a plurality of second cleaning images are generated successively, to show the dirtiness degree corresponding to each preset cleaning region in different stages, and reflect the dirtiness degree of each preset cleaning region and the change in the dirtiness degree of each preset cleaning region.
Optionally, an animation or a short video is generated according to at least one second cleaning image. For example, at least one second cleaning image is dynamically displayed with the playback of the animation or the short video plays; or, the changing processes of the preset cleaning regions corresponding to the plurality of second cleaning images are displayed by the animation or the short video.
Illustratively, the second cleaning image a, the second cleaning image b, and the second cleaning image c may be successively displayed by the animation or the short video, to show the changes in the dirtiness degree after the preset cleaning regions A1, A2, A3, and A4 have been cleaned at least one time. This helps the user to understand the cleaning process of the cleaning device and the process of the preset cleaning regions gradually becoming clean.
In some embodiments, the animation or the short video may be generated and displayed in real time during the cleaning process of the cleaning device; or may be generated and displayed after the cleaning process of the cleaning device is finished, to reproduce the cleaning process of the cleaning device, which is not limited herein.
In some embodiments, the generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region includes: generating a third cleaning image based on accumulated amount of the acquired dirtiness degree corresponding to the at least one preset cleaning region. Illustratively, a third cleaning image is generated after each cleaning of one preset cleaning region. Each cleaning image includes the image regions corresponding to all the preset cleaning regions,  and the target filling pattern marked in the image region corresponding to each preset cleaning region is determined by a sum dirtiness degree corresponding to each preset cleaning region, and the sum dirtiness degree corresponding to each preset cleaning region is the sum of the dirtiness degrees acquired after each cleaning of a preset cleaning region.
Illustratively, referring to FIG. 19 and in combination with the foregoing embodiments, in case of performing a cleaning task, the cleaning device cleans the preset cleaning regions A1, A2, A3, and A4 according to the cleaning sequence of from A1 to A2 to A3 to A4. Certainly, the cleaning sequence of the preset cleaning regions is not limited thereto. The cleaning sequence of from A1 to A2 to A3 to A4 is used as an example for the description below. For example, the cleaning device has cleaned the preset cleaning region A1 a total of three times, has cleaned the preset cleaning region A2 a total of three times, has cleaned the preset cleaning region A3 one time, and has cleaned the preset cleaning region A4 a total of two times. In this case, a third cleaning image (a) , as shown in FIG. 19 (a) , may be generated according to the dirtiness degree corresponding to the preset cleaning region A1 acquired after the first cleaning of the preset cleaning region A1 by the cleaning device. The third cleaning image (a) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4, and the image region a1 corresponding to the preset cleaning region A1 is marked with a numeric value of 500, which is determined by the dirtiness degree of the preset cleaning region A1 corresponding to the first cleaning. If the cleaning device needs to continue cleaning the preset cleaning region A1 for the second time after finishing the first cleaning, a third cleaning image (b) , as shown in FIG. 19 (b) , may be generated according to an accumulated value of two dirtiness degrees corresponding to the two times of cleaning to the preset cleaning region A1. The third cleaning image (b) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4. The numeric value of 800 marked in the image region a1 corresponding to the preset cleaning region A1 is determined by the accumulated value of the dirtiness degree corresponding to the preset cleaning region A1 acquired for the second time and the dirtiness degree corresponding to the preset cleaning region A1 acquired for the first time. It should be understood that after the third time of cleaning to the preset cleaning region A1, a third cleaning image (c) , as shown in FIG. 19 (c) , is generated according to an accumulated value of the three dirtiness degrees corresponding to the preset cleaning region A1, and the numeric value marked in the image region a1 corresponding to the preset cleaning region A1 is 900. It should be understood that the cleaning device continues to clean the preset cleaning region A2 after finishing cleaning the preset cleaning region A1. After the first time of cleaning to the preset cleaning region A2, a third cleaning image (d) , as shown in FIG. 19 (d) , is generated according to the accumulated value of the three dirtiness degrees corresponding to the preset cleaning region A1 and an accumulated dirtiness degree corresponding to the preset cleaning region A2. The third cleaning image (d) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4. The numeric value of 900 marked in the image region a1 corresponding to the preset cleaning region A1 is determined by the accumulated value of three corresponding dirtiness degrees, and the numeric value of 500 marked in the image region a2 corresponding to the preset cleaning region A2 is determined by one corresponding dirtiness degree. The image regions corresponding to the uncleaned preset cleaning regions are marked with no target filling pattern or marked with a preset target filling pattern. For example, the image regions a2 to a4 in the third cleaning images (a) to (c) are marked with no target filling pattern, or marked with a preset target filling pattern, which is not limited herein. By analogy, a plurality of third cleaning images are generated. In this way, the cleaning process for cleaning one preset cleaning region by the cleaning device can be reflected based on at least two third cleaning images, for example, the number of cleaning times of each preset cleaning region, and the change in the dirt amount collected by the cleaning device during cleaning the preset cleaning region can be reflected. Also, the dirtiness degree of each preset cleaning region before being cleaned and the accumulated dirt amount collected by the cleaning device from each preset cleaning region can be displayed based on the last third cleaning image. As shown in FIG. 20, the numeric value of 900 marked in the image region a1 corresponding to the preset cleaning region A1 is determined according to the accumulated dirtiness degree of the preset cleaning region A1 after three times of cleaning to the preset cleaning region A1; the numeric value of 900 marked in the image region a2 corresponding to the preset cleaning region A2 is determined according to the accumulated dirtiness degree of the preset cleaning region A2 after three times of cleaning to the preset cleaning region A2; the numeric value of 100 marked in the image region a3 corresponding to the preset cleaning region A3 is determined according to the accumulated dirtiness degree of the preset cleaning region A3 after one time of cleaning to the preset cleaning region A3; and the numeric value of 400 marked in the image region a4 corresponding to the preset cleaning region A4 is determined according to the accumulated dirtiness degree of the preset cleaning region A4 after two times of cleaning to the preset cleaning region A4. It can see the  difference in the accumulated dirt amount collected by the cleaning device for each preset region after different times of cleaning. This provides users improved perception of the different dirt degrees in each of the preset cleaning regions and the cleaning capability of the cleaning device.
Optionally, the target filling pattern may be a numeric value, or a color. The greater the dirtiness degree, the darker the filled color. Illustratively, after the cleaning device has cleaned the preset cleaning region A1 for the first time, one third cleaning image (a) may be generated according to the acquired dirtiness degree corresponding to the preset cleaning region A1. The third cleaning image (a) includes the image regions a1 to a4 corresponding to all the preset cleaning regions A1 to A4, and the image region a1 corresponding to the preset cleaning region A1 is filled with color, and the color is determined by the dirtiness degree corresponding to the first cleaning of the preset cleaning region A1. For another example, for the last third cleaning image, as shown in FIG. 21, the color is used as the target filling pattern to show the dirtiness degree of each preset cleaning region before it is cleaned and the accumulated dirt amount collected by the cleaning device from each preset cleaning region.
Optionally, an animation or a short video is generated according to the third cleaning images. For example, as the animation or short video plays, the target filling patterns of the image regions corresponding to the preset cleaning regions included in the third cleaning images gradually changing is dynamically displayed, to reflect the process of cleaning the dirt from the preset cleaning regions, and the accumulation process of the dirt elution values during the mopping member cleaning the preset cleaning regions.
Illustratively, a plurality of third cleaning images may be dynamically shown by the animation or the short video successively according to the generation sequence of the third cleaning images, such that the cleaning process of the cleaning device can be shown by the animation or the short video. For example, after the cleaning device cleans a preset cleaning region at least one time, the accumulation process of dirt of the preset cleaning region collected by the cleaning device is represented. This helps the user to understand the cleaning process of the cleaning device, and the accumulated amount of dirtiness degree of each preset cleaning region after at least one time of cleaning to the preset cleaning region by the cleaning device.
In some embodiments, the animation or the short video may be generated and displayed in real time during the cleaning process of the cleaning device; or may be generated and displayed after finishing the cleaning process of each preset cleaning region, to reproduce the cleaning process of the cleaning device, which is not limited herein.
Illustratively, the cleaning image includes a room region, and the room region corresponds to at least one preset cleaning region. In some embodiments, the generating the cleaning image according to the dirtiness degree corresponding to the preset cleaning region includes: determining a target filling pattern of the room region according to the dirtiness degree of the at least one preset cleaning region corresponding to the room region.
Illustratively, the cleaning device cleans at least one room according to the task map. Referring to FIG. 22, a room R includes one or more preset cleaning regions B. The cleaning device may generate a cleaning image after cleaning the room R. The cleaning image includes a room region r corresponding to the room R. It should be understood that the room region r corresponds to the preset cleaning regions B, and the target filling pattern of the room region r is determined according to the dirtiness degrees corresponding to the preset cleaning regions B. Referring to FIG. 23, a room R1 includes a preset cleaning region B1, a preset cleaning region B2, and a preset cleaning region B3. The cleaning device may generate a cleaning image after cleaning the room R1. The cleaning image includes a room region r1 corresponding to the room R1. It should be understood that the room region r1 corresponds to the preset cleaning region B1, the preset cleaning region B2, and the preset cleaning region B3. The target filling pattern of the room region r1 is determined according to the dirtiness degrees corresponding to the preset cleaning region B1, the preset cleaning region B2, and the preset cleaning region B3.
In some embodiments, the determining the target filling pattern of the room region according to the dirtiness degree of the at least one preset cleaning region corresponding to the room region includes: determining the target filling pattern of the room region according to an average dirtiness degree, a total dirtiness degree, or a maximum dirtiness degree corresponding to the at least two preset cleaning regions corresponding to the room region, or according to the dirtiness degree of any preset cleaning region corresponding to the room region.
For example, after the cleaning device has cleaned all the preset cleaning regions in the room, the target filling pattern of the room region may be determined according to the average dirtiness degree, the total dirtiness degree, and the maximum dirtiness degree corresponding to the preset  cleaning regions in the room, or according to the dirtiness degree of any preset cleaning region in the room. Illustratively, referring to FIG. 23, in case of performing a cleaning task, if the cleaning device has cleaned the preset cleaning region B1, the preset cleaning region B2, and the preset cleaning region B3 included in the room R1 one time, for example, the dirtiness degree corresponding to the preset cleaning region B1, the dirtiness degree corresponding to the preset cleaning region B2, and the dirtiness degree corresponding to the preset cleaning region B3 may be accumulated and then divided by the number of the preset cleaning regions corresponding to this cleaning, to determine the target filling pattern of the room region r1 corresponding to the room R1 according to the average dirtiness degree of the preset cleaning regions in the room R1; for example, the dirtiness degree corresponding to the preset cleaning B1, the dirtiness degree corresponding to the preset cleaning region B2, and the dirtiness degree corresponding to the preset cleaning region B3 may be accumulated, to determine the target filling pattern of the room region r1 corresponding to the room R1 according to the total dirtiness degree of the preset cleaning regions in the room R1; for example, the dirtiness degree corresponding to the preset cleaning region B1, the dirtiness degree corresponding to the preset cleaning region B2, and the dirtiness degree corresponding to the preset cleaning region B3 may be compared, to determine the target filling pattern of the room region r1 corresponding to the room R1 according to the maximum dirtiness degree of the preset cleaning regions in the room R1; for example, the dirtiness degree corresponding to the preset cleaning region B1, the dirtiness degree corresponding to the preset cleaning region B2, and the dirtiness degree corresponding to the preset cleaning region B3 may be randomly selected, to determine the target filling pattern of the room region r1 corresponding to the room R1 according to the randomly selected dirtiness degree that may correspond to any preset cleaning region in the room R1.
For example, in case at least one of the at least two preset cleaning regions corresponding to the room region has been cleaned more than one time, the target filling pattern of the room region may be determined according to an average dirtiness degree, a total dirtiness degree, or a maximum dirtiness degree corresponding to all the preset cleaning regions that correspond to the room region and have been cleaned for an i-th time, or according to the dirtiness degree of any preset cleaning region corresponding to the room region, to generate an i-th cleaning image including the room region. Illustratively, the cleaning device has cleaned the preset cleaning region B1 of room R1 twice, has cleaned the preset cleaning region B2 of room R1 twice, and has cleaned the preset cleaning region B3 of room R1 one time, for example, after the first time of cleaning, the dirtiness degrees of the preset cleaning region B1, the preset cleaning region B2, and the preset cleaning region B3 are 500, 600, and 100 respectively, with the average value of 400, the total value of 1200, and the maximum value of 600, then the target filling pattern of the room region r1 corresponding to the room R1 is determined according to any one of the average value of 400, the total value of 1200, the maximum value of 600, and the dirtiness degrees of 500, 600, and 100 corresponding to the present cleaning regions B1 to B3, to generate the first cleaning image including the room region r1. If the dirtiness degrees corresponding to the preset cleaning region B1 and the preset cleaning region B2 after the second time of cleaning are 100 and 200 respectively, with the average value of 150, the total value of 300, and the maximum value of 200, then the target filling pattern of the room region r1 corresponding to the room R1 is determined according to any one of the average value of 150, the total value of 300, the maximum value of 200 and the dirtiness degrees of 100 and 200 corresponding to the present cleaning regions B1 to B2, to generate the second cleaning image including the room region r1. In some embodiments, a plurality of room cleaning images in correspondence to the number of the dirtiness degrees corresponding to the room region may be generated according to a plurality of dirtiness degrees corresponding to the room region. It should be understood that the determination of any one of the average dirtiness degree, the total dirtiness degree, the maximum dirtiness degree, and the dirtiness degree of any one of the preset cleaning regions may refer to the foregoing description, which is not detailed herein.
For example, in case the at least two preset cleaning regions corresponding to the room region have been cleaned more than one time, the target filling pattern of the room region may be determined according to an average sum dirtiness degree, a total sum dirtiness degree, or a maximum sum dirtiness degree corresponding to the preset cleaning regions corresponding to the room region, or according to a sum dirtiness degree corresponding to any preset cleaning region corresponding to the room region, to generate the cleaning image including the room region, and the sum dirtiness degree corresponding to one preset cleaning region is a sum of dirtiness degrees acquired after each cleaning of one preset cleaning region. Illustratively, the cleaning device has cleaned the preset cleaning region B1 of the room R1 twice, has cleaned the preset cleaning region B2 of the room R1 twice, and has cleaned the preset cleaning region B3 of the room R1 one time, if the dirtiness degrees of the preset cleaning region B1, for example, acquired after each of the two times of cleaning are 500 and 100  respectively, the sum dirtiness degree corresponding to the two times of cleaning is 600; if the dirtiness degrees of the preset cleaning region B2 acquired after each of the two times of cleaning are 600 and 200 respectively, the sum dirtiness degree corresponding to the two times of cleaning is 800; and the dirtiness degree of the preset cleaning region B3 acquired after one time of cleaning is 100. That is, the sum dirtiness degree corresponding to the preset cleaning region B1 is 600, the sum dirtiness degree corresponding to the preset cleaning region B2 is 800, and the dirtiness degree corresponding to the preset cleaning region B3 is 100, then the average sum dirtiness degree is 300, the total sum dirtiness degree is 1500, and the maximum sum dirtiness degree is 800. Then, the target filling pattern of the room region r1 corresponding to the room R1 may be determined according to any one of the average sum dirtiness degree of 500, the total sum dirtiness degree of 1500, the maximum sum dirtiness degree of 800, and the sum dirtiness degrees of 600, 800, and 100 corresponding to the preset cleaning regions B1 to B3, to generate the cleaning image including the room region r1.
Optionally, the method further includes: acquiring sequential node positions for the cleaning task performed by the cleaning device, where the node positions include at least one of a start position, an interruption position, and an end position; and determining a region covered by a cleaning trajectory connecting two node positions adjacent to each other in a cleaning sequence as one preset cleaning region.
In some embodiments, in case of performing a cleaning task on the room based on the task map, the cleaning device may interrupt the cleaning task to get a maintenance based on a workload threshold. The workload threshold includes a cleaning area threshold, a power consumption threshold, a water consumption threshold, a dirt collection upper limit threshold of the mopping member, a low water-level threshold of the clean water tank, a high water-level threshold of the sewage tank, and the like. After finishing the maintenance, the cleaning device continues to perform the cleaning task from the position where the cleaning task was interrupted last time. The region covered by the cleaning trajectory connecting the current interruption position and the next interruption position may be determined as one preset cleaning region, and one cleaning image may be generated according to the dirtiness degree corresponding to the region covered by the cleaning trajectory. In the cleaning image, the target filling pattern of the image region corresponding to the region covered by the cleaning trajectory is determined according to the dirtiness degree corresponding to the region covered by the cleaning trajectory. As shown in FIG. 24, the cleaning device starts to clean the room R1 at a start position O1 in the room R1. In case the cleaning device needs to interrupt the cleaning task based on the workload threshold when arriving the position O2 along a cleaning path, then the position O2 is the interruption position O2. The dirtiness degree corresponding to the region covered by the cleaning trajectory S1 connecting the start position O1 and the interruption position O2 is acquired, and a first cleaning image is generated. The image region s1 in the cleaning image corresponds to the region covered by the cleaning trajectory S1, and the target filling pattern marked in the image region s1 is determined according to the dirtiness degree corresponding to the region covered by the cleaning trajectory S1. By analogy, a second cleaning image and a third cleaning image may be generated according to the dirtiness degrees corresponding to the regions covered by the cleaning trajectories S2 and S3. If the cleaning device further cleans the region covered by the cleaning trajectory for the second time, the target filling pattern marked in the image region corresponding to the region covered by the cleaning trajectory may be determined based on the dirtiness degree of the region covered by the cleaning trajectory acquired after the second cleaning, to generate one cleaning image. It should be understood that the cleaning process of the cleaning device can be reflected by at least one cleaning image generated according to the dirtiness degree corresponding to the region covered by the cleaning trajectory, for example, at least one of the cleaning trajectory of the cleaning device, the dirtiness degrees corresponding to the regions covered by different cleaning trajectories, and the changes in the dirtiness degrees corresponding to the regions covered by different cleaning trajectories after multiple times of cleaning can be reflected.
For example, the target filling pattern of the image regions corresponding to the regions covered by all the cleaning trajectories that have been cleaned for an i-th time are determined according to any one of the average dirtiness degree, the total dirtiness degree, and the maximum dirtiness degree of the dirtiness degrees corresponding to the regions covered by all the cleaning trajectories that have been cleaned for the i-th time, or according to the dirtiness degree of any one of the regions covered by the cleaning trajectories, to generate an i-th cleaning image. As shown in FIG. 25, the cleaning device cleans the room R1 and the room R2, and three cleaning trajectories, namely the cleaning trajectories S1 to S3, are formed after finishing the cleaning of the room R1. The regions covered by the cleaning trajectories S1 and S2 have been cleaned twice, and the region covered by the cleaning trajectory S3 has been cleaned one time. The corresponding dirtiness degree is acquired after each time of cleaning  to each of the regions covered by the cleaning trajectories S1 to S3. For example, if the dirtiness degrees of the regions covered by the cleaning trajectories S1 to S3 acquired after the first time of cleaning are 500, 600 and 100 respectively, then the average value of the three dirtiness degrees is 400, the total value is 1200, and the maximum value is 600. The target filling pattern of the image regions s1 to s3 corresponding to the regions covered by the cleaning trajectories S1 to S3 may be determined according to any one of the average value of 400, the total value of 1200, the maximum value of 600, and the dirtiness degrees of 500, 600 and 100 corresponding to the regions covered by the cleaning trajectories S1 to S3, to generate a first cleaning image, as shown in FIG. 25 (a) . If the dirtiness degrees of the regions covered by the cleaning trajectories S1 and S2 acquired after the second time of cleaning are 100 and 200 respectively, then the average value of the two dirtiness degrees is 150, the total value is 300, and the maximum value is 200. The target filling pattern of the image regions s1 and s2 corresponding to the regions covered by the cleaning trajectories S1 and S2 may be determined according to any one of the average value of 150, the total value of 300, the maximum value 200, and the dirtiness degrees of 100 and 200 corresponding to the regions covered by the cleaning trajectories S1 and S2, to generate a second cleaning image, as shown in FIG. 25 (b) . In this way, the working process of the cleaning device can be highlighted in the first and second cleaning images with the change in the overall dirtiness degrees of the preset cleaning regions with different cleaning frequencies. Illustratively, the cleaning region covered by the cleaning trajectory may be appropriately expanded according to a preset rule, to make the cleaning region covered by the cleaning trajectory be more obvious for users to observe, thereby improving the use experience.
Optionally, the cleaning image may be called a dirt heat map. Optionally, the method further includes: generating an animation or a short video according to the generated cleaning images.
In some embodiments, the animation or the short video may be generated based on the plurality of generated cleaning images. For example, the cleaning images may be played frame by frame.
The present disclosure is certainly not limited thereto. The user's experience of using the cleaning device can be enhanced through a variety of visualization manners to allow the user to understand the cleaning effect of the cleaning device.
Illustratively, referring to FIG. 26, FIG. 26 is a cleaning image involved in an embodiment of the present disclosure.
As shown in FIG. 26, the cleaning image is displayed according to a selecting operation by a user. In some embodiments, the user can determine the cleaning image to be output by selecting different number of cleaning times, and in response to the number of cleaning times selected by the user, the cleaning image corresponding to this number of cleaning times is output and displayed. Certainly, the present disclosure is certainly not limited thereto. The user may be prompted in various ways to select the previously generated cleaning images, which helps the user to understand the cleaning effect of the cleaning device to the floor in different cleaning stages.
Optionally, according to the number of cleaning times selected by the user, the cleaning image corresponding to this number of cleaning times is displayed, which helps the user to understand the cleaning effect of the cleaning device to the floor in different cleaning stages.
In some embodiments, as shown in FIG. 26, the output cleaning image further includes cleaning information corresponding to the cleaning task performed by the cleaning device, such as a cleaning area and a cleaning duration. This helps the user to understand the working process of the cleaning device, thereby improving the user’s experience of using the cleaning device.
The method according to the embodiments of the present disclosure includes: acquiring a dirtiness degree corresponding to one preset cleaning region after that the cleaning device has cleaned the preset cleaning region one time through the cleaning member; and generating the cleaning image according to the dirtiness degree corresponding to the at least one preset cleaning region. This realizes visualization of the cleaning workload of the cleaning device, thereby improving the user’s experience of using the cleaning device.
In combination with the foregoing embodiments and referring to FIG. 27, FIG. 27 is a schematic block diagram of a processing apparatus 300 for a cleaning image of a cleaning device according to an embodiment of the present disclosure. The processing apparatus 300 includes a processor 301 and a memory 302.
Illustratively, the processor 301 and the memory 302 are connected through a bus 303, such as an inter-integrated circuit (I2C) bus.
Typically, the processor 301 may be a micro-control unit (MCU) , a central processing unit (CPU) , or a digital signal processor (DSP) .
Typically, the memory 302 may be a flash chip, a read-only memory (ROM) disk, an optical disk, a USB flash drive, or a mobile hard drive.
The processor 301 is configured to run computer-executable instructions stored in the memory 302, and implement the steps in the foregoing method when executing the instructions.
Illustratively, the processor 301 is configured to run the computer-executable instructions stored in the memory 302, and implement the following steps when executing the instructions:
acquiring a dirtiness degree corresponding to one preset cleaning region after that the cleaning device has cleaned the preset cleaning region one time through the cleaning member; and
generating the cleaning image according to the dirtiness degree corresponding to at least one preset cleaning region.
The specific principle and implementation of the processing apparatus 300 provided by the embodiments of the present disclosure are similar to the foregoing method, and will not be detailed herein.
In combination with the foregoing embodiments and referring to FIG. 2, FIG. 2 is a schematic diagram of a system according to an embodiment of the present disclosure.
As shown in FIG. 2, the system includes:
a cleaning device 100 which includes a motion mechanism and a cleaning member, the motion mechanism being configured to drive the cleaning device 100 to move to allow the cleaning member to clean a preset cleaning region;
a base station 200 which is at least configured to clean the cleaning member of the cleaning device 100; and
the processing apparatus 300.
In combination with the foregoing embodiments and referring to FIG. 3, FIG. 3 is a schematic diagram of a system according to an embodiment of the present disclosure.
As shown in FIG. 3, the system includes:
a cleaning device 100 which includes a motion mechanism, a cleaning member, and a maintenance mechanism, the motion mechanism being configured to drive the cleaning device 100 to move to allow the cleaning member to clean a preset cleaning region, and the maintenance mechanism being configured to clean the cleaning member; and
the processing apparatus 300.
Illustratively, the cleaning device 100 includes at least one of a cleaning robot, a hand-held cleaning device, and other cleaning devices.
Optionally, the cleaning device 100, for example, may clean the cleaning member by itself. For example, the cleaning device 100 includes the maintenance mechanism.
Optionally, the cleaning device 100, for example, may not clean the cleaning member by itself. For example, the cleaning device system further includes the base station 200, where the base station 200 is at least configured to clean an executive mechanism of the cleaning device.
In some embodiments, the cleaning device 100 is provided with, for example, a device controller, and the base 200 is provided with, for example, a base controller. Illustratively, the device controller and/or the base station controller of the base station 200 may be separately served as the processing apparatus 300 or cooperatively served as the processing apparatus 300, to implement the steps in the method according to the embodiments of the present disclosure. In some other embodiments, the system includes a separate processing apparatus 300 which is configured to implement the steps in the method according to the embodiments of the present disclosure. The processing apparatus 300 may be disposed on the cleaning device 100, or may be disposed on the base station 200, which is not limited herein. For example, the processing apparatus 300 may be an apparatus other than the cleaning device 100 and the base station 200, such as a home intelligent terminal, a general control apparatus, and the like.
The specific principle and implementation of the system provided by the embodiments of the present disclosure are similar to the foregoing method, and will not be detailed herein.
An embodiment of the present disclosure further provides a computer-readable storage medium storing computer-executable instructions. The computer-executable instructions, when being executed by a processor, causes the processor to implement the steps of the foregoing method.
The computer-readable storage medium may be an internal storage unit of the processing apparatus according to any one of the foregoing embodiments, such as a hard disk or an internal memory of the processing apparatus. Alternatively, the computer-readable storage medium may be an  external storage device of the processing apparatus, such as a plug-in hard disk, a smart media card (SMC) , a secure digital (SD) card, and a flash card which are arranged on the processing apparatus.
In some embodiments, the processing apparatus 300 may be configured to implement the steps in the method according any of the embodiments of the present disclosure.
The specific principle and implementation of the system provided by the embodiments of the present disclosure are similar to the foregoing method, and will not be detailed herein.
It should be understood that the terms used in the present disclosure are merely for describing the specific embodiments and are not intended to limit the present disclosure.
It is also to be understood that the term “and/or” as used in the present disclosure and the appended claims refers to any and all possible combinations of one or more of the items listed in association, and includes such combinations.
The above description merely describes the specific embodiments, but the protection scope of the present disclosure is not limited thereto. It would be obvious for those skilled in the art to obtain various equivalent modifications or replacements within the technical scope disclosed in the present disclosure, and these modifications or replacements are within the protection scope of the present disclosure. Therefore, the scope of the present disclosure should be based on the scope of the claims.

Claims (24)

  1. A method for processing a cleaning image of a cleaning device, configured to generate the cleaning image after the cleaning device finishing cleaning at least one preset cleaning region through a cleaning member during performing a cleaning task, and the method comprising:
    acquiring a dirtiness degree corresponding to one preset cleaning region after the cleaning device has cleaned the preset cleaning region one time through the cleaning member; and
    generating the cleaning image according to the dirtiness degree corresponding to the at least one preset cleaning region.
  2. The method according to claim 1, wherein the cleaning image comprises an image region corresponding to the preset cleaning region; and
    the generating the cleaning image according to the dirtiness degree corresponding to the at least one preset cleaning region comprises:
    determining a target filling pattern of the image region according to a value range where the dirtiness degree corresponding to the preset cleaning region is located; and
    marking the image region with the target filling pattern, wherein different value ranges correspond to different target filling patterns.
  3. The method according to claim 2, wherein the determining the target filling pattern of the image region according to the value range where the dirtiness degree corresponding to the preset cleaning region is located comprises:
    determining the target filling pattern of the image region corresponding to the preset cleaning region according to a value range where at least one dirtiness degree corresponding to the preset cleaning region is located, in case the number of cleaning times of the preset cleaning region is greater than 1.
  4. The method according to claim 1, wherein the generating the cleaning image according to the dirtiness degree corresponding to the at least one preset cleaning region comprises:
    generating a first cleaning image according to a dirtiness degree corresponding to a first target preset cleaning region, wherein the first target preset cleaning region is one of the at least two preset cleaning regions, the first target preset cleaning region is the last cleaned preset cleaning region among the at least one preset cleaning region that has been cleaned by the cleaning device in a current cleaning task, the first cleaning image comprises each image region corresponding to each preset cleaning region, at least the image region corresponding to the first target preset cleaning region is marked with a target filling pattern, and the target filling pattern is determined according to a last acquired dirtiness degree corresponding to the first target preset cleaning region.
  5. The method according to claim 4, further comprising:
    skipping marking an image region corresponding to a non-first target preset cleaning region with a target filling pattern; or
    marking an image region corresponding to a non-first target preset cleaning region with a preset target filling pattern; or
    marking an image region corresponding to a non-first target preset cleaning region with a target filling pattern determined according to a last acquired dirtiness degree corresponding to the non-first target preset cleaning region;
    wherein the non-first target preset cleaning region is a preset cleaning region other than the first target preset cleaning region.
  6. The method according to claim 4 or 5, further comprising:
    generating the first cleaning image upon acquiring the dirtiness degree corresponding to the first target preset cleaning region; or
    generating one first cleaning image after finishing the cleaning task; or
    generating at least two first cleaning images successively or simultaneously after finishing the cleaning task.
  7. The method according to claim 1, wherein the generating the cleaning image according to the  dirtiness degree corresponding to the at least one preset cleaning region comprises:
    determining at least one second target preset cleaning region, the second target preset cleaning region being a preset cleaning region that has been cleaned i times, wherein i is an integer greater than or equal to 1; and
    generating an i-th second cleaning image according to at least one target dirtiness degree, wherein the target dirtiness degree is a dirtiness degree corresponding to the second target preset cleaning region acquired after the second target preset cleaning region has been cleaned for the i-th time, the i-th second cleaning image comprises each image region corresponding to each preset cleaning region, the image region corresponding to each second target preset cleaning region is marked with a target filling pattern, and the target filling pattern marked in the image region corresponding to each second target preset cleaning region is determined according to the target dirtiness degree corresponding to each second target preset cleaning region acquired for the i-th time.
  8. The method according to claim 7, further comprising:
    skipping marking an image region corresponding to a non-second target preset cleaning region with a target filling pattern; or
    marking an image region corresponding to a non-second target preset cleaning region with a preset target filling pattern; or
    marking an image region corresponding to a non-second target preset cleaning region with a target filling pattern determined according to a last acquired dirtiness degree corresponding to the non-second target preset cleaning region;
    wherein the non-second target preset cleaning region is a preset cleaning region other than the second target preset cleaning region.
  9. The method according to claim 7 or 8, further comprising:
    generating one second cleaning image after finishing the cleaning task; or
    generating at least two second cleaning images successively or simultaneously after finishing the cleaning task; or
    generating the i-th second cleaning image according to at least one target dirtiness degree after determining that all the second target preset cleaning regions have been cleaned for the i-th time.
  10. The method according to claim 1, wherein the generating the cleaning image according to the dirtiness degree corresponding to the at least one preset cleaning region comprises:
    generating a third cleaning image based on accumulated amount of the acquired dirtiness degree corresponding to the at least one preset cleaning region.
  11. The method according to claim 10, wherein the generating the third cleaning image based on the accumulated amount of the acquired dirtiness degree corresponding to the at least one preset cleaning region comprises:
    generating one third cleaning image after each cleaning of one preset cleaning region, wherein each third cleaning image comprises image regions corresponding to all the preset cleaning regions, and a target filling pattern of the image region corresponding to each preset cleaning region is determined by a sum dirtiness degree corresponding to each preset cleaning region, and the sum dirtiness degree corresponding to each preset cleaning region is a sum of dirtiness degrees acquired after each cleaning of one preset cleaning region.
  12. The method according to claim 11, wherein the generating the third cleaning image comprises:
    displaying at least two third clean images; or,
    displaying the last third cleaning image after finishing cleaning of all the preset cleaning regions.
  13. The method according to claim 1, wherein the cleaning image comprises a room region, and the room region corresponds to at least one preset cleaning region; and
    the generating the cleaning image according to the dirtiness degree corresponding to the at least one preset cleaning region comprises:
    determining a target filling pattern of the room region according to the dirtiness degree corresponding to the at least one preset cleaning region corresponding to the room region.
  14. The method according to claim 13, wherein in case the room region corresponds to at least two preset cleaning regions, the determining the target filling pattern of the room region according to the dirtiness degree corresponding to the at least one preset cleaning region corresponding to the room region comprises:
    determining the target filling pattern of the room region according to an average dirtiness degree, a total dirtiness degree, or a maximum dirtiness degree corresponding to the at least two preset cleaning regions corresponding to the room region, or according to the dirtiness degree corresponding to any one preset cleaning region corresponding to the room region.
  15. The method according to claim 14, wherein in case at least one of the at least two preset cleaning regions corresponding to the room region has been cleaned more than one time;
    the determining the target filling pattern of the room region according to an average dirtiness degree, a total dirtiness degree, or a maximum dirtiness degree corresponding to the at least two preset cleaning regions corresponding to the room region, or according to the dirtiness degree corresponding to any one preset cleaning region corresponding to the room region comprises:
    determining the target filling pattern of the room region according to an average dirtiness degree, a total dirtiness degree, or a maximum dirtiness degree corresponding to all the preset cleaning regions that correspond to the room region and have been cleaned for an i-th time, or according to a dirtiness degree corresponding to any one preset cleaning region that corresponds to the room region and has been cleaned for the i-th time, to generate an i-th cleaning image comprising the room region, wherein i is an integer greater than or equal to 1.
  16. The method according to claim 14, wherein in case at least one of the at least two preset cleaning regions corresponding to the room region has been cleaned more than one time;
    the determining the target filling pattern of the room region according to an average dirtiness degree, a total dirtiness degree, or a maximum dirtiness degree corresponding to the at least two preset cleaning regions corresponding to the room region, or according to the dirtiness degree corresponding to any one preset cleaning region corresponding to the room region comprises:
    determining the target filling pattern of the room region according to an average sum dirtiness degree, a total sum dirtiness degree, or a maximum sum dirtiness degree corresponding to the preset cleaning regions corresponding to the room region, or according to a sum dirtiness degree corresponding to any preset cleaning region corresponding to the room region, to generate the cleaning image comprising the room region, wherein the sum dirtiness degree corresponding to one preset cleaning region is a sum of dirtiness degrees acquired after each cleaning of one preset cleaning region.
  17. The method according to any one of claims 1 to 16, further comprising:
    acquiring sequential node positions for the cleaning task performed by the cleaning device, the node positions comprising at least one of a start position, an interruption position, and an end position; and
    determining that a region covered by a cleaning trajectory connecting two node positions adjacent in cleaning sequence is a preset cleaning region.
  18. The method according to any one of claims 1 to 17, further comprising:
    generating an animation or a short video according to the generated cleaning image; or
    displaying the cleaning image according to a selecting operation by a user.
  19. The method according to any one of claims 1 to 18, wherein the generating the cleaning image according to the dirtiness degree corresponding to the at least one preset cleaning region comprises:
    determining a preset cleaning region with a dirtiness degree greater than or equal to a preset dirtiness degree threshold; and
    generating the cleaning image according to the preset cleaning region with the dirtiness degree greater than or equal to the preset dirtiness degree threshold.
  20. The method according to any one of claims 1 to 19, wherein the acquiring the dirtiness degree corresponding to one preset cleaning region after the cleaning device cleaning the preset cleaning region for one time through the cleaning member comprises:
    acquiring a dirtiness degree of a mopping member after the cleaning device finishing cleaning the preset cleaning region through the mopping member.
  21. An apparatus for processing a cleaning image of a cleaning device, comprising a memory and a processor; wherein,
    the memory is configured to store computer-executable instructions; and
    the processor is configured to execute the computer-executable instructions to implement the operations of the method according to any one of claims 1 to 20.
  22. A system, comprising:
    a cleaning device comprising a motion mechanism and a cleaning member, the motion mechanism being configured to drive the cleaning device to move to allow the cleaning member to clean a preset cleaning region;
    a base station, the base station being at least configured to clean the cleaning member of the cleaning device; and
    the apparatus of claim 21.
  23. A system, comprising:
    a cleaning device comprising a motion mechanism, a cleaning member, and a maintenance mechanism, the motion mechanism being configured to drive the cleaning device to move to allow the cleaning member to clean a preset cleaning region, and the maintenance mechanism being configured to clean the cleaning member; and
    the apparatus of claim 21.
  24. A computer-readable storage medium storing computer-executable instructions, wherein the computer-executable instructions, when executed by a processor, cause the processor to implement he operations of the method according to any one of claims 1 to 20.
PCT/CN2023/108453 2022-07-29 2023-07-20 Method and apparatus for processing cleaning image of cleaning device, system, and storage medium WO2024022223A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210912375.X 2022-07-29
CN202210912375.XA CN115444327B (en) 2022-07-29 2022-07-29 Method, device, system and storage medium for processing cleaning image of cleaning device

Publications (1)

Publication Number Publication Date
WO2024022223A1 true WO2024022223A1 (en) 2024-02-01

Family

ID=84297130

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/108453 WO2024022223A1 (en) 2022-07-29 2023-07-20 Method and apparatus for processing cleaning image of cleaning device, system, and storage medium

Country Status (2)

Country Link
CN (2) CN115444327B (en)
WO (1) WO2024022223A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118038366A (en) * 2024-02-20 2024-05-14 青岛法牧机械有限公司 Intelligent monitoring system and method for pig farm cultivation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170007091A1 (en) * 2014-03-24 2017-01-12 Alfred Kärcher Gmbh & Co. Kg Method for cleaning a floor surface and floor cleaning device
US20190128821A1 (en) * 2016-10-26 2019-05-02 Pixart Imaging Inc. Dirtiness level determining system and surface cleaning machine
US20220151450A1 (en) * 2020-11-17 2022-05-19 Irobot Corporation Systems and methods for scheduling mobile robot missions

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019082807A (en) * 2017-10-30 2019-05-30 パナソニックIpマネジメント株式会社 Augmented reality display system, terminal device, augmented reality display method and autonomously travelling cleaner
CN109330501B (en) * 2018-11-30 2021-11-12 深圳乐动机器人有限公司 Method for cleaning ground and sweeping robot
CN110251004B (en) * 2019-07-16 2022-03-11 深圳市杉川机器人有限公司 Sweeping robot, sweeping method thereof and computer-readable storage medium
CN110613405A (en) * 2019-10-29 2019-12-27 珠海市一微半导体有限公司 Cleaning robot cleaning information display method, device and system and controller
CN113143118A (en) * 2021-04-06 2021-07-23 美智纵横科技有限责任公司 Cleaning robot, intelligent control method and device thereof, and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170007091A1 (en) * 2014-03-24 2017-01-12 Alfred Kärcher Gmbh & Co. Kg Method for cleaning a floor surface and floor cleaning device
US20190128821A1 (en) * 2016-10-26 2019-05-02 Pixart Imaging Inc. Dirtiness level determining system and surface cleaning machine
US20220151450A1 (en) * 2020-11-17 2022-05-19 Irobot Corporation Systems and methods for scheduling mobile robot missions

Also Published As

Publication number Publication date
CN115444327A (en) 2022-12-09
CN115444327B (en) 2023-09-29
CN117179656A (en) 2023-12-08

Similar Documents

Publication Publication Date Title
WO2022007590A1 (en) Method and apparatus for determining dirt level of cleaning mechanism, and storage medium
WO2024022223A1 (en) Method and apparatus for processing cleaning image of cleaning device, system, and storage medium
CN109998428A (en) For the clean method of sweeping robot, system and device
WO2023051227A1 (en) Control method and apparatus for cleaning device
CN109448002B (en) Floor sweeping robot control method and system, mobile terminal and storage medium
WO2023138365A1 (en) Method for controlling cleaning device, and device and storage medium
WO2021259059A1 (en) Rinse control method and apparatus for cleaning mechanism, and storage medium
CN115486756A (en) Cleaning equipment self-cleaning method, cleaning equipment and base station
WO2024022360A1 (en) Method, device, and system for controlling cleaning robot, and storage medium
CN111973071B (en) Sweeper base selection method and device, storage medium and equipment
CN111166238A (en) Processing method, device and equipment for cleaning forbidden zone and storage medium
CN111374595B (en) Operation planning method and system of double-sweeping robot
CN114569001B (en) Intelligent mobile device
CN115836829A (en) Surface cleaning device water yield control method and system
WO2022126884A1 (en) Cleaning control method and device, sweeping robot, and storage medium
CN116115121A (en) Cleaning robot, control method, device and system thereof and storage medium
WO2024021111A1 (en) Cleaning robot control method, and processing, generation, area division and exploration method, device and system
WO2023217190A1 (en) Cleaning method, cleaning apparatus, cleaning device, and storage medium
CN117017113A (en) Cleaning equipment work prompting method, device, terminal and storage medium
KR102564280B1 (en) Cleaning robot remote control system
CN115517593A (en) Control method, device and system of cleaning robot and storage medium
CN115530697A (en) Method for improving cleaning efficiency based on user behavior and cleaning robot equipment
CN115429169A (en) Visual interface generation method, device and system and storage medium
CN115553664A (en) Sweeper and sweeper control method
CN116965723A (en) Cleaning equipment control method, device, terminal and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23765144

Country of ref document: EP

Kind code of ref document: A1