CN109711744B - Cleaning task automatic generation and execution evaluation method, cleaning method and device - Google Patents

Cleaning task automatic generation and execution evaluation method, cleaning method and device Download PDF

Info

Publication number
CN109711744B
CN109711744B CN201811642436.5A CN201811642436A CN109711744B CN 109711744 B CN109711744 B CN 109711744B CN 201811642436 A CN201811642436 A CN 201811642436A CN 109711744 B CN109711744 B CN 109711744B
Authority
CN
China
Prior art keywords
cleaning
standard
current
image
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811642436.5A
Other languages
Chinese (zh)
Other versions
CN109711744A (en
Inventor
张军锋
马锋
马如明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Tiansu Automation Control System Co ltd
Original Assignee
Nanjing Tiansu Automation Control System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Tiansu Automation Control System Co ltd filed Critical Nanjing Tiansu Automation Control System Co ltd
Priority to CN201811642436.5A priority Critical patent/CN109711744B/en
Publication of CN109711744A publication Critical patent/CN109711744A/en
Application granted granted Critical
Publication of CN109711744B publication Critical patent/CN109711744B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a cleaning task automatic generation and execution evaluation method, a cleaning method and a device target detection model optimization method, wherein the cleaning task automatic generation method comprises the following steps: acquiring a current image of a preset cleaning supervision area; identifying the current image through a pre-trained target detection model to obtain the current characteristic information of the cleaning object in the current image; comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image; the standard characteristic information is obtained by identifying a standard image of a preset cleaning supervision area by using a target detection model; and when the difference degree of the current characteristic information and the standard characteristic information is larger than a preset threshold value, generating a cleaning task. The method can automatically generate the cleaning task for subsequent allocation to cleaning personnel for execution, thereby improving the response speed to the cleaning event and ensuring the cleaning service quality. In addition, the inspection time of cleaning personnel can be reduced, and the labor cost is saved.

Description

Cleaning task automatic generation and execution evaluation method, cleaning method and device
Technical Field
The present invention relates to the field of computers, and in particular, to a cleaning task automatic generation method, a cleaning task execution evaluation method, a cleaning task automatic generation device, a cleaning task execution evaluation device, a cleaning system, an electronic device, and a readable storage medium.
Background
Large-scale institutions such as hospitals, shops, hotels, group companies and the like have large operation areas, high personnel density, frequent personnel flow and high cleaning service quality requirements, and bring great pressure to cleaning work of the large-scale institutions. In order to ensure the quality of the cleaning service, the cleaning personnel of a large-scale organization generally occupy 1/3 to 1/2 of the total logistic number, and the quantity and the proportion are very large.
However, because the cleaning event has certain uncertainty and burstiness, even if the number of cleaning personnel is large, if the cleaning task cannot be generated in time based on the cleaning event, the cleaning task is distributed to the staff for execution, and the response speed is improved, the cleaning quality still cannot be ensured. The current cleaning service method obviously cannot solve the problems by spreading cleaning personnel according to areas and then finding and completing cleaning tasks in a resident tour mode. In addition, because the cleaning task cannot be generated in time based on the cleaning event, the cleaning personnel also need to find the cleaning event and execute the task by themselves, the non-working time and the invalid working time occupy a very large proportion, the working efficiency of the cleaning personnel is low, and the cleaning cost of a large-scale mechanism is high.
Disclosure of Invention
In view of this, the embodiment of the invention provides a cleaning task automatic generation and execution evaluation method and a target detection model optimization method, which solve the problems that the existing cleaning service method cannot generate cleaning tasks in time based on cleaning events, and has a slow cleaning response speed and high cleaning cost.
According to a first aspect, an embodiment of the present invention provides a method for automatically generating a cleaning task, including the following steps: acquiring a current image of a preset cleaning supervision area; identifying the current image through a pre-trained target detection model to obtain the current characteristic information of the cleaning object in the current image; comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image; the standard characteristic information is obtained by identifying a standard image of a preset cleaning supervision area by using a target detection model; and when the difference degree of the current characteristic information and the standard characteristic information is larger than a preset threshold value, generating a cleaning task.
The current characteristic information of the cleaning object in the current image of the preset supervision area and the difference degree of the standard characteristic information of the cleaning object in the standard image are compared, and when the difference degree is larger than the preset threshold value, the cleaning task is automatically generated, so that the cleaning task can be timely generated when the cleaning event occurs and is distributed to cleaning personnel for execution, the response speed to the cleaning event can be improved, and the cleaning service quality is ensured. In addition, the method can automatically generate the cleaning task, reduce the inspection time of cleaning personnel and save the labor cost.
With reference to the first aspect, in a first implementation manner of the first aspect, the current feature information includes current location information of the cleaning object in the current image, and the standard feature information includes standard location information of the cleaning object in the standard image; the step of comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image comprises the following steps: and acquiring a plurality of corresponding cleaning objects in the current image and the standard image, and comparing the difference degree of the current position information and the standard position information of the corresponding cleaning objects.
With reference to the first embodiment of the first aspect, in a second embodiment of the first aspect, when a degree of difference between the current feature information and the standard feature information is greater than a preset threshold, a step of generating a cleaning task includes: and when the sum of the difference degrees of the current position information and the standard position information of each corresponding cleaning object is larger than a first preset threshold value, generating a cleaning task.
The method comprises the steps of calculating the sum of the difference degrees of the current position information and the standard position information of each corresponding cleaning object, representing the current clutter degree of a preset cleaning supervision area by using the obtained result, and generating a cleaning task when the sum of the difference degrees of the current position information and the standard position information of each corresponding cleaning object is larger than a first preset threshold value, namely generating the cleaning task in time when the current clutter degree of the preset cleaning supervision area reaches a certain degree, so that the response degree of the cleaning object to the clutter cleaning event can be improved, and the cleaning service quality is improved.
With reference to the first aspect, in a third implementation manner of the first aspect, the current feature information includes current area information of the cleaning object in the current image; the step of comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image comprises the following steps: acquiring a different cleaning object with a difference between a cleaning object in a current image and a cleaning object in a standard image; the ratio of the sum of the areas of the different cleaning objects to the total area of the current image is calculated.
With reference to the third implementation manner of the first aspect, in a fourth implementation manner of the first aspect, when a degree of difference between the current feature information and the standard feature information is greater than a preset threshold, a step of generating a cleaning task includes: and when the ratio is greater than a second preset threshold, generating a cleaning task.
The ratio of the sum of the areas of the different cleaning objects (the cleaning objects existing in the current image but not existing in the standard image) relative to the total area of the current image is calculated to represent the occupation degree of the foreign matters in the preset cleaning supervision area by using the obtained result, and when the occupation degree of the foreign matters in the preset cleaning supervision area reaches a certain degree, the cleaning task is generated in time, so that the response degree of the foreign matters appearing in the preset cleaning supervision area can be improved, and the cleaning service quality is improved.
With reference to any one of the first aspect to the fourth implementation of the first aspect, in a fourth implementation of the first aspect, the target detection model is optimized by: identifying the image of the preset cleaning supervision area by using the current target detection model so as to identify the targets in the image as cleaning objects and non-cleaning objects; when a target which cannot be identified by the target detection model exists, acquiring labeling information of the target which cannot be identified; the labeling information is used for marking the unrecognizable target as a cleaning object or a non-cleaning object; and optimizing the current target detection model based on the unrecognizable target and the corresponding labeling information thereof.
The comprehensive accuracy of identifying the targets by the target detection model is continuously improved by continuously optimizing the target detection model, so that the timeliness of the generation of the cleaning task can be ensured, and the continuous improvement of the cleaning service quality is ensured.
According to a second aspect, an embodiment of the present invention provides a method for evaluating execution of a cleaning task, including the following steps: when receiving feedback information of completion of the cleaning task, acquiring a current image of a preset cleaning supervision area corresponding to the cleaning task; identifying the current image through a pre-trained target detection model to obtain the current characteristic information of the cleaning object in the current image; comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image; the standard characteristic information is obtained by identifying a standard image of a preset cleaning supervision area by using a target detection model; and evaluating the execution degree of the cleaning task according to the difference degree.
After the feedback task of the cleaning personnel is finished, the current image of the preset cleaning supervision area corresponding to the cleaning task is fed back, and after the current image is identified by using the target detection model to obtain the current characteristic information of the cleaning object in the current image, the difference degree of the current characteristic information and the standard characteristic information (obtained by identifying the standard image by using the target detection model) is compared, so that the execution degree of the cleaning task is evaluated according to the difference degree, the lower the difference degree is, the better the execution degree of the cleaning task is, and the objectivity of an evaluation result obtained after the execution degree of the cleaning task is improved. In addition, the method has low dependence on manpower, so that the labor cost can be saved.
With reference to the second aspect, in a first implementation manner of the second aspect, the target detection model is optimized by: identifying the image of the preset cleaning supervision area by using the current target detection model so as to identify the targets in the image as cleaning objects and non-cleaning objects; when a target which cannot be identified by the target detection model exists, acquiring labeling information of the target which cannot be identified; the labeling information is used for marking the unrecognizable target as a cleaning object or a non-cleaning object; and optimizing the current target detection model based on the unrecognizable target and the corresponding labeling information thereof.
According to a third aspect, an embodiment of the present invention provides a cleaning method, including the following steps: generating a cleaning task by using the cleaning task automatic generation method in the first aspect or any implementation manner of the first aspect; distributing the cleaning task to a certain cleaning person; when receiving the information of the cleaning task completion fed back by the cleaning personnel, the execution evaluation method of the cleaning task in the second aspect or any implementation mode of the second aspect is used for evaluating the execution degree of the cleaning task.
By automatically generating the cleaning task and automatically evaluating the execution degree of the cleaning task, the dependency on personnel is reduced, the waste of manpower and invalid work are also reduced, and the response timeliness and the effectiveness of the cleaning event can be improved.
According to a fourth aspect, an embodiment of the present invention provides an automatic cleaning task generating device, including: the first image acquisition module is used for acquiring a current image of a preset cleaning supervision area; the first image recognition module is used for recognizing the current image through a pre-trained target detection model to obtain the current characteristic information of the cleaning object in the current image; the first comparison module is used for comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image; the standard characteristic information is obtained by identifying a standard image of a preset cleaning supervision area by using a target detection model; and the task generating module is used for generating a cleaning task when the difference degree of the current characteristic information and the standard characteristic information is larger than a preset threshold value.
According to a fifth aspect, an embodiment of the present invention provides an apparatus for evaluating execution of a cleaning task, including: the second image acquisition module is used for acquiring a current image of a preset cleaning supervision area corresponding to the cleaning task when receiving feedback information of completion of the cleaning task; the second image recognition module is used for recognizing the current image through a pre-trained target detection model to obtain the current characteristic information of the cleaning object in the current image; the second comparison module is used for comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image; the standard characteristic information is obtained by identifying a standard image of a preset cleaning supervision area by using a target detection model; and the task evaluation module is used for evaluating the execution degree of the cleaning task according to the difference degree.
According to a sixth aspect, an embodiment of the present invention provides a cleaning system, including: the automatic cleaning task generating device according to the fourth aspect is used for generating cleaning tasks; the task allocation device is used for allocating the cleaning task to a certain cleaning person; the cleaning task execution evaluation device according to the fifth aspect is configured to evaluate the execution degree of the cleaning task.
According to a seventh aspect, an embodiment of the present invention provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; the memory has stored therein computer instructions which, upon execution by the processor, cause the processor to perform the method of the first aspect, any implementation of the first aspect, the second aspect, any implementation of the second aspect, or the third aspect.
According to an eighth aspect, an embodiment of the present invention provides a computer-readable storage medium storing computer instructions for causing a computer to perform the method of the first aspect, any implementation manner of the first aspect, the second aspect, any implementation manner of the second aspect, or the third aspect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a cleaning task automatic generation method provided by an embodiment of the invention;
FIG. 2 is a flowchart of a method for automatically generating cleaning tasks according to another embodiment of the present invention;
FIG. 3 is a flowchart of a method for automatically generating cleaning tasks according to another embodiment of the present invention;
FIG. 4 is a flowchart of an optimization process of the object detection model provided by the present invention;
FIG. 5 is a flowchart of a method for evaluating execution of cleaning tasks according to an embodiment of the present invention
FIG. 6 is a flow chart of a cleaning method according to an embodiment of the present invention;
FIG. 7 is a schematic block diagram of an automatic cleaning task generating device according to an embodiment of the present invention;
FIG. 8 is a schematic block diagram of a cleaning task execution evaluation device provided by an embodiment of the present invention;
fig. 9 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to fall within the scope of the invention.
In the description of the present invention, it should be noted that the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Example 1
Fig. 1 shows a flowchart of a cleaning task automatic generation method according to an embodiment of the present invention, and as shown in fig. 1, the method may include the following steps:
s101, acquiring a current image of a preset cleaning supervision area. Here, the current image of the preset cleaning supervision area can be acquired through an image acquisition device such as a monitoring camera or a digital camera.
S102, identifying the current image through a pre-trained target detection model to obtain the current characteristic information of the cleaning object in the current image. Here, the current characteristic information of the cleaning object may include current location information, current area information, and the like of the cleaning object.
Here, the cleaning object means that the cleaning object exists in a preset cleaning supervision area and affects the quality of cleaning service, and an image of a cleaning task can be generated, for example, a table, a chair, plant flowers, garbage, dirt, sundries, etc. in a conference room, and specifically, the cleaning object can be further classified into a standard object and a target object, wherein the standard object means that the cleaning object should exist in the preset cleaning supervision area but needs to be fixedly placed, such as the table, the chair, the plant flowers, etc., and the target object means that the cleaning object should not exist in the preset cleaning supervision area, such as garbage dirt, sundries, etc.
Here, the target model based on the R-CNN algorithm may be used to identify the current image, and the specific detection procedure includes:
1) Scanning the input current image by a selective search algorithm generating a predetermined number of candidate regions (regions Propos), to acquire a possible target;
2) By running a Convolutional Neural Network (CNN) on each Region Proposal;
3) The output of each CNN is input into a Support Vector Machine (SVM), the region is classified (here, the objects in the region are classified into a cleaning object and a non-cleaning object mainly based on the characteristic information in the region), and the bounding box around the object is contracted by a linear regression.
S103, comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image. The standard characteristic information is obtained by identifying a standard image of a preset cleaning supervision area by using a target detection model, wherein the standard image refers to a picture when standard objects in the preset cleaning supervision area are all located at positions which are required to be fixedly placed and no target object exists.
In this embodiment, the current feature information may include current position information of the cleaning object in the current image, and may further include current area information of the cleaning object in the current image; the standard feature information includes standard position information of the cleaning object in the standard image, and may further include standard area information of the cleaning object in the standard image.
The difference degree of the current position information and the standard position information of the cleaning object can be used as the difference degree of the current characteristic information and the standard characteristic information, or the difference degree of the current area information and the standard area information is used as the difference degree of the current characteristic information and the standard characteristic information, or the difference degree of the current position information and the standard position information, and the difference degree accumulated result of the current area information and the standard area information is used as the difference degree of the current characteristic information and the standard characteristic information.
And S104, when the difference degree of the current characteristic information and the standard characteristic information is larger than a preset threshold value, generating a cleaning task. Here, the generated cleaning task may include information of a position information of a preset supervision area, information of a cleaning object (a cleaning object having a difference between current feature information and standard feature information) that motivates the cleaning task, task difficulty information (obtained according to the difference degree), and the like.
According to the automatic generation method of the cleaning task, the current characteristic information of the cleaning object in the current image of the preset supervision area and the difference degree of the standard characteristic information of the cleaning object in the standard image are compared, and when the difference degree is larger than the preset threshold value, the cleaning task is automatically generated, so that the cleaning task can be generated timely when a cleaning event is generated and is distributed to cleaning personnel for execution, the response speed to the cleaning event can be improved, and the cleaning service quality is guaranteed. In addition, the method can automatically generate the cleaning task, reduce the inspection time of cleaning personnel and save the labor cost.
Fig. 2 is a flowchart illustrating a cleaning task automatic generation method according to another embodiment of the present invention, in which the current feature information includes current position information of a cleaning object in a current image, and the standard feature information includes standard position information of the cleaning object in a standard image, as shown in fig. 2, the method may include the following steps:
s201, acquiring a current image of a preset cleaning supervision area. The specific content is described with reference to step S101.
S202, identifying the current image through a pre-trained target detection model to obtain the current position information of the cleaning object in the current image. The specific content is described with reference to step S102.
S203, a plurality of corresponding cleaning objects in the current image and the standard image are obtained, and the difference degree of the current position information and the standard position information of the corresponding cleaning objects is compared. The standard position information is obtained by identifying a standard image of a preset cleaning supervision area by using a target detection model. In this embodiment, the cleaning objects are standard objects because the cleaning objects are present in the current image and the standard image, and the difference degree between the current position information and the standard position information of the cleaning objects is used to represent the clutter degree of the preset cleaning supervision area.
S204, when the sum of the difference degrees of the current position information and the standard position information of each corresponding cleaning object is larger than a first preset threshold value, a cleaning task is generated.
Here, the clutter level of the preset sanitation supervision area is calculated by the following formula:
wherein D (L, T) refers to the clutter degree of the preset cleaning supervision area L at the moment T, C i Refers to a cleaning object, in particular C i Is an index object, F (C) i (t m ) Refers to a cleaning object C i At t m The position of the time (the time corresponding to the current image), F (C) i (t 0 ) Refers to a cleaning object C i At t 0 The position of the time (time corresponding to the standard image). Along the above example, when the target model based on the R-CNN algorithm is used to perform target detection on the current image or the standard image, the centroid coordinates of the target can be obtained, and thus, the position information of the cleaning object here may be the centroid coordinates of the cleaning object.
According to the automatic generation method of the cleaning task, the sum of the difference degrees of the current position information and the standard position information of each corresponding cleaning object is calculated to represent the current clutter degree of the preset cleaning supervision area by using the obtained result, and when the sum of the difference degrees of the current position information and the standard position information of each corresponding cleaning object is larger than the first preset threshold value, the cleaning task is generated, namely when the current clutter degree of the preset cleaning supervision area reaches a certain degree, the cleaning task is generated in time, so that the response degree of the cleaning task to the clutter cleaning event of the cleaning object can be improved, and the cleaning service quality is improved.
Fig. 3 is a flowchart illustrating a cleaning task automatic generation method according to another embodiment of the present invention, in which the current feature information includes current area information of a cleaning object in a current image, and the cleaning task automatic generation method according to the embodiment of the present invention is described as shown in fig. 3, and the method may include the steps of:
s301, acquiring a current image of a preset cleaning supervision area. The specific content is described with reference to step S101.
S302, identifying the current image through a pre-trained target detection model to obtain the current area information of the cleaning object in the current image. The specific content is described with reference to step S102.
S303, acquiring a different cleaning object of which the cleaning object in the current image is different from the cleaning object in the standard image. Here, the distinguishing cleaning object refers to a cleaning object that exists in the current image but does not exist in the standard image, that is, the cleaning object herein refers to a target object such as garbage, dirt, or sundries.
S304, calculating the ratio of the sum of the areas of the different cleaning objects to the total area of the current image. Here, as described with reference to S303, the differential cleaning object refers to a target object, and the ratio of the sum of areas of the differential cleaning objects to the total area of the current image is used to characterize the foreign object occupancy.
And S305, generating a cleaning task when the ratio is greater than a second preset threshold.
Here, the foreign matter occupancy rate of the preset sanitation supervision area is calculated by the following formula:
wherein R (L, T) refers to the foreign matter occupancy of the preset cleaning and supervision area L at the moment T, C i Refers to a cleaning object, in particular C i Refers to a target object, S (C i (t m ) Refers to a cleaning object C i At t m The ratio of the area of the moment (moment corresponding to the current image) to the total area of the current image.
According to the automatic generation method of the cleaning task, the ratio of the sum of areas of different cleaning objects (the cleaning objects existing in the current image but not existing in the standard image) relative to the total area of the current image is calculated, so that the obtained result is used for representing the occupation degree of the foreign matters in the preset cleaning supervision area, and the cleaning task is generated when the occupation degree of the foreign matters in the preset cleaning supervision area reaches a certain degree when the occupation degree of the foreign matters in the preset cleaning supervision area is larger than the second preset threshold value, so that the response degree of the cleaning event of the foreign matters in the preset cleaning supervision area can be improved, and the cleaning service quality is improved.
As an optional implementation manner of this embodiment, the generating of the cleaning task may be completed by comprehensively calculating the clutter degree and the foreign object occupancy degree of the preset cleaning supervision area, and generating the cleaning task when the comprehensive calculation result is greater than the third threshold, that is, when the value of a×d (L, T) +b×r (L, T) is greater than the third threshold, generating the cleaning task. Here, the specific values of the parameter a and the parameter b may be adjusted according to the actual application scenario, which is not limited in any way.
FIG. 4 is a flowchart showing an optimization process of the target detection model in the cleaning task automatic generation method according to the embodiment of the present invention, and as shown in FIG. 4, the process may include the following steps:
s401, identifying the image of the preset cleaning supervision area by using the current target detection model so as to identify the targets in the image as cleaning objects and non-cleaning objects. Here, the cleaning object may be the standard object or the target object, and the non-cleaning object refers to an object that temporarily enters a preset cleaning and monitoring area or is temporarily placed in the preset cleaning and monitoring area, for example, an employee temporarily entering a meeting room for meeting, and when the non-cleaning object exists in the preset cleaning and monitoring area, no cleaning task is generated regardless of the state of other cleaning objects. Therefore, in order to accurately and timely generate the cleaning task, the target detection model is required to accurately identify the cleaning object and the non-cleaning object.
S402, when an object which cannot be identified by the object detection model exists, labeling information of the object which cannot be identified is obtained. Here, the marking information is used to identify an unidentifiable object as a cleaning object or a non-cleaning object.
S403, optimizing the current target detection model based on the unrecognizable targets and the corresponding labeling information.
Here, in order to prevent the optimization of the object detection model from being too frequent, the start threshold of the object detection model optimization may also be set by the following formula:
wherein T (L, T) refers to the target occupancy of the preset cleaning and supervision area L, which cannot be identified in the image at the moment T, C i Is an unidentifiable target, S (C i (t m ) Refers to a cleaning object C i At t m The ratio of the area at the moment in time to the total area of the current image.
According to the method and the device for automatically generating the cleaning task, the object detection model is continuously optimized, and the comprehensive accuracy of the object detection model for identifying the object is continuously improved, so that the timeliness of the cleaning task generation can be ensured, and the continuous improvement of the cleaning service quality is ensured.
Example 2
Fig. 5 shows a flowchart of a method for evaluating execution of a cleaning task according to an embodiment of the present invention, and as shown in fig. 5, the method may include the following steps:
s501, when feedback information of completion of the cleaning task is received, a current image of a preset cleaning supervision area corresponding to the cleaning task is obtained. Here, the feedback information is information that the cleaning task fed back by the cleaning personnel through a special software terminal or a short message, a WeChat and the like is completed.
S502, identifying the current image through a pre-trained target detection model to obtain the current characteristic information of the cleaning object in the current image. The specific content is described with reference to step S102 in embodiment 1, and is not described herein.
S503, comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image. The standard characteristic information is obtained by identifying a standard image of a preset cleaning supervision area by using a target detection model. The specific content is described with reference to step S103 in embodiment 1, and will not be described here again.
S504, evaluating the execution degree of the cleaning task according to the difference degree. Here, the lower the degree of difference, the better the degree of execution of the cleaning task.
According to the execution evaluation method for the cleaning task, after the feedback task of the cleaning personnel is finished, the current image of the preset cleaning supervision area corresponding to the cleaning task is fed back, the target detection model is used for identifying the current image to obtain the current characteristic information of the cleaning object in the current image, and then the difference degree of the current characteristic information and the standard characteristic information (obtained by identifying the standard image through the target detection model) is compared, so that the execution degree of the cleaning task is evaluated according to the difference degree, the lower the difference degree is, the better the execution degree of the cleaning task is, and the objectivity of an evaluation result obtained after the execution degree of the cleaning task is improved. In addition, the method has low dependence on manpower, so that the labor cost can be saved.
As an alternative implementation manner of this embodiment, the object detection model may be optimized, and the specific content thereof may be understood with reference to S401 to S403 in embodiment 1, which is not described herein.
Example 3
Fig. 6 shows a flowchart of a cleaning method according to an embodiment of the invention, as shown in fig. 6, the method may comprise the steps of:
s601, generating a cleaning task by using the cleaning task automatic generation method described in the embodiment 1 or any optional implementation manner thereof.
S602, distributing the cleaning task to a certain cleaning person.
S603, when receiving the information of the cleaning task completion fed back by the cleaning personnel, using the method for evaluating the execution of the cleaning task according to the embodiment 2 or any optional embodiment thereof to evaluate the execution degree of the cleaning task.
The details of the method in this embodiment can be understood with reference to embodiment 1 and embodiment 2, and will not be described herein.
According to the cleaning method, the cleaning task is automatically generated, the execution degree of the cleaning task is automatically evaluated, the dependence on personnel is reduced, the waste of manpower and invalid work are also reduced, and the response timeliness and the effectiveness of the cleaning event can be improved.
Example 4
Fig. 7 shows a schematic block diagram of a cleaning task automatic generating device according to an embodiment of the present invention, which may be used to implement the cleaning task automatic generating method according to embodiment 1 or any alternative embodiment thereof. As shown in fig. 7, the apparatus includes: the system comprises a first image acquisition module 10, a first image recognition module 20, a first comparison module 30 and a task generation module 40.
The first image acquisition module 10 is used for acquiring a current image of a preset cleaning supervision area.
The first image recognition module 20 is configured to recognize a current image through a pre-trained target detection model, so as to obtain current feature information of a cleaning object in the current image.
The first comparing module 30 is configured to compare the difference degree between the current feature information and the standard feature information of the cleaning object in the standard image. The standard characteristic information is obtained by identifying a standard image of a preset cleaning supervision area by using a target detection model.
The task generating module 40 is configured to generate a cleaning task when the degree of difference between the current feature information and the standard feature information is greater than a preset threshold.
The automatic generating device of the cleaning task can be used for realizing the automatic generating method of the cleaning task in the embodiment 1 and automatically generating the cleaning task, so that the cleaning task can be generated in time when a cleaning event is generated and is distributed to cleaning personnel for execution subsequently, the response speed to the cleaning event can be improved, and the cleaning service quality is ensured. In addition, the method can automatically generate the cleaning task, reduce the inspection time of cleaning personnel and save the labor cost.
The embodiment of the invention also provides a cleaning task execution evaluation device, and fig. 8 shows a schematic block diagram of the cleaning task automatic generation device, which can be used for realizing the cleaning task execution evaluation method in the embodiment 2 or any optional embodiment thereof. As shown in fig. 8, the apparatus includes: a second image acquisition module 50, a second image recognition module 60, a second comparison module 70, and a task evaluation module 80.
The second image obtaining module 50 is configured to obtain a current image of a preset cleaning supervision area corresponding to the cleaning task when receiving feedback information that the cleaning task is completed.
The second image recognition module 60 is configured to recognize the current image through a pre-trained target detection model, so as to obtain the current feature information of the cleaning object in the current image.
The second comparing module 70 is configured to compare the difference degree between the current feature information and the standard feature information of the cleaning object in the standard image. The standard characteristic information is obtained by identifying a standard image of a preset cleaning supervision area by using a target detection model.
The task evaluation module 80 is used for evaluating the execution degree of the cleaning task according to the difference degree.
The cleaning task execution evaluation device provided by the embodiment of the invention can be used for realizing the cleaning task execution evaluation method in the embodiment 2, objectively evaluating the execution condition of the cleaning task, reducing the intervention of manpower and saving the manpower cost.
The embodiment of the invention also provides a cleaning system which can be used for realizing the cleaning method described in the embodiment 3 or any optional implementation mode thereof. The system comprises: the cleaning task automatic generation device, the task distribution device and the execution evaluation device.
The cleaning task automatic generating device is used for generating cleaning tasks.
The task allocation device is used for allocating the cleaning task to a certain cleaning person.
The cleaning task execution evaluation device is used for evaluating the execution degree of the cleaning task.
The embodiment of the present invention further provides an electronic device, as shown in fig. 9, which may include a processor 91 and a memory 92, where the processor 91 and the memory 92 may be connected by a bus or other means, and in fig. 9, the connection is exemplified by a bus.
The processor 91 may be a central processing unit (Central Processing Unit, CPU). The processor 91 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or a combination thereof.
The memory 92 is used as a non-transitory computer readable storage medium for storing non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the first image acquisition module 10, the first image recognition module 20, the first comparison module 30, and the task generation module 40 shown in fig. 7) corresponding to the cleaning task automatic generation device in the embodiment of the present invention. The processor 91 executes various functional applications of the processor and data processing, i.e. implements the methods of the method embodiments described above, by running non-transitory software programs, instructions and modules stored in the memory 92.
Memory 92 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created by the processor 91, etc. In addition, the memory 92 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 92 may optionally include memory remotely located relative to processor 91, which may be connected to processor 91 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more modules are stored in the memory 92, which when executed by the processor 91, performs the methods of the embodiments shown in fig. 1-6.
The details of the electronic device may be understood in reference to the corresponding related descriptions and effects in the embodiments shown in fig. 1 to 6, which are not repeated herein.
It will be appreciated by those skilled in the art that implementing all or part of the above-described embodiment method may be implemented by a computer program to instruct related hardware, where the program may be stored in a computer readable storage medium, and the program may include the above-described embodiment method when executed. Wherein the storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Flash Memory (Flash Memory), a Hard Disk (HDD), or a Solid State Drive (SSD); the storage medium may also comprise a combination of memories of the kind described above.
It is apparent that the above examples are given by way of illustration only and are not limiting of the embodiments. Other variations or modifications of the above teachings will be apparent to those of ordinary skill in the art. It is not necessary here nor is it exhaustive of all embodiments. While still being apparent from variations or modifications that may be made by those skilled in the art are within the scope of the invention.

Claims (14)

1. The automatic cleaning task generating method is characterized by comprising the following steps:
acquiring a current image of a preset cleaning supervision area;
identifying the current image through a pre-trained target detection model to obtain current characteristic information of a cleaning object in the current image;
comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image; the standard characteristic information is obtained by identifying a standard image of the preset cleaning supervision area by using the target detection model; the cleaning objects comprise standard objects and target objects, wherein the standard objects are cleaning objects which are required to be fixedly placed and exist in a preset cleaning supervision area; the target object is a cleaning object which should not exist in a preset cleaning monitoring area; the difference degree of the current position information and the standard position information of the standard object is used for representing the disorder degree of a preset cleaning supervision area;
calculating the clutter degree of the preset cleaning supervision area by the following formula:
wherein D (L, T) refers to the clutter degree of the preset cleaning supervision area L at the moment T, C i Is an index object, F (C) i (t m ) Is the reference object C i At t m The position of the moment, t m The time represents the time corresponding to the current image; f (C) i (t 0 ) Is the reference object C i At t 0 The position of the moment, t 0 The time represents the time corresponding to the standard image;
the ratio of the sum of the areas of the target objects to the total area of the current image is used for representing the occupancy of foreign objects;
the foreign matter occupancy rate of the preset cleaning supervision area is calculated by the following formula:
wherein R (L, T) refers to the foreign matter occupancy of the preset cleaning and supervision area L at the moment T, G i Refers to a target object, S (G i (t m ) Refers to a target object G i At t m Ratio of area at time to total area of current image, t m TimetableShowing the moment corresponding to the current image;
and when the difference degree of the current characteristic information and the standard characteristic information is larger than a preset threshold value, generating a cleaning task.
2. The automatic cleaning task generating method according to claim 1, wherein the current feature information includes current position information of a cleaning object in the current image, and the standard feature information includes standard position information of the cleaning object in the standard image;
the step of comparing the difference degree between the current characteristic information and the standard characteristic information of the cleaning object in the standard image comprises the following steps:
And acquiring a plurality of corresponding cleaning objects in the current image and the standard image, and comparing the difference degree of the current position information and the standard position information of the corresponding cleaning objects.
3. The automatic cleaning task generating method according to claim 2, wherein the step of generating the cleaning task when the degree of difference between the current feature information and the standard feature information is greater than a preset threshold value comprises:
and when the sum of the difference degrees of the current position information and the standard position information of each corresponding cleaning object is larger than a first preset threshold value, generating a cleaning task.
4. The automatic cleaning task generating method according to claim 1, wherein the current feature information includes current area information of a cleaning object in the current image;
the step of comparing the difference degree between the current characteristic information and the standard characteristic information of the cleaning object in the standard image comprises the following steps:
acquiring a different cleaning object, which is different from the cleaning object in the standard image, of the cleaning object in the current image;
and calculating the ratio of the sum of the areas of the different cleaning objects to the total area of the current image.
5. The automatic cleaning task generating method according to claim 4, wherein the step of generating the cleaning task when the degree of difference between the current feature information and the standard feature information is greater than a preset threshold value comprises:
and generating a cleaning task when the ratio is greater than a second preset threshold.
6. The automatic cleaning task generating method according to any one of claims 1 to 5, wherein the object detection model is optimized by:
identifying the image of the preset cleaning supervision area by using a current target detection model so as to identify targets in the image as cleaning objects and non-cleaning objects;
when an unidentifiable target exists in the target detection model, acquiring labeling information of the unidentifiable target; the marking information is used for marking the unrecognizable target as a cleaning object or a non-cleaning object;
and optimizing the current target detection model based on the unrecognizable target and the corresponding labeling information thereof.
7. The method for evaluating the execution of the cleaning task is characterized by comprising the following steps of:
when receiving feedback information of completion of the cleaning task, acquiring a current image of a preset cleaning supervision area corresponding to the cleaning task;
Identifying the current image through a pre-trained target detection model to obtain current characteristic information of a cleaning object in the current image; the cleaning objects comprise standard objects and target objects, wherein the standard objects are cleaning objects which are required to be fixedly placed and exist in a preset cleaning supervision area; the target object is a cleaning object which should not exist in a preset cleaning monitoring area;
comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image; the standard characteristic information is obtained by identifying a standard image of the preset cleaning supervision area by using the target detection model; the difference degree of the current position information and the standard position information of the standard object is used for representing the disorder degree of a preset cleaning supervision area;
calculating the clutter degree of the preset cleaning supervision area by the following formula:
wherein D (L, T) refers to the clutter degree of the preset cleaning supervision area L at the moment T, C i Is an index object, F (C) i (t m ) Is the reference object C i At t m The position of the moment, t m The time represents the time corresponding to the current image; f (C) i (t 0 ) Is the reference object C i At t 0 The position of the moment, t 0 The time represents the time corresponding to the standard image;
the ratio of the sum of the areas of the target objects to the total area of the current image is used for representing the occupancy of foreign objects;
the foreign matter occupancy rate of the preset cleaning supervision area is calculated by the following formula:
wherein R (L, T) refers to the foreign matter occupancy of the preset cleaning and supervision area L at the moment T, G i Refers to a target object, S (G i (t m ) Refers to a target object G i At t m Ratio of area at time to total area of current image, t m The time represents the time corresponding to the current image;
and evaluating the execution degree of the cleaning task according to the difference degree.
8. The method for evaluating the performance of a cleaning task according to claim 7, wherein the object detection model is optimized by:
identifying an image of a preset cleaning supervision area by using a current target detection model so as to identify a target in the image as a cleaning object and a non-cleaning object;
when an unidentifiable target exists in the target detection model, acquiring labeling information of the unidentifiable target; the marking information is used for marking the unrecognizable target as a cleaning object or a non-cleaning object;
And optimizing the current target detection model based on the unrecognizable target and the corresponding labeling information thereof.
9. The cleaning method is characterized by comprising the following steps of:
generating a cleaning task using the cleaning task automatic generation method according to any one of claims 1 to 6;
distributing the cleaning task to a certain cleaning person;
when receiving the information of the cleaning task completion fed back by the cleaning personnel, the execution evaluation method of the cleaning task according to claim 7 or 8 is used for evaluating the execution degree of the cleaning task.
10. An automatic cleaning task generating device is characterized by comprising:
the first image acquisition module is used for acquiring a current image of a preset cleaning supervision area;
the first image recognition module is used for recognizing the current image through a pre-trained target detection model to obtain the current characteristic information of the cleaning object in the current image; the cleaning objects comprise standard objects and target objects, wherein the standard objects are cleaning objects which are required to be fixedly placed and exist in a preset cleaning supervision area; the target object is a cleaning object which should not exist in a preset cleaning monitoring area;
The first comparison module is used for comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image; the standard characteristic information is obtained by identifying a standard image of the preset cleaning supervision area by using the target detection model; the difference degree of the current position information and the standard position information of the standard object is used for representing the disorder degree of a preset cleaning supervision area;
calculating the clutter degree of the preset cleaning supervision area by the following formula:
wherein D (L, T) refers to the clutter degree of the preset cleaning supervision area L at the moment T, C i Is an index object, F (C) i (t m ) Is the reference object C i At t m The position of the moment, t m The time represents the time corresponding to the current image; f (C) i (t 0 ) Is the reference object C i At t 0 The position of the moment, t 0 The time represents the time corresponding to the standard image;
the ratio of the sum of the areas of the target objects to the total area of the current image is used for representing the occupancy of foreign objects;
the foreign matter occupancy rate of the preset cleaning supervision area is calculated by the following formula:
wherein R (L, T) refers to the foreign matter occupancy of the preset cleaning and supervision area L at the moment T, G i Refers to a target object, S (G i (t m ) Refers to a target object G i At t m Ratio of area at time to total area of current image, t m The time represents the time corresponding to the current image;
and the task generating module is used for generating a cleaning task when the difference degree of the current characteristic information and the standard characteristic information is larger than a preset threshold value.
11. An execution evaluation device for cleaning tasks is characterized by comprising:
the second image acquisition module is used for acquiring a current image of a preset cleaning supervision area corresponding to the cleaning task when receiving feedback information of completion of the cleaning task;
the second image recognition module is used for recognizing the current image through a pre-trained target detection model to obtain the current characteristic information of the cleaning object in the current image; the cleaning objects comprise standard objects and target objects, wherein the standard objects are cleaning objects which are required to be fixedly placed and exist in a preset cleaning supervision area; the target object is a cleaning object which should not exist in a preset cleaning monitoring area;
the second comparison module is used for comparing the difference degree of the current characteristic information and the standard characteristic information of the cleaning object in the standard image; the standard characteristic information is obtained by identifying a standard image of the preset cleaning supervision area by using the target detection model; the difference degree of the current position information and the standard position information of the standard object is used for representing the disorder degree of a preset cleaning supervision area;
Calculating the clutter degree of the preset cleaning supervision area by the following formula:
wherein D (L, T) refers to the clutter degree of the preset cleaning supervision area L at the moment T, C i Is an index object, F (C) i (t m ) Is the reference object C i At t m The position of the moment, t m The time represents the time corresponding to the current image; f (C) i (t 0 ) Is the reference object C i At t 0 The position of the moment, t 0 The time represents the time corresponding to the standard image;
The ratio of the sum of the areas of the target objects to the total area of the current image is used for representing the occupancy of foreign objects;
the foreign matter occupancy rate of the preset cleaning supervision area is calculated by the following formula:
wherein R (L, T) refers to the foreign matter occupancy of the preset cleaning and supervision area L at the moment T, G i Refers to a target object, S (G i (t m ) Refers to a target object G i At t m Ratio of area at time to total area of current image, t m The time represents the time corresponding to the current image;
and the task evaluation module is used for evaluating the execution degree of the cleaning task according to the difference degree.
12. A sanitation system, comprising:
the automatic cleaning task generating device according to claim 10, which is used for generating cleaning tasks;
the task allocation device is used for allocating the cleaning task to a certain cleaning person;
The cleaning task execution evaluation device according to claim 11, configured to evaluate an execution degree of the cleaning task.
13. An electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the one processor, the instructions being executable by the at least one processor to cause the at least one processor to perform the method of any of the preceding claims 1-9.
14. A computer readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the steps of the method of any of the preceding claims 1-9.
CN201811642436.5A 2018-12-29 2018-12-29 Cleaning task automatic generation and execution evaluation method, cleaning method and device Active CN109711744B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811642436.5A CN109711744B (en) 2018-12-29 2018-12-29 Cleaning task automatic generation and execution evaluation method, cleaning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811642436.5A CN109711744B (en) 2018-12-29 2018-12-29 Cleaning task automatic generation and execution evaluation method, cleaning method and device

Publications (2)

Publication Number Publication Date
CN109711744A CN109711744A (en) 2019-05-03
CN109711744B true CN109711744B (en) 2024-02-06

Family

ID=66260384

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811642436.5A Active CN109711744B (en) 2018-12-29 2018-12-29 Cleaning task automatic generation and execution evaluation method, cleaning method and device

Country Status (1)

Country Link
CN (1) CN109711744B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110245824A (en) * 2019-05-05 2019-09-17 浙江乌镇街科技有限公司 A kind of hotel keeps a public place clean supervisory systems
CN110363444A (en) * 2019-07-19 2019-10-22 河北冀联人力资源服务集团有限公司 Dust alleviation work quality evaluating method and system
CN110443484A (en) * 2019-07-26 2019-11-12 广州启盟信息科技有限公司 A kind of cleaning worker intelligent dispatching method and device entering and leaving data based on personnel
CN110706219B (en) * 2019-09-27 2021-01-26 北京海益同展信息科技有限公司 Animal waste monitoring method, monitoring device, inspection equipment and inspection system
CN111027118B (en) * 2019-11-19 2024-01-19 广东博智林机器人有限公司 Actual measurement real-quantity task point searching and task dispatching method and system
CN112528734A (en) * 2020-10-29 2021-03-19 长沙市到家悠享家政服务有限公司 Sorting score determining method, device, equipment and storage medium
JP6859006B1 (en) * 2020-12-24 2021-04-14 Rsmile株式会社 Information processing equipment, information processing methods and information processing programs
CN112686162B (en) * 2020-12-31 2023-12-15 鄂尔多斯市空港大数据运营有限公司 Method, device, equipment and storage medium for detecting clean state of warehouse environment
CN115174640B (en) * 2022-09-07 2022-12-16 深圳市奇果物联科技有限公司 Equipment data analysis management system and method based on Internet of things

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105512666A (en) * 2015-12-16 2016-04-20 天津天地伟业数码科技有限公司 River garbage identification method based on videos
CN105947476A (en) * 2016-05-04 2016-09-21 重庆特斯联智慧科技股份有限公司 Intelligent trash can monitoring method based on image recognition
CN106358021A (en) * 2016-11-01 2017-01-25 成都宏软科技实业有限公司 Smart community monitoring system
CN108229439A (en) * 2018-02-05 2018-06-29 谭希韬 A kind of artificial intelligence road health cloud computing management system and method
CN108875821A (en) * 2018-06-08 2018-11-23 Oppo广东移动通信有限公司 The training method and device of disaggregated model, mobile terminal, readable storage medium storing program for executing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105512666A (en) * 2015-12-16 2016-04-20 天津天地伟业数码科技有限公司 River garbage identification method based on videos
CN105947476A (en) * 2016-05-04 2016-09-21 重庆特斯联智慧科技股份有限公司 Intelligent trash can monitoring method based on image recognition
CN106358021A (en) * 2016-11-01 2017-01-25 成都宏软科技实业有限公司 Smart community monitoring system
CN108229439A (en) * 2018-02-05 2018-06-29 谭希韬 A kind of artificial intelligence road health cloud computing management system and method
CN108875821A (en) * 2018-06-08 2018-11-23 Oppo广东移动通信有限公司 The training method and device of disaggregated model, mobile terminal, readable storage medium storing program for executing

Also Published As

Publication number Publication date
CN109711744A (en) 2019-05-03

Similar Documents

Publication Publication Date Title
CN109711744B (en) Cleaning task automatic generation and execution evaluation method, cleaning method and device
CN112232293B (en) Image processing model training method, image processing method and related equipment
CN109801260B (en) Livestock number identification method and device, control device and readable storage medium
CN109241084B (en) Data query method, terminal equipment and medium
CN109547748A (en) Object foothold determines method, apparatus and storage medium
CN111738463A (en) Operation and maintenance method, device, system, electronic equipment and storage medium
CN104992146A (en) Method of face identification and apparatus thereof
CN110347653A (en) Data processing method and device, electronic equipment and readable storage medium storing program for executing
CN105162931A (en) Method and device for classifying communication numbers
CN115439700B (en) Image processing method and device and machine-readable storage medium
CN115640372A (en) Method, device, system, equipment and medium for guiding area of indoor plane
US11782923B2 (en) Optimizing breakeven points for enhancing system performance
US20220038627A1 (en) Method and apparatus for stabilizing image, roadside device and cloud control platform
CN113362227B (en) Image processing method, device, electronic equipment and storage medium
CN113344064A (en) Event processing method and device
CN113900885A (en) Data processing method and device for buried point
CN112906478A (en) Target object identification method, device, equipment and storage medium
CN113342518A (en) Task processing method and device
CN114840700B (en) Image retrieval method and device for realizing IA by combining RPA and AI and electronic equipment
CN113961603B (en) Large-screen data display method and device, electronic equipment and storage medium
Murakami et al. Sensor Network System with Dynamic Service Deployment Depending on Network Condition for Smart Aquaponics
CN114167223B (en) Power supply abnormity detection method and device and computer readable storage medium
CN114223189B (en) Time length statistics method, device, electronic equipment and computer readable medium
WO2024094127A1 (en) Parameter tuning method and apparatus, and computer device and storage medium
CN115099354A (en) Training sample construction method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant