CN113469000B - Regional map processing method and device, storage medium and electronic device - Google Patents

Regional map processing method and device, storage medium and electronic device Download PDF

Info

Publication number
CN113469000B
CN113469000B CN202110701569.0A CN202110701569A CN113469000B CN 113469000 B CN113469000 B CN 113469000B CN 202110701569 A CN202110701569 A CN 202110701569A CN 113469000 B CN113469000 B CN 113469000B
Authority
CN
China
Prior art keywords
target
target object
area map
information
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110701569.0A
Other languages
Chinese (zh)
Other versions
CN113469000A (en
Inventor
杨飞雨
李建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dreame Innovation Technology Suzhou Co Ltd
Original Assignee
Dreame Innovation Technology Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dreame Innovation Technology Suzhou Co Ltd filed Critical Dreame Innovation Technology Suzhou Co Ltd
Priority to CN202110701569.0A priority Critical patent/CN113469000B/en
Publication of CN113469000A publication Critical patent/CN113469000A/en
Priority to PCT/CN2022/094615 priority patent/WO2022267795A1/en
Application granted granted Critical
Publication of CN113469000B publication Critical patent/CN113469000B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a processing method and device of a regional map, a storage medium and an electronic device, wherein the method comprises the following steps: acquiring a target object image corresponding to a target object in a target area, wherein the target area is an area cleaned by a sweeping robot; identifying the target object according to the target object image to obtain a target identification result, wherein the target identification result comprises description information of the target object; and marking a target object in a target area map according to the target identification result, wherein the target object is an object corresponding to the target object in the target area map, and the target area map is an area map established for the target area by the sweeping robot. By adopting the technical scheme, the problem that the information acquisition efficiency is low in a mode of constructing the regional map by the sweeping robot in the related technology is solved.

Description

Regional map processing method and device, storage medium and electronic device
[ Field of technology ]
The present application relates to the field of communications, and in particular, to a method and apparatus for processing a regional map, a storage medium, and an electronic apparatus.
[ Background Art ]
Currently, users can clean their houses using a sweeping robot. The sweeping robot generally interacts with a user through a mobile phone end App (Application), and a house map established by the sweeping robot is displayed on the App. However, the house map is generally displayed only in one position with or without an object, and a user cannot acquire accurate object information from the map.
Therefore, in the related art, the area map is constructed by the sweeping robot, so that the information acquisition efficiency is low.
[ Invention ]
The application aims to provide a processing method and device of an area map, a storage medium and an electronic device, which at least solve the problem that the information acquisition efficiency is low in a mode of constructing the area map by a sweeping robot in the related art.
The application aims at realizing the following technical scheme:
According to an aspect of an embodiment of the present application, there is provided a method for processing an area map, including: acquiring a target object image corresponding to a target object in a target area, wherein the target area is an area cleaned by a sweeping robot; identifying the target object according to the target object image to obtain a target identification result, wherein the target identification result comprises description information of the target object; and marking a target object in a target area map according to the target identification result, wherein the target object is an object corresponding to the target object in the target area map, and the target area map is an area map established for the target area by the sweeping robot.
In an exemplary embodiment, before acquiring the target object image corresponding to the target object in the target area, the method further includes: and detecting that the target object exists in the target area in the process of cleaning the target area by the sweeping robot.
In an exemplary embodiment, identifying the target object according to the target object image, and obtaining the target identification result includes: and carrying out type recognition on the target object according to the target object image to obtain target type information, wherein the target type information is used for representing the object type of the target object, and the target recognition result comprises the target type information.
In an exemplary embodiment, performing the type recognition on the target object according to the target object image, and obtaining the target type information includes: and inputting the target object image into a target recognition model to obtain the target type information output by the target recognition model, wherein the target recognition model is obtained by training an initial recognition model by using an object image of a sample object, and the sample object is marked with a corresponding object type.
In an exemplary embodiment, identifying the target object according to the target object image, and obtaining the target identification result includes: and carrying out contour recognition on the target object according to the target object image to obtain target contour information, wherein the target contour information is used for representing the object contour of the target object, and the target recognition result comprises the target contour information.
In an exemplary embodiment, labeling the target object in the target area map according to the target recognition result includes: and marking the outline of the target object in the target area map in the form of a virtual wall according to the target outline information.
In an exemplary embodiment, after labeling the target object in the target area map according to the target recognition result, the method further includes: and sending the target area map to a target application on a target terminal for display, wherein the target application uses a target account number bound with the sweeping robot to log in.
According to another aspect of the embodiment of the present application, there is also provided a method for processing an area map, including: receiving a target area map sent by a sweeping robot through a target application, wherein the target application logs in by using a target account bound with the sweeping robot, and the target area map is an area map established for a target area through the sweeping robot; and displaying the target area map on a target display interface of the target application, wherein a target object and target labeling information of the target object are displayed on the target area map, the target object is an object corresponding to a target object in the target area map, and the target labeling information is used for describing the target object.
In one exemplary embodiment, displaying the target area map on the target display interface of the target application includes: and displaying the target object and the outline information of the target object in the target area map displayed on the target display interface, wherein the outline information of the target object is used for representing the outline of the target object, and the target annotation information comprises the outline information of the target object.
In one exemplary embodiment, displaying the target object and the contour information of the target object in the target area map displayed on the target display interface includes: and displaying the target object in the target area map displayed on the target display interface, and displaying the outline information of the target object in the form of a virtual wall.
In one exemplary embodiment, displaying the target area map on the target display interface of the target application includes: and displaying the target object and target type information in the target area map displayed on the target display interface, wherein the target type information is used for representing the object type of the target object, and the target labeling information comprises the target type information.
According to still another aspect of the embodiment of the present application, there is also provided a processing apparatus for an area map, including: an acquisition unit, configured to acquire a target object image corresponding to a target object in a target area, where the target area is an area cleaned by a sweeping robot; the identification unit is used for identifying the target object according to the target object image to obtain a target identification result, wherein the target identification result comprises description information of the target object; the labeling unit is used for labeling target objects in a target area map according to the target identification result, wherein the target objects are objects corresponding to the target objects in the target area map, and the target area map is an area map established for the target area through the sweeping robot.
In an exemplary embodiment, the apparatus further comprises: a detection unit, configured to detect, before acquiring the target object image corresponding to the target object in the target area, that the target object exists in the target area in a process of cleaning the target area by the sweeping robot.
In an exemplary embodiment, the identification unit includes: the first recognition module is used for carrying out type recognition on the target object according to the target object image to obtain target type information, wherein the target type information is used for representing the object type of the target object, and the target recognition result comprises the target type information.
In one exemplary embodiment, the first identification module includes: the input sub-module is used for inputting the target object image into a target recognition model to obtain the target type information output by the target recognition model, the target recognition model is obtained by training an initial recognition model by using an object image of a sample object, and the sample object is marked with a corresponding object type.
In an exemplary embodiment, the identification unit includes: and the second recognition module is used for carrying out contour recognition on the target object according to the target object image to obtain target contour information, wherein the target contour information is used for representing the object contour of the target object, and the target recognition result comprises the target contour information.
In an exemplary embodiment, the labeling unit includes: and the marking module is used for marking the outline of the target object in the target area map in the form of a virtual wall according to the target outline information.
In an exemplary embodiment, the apparatus further comprises: and the sending unit is used for sending the target area map to a target application on a target terminal for display after marking the target object in the target area map according to the target identification result, wherein the target application uses a target account number bound with the sweeping robot for login.
According to still another aspect of the embodiment of the present application, there is also provided a processing apparatus for an area map, including: the receiving unit is used for receiving a target area map sent by the sweeping robot through a target application, wherein the target application logs in by using a target account bound with the sweeping robot, and the target area map is an area map established for a target area through the sweeping robot; the display unit is used for displaying the target area map on a target display interface of the target application, wherein a target object and target labeling information of the target object are displayed on the target area map, the target object is an object corresponding to a target object in the target area map, and the target labeling information is used for describing the target object.
In one exemplary embodiment, the display unit includes: the first display module is used for displaying the target object and the outline information of the target object in the target area map displayed on the target display interface, wherein the outline information of the target object is used for representing the outline of the target object, and the target annotation information comprises the outline information of the target object.
In one exemplary embodiment, the first display module includes: and the display sub-module is used for displaying the target object in the target area map displayed on the target display interface and displaying the outline information of the target object in a virtual wall mode.
In one exemplary embodiment, the display unit includes: the second display module is used for displaying the target object and target type information in the target area map displayed on the target display interface, wherein the target type information is used for representing the object type of the target object, and the target labeling information comprises the target type information.
In the embodiment of the application, a target object image corresponding to a target object in a target area is acquired by marking the corresponding object on the area map according to the description information of the object and the like, wherein the target area is an area cleaned by a sweeping robot; identifying a target object according to the target object image to obtain a target identification result, wherein the target identification result comprises description information of the target object; according to the target recognition result, the target object is marked in the target area map, wherein the target object is an object corresponding to the target object in the target area map, the target area map is an area map established by the sweeping robot as the target area, the obtained recognition result contains description information (such as object type, object outline and the like) of the object because the object is recognized according to the object image of the object, the obtained description information can be ensured to accurately describe the object in the area, meanwhile, the object corresponding to the object can be displayed in the area map and the marking information of the object can be displayed at the same time, so that the object information can be rapidly acquired from the area map, the technical effects of improving the accuracy and efficiency of the object information acquisition are achieved, and the problem of low information acquisition efficiency in the related technology due to the mode of establishing the area map by the sweeping robot is solved.
[ Description of the drawings ]
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, and it will be obvious to a person skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic diagram of a hardware environment of an alternative method of processing a regional map according to an embodiment of the present application;
FIG. 2 is a flow chart of an alternative method of processing a regional map according to an embodiment of the present application;
FIG. 3 is a flow chart of another alternative method of processing a regional map according to an embodiment of the present application;
FIG. 4 is a flow chart of yet another alternative method of processing a regional map in accordance with an embodiment of the present application;
FIG. 5 is a schematic illustration of an alternative regional map in accordance with an embodiment of the present application;
FIG. 6 is a block diagram of an alternative regional map processing apparatus in accordance with an embodiment of the present application;
FIG. 7 is a block diagram of an alternative regional map processing apparatus according to an embodiment of the present application;
fig. 8 is a block diagram of an alternative electronic device according to an embodiment of the application.
[ Detailed description ] of the invention
The application will be described in detail hereinafter with reference to the drawings in conjunction with embodiments. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
According to an aspect of an embodiment of the present application, there is provided a method for processing an area map. Alternatively, in the present embodiment, the above-described processing method of the area map may be applied to a hardware environment constituted by the terminal 102 and the server 104 as shown in fig. 1. As shown in fig. 1, the server 104 is connected to the terminal 102 through a network, and may be used to provide services (such as game services, application services, etc.) to the terminal or clients installed on the terminal, and a database may be provided on the server or independent of the server, for providing data storage services to the server 104.
The terminal 102 may include one or more terminals, and different terminals may be connected through a server or may be directly connected through a communication without the server. Alternatively, the terminal 102 may include at least one of: a user terminal, a cleaning device, which may comprise a cleaning robot.
The network may include, but is not limited to, at least one of: wired network, wireless network. The wired network may include, but is not limited to, at least one of: a wide area network, a metropolitan area network, a local area network, and the wireless network may include, but is not limited to, at least one of: WIFI (WIRELESS FIDELITY ), bluetooth. The terminal 102 may not be limited to a PC, a mobile phone, a tablet computer, etc.
The method for processing the regional map according to the embodiment of the present application may be performed by the server 104, may be performed by the terminal 102, or may be performed by both the server 104 and the terminal 102. The method for processing the area map by the terminal 102 according to the embodiment of the present application may be performed by a client installed thereon.
Taking the method for processing the area map in the present embodiment performed by the cleaning device as an example, fig. 2 is a schematic flow chart of an alternative method for processing the area map according to an embodiment of the present application, as shown in fig. 2, the flow of the method may include the following steps:
step S202, obtaining a target object image corresponding to a target object in a target area, wherein the target area is an area cleaned by the sweeping robot.
The processing method of the area map in the embodiment can be applied to a scene in which the area map is constructed by the sweeping robot. The sweeping robot is a generic term of a device with a sweeping function, and may be an independent sweeping device or may belong to other intelligent devices, which is not specifically limited in this embodiment.
The method for processing the area map in this embodiment may be executed by the sweeping robot, may be executed by an intelligent device to which the sweeping robot belongs, may be executed by a background server, or may be executed by other devices having data processing capability, and in this embodiment, the execution by the sweeping robot is described as an example.
The target user can log in a target application running on the terminal equipment (namely, the target terminal) of the target user by using the target account, the target application is an application matched with the sweeping robot, and the sweeping robot bound with the target account can be controlled by using the target application logged in by using the target account so as to realize interaction between the target user and the sweeping robot. For example, an area map established by the sweeping robot may be displayed to a target user through the target application, and the target user may send a sweeping instruction or the like to the sweeping robot through the target application.
The sweeping robot may be used to sweep a target area, which may be a closed area, such as a bedroom, living room, bathroom, etc., or an unsealed area, such as an outdoor area. The target area may include a target object, which may be an object having a certain shape, such as a trash can, an electric wire, a slipper, or the like, but is not limited thereto, and the kind of the target object is not limited in the present embodiment.
The sweeping robot may be provided with a data acquisition component, which may be a camera, an infrared sensor, etc., but is not limited thereto, and other components having an object image acquisition function may be applied to the present embodiment. The data acquisition means may perform a data acquisition operation during, before, or after cleaning the target area using the cleaning robot, and the acquired data may be a target object image of the target object.
Alternatively, the target object image may be an object image of the target object at a certain angle, or may be an object image of the target object at a plurality of angles. Various operations can be performed using object images of the target object at various angles, for example, a three-dimensional shape of the target object can be constructed, so that the accuracy of recognition can be improved.
Step S204, identifying the target object according to the target object image to obtain a target identification result, wherein the target identification result contains description information of the target object.
The target object image may represent attribute information of the target object, such as shape, color, size, etc. of the object. In this embodiment, the target object may be identified according to the target object image, so as to obtain a target identification result. The obtained target recognition result may contain description information of the target object, for example, information describing the object type of the target object, the object profile of the target object, and the like.
As an alternative embodiment, the above-described identification operation may be performed by a sweeping robot. After the target object image is acquired, the sweeping robot may perform the recognition operation directly according to the target object image.
As another alternative embodiment, the above-mentioned identification operation may be performed by an intelligent device, a background server, or other devices to which the sweeping robot belongs. The robot may transmit the acquired target object image to other devices, and after receiving the target object image, the other devices may perform the above-described recognition operation according to the target object image.
Alternatively, the recognition operation may be performed in real time (for example, the recognition operation is performed immediately after the target object image is acquired), or may be performed in non-real time (for example, the recognition operation is performed again in idle time after the target object image is acquired), which is not limited in this embodiment.
The target object image can comprise a plurality of object images of different angles of the target object, and when the target object image is identified, each object image in the plurality of object images can be identified respectively to obtain an identification result of each object image; and then fusing the recognition results of the object images to obtain a target recognition result. Optionally, when the object image is identified, multiple object images can be identified at the same time, and the object identification result is obtained by fusing the image features of the object images.
Alternatively, the identifying the target object according to the target object image may be: constructing a target object model corresponding to the target object according to the target object image, wherein the target object model is used for representing the three-dimensional shape of the target object; and identifying the target object according to the target object model.
In step S206, labeling the target object in the target area map according to the target recognition result, where the target object is an object corresponding to the target object in the target area map, and the target area map is an area map established for the target area by the sweeping robot.
The area map established for the target area by the sweeping robot is a target area map. The building of the area map by the sweeping robot means: and establishing an area map of the area by using data obtained by data acquisition of the sweeping robot on the certain area. The operation of creating the area map may be performed by the sweeping robot, or may be performed by other devices (for example, an intelligent device to which the sweeping robot belongs, a background server, etc.), which is not limited in this embodiment.
The processing method of the regional map in the embodiment may be an object labeling scheme in the regional map, and there may be multiple occasions for executing object labeling. For example, object annotation may be performed when a target region map is built. In this case, the map construction and the object annotation are performed synchronously.
Alternatively, the object annotation may be performed after the established target area map. In this case, the object annotation may be performed on an object already in the established target area map. The object labeling mode can be compatible with the existing regional map building scheme, and enriches object information in the target regional map. The object annotation may also be performed on newly added objects in the established target area map. The object labeling mode can be suitable for a scene of adding a new object in a target area, and improves the accuracy of object information in an area map, so that the capability of the area map for representing an area is improved.
Through the steps S202 to S206, a target object image corresponding to a target object in a target area is acquired, wherein the target area is an area cleaned by the sweeping robot; identifying a target object according to the target object image to obtain a target identification result, wherein the target identification result comprises description information of the target object; labeling target objects in a target area map according to a target identification result, wherein the target objects are objects corresponding to the target objects in the target area map, and the target area map is an area map established for a target area by a sweeping robot, so that the problem of low information acquisition efficiency in a mode of establishing the area map by the sweeping robot in the related technology is solved, and the accuracy and the efficiency of object information acquisition are improved.
In an exemplary embodiment, before acquiring the target object image corresponding to the target object in the target area, the method further includes:
S11, detecting that a target object exists in the target area in the process of cleaning the target area by the sweeping robot.
In the present embodiment, the acquisition of the target object image may be performed during the cleaning of the target area by the cleaning robot. If the sweeping robot encounters a target object during sweeping, the data acquisition component or other sensing components can detect that the target object exists in a target area. If the presence of a target object in the target area is detected, the step of acquiring an image of the target object may be triggered to be performed, irrespective of whether a target area map has been established for the target area.
Alternatively, to avoid repeating labeling of the object, the sweeping robot may determine whether an object corresponding to the target object exists in the target area map and whether the object corresponding to the target object has been labeled. When there is no object corresponding to the target object in the target area map or there is an object corresponding to the target object in the target area map, but the object corresponding to the target object is not marked, the step of acquiring the target object image is triggered.
According to the embodiment, if the object exists in the area in the cleaning process of the cleaning robot, the object image of the object is triggered and acquired, and then the object corresponding to the object is marked, so that timeliness of object marking can be achieved.
In an exemplary embodiment, identifying the target object according to the target object image, and obtaining the target identification result includes:
S21, carrying out type recognition on the target object according to the target object image to obtain target type information, wherein the target type information is used for representing the object type of the target object, and the target recognition result comprises the target type information.
The object type is more convenient for the user to acquire the information of the object than the size, temperature, color and the like of the object, i.e. can provide more abundant object information. Therefore, in the present embodiment, in order to improve the convenience of the object information provision, the target recognition result may include target type information for indicating the object type of the target object.
The sweeping robot can conduct type recognition on the target object according to the target object image to obtain target type information. Correspondingly, labeling the target object in the target area map according to the target recognition result may include: and marking the target type information on a target position matched with the target object in the target area map.
The target type information may be marked in various ways, and may be written, or may be expressed in other forms than written, for example, in symbols, patterns, and the like. The target position may be an upper, a lower, a left, a right, etc. position of the target object, and the distance between the two positions is less than or equal to a target distance threshold, etc., and in this embodiment, the labeling mode and the labeling position of the target type information are not limited.
According to the embodiment, the type identification is carried out according to the object image, so that the type of the object is marked on the regional map, the object information provided by the regional map can be enriched, and the convenience of object information acquisition is improved.
In an exemplary embodiment, performing category recognition on the target object according to the target object image, and obtaining the target category information includes:
s31, inputting the target object image into a target recognition model to obtain target type information output by the target recognition model, wherein the target recognition model is obtained by training an initial recognition model by using an object image of a sample object, and the sample object is marked with a corresponding object type.
When the type of the target object is identified, the identification mode used may be a non-AI (ARTIFICIAL INTELLIGENCE ) identification mode. Alternatively, in the present embodiment, the kind of the object may be identified based on AI, for example, the kind of the object may be identified using the target recognition model.
The object recognition model is a neural network model, but may be other types of AI models. The target recognition model can be obtained by training the initial recognition model by using an object image of a sample object, the sample object is marked with a corresponding object type, and therefore model parameters of the recognition model can be adjusted according to an output result of the recognition model and the marked object type, and a trained target recognition model is obtained.
In order to improve accuracy of model training, the object image of the sample object may be obtained by performing data acquisition on the sample object using the same kind of sweeping robot as that for acquiring the target object image. The device for training the recognition model may be a robot for sweeping floor, or may be another device, which is not limited in this embodiment.
In the object type recognition, the target object image may be input to a target recognition model, and the target recognition model may recognize the target object based on the target object image, determine a probability that the object type of the target object is each of a plurality of candidate object types, and further determine an object type that best matches the target object from the plurality of candidate object types. The target type information indicates the type of object that best matches the target object.
By this embodiment, the kind of the object is recognized based on AI, and the accuracy of the object kind recognition can be improved.
In an exemplary embodiment, identifying the target object according to the target object image, and obtaining the target identification result includes:
S41, carrying out contour recognition on the target object according to the target object image to obtain target contour information, wherein the target contour information is used for representing the object contour of the target object, and the target recognition result comprises the target contour information.
In the related art, the outline of the obstacle is not accurately drawn in the area map, and it is not known whether the similar objects are different parts of the same object. In order to improve accuracy of object information acquisition, in the present embodiment, the target recognition result may include target contour information for representing an object contour of the target object.
The sweeping robot can conduct contour recognition on the target object according to the target object image to obtain target contour information. The target object image may be subjected to a plurality of contour recognition modes, and may include, but not limited to, at least one of the following: image contour extraction, image segmentation based on AI semantics, and the like, the manner of contour recognition is not limited in this embodiment.
Alternatively, for objects having a plurality of different locations, the profile represented by the target profile information may comprise a specific profile of the different locations. When the labeling is performed, different parts belonging to the same object can be labeled in the same labeling mode so as to embody that the parts belong to the same object. The contour represented by the target contour information may include the entire contour of the object, that is, the contour of all portions of the object. When labeling, the outline of an object may contain all the parts belonging to the object to show that the parts belong to the object.
According to the method and the device for identifying the outline of the object, the outline of the object is identified according to the object image, so that the outline of the object is marked in the area map, and the convenience of object information acquisition can be improved.
In one exemplary embodiment, labeling the target object in the target area map according to the target recognition result includes:
s51, marking the outline of the target object in the form of a virtual wall in the target area map according to the target outline information.
If the target recognition result includes the target contour information, labeling the target object in the target area map according to the target recognition result may include: and marking the outline of the target object in the target area map according to the target outline information. The contour of the target object may be determined based on the object contour of the target object, e.g., the contour of the target object may be obtained after scaling, translating, etc. the object contour of the target object.
The outline of the target object can be marked in various modes, can be marked in a virtual wall mode, and can be marked in other straight lines and curves except the virtual wall mode. In this embodiment, the outline of the target object is marked in the target map in the form of a virtual wall, and the area range where the target object is located can be set as a detour area (or, a cleaning prohibition area) of the sweeping robot, so that the sweeping robot can be controlled to avoid the target object during cleaning.
According to the method and the device for marking the outline of the object, the outline of the object corresponding to the object in the area is marked in the area map in the form of the virtual wall, so that the convenience of outline marking can be improved, and meanwhile, the sweeping of the sweeping robot can be controlled conveniently.
In an exemplary embodiment, after labeling the target object in the target area map according to the target recognition result, the method further includes:
And S61, sending the target area map to a target application on the target terminal for display, wherein the target application uses a target account number bound with the sweeping robot to log in.
For the established target area map, the map can be stored on the sweeping robot so that the sweeping robot can sweep the target area. Optionally, the sweeping robot may also send the target area map to the target application logged in on the target terminal using the target account. The timing of sending the target area map may be one or more, for example, after the target area map is established, after the target area map is updated, after receiving the map acquisition request of the target application, or may be other timing of allowing the area map to be sent.
The display interface for displaying the area map in the target application is a target display interface. After receiving the target area map, the target application may display the target area map on the target display interface. Alternatively, the target application may store the target area map on the target terminal and display the target area map on the target display interface after detecting the map display instruction.
Through the embodiment, the regional map is sent to the terminal App for display, so that a user can conveniently view the regional map constructed by the sweeping robot, and convenience in information display is improved.
According to another aspect of the embodiment of the application, a method for processing an area map is also provided. Alternatively, in the present embodiment, the above-described processing method of the area map may be applied to a hardware environment constituted by the terminal 102 and the server 104 as shown in fig. 1. Has been described and will not be described in detail herein.
Taking the example that the user terminal executes the processing method of the area map in the present embodiment, fig. 3 is a schematic flow chart of another alternative processing method of the area map according to an embodiment of the present application, as shown in fig. 3, the flow of the method may include the following steps:
In step S302, a target area map sent by the sweeping robot is received by a target application, where the target application uses a target account bound to the sweeping robot to log in, and the target area map is an area map established for a target area by the sweeping robot.
The processing method of the area map in the embodiment can be applied to a scene in which the area map is constructed by the sweeping robot. In this embodiment, the sweeping robot, the target application, and the target area map are the same as or similar to those in the foregoing embodiments, for example, the target area map may be an area map established or updated by the processing method of the area map in the foregoing embodiments, which is already described and will not be described herein.
For the terminal device of the target user, i.e. the target terminal, a target application can be run on it which is logged in using the target account. The target application and the sweeping robot can be connected in a communication mode, and the target application can receive the target area map sent by the sweeping robot through the communication connection between the target application and the sweeping robot.
Step S304, a target area map is displayed on a target display interface of the target application, wherein a target object and target labeling information of the target object are displayed on the target area map, the target object is an object corresponding to the target object in the target area map, and the target labeling information is used for describing the target object.
The display interface for displaying the area map in the target application is a target display interface. After receiving the target area map, the target application may display the target area map on the target display interface. Alternatively, the target application may store the target area map on the target terminal and display the target area map on the target display interface after detecting the map display instruction.
The target object and the labeling information of the target object, that is, the target labeling information, may be contained on the displayed target area map. The target annotation information may be used to describe the target object, enriching the object information that can be provided in the region map. Optionally, the target labeling information may include labeling information obtained by labeling the target object in the target area map according to the target recognition result, or may include other labeling information, for example, labeling information obtained by labeling the target object in other manners, which is not limited in this embodiment.
Receiving a target area map sent by the sweeping robot through the target application, wherein the target application logs in by using a target account bound with the sweeping robot, and the target area map is an area map established for a target area through the sweeping robot; and displaying a target area map on a target display interface of the target application, wherein a target object and target labeling information of the target object are displayed on the target area map, the target object is an object corresponding to a target object in a target area in the target area map, the target labeling information is used for describing the target object, the problem that information acquisition efficiency is low in a mode of constructing the area map by the sweeping robot in the related art is solved, and the accuracy and the efficiency of acquiring object information are improved.
In one exemplary embodiment, displaying a target area map on a target display interface of a target application includes:
s71, displaying a target object and outline information of the target object in a target area map displayed on a target display interface, wherein the outline information of the target object is used for representing the outline of the target object, and the target annotation information comprises the outline information of the target object.
The target annotation information may comprise contour information of the target object. The contour information of the target object may be used to represent the contour of the target object, and may include contour information obtained by labeling the contour of the target object in the target area map according to the target contour information, or may include contour information of the target object obtained by other means.
The target area map displayed on the target display interface may include the target object and contour information of the target object. If an object contains a plurality of parts, the parts can be marked by adopting the same marking mode so as to show that the parts belong to the same object; or the outline of the object may contain all the parts belonging to the object to show that the parts belong to the object.
By the method, the outline of the object is marked in the regional map, and convenience in acquiring the object information can be improved.
In one exemplary embodiment, displaying a target object and outline information of the target object in a target area map displayed on a target display interface includes:
s81, displaying the target object in the target area map displayed on the target display interface, and displaying the contour information of the target object in the form of a virtual wall.
The outline information of the target object may be displayed in various ways, for example, in the form of a virtual wall, or in the form of a straight line or a curve other than the virtual wall. In this embodiment, the outline information of the target object is displayed in the form of a virtual wall in the target map, and the area range where the target object is located can be set as the detour area of the sweeping robot, so that the sweeping robot can be controlled to avoid the target object during sweeping.
According to the method and the device for displaying the outline of the object in the regional map, the outline of the object in the regional map is displayed in the form of the virtual wall, the convenience of outline display can be improved, and meanwhile the sweeping of the sweeping robot can be controlled conveniently.
In one exemplary embodiment, displaying a target area map on a target display interface of a target application includes:
S91, displaying a target object and target type information in a target area map displayed on a target display interface, wherein the target type information is used for indicating the object type of the target object, and the target labeling information comprises the target type information.
The target annotation information may comprise target category information. The target type information may be information indicating the type of the target object, and may be information indicating the type of the target object obtained by identifying the target object from the target object image, or information indicating the type of the target object obtained by other means.
The target area map displayed on the target display interface may include target objects and target type information. The target category information may be displayed at a target location that matches the target object. The target type information may be displayed in various forms, such as characters, or in other forms other than characters, for example, symbols, patterns, and the like. The target position may be an upper, a lower, a left, a right, etc. position of the target object, and the distance between the two positions is less than or equal to a target distance threshold, etc., and in this embodiment, the display mode and the labeling position of the target type information are not limited.
According to the embodiment, the object types corresponding to the objects in the regional map are displayed on the regional map, so that the object information provided by the regional map can be enriched, and the convenience of object information acquisition is improved.
The processing method of the area map in the present embodiment is explained below in conjunction with an alternative example. Provided in this example is a virtual wall self-labeling scheme based on AI technology, which can automatically label the outline of an object (e.g., an obstacle) and label the kind in the form of a virtual wall on an App map.
As shown in fig. 4, the flow of the processing method of the area map in this alternative example may include the steps of:
Step S402, the sweeping robot detects an object in the sweeping process;
step S404, the sweeping robot recognizes the three-dimensional shape of the object and recognizes the kind of the object based on AI;
In step S406, the outline of the object is marked in the App map in the form of a virtual wall, and the kind of the object is marked, wherein the object is represented in the App map in a two-dimensional map.
As shown in fig. 5, the App map displays a two-dimensional map of objects, the types of which are represented by characters, and the dotted lines are virtual walls of the outline of the objects. On the App map, the user can see the kind of objects in the room while more clearly and accurately seeing the real shape.
Through the example, the types and the outlines of the objects are marked on the App map, so that a user can see more real and accurate object information on the App map, more accurate and rich interaction information is provided for the user, and experience is improved.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM (Read-Only Memory)/RAM (Random Access Memory), magnetic disk, optical disk) and including instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
According to still another aspect of the embodiment of the present application, there is also provided a processing apparatus for an area map for implementing the processing method of an area map. Fig. 6 is a block diagram of an alternative processing apparatus for an area map according to an embodiment of the present application, and as shown in fig. 6, the apparatus may include:
an acquiring unit 602, configured to acquire a target object image corresponding to a target object in a target area, where the target area is an area cleaned by the sweeping robot;
The identifying unit 604 is connected to the acquiring unit 602, and is configured to identify a target object according to the target object image, so as to obtain a target identification result, where the target identification result includes description information of the target object;
The labeling unit 606 is connected to the identifying unit 604, and is configured to label a target object in a target area map according to a target identification result, where the target object is an object corresponding to the target object in the target area map, and the target area map is an area map established for the target area by the sweeping robot.
It should be noted that, the acquiring unit 602 in this embodiment may be used to perform the step S202 described above, the identifying unit 604 in this embodiment may be used to perform the step S204 described above, and the labeling unit 606 in this embodiment may be used to perform the step S206 described above.
Acquiring a target object image corresponding to a target object in a target area through the module, wherein the target area is an area cleaned by a sweeping robot; identifying a target object according to the target object image to obtain a target identification result, wherein the target identification result comprises description information of the target object; labeling target objects in a target area map according to a target identification result, wherein the target objects are objects corresponding to the target objects in the target area map, and the target area map is an area map established for a target area by a sweeping robot, so that the problem of low information acquisition efficiency in a mode of establishing the area map by the sweeping robot in the related technology is solved, and the accuracy and the efficiency of object information acquisition are improved.
In an exemplary embodiment, the above apparatus further includes:
and the detection unit is used for detecting the existence of the target object in the target area in the process of cleaning the target area by the sweeping robot before acquiring the target object image corresponding to the target object in the target area.
In one exemplary embodiment, the identification unit 604 includes:
The first recognition module is used for carrying out type recognition on the target object according to the target object image to obtain target type information, wherein the target type information is used for representing the object type of the target object, and the target recognition result comprises the target type information.
In one exemplary embodiment, the first identification module includes:
the input sub-module is used for inputting the target object image into the target recognition model to obtain the target type information output by the target recognition model, wherein the target recognition model is obtained by training the initial recognition model by using the object image of the sample object, and the sample object is marked with the corresponding object type.
In one exemplary embodiment, the identification unit 604 includes:
And the second recognition module is used for carrying out contour recognition on the target object according to the target object image to obtain target contour information, wherein the target contour information is used for representing the object contour of the target object, and the target recognition result comprises the target contour information.
In one exemplary embodiment, the labeling unit 606 includes:
and the marking module is used for marking the outline of the target object in the form of a virtual wall in the target area map according to the target outline information.
In an exemplary embodiment, the above apparatus further includes:
And the sending unit is used for sending the target area map to a target application on the target terminal for display after marking the target object in the target area map according to the target identification result, wherein the target application uses a target account number bound with the sweeping robot for login.
According to still another aspect of the embodiment of the present application, there is also provided a processing apparatus for an area map for implementing the processing method of an area map. Fig. 7 is a block diagram of another alternative processing apparatus for regional maps according to an embodiment of the present application, which may include:
A receiving unit 702, configured to receive, by using a target application, a target area map sent by the sweeping robot, where the target application uses a target account bound to the sweeping robot to log in, and the target area map is an area map established by the sweeping robot for a target area;
The display unit 704 is connected to the receiving unit 702, and is configured to display a target area map on a target display interface of the target application, where a target object and target labeling information of the target object are displayed on the target area map, where the target object is an object corresponding to a target object in the target area map, and the target labeling information is used to describe the target object.
It should be noted that, the receiving unit 702 in this embodiment may be used to perform the above-mentioned step S302, and the display unit 704 in this embodiment may be used to perform the above-mentioned step S304.
Through the module, receiving a target area map sent by the sweeping robot through a target application, wherein the target application logs in by using a target account bound with the sweeping robot, and the target area map is an area map established for a target area through the sweeping robot; and displaying a target area map on a target display interface of the target application, wherein a target object and target labeling information of the target object are displayed on the target area map, the target object is an object corresponding to a target object in a target area in the target area map, the target labeling information is used for describing the target object, the problem that information acquisition efficiency is low in a mode of constructing the area map by the sweeping robot in the related art is solved, and the accuracy and the efficiency of acquiring object information are improved.
In one exemplary embodiment, the display unit 704 includes:
the first display module is used for displaying the target object and the outline information of the target object in the target area map displayed on the target display interface, wherein the outline information of the target object is used for representing the outline of the target object, and the target annotation information comprises the outline information of the target object.
In one exemplary embodiment, the first display module includes:
And the display sub-module is used for displaying the target object in the target area map displayed on the target display interface and displaying the outline information of the target object in the form of a virtual wall.
In one exemplary embodiment, the display unit 704 includes:
and the second display module is used for displaying the target object and the target type information in the target area map displayed on the target display interface, wherein the target type information is used for representing the object type of the target object, and the target labeling information comprises the target type information.
It should be noted that the above modules are the same as examples and application scenarios implemented by the corresponding steps, but are not limited to what is disclosed in the above embodiments. It should be noted that the above modules may be implemented in software or in hardware as part of the apparatus shown in fig. 1, where the hardware environment includes a network environment.
According to yet another aspect of an embodiment of the present application, there is also provided a storage medium. Alternatively, in the present embodiment, the storage medium may be used to execute the program code of the processing method of the area map of any one of the above-mentioned embodiments of the present application.
Alternatively, in this embodiment, the storage medium may be located on at least one network device of the plurality of network devices in the network shown in the above embodiment.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of:
s1, acquiring a target object image corresponding to a target object in a target area, wherein the target area is an area cleaned by a sweeping robot;
s2, identifying a target object according to the target object image to obtain a target identification result, wherein the target identification result comprises description information of the target object;
and S3, marking a target object in a target area map according to a target identification result, wherein the target object is an object corresponding to the target object in the target area map, and the target area map is an area map established by a sweeping robot for a target area.
Or is arranged to store program code for performing the steps of:
S1, receiving a target area map sent by a sweeping robot through a target application, wherein the target application logs in by using a target account bound with the sweeping robot, and the target area map is an area map established for a target area through the sweeping robot;
s2, displaying a target area map on a target display interface of the target application, wherein a target object and target labeling information of the target object are displayed on the target area map, the target object is an object corresponding to the target object in the target area map, and the target labeling information is used for describing the target object.
Alternatively, specific examples in the present embodiment may refer to examples described in the above embodiments, which are not described in detail in the present embodiment.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: various media capable of storing program codes, such as a U disk, ROM, RAM, a mobile hard disk, a magnetic disk or an optical disk.
According to still another aspect of the embodiment of the present application, there is also provided an electronic device for implementing the above-mentioned method for processing an area map, where the electronic device may be a server, a terminal, or a combination thereof.
Fig. 8 is a block diagram of an alternative electronic device, according to an embodiment of the present application, as shown in fig. 8, including a processor 802, a communication interface 804, a memory 806, and a communication bus 808, wherein the processor 802, the communication interface 804, and the memory 806 communicate with each other via the communication bus 808, wherein,
A memory 806 for storing a computer program;
The processor 802, when executing the computer program stored on the memory 806, performs the following steps:
s1, acquiring a target object image corresponding to a target object in a target area, wherein the target area is an area cleaned by a sweeping robot;
s2, identifying a target object according to the target object image to obtain a target identification result, wherein the target identification result comprises description information of the target object;
and S3, marking a target object in a target area map according to a target identification result, wherein the target object is an object corresponding to the target object in the target area map, and the target area map is an area map established by a sweeping robot for a target area.
Or processor 802, for executing computer programs stored on memory 806, performs the following steps:
S1, receiving a target area map sent by a sweeping robot through a target application, wherein the target application logs in by using a target account bound with the sweeping robot, and the target area map is an area map established for a target area through the sweeping robot;
s2, displaying a target area map on a target display interface of the target application, wherein a target object and target labeling information of the target object are displayed on the target area map, the target object is an object corresponding to the target object in the target area map, and the target labeling information is used for describing the target object.
Alternatively, in the present embodiment, the communication bus may be a PCI (PERIPHERAL COMPONENT INTERCONNECT, peripheral component interconnect standard) bus, or an EISA (Extended Industry Standard Architecture ) bus, or the like. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, only one thick line is shown in fig. 8, but not only one bus or one type of bus. The communication interface is used for communication between the electronic device and other equipment.
The memory may include RAM or nonvolatile memory (non-volatile memory), such as at least one disk memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
As an example, the memory 806 may include, but is not limited to, an acquisition unit 602, an identification unit 604, and a labeling unit 606 in a control device including the apparatus. In addition, other module units in the control device of the above apparatus may be included, but are not limited to, and are not described in detail in this example.
As another example, the memory 806 may include, but is not limited to, a receiving unit 702 in a control device including the apparatus, and a display unit 704. In addition, other module units in the control device of the above apparatus may be included, but are not limited to, and are not described in detail in this example.
The processor may be a general purpose processor and may include, but is not limited to: CPU (Central Processing Unit ), NP (Network Processor, network processor), etc.; but may also be a DSP (DIGITAL SIGNAL Processing), ASIC (Application SPECIFIC INTEGRATED Circuit), FPGA (Field-Programmable gate array) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments, and this embodiment is not described herein.
It will be understood by those skilled in the art that the structure shown in fig. 8 is only schematic, and the device implementing the above-mentioned method for processing the area map may be a terminal device, and the terminal device may be a smart phone (such as an Android Mobile phone, an iOS Mobile phone, etc.), a tablet computer, a palm computer, a Mobile internet device (Mobile INTERNET DEVICES, MID), a PAD, etc. Fig. 8 is not limited to the structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 8, or have a different configuration than shown in FIG. 8.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program for instructing a terminal device to execute in association with hardware, the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, ROM, RAM, magnetic or optical disk, etc.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
The integrated units in the above embodiments may be stored in the above-described computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing one or more computer devices (which may be personal computers, servers or network devices, etc.) to perform all or part of the steps of the method described in the embodiments of the present application.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In several embodiments provided by the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution provided in the present embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application, which are intended to be comprehended within the scope of the present application.

Claims (11)

1. A method for processing a region map, comprising:
acquiring a target object image corresponding to a target object in a target area, wherein the target area is an area cleaned by a sweeping robot;
Identifying the target object according to the target object image to obtain a target identification result, wherein the target identification result comprises description information of the target object;
marking a target object in a target area map according to the target identification result, wherein the target object is an object corresponding to the target object in the target area map, the target area map is an area map established for the target area by the sweeping robot, and the target object is represented in the target area map by a two-dimensional graph;
Wherein the identifying the target object according to the target object image includes: constructing a target object model corresponding to the target object according to the target object image, wherein the target object model is used for representing the three-dimensional shape of the target object; identifying the target object according to the target object model;
Wherein, according to the target object image, identifying the target object, the target identification result includes: performing contour recognition on the target object according to the target object image to obtain target contour information, wherein the target contour information is used for representing the object contour of the target object, and the target recognition result comprises the target contour information;
The labeling the target object in the target area map according to the target identification result comprises the following steps: and marking the outline of the target object in the target area map in the form of a virtual wall according to the target outline information.
2. The method of claim 1, wherein prior to acquiring the target object image corresponding to the target object in the target area, the method further comprises:
and detecting that the target object exists in the target area in the process of cleaning the target area by the sweeping robot.
3. The method of claim 1, wherein identifying the target object from the target object image comprises:
And carrying out type recognition on the target object according to the target object image to obtain target type information, wherein the target type information is used for representing the object type of the target object, and the target recognition result comprises the target type information.
4. The method of claim 3, wherein performing class identification on the target object based on the target object image to obtain the target class information comprises:
And inputting the target object image into a target recognition model to obtain the target type information output by the target recognition model, wherein the target recognition model is obtained by training an initial recognition model by using an object image of a sample object, and the sample object is marked with a corresponding object type.
5. The method of any of claims 1 to 4, wherein after labeling the target object in the target area map according to the target recognition result, the method further comprises:
And sending the target area map to a target application on a target terminal for display, wherein the target application uses a target account number bound with the sweeping robot to log in.
6. A method for processing a region map, comprising:
Receiving a target area map sent by a sweeping robot through a target application, wherein the target application logs in by using a target account bound with the sweeping robot, and the target area map is an area map established for a target area through the sweeping robot;
Displaying the target area map on a target display interface of the target application, wherein a target object and target labeling information of the target object are displayed on the target area map, the target object is an object corresponding to the target object in the target area map, the target labeling information is used for describing the target object, and the target object is represented in a two-dimensional graph in the target area map;
wherein displaying the target area map on the target display interface of the target application comprises: displaying the target object and contour information of the target object in the target area map displayed on the target display interface, wherein the contour information of the target object is used for representing the contour of the target object, and the target annotation information comprises the contour information of the target object;
Wherein displaying the target object and the contour information of the target object in the target area map displayed on the target display interface includes: and displaying the target object in the target area map displayed on the target display interface, and displaying the outline information of the target object in the form of a virtual wall.
7. The method of claim 6, wherein displaying the target area map on the target display interface of the target application comprises:
and displaying the target object and target type information in the target area map displayed on the target display interface, wherein the target type information is used for representing the object type of the target object, and the target labeling information comprises the target type information.
8. A processing apparatus for an area map, comprising:
An acquisition unit, configured to acquire a target object image corresponding to a target object in a target area, where the target area is an area cleaned by a sweeping robot;
The identification unit is used for identifying the target object according to the target object image to obtain a target identification result, wherein the target identification result comprises description information of the target object;
The labeling unit is used for labeling target objects in a target area map according to the target identification result, wherein the target objects are objects corresponding to the target objects in the target area map, the target area map is an area map established for the target area by the sweeping robot, and the target objects are represented in a two-dimensional map in the target area map;
The identification unit is used for identifying the target object according to the target object image by executing the following steps: constructing a target object model corresponding to the target object according to the target object image, and identifying the target object, wherein the target object model is used for representing the three-dimensional shape of the target object; identifying the target object according to the target object model;
Wherein the identification unit includes: the second recognition module is used for carrying out contour recognition on the target object according to the target object image to obtain target contour information, wherein the target contour information is used for representing the object contour of the target object, and the target recognition result comprises the target contour information;
wherein, the annotating unit includes: and the marking module is used for marking the outline of the target object in the target area map in the form of a virtual wall according to the target outline information.
9. A processing apparatus for an area map, comprising:
the receiving unit is used for receiving a target area map sent by the sweeping robot through a target application, wherein the target application logs in by using a target account bound with the sweeping robot, and the target area map is an area map established for a target area through the sweeping robot;
The display unit is used for displaying the target area map on a target display interface of the target application, wherein a target object and target labeling information of the target object are displayed on the target area map, the target object is an object corresponding to a target object in the target area map, the target labeling information is used for describing the target object, and the target object is represented in a two-dimensional graph in the target area map;
wherein the display unit includes: the first display module is used for displaying the target object and the outline information of the target object in the target area map displayed on the target display interface, wherein the outline information of the target object is used for representing the outline of the target object, and the target annotation information comprises the outline information of the target object;
wherein, the first display module includes: and the display sub-module is used for displaying the target object in the target area map displayed on the target display interface and displaying the outline information of the target object in a virtual wall mode.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a stored program, wherein the program when run performs the method of any one of claims 1 to 5 or the method of any one of claims 6 to 7.
11. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, the processor being arranged to perform the method of any of claims 1 to 5 or the method of any of claims 6 to 7 by means of the computer program.
CN202110701569.0A 2021-06-23 2021-06-23 Regional map processing method and device, storage medium and electronic device Active CN113469000B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110701569.0A CN113469000B (en) 2021-06-23 2021-06-23 Regional map processing method and device, storage medium and electronic device
PCT/CN2022/094615 WO2022267795A1 (en) 2021-06-23 2022-05-24 Regional map processing method and apparatus, storage medium, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110701569.0A CN113469000B (en) 2021-06-23 2021-06-23 Regional map processing method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN113469000A CN113469000A (en) 2021-10-01
CN113469000B true CN113469000B (en) 2024-06-14

Family

ID=77872542

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110701569.0A Active CN113469000B (en) 2021-06-23 2021-06-23 Regional map processing method and device, storage medium and electronic device

Country Status (2)

Country Link
CN (1) CN113469000B (en)
WO (1) WO2022267795A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113469000B (en) * 2021-06-23 2024-06-14 追觅创新科技(苏州)有限公司 Regional map processing method and device, storage medium and electronic device
CN116211168A (en) * 2021-12-02 2023-06-06 追觅创新科技(苏州)有限公司 Operation control method and device of cleaning equipment, storage medium and electronic device
CN114521841A (en) * 2022-03-23 2022-05-24 深圳市优必选科技股份有限公司 Cleaning area management method, system, intelligent terminal, robot and storage medium
CN116091607B (en) * 2023-04-07 2023-09-26 科大讯飞股份有限公司 Method, device, equipment and readable storage medium for assisting user in searching object

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108885459A (en) * 2018-06-08 2018-11-23 珊口(深圳)智能科技有限公司 Air navigation aid, navigation system, mobile control system and mobile robot
CN111839360A (en) * 2020-06-22 2020-10-30 珠海格力电器股份有限公司 Data processing method, device and equipment of sweeper and computer readable medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102061511B1 (en) * 2013-04-26 2020-01-02 삼성전자주식회사 Cleaning robot, home monitoring apparatus and method for controlling the same
CN112867424B (en) * 2019-03-21 2022-05-06 深圳阿科伯特机器人有限公司 Navigation and cleaning area dividing method and system, and moving and cleaning robot
CN110622085A (en) * 2019-08-14 2019-12-27 珊口(深圳)智能科技有限公司 Mobile robot and control method and control system thereof
CN111242994B (en) * 2019-12-31 2024-01-09 深圳优地科技有限公司 Semantic map construction method, semantic map construction device, robot and storage medium
CN111325136B (en) * 2020-02-17 2024-03-19 北京小马慧行科技有限公司 Method and device for labeling object in intelligent vehicle and unmanned vehicle
CN112307994A (en) * 2020-11-04 2021-02-02 深圳市普森斯科技有限公司 Obstacle identification method based on sweeper, electronic device and storage medium
CN112462780B (en) * 2020-11-30 2024-05-21 深圳市杉川致行科技有限公司 Sweeping control method and device, sweeping robot and computer readable storage medium
CN112783156A (en) * 2020-12-25 2021-05-11 北京小狗吸尘器集团股份有限公司 Sweeping robot and sweeping task planning method and device thereof
CN113469000B (en) * 2021-06-23 2024-06-14 追觅创新科技(苏州)有限公司 Regional map processing method and device, storage medium and electronic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108885459A (en) * 2018-06-08 2018-11-23 珊口(深圳)智能科技有限公司 Air navigation aid, navigation system, mobile control system and mobile robot
CN111839360A (en) * 2020-06-22 2020-10-30 珠海格力电器股份有限公司 Data processing method, device and equipment of sweeper and computer readable medium

Also Published As

Publication number Publication date
WO2022267795A1 (en) 2022-12-29
CN113469000A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
CN113469000B (en) Regional map processing method and device, storage medium and electronic device
CN108304882B (en) Image classification method and device, server, user terminal and storage medium
CN108416198B (en) Device and method for establishing human-machine recognition model and computer readable storage medium
CN109509260B (en) Labeling method, equipment and readable medium of dynamic obstacle point cloud
CN111708366B (en) Robot, and method, apparatus and computer-readable storage medium for controlling movement of robot
CN113095434B (en) Target detection method and device, electronic equipment and storage medium
CN111652087B (en) Car inspection method, device, electronic equipment and storage medium
CN107682368B (en) Verification method, client, server and system based on interactive operation
CN110737798B (en) Indoor inspection method and related product
CN111832447A (en) Building drawing component identification method, electronic equipment and related product
CN108537129B (en) Method, device and system for marking training samples
CN108579094A (en) A kind of user interface detection method and relevant apparatus, system and storage medium
CN112241565A (en) Modeling method and related device
CN113907663A (en) Obstacle map construction method, cleaning robot and storage medium
CN115131604A (en) Multi-label image classification method and device, electronic equipment and storage medium
CN111783561A (en) Picture examination result correction method, electronic equipment and related products
CN113781462A (en) Human body disability detection method, device, equipment and storage medium
CN113537122A (en) Motion recognition method and device, storage medium and electronic equipment
CN110111385B (en) Method, terminal and server for realizing target positioning in three-dimensional space
US20230177800A1 (en) Handheld Object Recognition Method and Apparatus
CN111703278A (en) Fragrance release method, device, vehicle end, cloud end, system and storage medium
CN114061593B (en) Navigation method and related device based on building information model
CN108805121B (en) License plate detection and positioning method, device, equipment and computer readable medium
CN114972500A (en) Checking method, marking method, system, device, terminal, equipment and medium
CN113469138A (en) Object detection method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant