WO2022267795A1 - Procédé et appareil de traitement de carte régionale, support de stockage et dispositif électronique - Google Patents

Procédé et appareil de traitement de carte régionale, support de stockage et dispositif électronique Download PDF

Info

Publication number
WO2022267795A1
WO2022267795A1 PCT/CN2022/094615 CN2022094615W WO2022267795A1 WO 2022267795 A1 WO2022267795 A1 WO 2022267795A1 CN 2022094615 W CN2022094615 W CN 2022094615W WO 2022267795 A1 WO2022267795 A1 WO 2022267795A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
target object
area map
information
area
Prior art date
Application number
PCT/CN2022/094615
Other languages
English (en)
Chinese (zh)
Inventor
王生乐
Original Assignee
追觅创新科技(苏州)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 追觅创新科技(苏州)有限公司 filed Critical 追觅创新科技(苏州)有限公司
Publication of WO2022267795A1 publication Critical patent/WO2022267795A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Definitions

  • the present application relates to the communication field, and in particular, relates to a method and device for processing an area map, a storage medium, and an electronic device.
  • the sweeping robot usually interacts with the user through the mobile app (Application, application), and the house map created by the sweeping robot will be displayed on the App.
  • the above-mentioned house map usually only shows whether there is an object at a position, and the user cannot obtain accurate object information from the map.
  • the purpose of the present application is to provide a method and device for processing an area map, a storage medium and an electronic device, so as to at least solve the problem of low information acquisition efficiency in the related art of constructing an area map by a sweeping robot.
  • a method for processing an area map including: acquiring an image of a target object corresponding to a target object in a target area, wherein the target area is an area cleaned by a sweeping robot; according to The target object image is used to identify the target object to obtain a target recognition result, wherein the target recognition result includes description information of the target object; according to the target recognition result, the target object is identified in the target area map Marking, wherein the target object is an object corresponding to the target object in the target area map, and the target area map is an area map established for the target area by the cleaning robot.
  • the method before acquiring the image of the target object corresponding to the target object in the target area, the method further includes: when the cleaning robot cleans the target area , it is detected that the target object exists in the target area.
  • identifying the target object according to the target object image and obtaining the target recognition result includes: performing type identification on the target object according to the target object image to obtain target type information, Wherein, the target type information is used to represent the object type of the target object, and the target recognition result includes the target type information.
  • performing type recognition on the target object according to the target object image, and obtaining the target type information includes: inputting the target object image into a target recognition model, and obtaining an output of the target recognition model
  • the target type information, the target recognition model is obtained by using the object image of the sample object to train the initial recognition model, and the sample object is marked with the corresponding object type.
  • recognizing the target object according to the target object image, and obtaining the target recognition result includes: performing contour recognition on the target object according to the target object image to obtain target contour information, wherein, the target contour information is used to represent the object contour of the target object, and the target recognition result includes the target contour information.
  • marking the target object in the target area map includes: according to the target outline information, marking the target object in the form of a virtual wall in the target area map An outline of the target object is marked.
  • the method further includes: sending the target area map to the target terminal
  • the target application is displayed, wherein the target application uses the target account bound to the sweeping robot to log in.
  • a method for processing an area map including: receiving the target area map sent by the sweeping robot through a target application, wherein the target application uses the The target account is logged in, and the target area map is an area map established for the target area by the sweeping robot; the target area map is displayed on the target display interface of the target application, wherein the target area map is displayed on the target area map There is a target object and target labeling information of the target object, the target object is an object corresponding to the target object in the target area in the target area map, and the target labeling information is used to describe the target object .
  • displaying the target area map on the target display interface of the target application includes: displaying the target object and the target area map on the target display interface.
  • the contour information of the target object wherein the contour information of the target object is used to represent the contour of the target object, and the target annotation information includes the contour information of the target object.
  • displaying the target object and the outline information of the target object in the target area map displayed on the target display interface includes: displaying the target on the target display interface The target object is displayed on the area map, and the contour information of the target object is displayed in the form of a virtual wall.
  • displaying the target area map on the target display interface of the target application includes: displaying the target object and target on the target area map displayed on the target display interface Type information, wherein the target type information is used to represent the object type of the target object, and the target label information includes the target type information.
  • an area map processing device including: an acquisition unit, configured to acquire an image of a target object corresponding to a target object in a target area, wherein the target area is a sweeping The area cleaned by the robot; the identification unit is used to identify the target object according to the image of the target object to obtain a target recognition result, wherein the target recognition result includes description information of the target object; the labeling unit uses According to the target recognition result, the target object is marked in the target area map, wherein the target object is an object corresponding to the target object in the target area map, and the target area map is obtained through the The area map established by the sweeping robot for the target area.
  • the device further includes: a detection unit configured to, before acquiring the image of the target object corresponding to the target object in the target area, detect the During the process of cleaning the target area, it is detected that the target object exists in the target area.
  • a detection unit configured to, before acquiring the image of the target object corresponding to the target object in the target area, detect the During the process of cleaning the target area, it is detected that the target object exists in the target area.
  • the identification unit includes: a first identification module, configured to identify the type of the target object according to the image of the target object to obtain target type information, wherein the target type information is used for Indicates the object type of the target object, and the target recognition result includes the target type information.
  • the first recognition module includes: an input submodule, configured to input the image of the target object into a target recognition model to obtain the target type information output by the target recognition model, the The target recognition model is obtained by training the initial recognition model by using object images of sample objects marked with corresponding object types.
  • the recognition unit includes: a second recognition module, configured to perform contour recognition on the target object according to the target object image to obtain target contour information, wherein the target contour information is used for Indicates the object contour of the target object, and the target recognition result includes the target contour information.
  • the labeling unit includes: a labeling module, configured to mark an outline of the target object in the form of a virtual wall in the target area map according to the target outline information.
  • the device further includes: a sending unit, configured to send the target area map after marking the target object in the target area map according to the target recognition result
  • the target application on the target terminal is displayed, wherein the target application uses the target account bound to the sweeping robot to log in.
  • an area map processing device including: a receiving unit, configured to receive the target area map sent by the sweeping robot through the target application, wherein the target application uses the same The target account bound to the sweeping robot is logged in, and the target area map is an area map established for the target area by the sweeping robot; a display unit is used to display the target area map on the target display interface of the target application, Wherein, a target object and target annotation information of the target object are displayed on the target area map, the target object is an object corresponding to a target object in the target area in the target area map, and the The target annotation information is used to describe the target object.
  • the display unit includes: a first display module, configured to display the target object and the outline information of the target object in the target area map displayed on the target display interface, Wherein, the contour information of the target object is used to represent the contour of the target object, and the target label information includes the contour information of the target object.
  • the first display module includes: a display submodule, configured to display the target object in the target area map displayed on the target display interface, and display the target object in the form of a virtual wall Outline information of the target object.
  • the display unit includes: a second display module, configured to display the target object and target category information in the target area map displayed on the target display interface, wherein the The target type information is used to represent the object type of the target object, and the target label information includes the target type information.
  • the target object image corresponding to the target object in the target area is obtained by marking the corresponding object on the area map according to the description information of the object, where the target area is cleaned by the sweeping robot region; identify the target object according to the target object image, and obtain the target recognition result, wherein the target recognition result contains the description information of the target object; according to the target recognition result, mark the target object in the target area map, wherein the target object is The object corresponding to the target object in the target area map.
  • the target area map is the area map established for the target area by the sweeping robot.
  • the obtained recognition result contains the description information of the object (for example, object types, object outlines, etc.), which can ensure that the obtained description information can accurately describe the objects in the area.
  • the corresponding objects in the area map are marked according to the description information of the objects, the corresponding objects of the objects can be displayed in the area map.
  • display the labeling information of the object so as to achieve the purpose of quickly obtaining object information from the area map, achieve the technical effect of improving the accuracy and efficiency of object information acquisition, and then solve the problem of building area maps by sweeping robots in related technologies There is a problem of low efficiency of information acquisition.
  • Fig. 1 is the schematic diagram of the hardware environment of a kind of optional area map processing method according to the embodiment of the application;
  • FIG. 2 is a schematic flowchart of an optional method for processing an area map according to an embodiment of the present application
  • FIG. 3 is a schematic flowchart of another optional method for processing an area map according to an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of another optional method for processing an area map according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an optional area map according to an embodiment of the present application.
  • Fig. 6 is a structural block diagram of an optional area map processing device according to an embodiment of the present application.
  • Fig. 7 is a structural block diagram of another optional area map processing device according to an embodiment of the present application.
  • Fig. 8 is a structural block diagram of an optional electronic device according to an embodiment of the present application.
  • a method for processing an area map is provided.
  • the above method for processing an area map may be applied to a hardware environment composed of a terminal 102 and a server 104 as shown in FIG. 1 .
  • the server 104 is connected to the terminal 102 through the network, and can be used to provide services (such as game services, application services, etc.) It is used to provide data storage service for the server 104.
  • the above-mentioned terminal 102 may include one or more terminals, and different terminals may be connected by communication through a server, or may be directly connected by communication without going through a server.
  • the terminal 102 may include at least one of the following: a user terminal, and a cleaning device, where the cleaning device may include a sweeping robot.
  • the foregoing network may include but not limited to at least one of the following: a wired network and a wireless network.
  • the above-mentioned wired network may include but not limited to at least one of the following: wide area network, metropolitan area network, and local area network
  • the above-mentioned wireless network may include but not limited to at least one of the following: WIFI (Wireless Fidelity, Wireless Fidelity), Bluetooth.
  • the terminal 102 may not be limited to a PC, a mobile phone, a tablet computer, and the like.
  • the method for processing an area map in this embodiment of the present application may be executed by the server 104 , may also be executed by the terminal 102 , and may also be executed jointly by the server 104 and the terminal 102 .
  • the method for processing the area map in the embodiment of the present application executed by the terminal 102 may also be executed by a client installed on it.
  • FIG. 2 is a schematic flowchart of an optional area map processing method according to the embodiment of the present application. As shown in FIG. 2 , the method The process can include the following steps:
  • Step S202 acquiring an image of the target object corresponding to the target object in the target area, wherein the target area is the area cleaned by the cleaning robot.
  • a sweeping robot is a general term for a class of devices with a cleaning function, which may be a separate cleaning device or belong to other smart devices, which is not specifically limited in this embodiment.
  • the processing method of the area map in this embodiment may be executed by the sweeping robot, may be executed by the smart device to which the sweeping robot belongs, may also be executed by a background server, or may be executed by other devices with data processing capabilities In this embodiment, the execution by the sweeping robot is taken as an example for description.
  • the target user can use the target account to log in to the target application running on the terminal device (that is, the target terminal).
  • the target application is an application that matches the sweeping robot.
  • the sweeping robot is controlled to realize the interaction between the target user and the sweeping robot. For example, the area map created by the sweeping robot can be displayed to the target user through the target application, and the target user can send cleaning instructions to the sweeping robot through the target application.
  • the sweeping robot can be used to clean a target area.
  • the target area can be a closed area, such as a bedroom, living room, bathroom, etc., or an unenclosed area, such as an outdoor field.
  • the target area may contain a target object, and the target object may be an object with a certain shape, such as a trash can, a wire, slippers, etc., but is not limited thereto, and the type of the target object is not limited in this embodiment.
  • the sweeping robot can be equipped with a data acquisition component, which can be a camera, an infrared sensor, etc., but is not limited thereto. Other components capable of collecting object images can be applied to this embodiment.
  • the data collection component may perform a data collection operation, and the collected data may be a target object image of the target object.
  • the target object image may be an object image of the target object at a certain angle, or may be an object image of the target object at multiple angles.
  • Various operations can be performed by using object images of the target object at multiple angles, for example, the three-dimensional shape of the target object can be constructed, thereby improving the accuracy of recognition.
  • step S204 the target object is recognized according to the image of the target object to obtain a target recognition result, wherein the target recognition result includes description information of the target object.
  • the image of the target object can represent the attribute information of the target object, for example, the shape, color, size, etc. of the object.
  • the target object can be recognized according to the image of the target object to obtain a target recognition result.
  • the obtained target recognition result may include descriptive information of the target object, for example, information describing the object type of the target object, the object outline of the target object, and the like.
  • the above recognition operation may be performed by a sweeping robot. After acquiring the above image of the target object, the cleaning robot can directly perform the above recognition operation according to the image of the target object.
  • the above recognition operation may be performed by other devices such as a smart device to which the sweeping robot belongs, a background server, and the like.
  • the sweeping robot can send the acquired image of the target object to other devices, and after receiving the image of the target object, other devices can perform the above recognition operation according to the image of the target object.
  • the recognition operation can be real-time (for example, perform the recognition operation immediately after acquiring the image of the target object), or non-real-time (for example, perform the recognition operation when the target object image is acquired when idle), this This is not limited in the embodiments.
  • the target object image can contain multiple object images from different angles of the target object.
  • each object image in the multiple object images can be recognized separately to obtain the recognition result of each object image; and then the recognition of each object image The results are fused to obtain the target recognition result.
  • multiple object images may also be recognized simultaneously, and the target recognition result may be obtained by fusing image features of each object image.
  • identifying the target object according to the target object image may be: constructing a target object model corresponding to the target object according to the target object image, where the target object model is used to represent the three-dimensional shape of the target object; The model recognizes the target object.
  • Step S206 mark the target object in the target area map according to the target recognition result, wherein the target object is an object corresponding to the target object in the target area map, and the target area map is an area map established for the target area by the sweeping robot.
  • the area map established for the target area by the sweeping robot is the target area map.
  • Creating an area map through the sweeping robot refers to: using the data obtained by the sweeping robot to collect data in a certain area to establish an area map of the area.
  • the operation of creating an area map may be performed by the cleaning robot, or may be performed by other devices (for example, a smart device to which the cleaning robot belongs, a background server, etc.), which is not limited in this embodiment.
  • the method for processing the area map in this embodiment may be an object labeling scheme in the area map, and there may be various timings for performing object labeling.
  • object labeling may be performed when building a map of an area of interest. In this case, map building and object labeling are performed simultaneously.
  • object labeling may also be performed after the established target area map.
  • object labeling may be performed on existing objects in an established target area map.
  • the above-mentioned object labeling method can be compatible with existing regional map creation schemes, and enrich object information in the target regional map.
  • Object labeling can also be performed on newly added objects in an established target area map.
  • the object labeling method described above can be applied to the scene where a new object is added in the target area, which improves the accuracy of the object information in the area map, and further improves the ability of the area map to represent the area.
  • the target object image corresponding to the target object in the target area is acquired, wherein the target area is the area cleaned by the sweeping robot; the target object is recognized according to the target object image, and the target recognition result is obtained, wherein , the target recognition result contains the description information of the target object; according to the target recognition result, the target object is marked in the target area map, where the target object is the object corresponding to the target object in the target area map, and the target area map is the The area map established for the target area solves the problem of low information acquisition efficiency in the way of building area maps by sweeping robots in related technologies, and improves the accuracy and efficiency of object information acquisition.
  • the above method before acquiring the target object image corresponding to the target object in the target area, the above method further includes:
  • acquiring the image of the target object may be performed when the cleaning robot cleans the target area. If the sweeping robot encounters a target object during cleaning, the above-mentioned data collection component or other sensing components can detect the presence of the target object in the target area. If a target object is detected in the target area, the step of acquiring an image of the target object may be triggered, regardless of whether a target area map has been established for the target area.
  • the cleaning robot may determine whether there is an object corresponding to the target object in the target area map, and whether the object corresponding to the target object has been marked. In the case that there is no object corresponding to the target object in the target area map, or there is an object corresponding to the target object in the target area map, but the object corresponding to the target object is not marked, trigger the execution of acquisition Steps for target object images.
  • the sweeping robot detects an object in the area during the cleaning process, it triggers the acquisition of the object image of the object, and then marks the object corresponding to the object, which can improve the timeliness of object marking.
  • the target object is recognized according to the target object image, and the target recognition result obtained includes:
  • S21 Perform type recognition on the target object according to the target object image to obtain target type information, wherein the target type information is used to indicate the object type of the target object, and the target recognition result includes the target type information.
  • the type of the object is more convenient for the user to obtain the information of the object, that is, it can provide richer object information. Therefore, in this embodiment, in order to improve the convenience of providing object information, the object recognition result may include object type information used to indicate the object type of the target object.
  • the sweeping robot can identify the type of the target object according to the image of the target object, and obtain the information of the target type.
  • marking the target object in the target area map according to the target recognition result may include: marking target type information on a target position matching the target object in the target area map.
  • the target type information which may be text or other expression forms other than text, for example, symbols, patterns and so on.
  • the above-mentioned target position can be above, below, left, right, etc. of the target object, and the distance between the two is less than or equal to the target distance threshold, etc.
  • the type recognition is performed according to the object image, so as to mark the object type on the area map, which can enrich the object information provided by the area map, and improve the convenience of object information acquisition.
  • the target object type is identified according to the target object image, and the target type information obtained includes:
  • the identification method adopted may be a non-AI (Artificial Intelligence, artificial intelligence) identification method.
  • the type of the object may be identified based on AI, for example, the type of the object may be identified using a target recognition model.
  • the object recognition model is a neural network model, and it can also be other types of AI models.
  • the target recognition model can be obtained by using the object image of the sample object to train the initial recognition model.
  • the sample object is marked with the corresponding object type, so that the model parameters of the recognition model can be adjusted according to the output result of the recognition model and the marked object type. Make adjustments to obtain a trained target recognition model.
  • the object image of the sample object may be obtained by collecting data on the sample object using a sweeping robot of the same type as the sweeping robot that acquires the image of the target object.
  • the device for training the recognition model may be a sweeping robot or other devices, which is not limited in this embodiment.
  • the target object image can be input into the target recognition model, and the target recognition model can recognize the type of the target object based on the target object image, and determine the object type of the target object as each candidate among multiple candidate object types The probability of the object type, and then determine the object type that best matches the target object from multiple candidate object types.
  • the target type information indicates the object type that best matches the target object.
  • the type of object is recognized based on AI, and the accuracy of object type recognition can be improved.
  • the target object is recognized according to the target object image, and the target recognition result obtained includes:
  • S41 Perform contour recognition on the target object according to the target object image to obtain target contour information, wherein the target contour information is used to represent the object contour of the target object, and the target recognition result includes the target contour information.
  • the target recognition result may include target outline information used to represent the object outline of the target object.
  • the sweeping robot can recognize the outline of the target object according to the image of the target object, and obtain the target outline information.
  • contour recognition There are many ways to perform contour recognition on the target object image, including but not limited to at least one of the following: image contour extraction, image segmentation, image segmentation based on AI semantics, etc.
  • the contour recognition method is not done. limited.
  • the contour represented by the target contour information may include specific contours of different parts.
  • different parts belonging to the same object can be marked with the same labeling method to reflect that these parts belong to the same object.
  • the outline represented by the target outline information may also include the overall outline of the object, that is, the outline that can include all parts of the object.
  • marking the outline of an object can include all the parts belonging to the object, so as to reflect that these parts belong to the object.
  • contour recognition is performed according to the object image, so that the contour of the object is marked on the area map, and the convenience of object information acquisition can be improved.
  • marking the target object on the target area map according to the target recognition result includes:
  • marking the target object on the target area map according to the target recognition result may include: marking the target object's outline on the target area map according to the target outline information.
  • the contour of the target object may be determined according to the object contour of the target object. For example, the contour of the target object is obtained after operations such as scaling and translation are performed on the object contour of the target object.
  • the outline of the target object can be marked in the form of a virtual wall, or it can be marked in the form of other lines and curves other than the virtual wall.
  • the outline of the target object is marked in the form of a virtual wall in the target map, and the area where the target object is located can be set as the detour area of the sweeping robot (or in other words, the cleaning area is prohibited), so that it can control The sweeping robot avoids the target object during cleaning.
  • the object outlines corresponding to the objects in the area are marked on the area map in the form of a virtual wall, which can improve the convenience of outline marking and facilitate the cleaning control of the sweeping robot.
  • the above method further includes:
  • the established target area map it can be stored on the sweeping robot for cleaning the target area by the sweeping robot.
  • the sweeping robot may also send the map of the target area to the target application logged in with the target account on the target terminal.
  • There can be one or more timings for sending the target area map for example, after the target area map is established, after the target area map is updated, after receiving the map acquisition request of the target application, and other timings that allow the area map to be sent .
  • the display interface for displaying the area map in the target application is the target display interface.
  • the target application can display the target area map on the target display interface.
  • the target application may store the target area map on the target terminal, and display the target area map on the target display interface after detecting the map display instruction.
  • a method for processing an area map is also provided.
  • the above method for processing an area map may be applied to a hardware environment composed of a terminal 102 and a server 104 as shown in FIG. 1 . What has already been described will not be repeated here.
  • FIG. 3 is a schematic flowchart of another optional method for processing an area map according to an embodiment of the present application. As shown in FIG. 3 , the The flow of the method may include the following steps:
  • Step S302 receiving the target area map sent by the sweeping robot through the target application, wherein the target application uses the target account bound to the sweeping robot to log in, and the target area map is an area map established for the target area by the sweeping robot.
  • the method for processing an area map in this embodiment can be applied to a scene where a sweeping robot is used to construct an area map.
  • the sweeping robot, the target application, and the target area map are the same as or similar to those in the foregoing embodiments.
  • the target area map may be an area map created or updated by the area map processing method in the foregoing embodiments, What has already been described will not be repeated here.
  • a target application logged in using the target account may run on it.
  • a communication connection may be established between the target application and the sweeping robot, and through the communication connection between the two, the target application may receive the aforementioned target area map sent by the sweeping robot.
  • Step S304 displaying the target area map on the target display interface of the target application, where the target object and the target labeling information of the target object are displayed on the target area map, and the target object is the target object in the target area map and the target area
  • the target annotation information is used to describe the target object.
  • the display interface for displaying the area map in the target application is the target display interface.
  • the target application can display the target area map on the target display interface.
  • the target application may store the target area map on the target terminal, and display the target area map on the target display interface after detecting the map display instruction.
  • the displayed target area map may include the above-mentioned target object and label information of the target object, that is, target label information.
  • the target annotation information can be used to describe the target object, thereby enriching the object information that can be provided in the area map.
  • the target labeling information may include the labeling information obtained by labeling the target object in the target area map according to the aforementioned target recognition result, and may also contain other labeling information, for example, the labeling information obtained by labeling the target object in other ways
  • the marking information of which is not limited in this embodiment.
  • the target application receives the target area map sent by the sweeping robot, wherein the target application uses the target account bound to the sweeping robot to log in, and the target area map is the area map established for the target area by the sweeping robot;
  • the target area map is displayed on the target display interface of the target application, wherein the target object and the target label information of the target object are displayed on the target area map, and the target object is an object corresponding to the target object in the target area in the target area map , the target labeling information is used to describe the target object, which solves the problem of low information acquisition efficiency in the way of building area maps by sweeping robots in related technologies, and improves the accuracy and efficiency of object information acquisition.
  • displaying the target area map on the target display interface of the target application includes:
  • the target annotation information may include outline information of the target object.
  • the contour information of the target object may be used to represent the contour of the target object, which may include contour information obtained by marking the contour of the target object in the target area map according to the aforementioned target contour information, or may include contour information obtained by other means Outline information of the target object.
  • the target area map displayed on the target display interface may include target objects and outline information of the target objects. If an object contains multiple parts, multiple parts can be marked with the same labeling method to reflect that these parts belong to the same object; or, the outline of the object can contain all parts belonging to the object to reflect that these parts belong to the same object object.
  • the outline of the object is marked on the area map, which can improve the convenience of obtaining object information.
  • displaying the target object and the outline information of the target object in the target area map displayed on the target display interface includes:
  • the outline information of the target object may be displayed in the form of a virtual wall, or in the form of other straight lines or curves other than the virtual wall.
  • the outline information of the target object is displayed in the form of a virtual wall in the target map, and the area where the target object is located can be set as the detour area of the sweeping robot, so that the sweeping robot can be controlled to avoid the target object.
  • the outline of objects in the area map is displayed in the form of a virtual wall, which can improve the convenience of outline display and facilitate the cleaning control of the sweeping robot.
  • displaying the target area map on the target display interface of the target application includes:
  • S91 Display target objects and target type information on the target area map displayed on the target display interface, wherein the target type information is used to indicate the object type of the target object, and the target label information includes target type information.
  • the target labeling information may include target type information.
  • the target type information may be used to indicate the object type of the target object, which may be the target type information obtained by identifying the type of the target object according to the target object image as described above, or may be the target type information obtained in other ways.
  • the target area map displayed on the target display interface may include target object and target type information.
  • the target category information may be displayed on the target location matching the target object.
  • the above-mentioned target position can be above, below, left, right, etc. of the target object, and the distance between the two is less than or equal to the target distance threshold, etc. In this embodiment, there is no limitation on the display method and marking position of the target type information .
  • displaying object types corresponding to objects in the area map on the area map can enrich the object information provided by the area map and improve the convenience of object information acquisition.
  • This example provides a self-labeling solution for virtual walls based on AI technology, which can automatically mark the outline of objects (for example, obstacles) in the form of virtual walls on the App map, and mark the type.
  • objects for example, obstacles
  • the flow of the method for processing an area map in this optional example may include the following steps:
  • Step S402 the sweeping robot detects an object during the cleaning process
  • Step S404 the sweeping robot recognizes the three-dimensional shape of the object, and recognizes the type of the object based on AI;
  • Step S406 mark the outline of the object in the form of a virtual wall in the App map, and mark the type of the object, wherein the object is represented by a two-dimensional image in the App map.
  • the App map displays a two-dimensional image of the object, the type of the object is represented by text, and the dotted line is the virtual wall of the outline of the object.
  • the App map users can see the types of objects in the room and at the same time see the real shape more clearly and accurately.
  • FIG. 6 is a structural block diagram of an optional area map processing device according to an embodiment of the present application. As shown in Fig. 6, the device may include:
  • An acquisition unit 602 configured to acquire an image of a target object corresponding to a target object in a target area, where the target area is an area cleaned by the sweeping robot;
  • the recognition unit 604 is connected to the acquisition unit 602, and is used to recognize the target object according to the image of the target object to obtain a target recognition result, wherein the target recognition result includes description information of the target object;
  • the labeling unit 606 is connected to the recognition unit 604, and is used to mark the target object in the target area map according to the target recognition result, wherein the target object is the object corresponding to the target object in the target area map, and the target area map is A map of the area built by the robot for the target area.
  • the obtaining unit 602 in this embodiment can be used to perform the above step S202
  • the identification unit 604 in this embodiment can be used to perform the above step S204
  • the labeling unit 606 in this embodiment can be used to perform the above Step S206.
  • the target object image corresponding to the target object in the target area is obtained, wherein the target area is the area cleaned by the sweeping robot; the target object is recognized according to the target object image, and the target recognition result is obtained, wherein the target recognition result Contains the description information of the target object; according to the target recognition result, mark the target object in the target area map, where the target object is the object corresponding to the target object in the target area map, and the target area map is established for the target area by the sweeping robot
  • the regional map solves the problem of low information acquisition efficiency in the way of building area maps by sweeping robots in related technologies, and improves the accuracy and efficiency of object information acquisition.
  • the above-mentioned device also includes:
  • the detection unit is configured to detect the existence of the target object in the target area during the process of cleaning the target area by the cleaning robot before acquiring the image of the target object corresponding to the target object in the target area.
  • the identification unit 604 includes:
  • the first identification module is configured to identify the type of the target object according to the image of the target object to obtain target type information, wherein the target type information is used to indicate the object type of the target object, and the target recognition result includes the target type information.
  • the first identification module includes:
  • the input sub-module is used to input the image of the target object into the target recognition model to obtain the target type information output by the target recognition model.
  • the target recognition model is obtained by using the object image of the sample object to train the initial recognition model, and the sample object is marked with the corresponding type of object.
  • the identification unit 604 includes:
  • the second recognition module is configured to perform contour recognition on the target object according to the target object image to obtain target contour information, wherein the target contour information is used to represent the object contour of the target object, and the target recognition result includes the target contour information.
  • the labeling unit 606 includes:
  • the labeling module is used to mark the outline of the target object in the form of a virtual wall in the target area map according to the target outline information.
  • the above-mentioned device also includes:
  • the sending unit is configured to, after marking the target object in the target area map according to the target recognition result, send the target area map to the target application on the target terminal for display, wherein the target application uses the target object bound to the sweeping robot Account login.
  • FIG. 7 is a structural block diagram of another optional area map processing device according to an embodiment of the present application. As shown in Fig. 7, the device may include:
  • the receiving unit 702 is configured to receive the target area map sent by the sweeping robot through the target application, wherein the target application uses the target account bound to the sweeping robot to log in, and the target area map is an area map established for the target area by the sweeping robot;
  • the display unit 704 is connected to the receiving unit 702, and is used to display the target area map on the target display interface of the target application, wherein the target object and the target labeling information of the target object are displayed on the target area map, and the target object is the target area map In , the object corresponding to the target object in the target area, the target annotation information is used to describe the target object.
  • receiving unit 702 in this embodiment can be used to perform the above step S302
  • display unit 704 in this embodiment can be used to perform the above step S304.
  • the target application receives the target area map sent by the sweeping robot, wherein the target application uses the target account bound to the sweeping robot to log in, and the target area map is the area map established for the target area by the sweeping robot; in the target application
  • the target area map is displayed on the target display interface, wherein the target object and the target labeling information of the target object are displayed on the target area map, the target object is the object corresponding to the target object in the target area in the target area map, and the target labeling information It is used to describe the target object, solves the problem of low information acquisition efficiency in the way of building area maps by sweeping robots in related technologies, and improves the accuracy and efficiency of object information acquisition.
  • the display unit 704 includes:
  • the first display module is used to display the target object and the outline information of the target object in the target area map displayed on the target display interface, wherein the outline information of the target object is used to represent the outline of the target object, and the target label information includes the target object profile information.
  • the first display module includes:
  • the display sub-module is used for displaying the target object in the target area map displayed on the target display interface, and displaying the outline information of the target object in the form of a virtual wall.
  • the display unit 704 includes:
  • the second display module is used to display the target object and target type information in the target area map displayed on the target display interface, wherein the target type information is used to indicate the object type of the target object, and the target label information includes target type information.
  • the above modules can run in the hardware environment shown in FIG. 1 , and can be implemented by software or by hardware, wherein the hardware environment includes a network environment.
  • a storage medium is also provided.
  • the above-mentioned storage medium may be used to execute the program code of any one of the above-mentioned area map processing methods in the embodiments of the present application.
  • the foregoing storage medium may be located on at least one network device among the plurality of network devices in the network shown in the foregoing embodiments.
  • the storage medium is configured to store program codes for performing the following steps:
  • the target object is an object corresponding to the target object in the target area map
  • the target area map is an area map established for the target area by the sweeping robot.
  • the above-mentioned storage medium may include, but not limited to, various media capable of storing program codes such as a U disk, ROM, RAM, removable hard disk, magnetic disk, or optical disk.
  • an electronic device for implementing the above method for processing an area map is also provided, and the electronic device may be a server, a terminal, or a combination thereof.
  • Fig. 8 is a structural block diagram of an optional electronic device according to an embodiment of the present application. 804 and memory 806 complete mutual communication through communication bus 808, wherein,
  • the target object is an object corresponding to the target object in the target area map
  • the target area map is an area map established for the target area by the sweeping robot.
  • the communication bus may be a PCI (Peripheral Component Interconnect, Peripheral Component Interconnect Standard) bus, or an EISA (Extended Industry Standard Architecture, Extended Industry Standard Architecture) bus, etc.
  • the communication bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is used in FIG. 8 , but it does not mean that there is only one bus or one type of bus.
  • the communication interface is used for communication between the above-mentioned electronic device and other devices.
  • the above-mentioned memory may include RAM, and may also include non-volatile memory (non-volatile memory), for example, at least one disk memory.
  • non-volatile memory non-volatile memory
  • the memory may also be at least one storage device located away from the aforementioned processor.
  • the above-mentioned memory 806 may include, but is not limited to, the acquiring unit 602, the identifying unit 604, and the labeling unit 606 in the control device of the above-mentioned device. In addition, it may also include but not limited to other module units in the control device of the above equipment, which will not be described in detail in this example.
  • the memory 806 may include, but is not limited to, the receiving unit 702 and the display unit 704 in the control device of the above-mentioned device. In addition, it may also include but not limited to other module units in the control device of the above equipment, which will not be described in detail in this example.
  • processor can be general-purpose processor, can include but not limited to: CPU (Central Processing Unit, central processing unit), NP (Network Processor, network processor) etc.; Can also be DSP (Digital Signal Processing, digital signal processor ), ASIC (Application Specific Integrated Circuit, application specific integrated circuit), FPGA (Field-Programmable Gate Array, field programmable gate array) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • CPU Central Processing Unit, central processing unit
  • NP Network Processor, network processor
  • DSP Digital Signal Processing, digital signal processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array, field programmable gate array
  • other programmable logic devices discrete gate or transistor logic devices, discrete hardware components.
  • the device implementing the above-mentioned processing method for the area map can be a terminal device, and the terminal device can be a smart phone (such as an Android phone, an iOS phone, etc.), a tablet Computers, PDAs, and mobile Internet devices (Mobile Internet Devices, MID), PAD and other terminal equipment.
  • FIG. 8 does not limit the structure of the above-mentioned electronic device.
  • the electronic device may also include more or less components than those shown in FIG. 8 (such as a network interface, a display device, etc.), or have a different configuration from that shown in FIG. 8 .
  • the integrated units in the above embodiments are realized in the form of software function units and sold or used as independent products, they can be stored in the above computer-readable storage medium.
  • the technical solution of the present application is essentially or part of the contribution to the prior art, or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium.
  • Several instructions are included to make one or more computer devices (which may be personal computers, servers or network devices, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the disclosed client can be implemented in other ways.
  • the device embodiments described above are only illustrative, for example, the division of the units is only a logical function division, and there may be other division methods in actual implementation, for example, multiple units or components can be combined or can be Integrate into another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of units or modules may be in electrical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place or distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution provided in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un procédé et un appareil de traitement de carte régionale, un support de stockage et un dispositif électronique. Le procédé consiste à : acquérir une image d'objet cible correspondant à un objet cible dans une région cible, la région cible étant une région nettoyée par un robot de nettoyage (S202) ; reconnaître l'objet cible selon l'image d'objet cible pour obtenir un résultat de reconnaissance cible, le résultat de reconnaissance cible comprenant des informations de description de l'objet cible (S204) ; et marquer un objet cible dans une carte régionale cible selon le résultat de reconnaissance cible, l'objet cible étant un objet correspondant à l'objet cible dans la carte régionale cible, et la carte régionale cible étant une carte régionale construite pour la région cible au moyen du robot de nettoyage (S206). Le procédé résout le problème, dans l'état de la technique associé, d'une faible efficacité d'acquisition d'informations dans la manière de construire une carte régionale au moyen du robot de nettoyage.
PCT/CN2022/094615 2021-06-23 2022-05-24 Procédé et appareil de traitement de carte régionale, support de stockage et dispositif électronique WO2022267795A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110701569.0A CN113469000B (zh) 2021-06-23 2021-06-23 区域地图的处理方法及装置、存储介质及电子装置
CN202110701569.0 2021-06-23

Publications (1)

Publication Number Publication Date
WO2022267795A1 true WO2022267795A1 (fr) 2022-12-29

Family

ID=77872542

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/094615 WO2022267795A1 (fr) 2021-06-23 2022-05-24 Procédé et appareil de traitement de carte régionale, support de stockage et dispositif électronique

Country Status (2)

Country Link
CN (1) CN113469000B (fr)
WO (1) WO2022267795A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113469000B (zh) * 2021-06-23 2024-06-14 追觅创新科技(苏州)有限公司 区域地图的处理方法及装置、存储介质及电子装置
CN116211168A (zh) * 2021-12-02 2023-06-06 追觅创新科技(苏州)有限公司 清洁设备的运行控制方法及装置、存储介质及电子装置
CN114521841A (zh) * 2022-03-23 2022-05-24 深圳市优必选科技股份有限公司 打扫区域管理方法、***、智能终端、机器人及存储介质
CN116091607B (zh) * 2023-04-07 2023-09-26 科大讯飞股份有限公司 辅助用户寻找物体的方法、装置、设备及可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108885459A (zh) * 2018-06-08 2018-11-23 珊口(深圳)智能科技有限公司 导航方法、导航***、移动控制***及移动机器人
CN111242994A (zh) * 2019-12-31 2020-06-05 深圳优地科技有限公司 一种语义地图构建方法、装置、机器人及存储介质
CN111325136A (zh) * 2020-02-17 2020-06-23 北京小马智行科技有限公司 智能车辆中物体对象的标注方法及装置、无人驾驶车辆
US20210049376A1 (en) * 2019-08-14 2021-02-18 Ankobot (Shenzhen) Smart Technologies Co., Ltd. Mobile robot, control method and control system thereof
CN113469000A (zh) * 2021-06-23 2021-10-01 追觅创新科技(苏州)有限公司 区域地图的处理方法及装置、存储介质及电子装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102061511B1 (ko) * 2013-04-26 2020-01-02 삼성전자주식회사 청소 로봇, 홈 모니터링 장치 및 그 제어 방법
WO2020186493A1 (fr) * 2019-03-21 2020-09-24 珊口(深圳)智能科技有限公司 Procédé et système de navigation et de division d'une région de nettoyage, robot mobile et robot de nettoyage
CN111839360B (zh) * 2020-06-22 2021-09-14 珠海格力电器股份有限公司 扫地机数据处理方法、装置、设备及计算机可读介质
CN112307994A (zh) * 2020-11-04 2021-02-02 深圳市普森斯科技有限公司 基于扫地机的障碍物识别方法、电子装置及存储介质
CN112462780B (zh) * 2020-11-30 2024-05-21 深圳市杉川致行科技有限公司 扫地控制方法、装置、扫地机器人及计算机可读存储介质
CN112783156A (zh) * 2020-12-25 2021-05-11 北京小狗吸尘器集团股份有限公司 扫地机器人及其清扫任务规划方法和装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108885459A (zh) * 2018-06-08 2018-11-23 珊口(深圳)智能科技有限公司 导航方法、导航***、移动控制***及移动机器人
US20210049376A1 (en) * 2019-08-14 2021-02-18 Ankobot (Shenzhen) Smart Technologies Co., Ltd. Mobile robot, control method and control system thereof
CN111242994A (zh) * 2019-12-31 2020-06-05 深圳优地科技有限公司 一种语义地图构建方法、装置、机器人及存储介质
CN111325136A (zh) * 2020-02-17 2020-06-23 北京小马智行科技有限公司 智能车辆中物体对象的标注方法及装置、无人驾驶车辆
CN113469000A (zh) * 2021-06-23 2021-10-01 追觅创新科技(苏州)有限公司 区域地图的处理方法及装置、存储介质及电子装置

Also Published As

Publication number Publication date
CN113469000B (zh) 2024-06-14
CN113469000A (zh) 2021-10-01

Similar Documents

Publication Publication Date Title
WO2022267795A1 (fr) Procédé et appareil de traitement de carte régionale, support de stockage et dispositif électronique
CN111832447B (zh) 建筑图纸构件识别方法、电子设备及相关产品
CN109995601B (zh) 一种网络流量识别方法及装置
CN113095434B (zh) 目标检测方法及装置、电子设备、存储介质
WO2020223975A1 (fr) Procédé de localisation de dispositif sur carte, serveur, et robot mobile
CN108416003A (zh) 一种图片分类方法和装置、终端、存储介质
CN108875667B (zh) 目标识别方法、装置、终端设备和存储介质
CN110175223A (zh) 一种实现问题生成的方法及装置
CN109117760A (zh) 图像处理方法、装置、电子设备和计算机可读介质
WO2022022292A1 (fr) Procédé et dispositif de reconnaissance d'objet portatif
CN111708366A (zh) 机器人及其行动控制方法、装置和计算机可读存储介质
US20200074175A1 (en) Object cognitive identification solution
WO2019223056A1 (fr) Procédé et appareil d'enseignement et d'apprentissage basés sur la reconnaissance de geste
CN113632097B (zh) 对象间的关联性的预测方法、装置、设备和存储介质
CN111783561A (zh) 审图结果修正方法、电子设备及相关产品
CN111832579A (zh) 地图兴趣点数据处理方法、装置、电子设备以及可读介质
US20220300774A1 (en) Methods, apparatuses, devices and storage media for detecting correlated objects involved in image
CN111703278B (zh) 香氛释放方法、装置、车端、云端、***和存储介质
CN113469138A (zh) 对象检测方法和装置、存储介质及电子设备
CN112380951B (zh) 一种识别异常行为的方法、装置、计算机设备及存储介质
CN106778449B (zh) 动态影像的物件辨识方法及自动截取目标图像的互动式影片建立方法
CN114694257A (zh) 多人实时三维动作识别评估方法、装置、设备及介质
CN114549968A (zh) 目标检测方法、装置以及电子设备
CN111061451A (zh) 一种信息处理方法及装置、***
CN111382626B (zh) 视频中违规图像的检测方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22827292

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22827292

Country of ref document: EP

Kind code of ref document: A1