CN105796002A - Indoor cleaning method for cleaning robot, cleaning robot and mobile terminal - Google Patents

Indoor cleaning method for cleaning robot, cleaning robot and mobile terminal Download PDF

Info

Publication number
CN105796002A
CN105796002A CN201610202459.9A CN201610202459A CN105796002A CN 105796002 A CN105796002 A CN 105796002A CN 201610202459 A CN201610202459 A CN 201610202459A CN 105796002 A CN105796002 A CN 105796002A
Authority
CN
China
Prior art keywords
indoor map
clean
untouched
campaign
spot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610202459.9A
Other languages
Chinese (zh)
Other versions
CN105796002B (en
Inventor
王达
谢焱
成悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201610202459.9A priority Critical patent/CN105796002B/en
Publication of CN105796002A publication Critical patent/CN105796002A/en
Application granted granted Critical
Publication of CN105796002B publication Critical patent/CN105796002B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an indoor cleaning method for a cleaning robot, a cleaning robot and a mobile terminal.The method comprises the steps of acquiring a two-dimensional indoor map drawn by the cleaning robot; acquiring the three-dimensional data of indoor articles detected by the cleaning robot; generating a three-dimensional indoor map according to the two-dimensional indoor map and the three-dimensional data of the indoor articles, and identifying and marking cleaning dead space on the two-dimensional indoor map and/or three-dimensional indoor map according to predetermined conditions; sending the two-dimensional indoor map and the three-dimensional indoor map marked with the cleaning dead space to a mobile terminal so that the cleaning dead space can be cleaned according to a received instruction of the mobile terminal.By the adoption of the scheme, indoor cleaning dead space can be identified and cleaned.

Description

Clean robot indoor cleaning processing method, clean robot and mobile terminal
Technical field
It relates to Smart Home technical field, particularly relate to a kind of clean robot indoor cleaning processing method, clean robot and mobile terminal.
Background technology
At present, along with the development of science and technology, various different clean robot has occurred.Clean robot is the one of controlling intelligent household appliances, is mainly engaged in the work such as the cleaning of domestic hygiene, cleaning, can rely on certain artificial intelligence, automatically complete floor cleaning work in room.Clean robot is generally adopted brush and sweeps and vacuum mode, ground foreign material first receives the rubbish receiver entering self, thus completing the function of land clearing.
But, existing clean robot generally uses the two-dimensional map of drafting when cleaning, therefore can not effectively identify for some indoor layout so that None-identified goes out the corner or spot as yet untouched by a clean-up campaign of cleaning.
Summary of the invention
Present disclose provides a kind of clean robot indoor cleaning processing method, clean robot and mobile terminal, can recognize that the corner or spot as yet untouched by a clean-up campaign of the cleaning of indoor and be cleaned.
First aspect according to disclosure embodiment, it is provided that a kind of clean robot indoor cleaning processing method, including:
Obtain the two-dimentional indoor map that clean robot self is drawn;
Obtain the stereo data of the indoor object that clean robot self detects;
Stereo data according to described two dimension indoor map and described indoor object generates three-dimensional indoor map, according to pre-conditioned in described two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in described three-dimensional indoor map;
The two-dimentional indoor map and three-dimensional indoor map that identify described corner or spot as yet untouched by a clean-up campaign are sent to mobile terminal so that described corner or spot as yet untouched by a clean-up campaign can be cleaned according to the instruction of the mobile terminal received.
Optionally, the stereo data of the indoor object that the clean robot of described acquisition detects self is to be detected according to self at least one assembly following by described clean robot: touch sensor, gyroscope, range sensor, airborne laser range finder, photographic head.
Optionally, described according to pre-conditioned in described two dimension indoor map and/or described three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning include:
According to the area that the neighbor distance of different objects in described two dimension indoor map and/or in described three-dimensional indoor map, the height supporting parts of same object or the support member group of same object become, in described two dimension indoor map and/or described three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning.
Optionally, the described area become according to the neighbor distance of different objects in described two dimension indoor map and/or in described three-dimensional indoor map, the height supporting parts of same object or the support member group of same object, in described two dimension indoor map and/or described three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning include:
When the neighbor distance of the different objects in described two dimension indoor map and/or in described three-dimensional indoor map is less than or equal to setpoint distance, the adjacent area identifying different objects is corner or spot as yet untouched by a clean-up campaign;Or,
When the height supporting parts of the same object in described two dimension indoor map and/or in described three-dimensional indoor map is more than or equal to setting height, identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;Or,
The area become when the support member group of the same object in described two dimension indoor map and/or in described three-dimensional indoor map is less than or equal to setting area, and identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;
The described corner or spot as yet untouched by a clean-up campaign identified is identified in described two dimension indoor map and/or in described three-dimensional indoor map.
Optionally, described method also includes: when the distance of two corner or spot as yet untouched by a clean-up campaign identified with described three-dimensional indoor map in described two dimension indoor map is less than or equal to setting threshold value, said two corner or spot as yet untouched by a clean-up campaign is merged into a corner or spot as yet untouched by a clean-up campaign mark in described two dimension indoor map and/or in described three-dimensional indoor map.
Second aspect according to disclosure embodiment, it is provided that a kind of clean robot indoor cleaning processing method, including:
Receive the two-dimentional indoor map of the corner or spot as yet untouched by a clean-up campaign being identified with cleaning that clean robot sends and three-dimensional indoor map, the stereo data that wherein said three-dimensional indoor map is the two-dimentional indoor map drawn according to self by described clean robot and the indoor object self detected generates, and the corner or spot as yet untouched by a clean-up campaign of described cleaning is to be identified and identify according to pre-conditioned in described two dimension indoor map and/or in described three-dimensional indoor map by described clean robot;
Sending, to described clean robot, the instruction cleaning described corner or spot as yet untouched by a clean-up campaign makes described clean robot clean described corner or spot as yet untouched by a clean-up campaign according to the instruction received.
Optionally, also include after the two-dimentional indoor map of the corner or spot as yet untouched by a clean-up campaign being identified with cleaning that described reception clean robot sends and three-dimensional indoor map:
The switching on mobile terminal screen of the three-dimensional indoor map of described reception and described two dimension indoor map and convergent-divergent is realized by gesture operation.
The third aspect according to disclosure embodiment, it is provided that a kind of clean robot, including:
First acquisition module, for obtaining the two-dimentional indoor map that clean robot self is drawn;
Second acquisition module, for obtaining the stereo data of the indoor object that clean robot self detects;
Conversion and mark module, the stereo data of the indoor object obtained for the described two dimension indoor map obtained according to described first acquisition module and described second acquisition module generates three-dimensional indoor map, according to pre-conditioned in described two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in described three-dimensional indoor map;
Sending module, for being sent to mobile terminal by the two-dimentional indoor map and three-dimensional indoor map that identify described corner or spot as yet untouched by a clean-up campaign so that can clean described corner or spot as yet untouched by a clean-up campaign according to the instruction of the mobile terminal received.
Optionally, described clean robot also includes: detection module, described detection module at least includes with the next item down assembly: touch sensor, gyroscope, range sensor, airborne laser range finder, photographic head, described detection module passes through the stereo data of the component detection indoor object comprised, and described second acquisition module obtains the stereo data of the indoor object that described detection module detects.
Optionally, described conversion and mark module include:
Transform subblock, for the three-dimensional indoor map of stereo data generation of the indoor object that the described two dimension indoor map obtained according to described first acquisition module and described second acquisition module obtain;
Identify submodule, for the area become according to the neighbor distance of different objects in described two dimension indoor map and/or in described three-dimensional indoor map, the height supporting parts of same object or the support member group of same object, in described two dimension indoor map and/or described three-dimensional indoor map identifies the corner or spot as yet untouched by a clean-up campaign of cleaning;
Labeling submodule, for identifying, in described two dimension indoor map and/or in described three-dimensional indoor map, the corner or spot as yet untouched by a clean-up campaign that described identification submodule identifies.
Optionally, described identification submodule includes:
First identifies submodule, and for when the neighbor distance of the different objects in described two dimension indoor map and/or in described three-dimensional indoor map is less than or equal to setpoint distance, the adjacent area identifying different objects is corner or spot as yet untouched by a clean-up campaign;Or,
Second identifies submodule, and for when the height supporting parts of the same object in described two dimension indoor map and/or in described three-dimensional indoor map is more than or equal to setting height, identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;Or,
3rd identifies submodule, and for the area that becomes when the support member group of the same object in described two dimension indoor map and/or in described three-dimensional indoor map less than or equal to setting area, identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign.
Fourth aspect according to disclosure embodiment, it is provided that a kind of clean robot, including:
Processor and for storing the memorizer of processor executable;
Wherein, described processor is configured to:
Obtain the two-dimentional indoor map that clean robot self is drawn;
Obtain the stereo data of the indoor object that clean robot self detects;
Stereo data according to described two dimension indoor map and described indoor object generates three-dimensional indoor map, according to pre-conditioned in described two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in described three-dimensional indoor map;
The two-dimentional indoor map and three-dimensional indoor map that identify described corner or spot as yet untouched by a clean-up campaign are sent to mobile terminal so that described corner or spot as yet untouched by a clean-up campaign can be cleaned according to the instruction of the mobile terminal received.
The 5th aspect according to disclosure embodiment, it is provided that a kind of mobile terminal, including:
Processor and for storing the memorizer of processor executable;
Wherein, described processor is configured to:
Receive the two-dimentional indoor map of the corner or spot as yet untouched by a clean-up campaign being identified with cleaning that clean robot sends and three-dimensional indoor map, the stereo data that wherein said three-dimensional indoor map is the two-dimentional indoor map drawn according to self by described clean robot and the indoor object self detected generates, and the corner or spot as yet untouched by a clean-up campaign of described cleaning is to be identified and identify according to pre-conditioned in described two dimension indoor map and/or in described three-dimensional indoor map by described clean robot;
Sending, to described clean robot, the instruction cleaning described corner or spot as yet untouched by a clean-up campaign makes described clean robot clean described corner or spot as yet untouched by a clean-up campaign according to the instruction received.
Embodiment of the disclosure that the technical scheme of offer can include following beneficial effect:
The clean robot of the disclosure is after obtaining the two-dimentional indoor map self drawn, also obtain the stereo data of the indoor object self detected, stereo data further according to two dimension indoor map and indoor object generates three-dimensional indoor map, according to pre-conditioned in two dimension indoor map and/or three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning, the layout of indoor room so can more intuitively and be more accurately judged according to three-dimensional indoor map, such as bed, table, the layout of chair etc., it is more beneficial for the cleaning of clean robot, also may identify which out the corner or spot as yet untouched by a clean-up campaign of cleaning further, so that indoor each region can realize cleaning, avoid omitting, improve indoor cleaning effect.
The area that the disclosure can become according to the neighbor distance of different objects in two dimension indoor map and/or in three-dimensional indoor map, the height supporting parts of same object or the support member group of same object, identifies the corner or spot as yet untouched by a clean-up campaign of cleaning.
The disclosure can identify corner or spot as yet untouched by a clean-up campaign in several ways, for instance according to the neighbor distance of different objects less than or equal to setpoint distance, the adjacent area identifying different objects is corner or spot as yet untouched by a clean-up campaign;The height supporting parts according to same object is more than or equal to setting height, and identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;The area that support member group according to same object becomes is less than or equal to setting area, and identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign etc..
It should be appreciated that it is only exemplary and explanatory that above general description and details hereinafter describe, the disclosure can not be limited.
Accompanying drawing explanation
Accompanying drawing herein is merged in description and constitutes the part of this specification, it is shown that meets and embodiment of the disclosure, and for explaining the principle of the disclosure together with description.
Fig. 1 is the disclosure a kind of clean robot indoor cleaning process flow figure according to an exemplary embodiment.
Fig. 2 is the disclosure another kind of clean robot indoor cleaning process flow figure according to an exemplary embodiment.
Fig. 3 is the disclosure another kind of clean robot indoor cleaning process flow figure according to an exemplary embodiment.
Fig. 4 is the disclosure schematic diagram identifying corner or spot as yet untouched by a clean-up campaign according to an exemplary embodiment.
Fig. 5 is the disclosure another schematic diagram identifying corner or spot as yet untouched by a clean-up campaign according to an exemplary embodiment.
Fig. 6 is the device block diagram of the disclosure a kind of clean robot according to an exemplary embodiment.
Fig. 7 is the device block diagram of the disclosure another kind of clean robot according to an exemplary embodiment.
Fig. 8 is the device block diagram of the disclosure another kind of clean robot according to an exemplary embodiment.
Fig. 9 is the device block diagram of the disclosure a kind of mobile terminal according to an exemplary embodiment.
Figure 10 is a structured flowchart of the disclosure a kind of equipment according to an exemplary embodiment.
Detailed description of the invention
Here in detail exemplary embodiment being illustrated, its example representation is in the accompanying drawings.When as explained below relates to accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawings represents same or analogous key element.Embodiment described in following exemplary embodiment does not represent all embodiments consistent with the disclosure.On the contrary, they only with in appended claims describe in detail, the disclosure some in the example of consistent apparatus and method.
The term used in the disclosure is only merely for the purpose describing specific embodiment, and is not intended to be limiting the disclosure." one ", " described " and " being somebody's turn to do " of the singulative used in disclosure and the accompanying claims book is also intended to include most form, unless context clearly shows that other implications.It is also understood that term "and/or" used herein refers to and comprises any or all of one or more project of listing being associated and be likely to combination.
Although should be appreciated that and be likely to adopt term first, second, third, etc. to describe various information in the disclosure, but these information should not necessarily be limited by these terms.These terms are only used for being distinguished from each other out same type of information.Such as, without departing from the scope of this disclosure, the first information can also be referred to as the second information, and similarly, the second information can also be referred to as the first information.Depend on linguistic context, word as used in this " if " can be construed to " ... time " or " when ... " or " in response to determining ".
The disclosure provides a kind of clean robot indoor cleaning processing method, can recognize that the corner or spot as yet untouched by a clean-up campaign of the cleaning of indoor and is cleaned.
Fig. 1 is the disclosure a kind of clean robot indoor cleaning process flow figure according to an exemplary embodiment.
As it is shown in figure 1, the method can be performed by clean robot, the method may comprise steps of:
In a step 101, the two-dimentional indoor map that clean robot self is drawn is obtained.
In a step 102, the stereo data of the indoor object that clean robot self detects is obtained.
Wherein, the stereo data of the indoor object that the clean robot of acquisition detects self is to be detected according to self at least one assembly following by clean robot: touch sensor, gyroscope, range sensor, airborne laser range finder, photographic head.
In step 103, generate three-dimensional indoor map according to the stereo data of two dimension indoor map and indoor object, according to pre-conditioned in two dimension indoor map and/or three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning.
The area that can become according to the neighbor distance of different objects in two dimension indoor map and/or in three-dimensional indoor map, the height supporting parts of same object or the support member group of same object in this step, in two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in three-dimensional indoor map.
This step identification also identifies the corner or spot as yet untouched by a clean-up campaign of cleaning and may include that
When the neighbor distance of the different objects in two dimension indoor map and/or in three-dimensional indoor map is less than or equal to setpoint distance, the adjacent area identifying different objects is corner or spot as yet untouched by a clean-up campaign;Or,
When the height supporting parts of the same object in two dimension indoor map and/or in three-dimensional indoor map is more than or equal to setting height, identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;Or,
The area become when the support member group of the same object in two dimension indoor map and/or in three-dimensional indoor map is less than or equal to setting area, and identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;
The corner or spot as yet untouched by a clean-up campaign that will identify that is mark in two dimension indoor map and/or in three-dimensional indoor map.
At step 104, the two-dimentional indoor map of mark corner or spot as yet untouched by a clean-up campaign and three-dimensional indoor map are sent to mobile terminal so that can according to the instruction sanitation and hygiene dead angle of the mobile terminal received.
It should be noted that, said method can also include: when the distance of two corner or spot as yet untouched by a clean-up campaign identified with three-dimensional indoor map in two dimension indoor map is less than or equal to setting threshold value, two corner or spot as yet untouched by a clean-up campaign are merged into a corner or spot as yet untouched by a clean-up campaign mark in two dimension indoor map and/or in three-dimensional indoor map.
From such scheme it appeared that, the clean robot of the disclosure is after obtaining the two-dimentional indoor map self drawn, also obtain the stereo data of the indoor object self detected, stereo data further according to two dimension indoor map and indoor object generates three-dimensional indoor map, according to pre-conditioned in described two dimension indoor map and/or described three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning, the layout of indoor room so can more intuitively and be more accurately judged according to three-dimensional indoor map, such as bed, table, the layout of chair etc., it is more beneficial for the cleaning of clean robot, also may identify which out the corner or spot as yet untouched by a clean-up campaign of cleaning further, so that indoor each region can realize cleaning, avoid omitting, improve indoor cleaning effect.
Fig. 2 is the disclosure another kind of clean robot indoor cleaning process flow figure according to an exemplary embodiment.
As in figure 2 it is shown, the method can be performed by mobile terminal, the method comprises the following steps:
In step 201, receive the two-dimentional indoor map of the corner or spot as yet untouched by a clean-up campaign being identified with cleaning that clean robot sends and three-dimensional indoor map, wherein three-dimensional indoor map is the stereo data generation of the two-dimentional indoor map drawn according to self by clean robot and the indoor object self detected, and the corner or spot as yet untouched by a clean-up campaign of cleaning is to be identified in two dimension indoor map and/or in three-dimensional indoor map by clean robot according to pre-conditioned and identify.
In step 202., make clean robot according to the instruction sanitation and hygiene dead angle received to the instruction at clean robot transmission sanitation and hygiene dead angle.
It should be noted that, can also include after the two-dimentional indoor map receiving the corner or spot as yet untouched by a clean-up campaign being identified with cleaning that clean robot sends and three-dimensional indoor map:
The three-dimensional indoor map and two dimension indoor map switching on mobile terminal screen and the convergent-divergent that receive is realized by gesture operation.
After mobile terminal receives three-dimensional indoor map, the convergent-divergent of indoor map (three-dimensional indoor map or two dimension indoor map) or the switching of three-dimensional indoor map and two dimension indoor map can be realized, for instance as double; two finger slip up realize the conversion to three-dimensional indoor map of the two dimension indoor map by gesture operation;Point to outer or slid inward by two to realize the amplification of map (three-dimensional indoor map or two dimension indoor map) and reduce.
Fig. 3 is the disclosure another kind of clean robot indoor cleaning process flow figure according to an exemplary embodiment.
This embodiment describe in more detail the technical scheme of the disclosure relative to Fig. 1 and Fig. 2.As it is shown on figure 3, the method may comprise steps of:
In step 301, clean robot draws out the two-dimentional indoor map for cleaning when being cleaned work.
The clean robot of the disclosure, it is possible to include following assembly: touch sensor, gyroscope, range sensor, airborne laser range finder, shooting are first-class.
Wherein, touch sensor is for detecting whether have barrier in clean robot course;Gyroscope is for providing clean robot direction of rotation and angle information in real time;Range sensor and airborne laser range finder are for detecting the distance of clean robot and object;Photographic head is used for shooting and identifying object.By touch sensor, gyroscope, range sensor, airborne laser range finder, the first-class obstruction thing (such as wall, desk etc.) that may determine that in cleaning region of shooting.
This step clean robot can according to the range information in the dynamic process of search row, angle information certainly and the datum mark gone into action, draw the search movement track that clean robot starts from datum mark, and the boundary information according to search movement track and each region generates the two-dimentional indoor map for cleaning.This generation process can adopt existing techniques in realizing, and the disclosure is not limited.
In step 302, clean robot detects the stereo data of indoor object when being cleaned work.
Clean robot detects object and records the stereo data of object when being cleaned work, the data such as including length, or ground line gradient data (can be detected by gyroscope).These stereo datas can be detected by the photographic head of clean robot self, airborne laser range finder, range sensor, gyroscope, touch sensor etc. and obtain.
In step 303, clean robot obtains the two-dimentional indoor map self drawn and the stereo data of the indoor object detected, stereo data according to two dimension indoor map and indoor object generates three-dimensional indoor map, according to pre-conditioned in two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in three-dimensional indoor map.
The stereo data of the indoor object that namely this step can detect according to object information, generates three-dimensional indoor map threedimensional model two dimension indoor map.It should be noted that, this threedimensional model can be naive model, it is not necessary to very fine, as long as the information such as the area of object, height can probably be stated out.It is to say, generate three-dimensional indoor map from two dimension indoor map, on vision represents, when cleaning, the object detected is carried out data record mainly by clean robot, draw out the 3-D views such as the height of object, width, angle according to stereo data.Map based on the two-dimentional indoor map of the disclosure, three-dimensional indoor map can realize the higher level clean up task services such as precise cleaning.
The area that this step can become according to the neighbor distance of different objects in two dimension indoor map and/or in three-dimensional indoor map, the height supporting parts of same object or the support member group of same object, in two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in three-dimensional indoor map.
The clean robot of the disclosure, it is possible to identify the corner or spot as yet untouched by a clean-up campaign of the cleaning of room area.Wherein it is possible to identify corner or spot as yet untouched by a clean-up campaign according to two dimension indoor map or three-dimensional indoor map respectively, it is also possible to combine two dimension indoor map or three-dimensional indoor map identifies corner or spot as yet untouched by a clean-up campaign.
It addition, when the distance of two corner or spot as yet untouched by a clean-up campaign identified with three-dimensional indoor map in two dimension indoor map is less than or equal to setting threshold value, it is possible to a corner or spot as yet untouched by a clean-up campaign is merged in two corner or spot as yet untouched by a clean-up campaign and is recorded.Such as, corner or spot as yet untouched by a clean-up campaign can be drawn by the identical cooperation of record position with three-dimensional indoor map in two dimension indoor map, corner or spot as yet untouched by a clean-up campaign B is there is in corner or spot as yet untouched by a clean-up campaign A and three-dimensional indoor map as two dimension indoor map exists, if the distance between two corner or spot as yet untouched by a clean-up campaign A and B is in certain and sets in threshold value, then may determine that and be merged into a corner or spot as yet untouched by a clean-up campaign.
Wherein, the area that the support member group of the neighbor distance of different objects in this step above-mentioned according to the map (two dimension indoor map or three-dimensional indoor map), the height supporting parts of same object or same object becomes, in two dimension indoor map and/or three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning, it is possible to including:
When the neighbor distance of the different objects in two dimension indoor map and/or in three-dimensional indoor map is less than or equal to setpoint distance, in two dimension indoor map and/or to identify the adjacent area of different objects in three-dimensional indoor map be corner or spot as yet untouched by a clean-up campaign;Wherein, setpoint distance can be set to 1 meter but be not limited to this;Or,
When the height supporting parts of the same object in two dimension indoor map and/or in three-dimensional indoor map is more than or equal to setting height, identifying in two dimension indoor map and/or in three-dimensional indoor map and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;Wherein, setting height can be 20 centimetres but be not limited to this;Or,
The region area become when the support member group of the same object in two dimension indoor map and/or in three-dimensional indoor map is less than or equal to setting area, and identifying in two dimension indoor map and/or in three-dimensional indoor map and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;Wherein, set area to be set to 1 square metre but be not limited to this;
The corner or spot as yet untouched by a clean-up campaign that will identify that is mark in two dimension indoor map and/or in three-dimensional indoor map.
Such as, by the distance between object limit and the object limit in two dimension indoor map or three-dimensional indoor map, if the marginal neighbor distance of a desk and another desk or the marginal neighbor distance of desk and stool are less than or equal to 1 meter, may identify which the corner or spot as yet untouched by a clean-up campaign that there is narrow regions for desk, chair etc., and corner or spot as yet untouched by a clean-up campaign is carried out labelling.Shown in Figure 4, Fig. 4 is the disclosure schematic diagram identifying corner or spot as yet untouched by a clean-up campaign according to an exemplary embodiment.
Again such as, the area that 4 pillars of indoor desk or stool the are formed region less than or equal to 1 square metre, identify that this region is corner or spot as yet untouched by a clean-up campaign.Shown in Figure 5, Fig. 5 is the disclosure another schematic diagram identifying corner or spot as yet untouched by a clean-up campaign according to an exemplary embodiment.
Clean robot can pass through the pillar that touch sensor etc. judges 4 identical sizes of same object such as same desk when carrying out mapping, laser ranging can also be passed through, image the first-class position judging four pillars and distance, such that it is able to calculate the area that 4 pillars are formed, and according to the comparison with setting area, draw the corner or spot as yet untouched by a clean-up campaign whether desk, chair etc. exist narrow regions.That is, by recording the position relationship between each pillar, such as 4 pillars A, B, C, D, there is equal (A to the B of numerical relation in the mutual distance between ABCD, B to C, C to D, D to A), the information such as the area size of ABCD is consistent, highly consistent, then may determine that the pillar into 4 formed objects with a desk.The image shot additionally by the photographic head on clean robot also can calculate from image whether object belongs to desk etc..
Again such as, indoor desk or stool have 4 foot ribs supported, and clean robot differs and accurately crosses over over surely.When the height of foot rib is more than or equal to setting height such as 20 centimetres, clean robot cannot be crossed over, then the region recognition become by 4 foot prismatics is for being identified as corner or spot as yet untouched by a clean-up campaign.
In step 304, two-dimentional indoor map and the three-dimensional indoor map of mark corner or spot as yet untouched by a clean-up campaign are sent to mobile terminal by clean robot.
In this step, clean robot sends two-dimentional indoor map and the three-dimensional indoor map of mark corner or spot as yet untouched by a clean-up campaign to be connected with clean robot mobile terminal.Mobile terminal can connect clean robot by WiFi, bluetooth, the mode such as infrared.
In step 305, mobile terminal realizes the three-dimensional indoor map and two dimension indoor map switching on mobile terminal screen and the convergent-divergent that receive by gesture operation, and sends the instruction at sanitation and hygiene dead angle to clean robot according to the corner or spot as yet untouched by a clean-up campaign of mark in two dimension indoor map and three-dimensional indoor map.
After mobile terminal receives three-dimensional indoor map, the convergent-divergent of indoor map (three-dimensional indoor map or two dimension indoor map) or the switching of three-dimensional indoor map and two dimension indoor map can be realized, for instance as double; two finger slip up realize the conversion to three-dimensional indoor map of the two dimension indoor map by gesture operation;Point to outer or slid inward by two to realize the amplification of map (three-dimensional indoor map or two dimension indoor map) and reduce.
After mobile terminal is connected with clean robot, mobile terminal can send cleaning instruction according to the corner or spot as yet untouched by a clean-up campaign on the map received, it is desirable to the clean robot corner or spot as yet untouched by a clean-up campaign to identifying carries out emphasis cleaning, repeatedly cleaning etc..
Within step 306, the clean robot instruction sanitation and hygiene dead angle according to the mobile terminal received.
In this step, the clean robot cleaning instruction according to the mobile terminal received, and with reference to three-dimensional indoor map, corner or spot as yet untouched by a clean-up campaign is carried out emphasis cleaning.
It is found that two dimension indoor map is converted to three-dimensional indoor map by the clean robot of the disclosure, so can more accurately judge the layout of indoor room, such as the layout of bed, table, chair etc.;According to pre-conditioned in two dimension indoor map and/or three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning, so can carry out emphasis cleaning according to the corner or spot as yet untouched by a clean-up campaign of the cleaning identified, so that indoor each region can realize cleaning, it is to avoid omit, improve indoor cleaning effect.
Corresponding with aforementioned applications function realizing method embodiment, the disclosure additionally provides a kind of clean robot, mobile terminal and corresponding embodiment.
Fig. 6 is the device block diagram of the disclosure a kind of clean robot according to an exemplary embodiment.
As shown in Figure 6, a kind of clean robot may include that first acquisition module the 61, second acquisition module 62, conversion and mark module 63, sending module 64.
First acquisition module 61, for obtaining the two-dimentional indoor map that clean robot self is drawn.
Second acquisition module 62, for obtaining the stereo data of the indoor object that clean robot self detects.Wherein, clean robot can according to the stereo data of at least one the component detection indoor object following of self: touch sensor, gyroscope, range sensor, airborne laser range finder, photographic head.
Conversion and mark module 63, the stereo data of the indoor object obtained for the two-dimentional indoor map obtained according to the first acquisition module 61 and the second acquisition module 62 generates three-dimensional indoor map, according to pre-conditioned in two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in three-dimensional indoor map.
Sending module 64, for being sent to mobile terminal by the two-dimentional indoor map of mark corner or spot as yet untouched by a clean-up campaign and three-dimensional indoor map so that can according to the instruction sanitation and hygiene dead angle of the mobile terminal received.
It appeared that, the clean robot of the disclosure is after obtaining the two-dimentional indoor map self drawn, also obtain the stereo data of the indoor object self detected, stereo data further according to two dimension indoor map and indoor object generates three-dimensional indoor map, according to pre-conditioned in two dimension indoor map and/or three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning, the layout of indoor room so can more intuitively and be more accurately judged according to three-dimensional indoor map, such as bed, table, the layout of chair etc., it is more beneficial for the cleaning of clean robot, also may identify which out the corner or spot as yet untouched by a clean-up campaign of cleaning further, so that indoor each region can realize cleaning, avoid omitting, improve indoor cleaning effect.
Fig. 7 is the device block diagram of the disclosure another kind of clean robot according to an exemplary embodiment.
As it is shown in fig. 7, may include that first acquisition module the 61, second acquisition module 62, conversion and mark module 63, sending module 64, detection module 65 in a kind of clean robot.
The function of first acquisition module the 61, second acquisition module 62, conversion and mark module 63, sending module 64 is shown in Figure 6.
Detection module 65, at least include with the next item down assembly: touch sensor, gyroscope, range sensor, airborne laser range finder, photographic head, detection module 65 passes through the stereo data of the component detection indoor object comprised, and the second acquisition module 62 obtains the stereo data of the indoor object that detection module 65 detects.
Wherein, conversion and mark module 63 may include that transform subblock 631, identify submodule 632, labeling submodule 633.
Transform subblock 631, for the three-dimensional indoor map of stereo data generation of the indoor object that the two-dimentional indoor map obtained according to the first acquisition module 61 and the second acquisition module 62 obtain.
Identify submodule 632, for the area become according to the neighbor distance of different objects in two dimension indoor map and/or in three-dimensional indoor map, the height supporting parts of same object or the support member group of same object, in two dimension indoor map and/or three-dimensional indoor map identifies the corner or spot as yet untouched by a clean-up campaign of cleaning.
Labeling submodule 633, identifies, for mark in two dimension indoor map and/or in three-dimensional indoor map, the corner or spot as yet untouched by a clean-up campaign that submodule 632 identifies.
Wherein, identify that submodule 633 can also include: first identifies that submodule 6331, second identifies that submodule 6332 or the 3rd identifies submodule 6333.
First identifies submodule 6331, and for the neighbor distance when the different objects in two dimension indoor map and/or in three-dimensional indoor map less than or equal to setpoint distance, the adjacent area identifying different objects is corner or spot as yet untouched by a clean-up campaign;Wherein, setpoint distance can be set to 1 meter but be not limited to this.
Second identifies submodule 6332, for the height supporting parts when the same object in two dimension indoor map and/or in three-dimensional indoor map more than or equal to setting height, identifying and supporting the region that forms of parts is corner or spot as yet untouched by a clean-up campaign, and wherein, setting height can be 20 centimetres but be not limited to this.
3rd identifies submodule 6333, the area that support member group for working as the same object in two dimension indoor map and/or in three-dimensional indoor map becomes is less than or equal to setting area, identifying and supporting the region that forms of parts is corner or spot as yet untouched by a clean-up campaign, wherein, set area to be set to 1 square metre but be not limited to this.
What in said apparatus, the function of unit and the process that realizes of effect specifically referred in said method corresponding step realizes process, does not repeat them here.
For device embodiment, owing to it corresponds essentially to embodiment of the method, so relevant part illustrates referring to the part of embodiment of the method.Device embodiment described above is merely schematic, the unit wherein illustrated as separating component can be or may not be physically separate, the parts shown as unit can be or may not be physical location, namely may be located at a place, or can also be distributed on multiple NE.Some or all of module therein can be selected according to the actual needs to realize the purpose of disclosure scheme.Those of ordinary skill in the art, when not paying creative work, are namely appreciated that and implement.
Fig. 8 is the device block diagram of the disclosure another kind of clean robot according to an exemplary embodiment.
As shown in Figure 8, a kind of clean robot includes: processor 801 and for storing the memorizer 802 of processor executable;
Wherein, processor 801 is configured to:
Obtain the two-dimentional indoor map that clean robot self is drawn;
Obtain the stereo data of the indoor object that clean robot self detects;
Stereo data according to two dimension indoor map and indoor object generates three-dimensional indoor map, according to pre-conditioned in two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in three-dimensional indoor map;
The two-dimentional indoor map of mark corner or spot as yet untouched by a clean-up campaign and three-dimensional indoor map are sent to mobile terminal so that can according to the instruction sanitation and hygiene dead angle of the mobile terminal received.
It should be noted that, other programs of memorizer 802 storage, referring specifically to the description in previous methods flow process, repeat no more herein, processor 801 is additionally operable to perform other programs of memorizer 802 storage.
Fig. 9 is the device block diagram of the disclosure a kind of mobile terminal according to an exemplary embodiment.
As it is shown in figure 9, a kind of mobile terminal includes: processor 91 and for storing the memorizer 92 of processor executable;
Wherein, processor 91 is configured to:
Receive the two-dimentional indoor map of the corner or spot as yet untouched by a clean-up campaign being identified with cleaning that clean robot sends and three-dimensional indoor map, wherein three-dimensional indoor map is the stereo data generation of the two-dimentional indoor map drawn according to self by clean robot and the indoor object self detected, and the corner or spot as yet untouched by a clean-up campaign of cleaning is to be identified in two dimension indoor map and/or in three-dimensional indoor map by clean robot according to pre-conditioned and identify;
The instruction sending sanitation and hygiene dead angle to clean robot makes clean robot according to the instruction sanitation and hygiene dead angle received.
It should be noted that, other programs of memorizer 92 storage, referring specifically to the description in previous methods flow process, repeat no more herein, processor 91 is additionally operable to perform other programs of memorizer 92 storage.
Figure 10 is a structured flowchart of the disclosure a kind of equipment according to an exemplary embodiment.
Such as, equipment 900 can be terminal unit or server etc..
With reference to Figure 10, equipment 900 can include following one or more assembly: processes assembly 902, memorizer 904, power supply module 906, multimedia groupware 908, audio-frequency assembly 910, the interface 912 of input/output (I/O), sensor cluster 914, and communications component 916.
Process assembly 902 and generally control the integrated operation of equipment 900.Process assembly 902 and can include one or more processor 920 to perform instruction, to complete all or part of step of above-mentioned method.Additionally, process assembly 902 can include one or more module, it is simple to what process between assembly 902 and other assemblies is mutual.Such as, process assembly 902 and can include multi-media module, with facilitate multimedia groupware 908 and process between assembly 902 mutual.
Memorizer 904 is configured to store various types of data to support the operation at equipment 900.The example of these data includes the instruction of any application program for operating on the device 900 or method, contact data, telephone book data, message, picture, video etc..Memorizer 904 can be realized by any kind of volatibility or non-volatile memory device or their combination, such as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory EPROM (EPROM), programmable read only memory (PROM), read only memory (ROM), magnetic memory, flash memory, disk or CD.
The various assemblies that power supply module 906 is equipment 900 provide electric power.Power supply module 906 can include power-supply management system, one or more power supplys, and other generate, manage and distribute, with for equipment 900, the assembly that electric power is associated.
Multimedia groupware 908 includes the screen providing an output interface between equipment 900 and user.In certain embodiments, screen can include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, screen may be implemented as touch screen, to receive the input signal from user.Touch panel includes one or more touch sensor to sense the gesture on touch, slip and touch panel.Touch sensor can not only sense the border of touch or sliding action, but also detects the persistent period relevant to touch or slide and pressure.In certain embodiments, multimedia groupware 908 includes a front-facing camera and/or post-positioned pick-up head.When equipment 900 is in operator scheme, during such as screening-mode or video mode, front-facing camera and/or post-positioned pick-up head can receive the multi-medium data of outside.Each front-facing camera and post-positioned pick-up head can be a fixing optical lens system or have focal length and optical zoom ability.
Audio-frequency assembly 910 is configured to output and/or input audio signal.Such as, audio-frequency assembly 910 includes a mike (MIC), and when equipment 900 is in operator scheme, during such as call model, logging mode and speech recognition mode, mike is configured to receive external audio signal.The audio signal received can be further stored at memorizer 904 or send via communications component 916.In certain embodiments, audio-frequency assembly 910 also includes a speaker, is used for exporting audio signal.
I/O interface 912 provides interface for processing between assembly 902 and peripheral interface module, above-mentioned peripheral interface module can be keyboard, puts striking wheel, button etc..These buttons may include but be not limited to: home button, volume button, startup button and locking press button.
Sensor cluster 914 includes one or more sensor, for providing the state estimation of various aspects for equipment 900.Such as, what sensor cluster 914 can detect equipment 900 opens/closed mode, the relative localization of assembly, such as assembly is display and the keypad of equipment 900, sensor cluster 914 can also detect equipment 900 or the position change of 900 1 assemblies of equipment, the presence or absence that user contacts with equipment 900, the variations in temperature of equipment 900 orientation or acceleration/deceleration and equipment 900.Sensor cluster 914 can include proximity transducer, is configured to when not having any physical contact object near detection.Sensor cluster 914 can also include optical sensor, such as CMOS or ccd image sensor, for using in imaging applications.In certain embodiments, this sensor cluster 914 can also include acceleration transducer, gyro sensor, Magnetic Sensor, pressure transducer or temperature sensor.
Communications component 916 is configured to facilitate between equipment 900 and other equipment the communication of wired or wireless mode.Equipment 900 can access the wireless network based on communication standard, such as WiFi, 2G or 3G, or their combination.In one exemplary embodiment, communications component 916 receives the broadcast singal or the broadcast related information that manage system from external broadcasting via broadcast channel.In one exemplary embodiment, communications component 916 also includes near-field communication (NFC) module, to promote junction service.Such as, can based on RF identification (RFID) technology in NFC module, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, bluetooth (BT) technology and other technologies realize.
In the exemplary embodiment, equipment 900 can by one or more application specific integrated circuits
(ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), PLD (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components realize, be used for performing said method.
In the exemplary embodiment, additionally providing a kind of non-transitory computer-readable recording medium including instruction, for instance include the memorizer 904 of instruction, above-mentioned instruction can have been performed said method by the processor 920 of equipment 900.Such as, non-transitory computer-readable recording medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk and optical data storage devices etc..
A kind of non-transitory computer-readable recording medium, when the instruction in storage medium is performed by the processor of terminal unit so that terminal unit is able to carry out the method that clean robot indoor cleaning processes, and method includes:
Obtain the two-dimentional indoor map that clean robot self is drawn;
Obtain the stereo data of the indoor object that clean robot self detects;
Stereo data according to two dimension indoor map and indoor object generates three-dimensional indoor map, according to pre-conditioned in two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in three-dimensional indoor map;
The two-dimentional indoor map of mark corner or spot as yet untouched by a clean-up campaign and three-dimensional indoor map are sent to mobile terminal so that can according to the instruction sanitation and hygiene dead angle of the mobile terminal received.
A kind of non-transitory computer-readable recording medium, when the instruction in storage medium is performed by the processor of terminal unit so that terminal unit is able to carry out the method that clean robot indoor cleaning processes, and method includes:
Receive the two-dimentional indoor map of the corner or spot as yet untouched by a clean-up campaign being identified with cleaning that clean robot sends and three-dimensional indoor map, the stereo data that wherein said three-dimensional indoor map is the two-dimentional indoor map drawn according to self by described clean robot and the indoor object self detected generates, and the corner or spot as yet untouched by a clean-up campaign of described cleaning is to be identified and identify according to pre-conditioned in described two dimension indoor map and/or in described three-dimensional indoor map by described clean robot;
Sending, to described clean robot, the instruction cleaning described corner or spot as yet untouched by a clean-up campaign makes described clean robot clean described corner or spot as yet untouched by a clean-up campaign according to the instruction received.
Those skilled in the art, after considering description and putting into practice invention disclosed herein, will readily occur to other embodiment of the disclosure.The disclosure is intended to any modification of the disclosure, purposes or adaptations, and these modification, purposes or adaptations are followed the general principle of the disclosure and include the undocumented known general knowledge in the art of the disclosure or conventional techniques means.Description and embodiments is considered only as exemplary, and the true scope of the disclosure and spirit are pointed out by claim below.
It should be appreciated that the disclosure is not limited to precision architecture described above and illustrated in the accompanying drawings, and various amendment and change can carried out without departing from the scope.The scope of the present disclosure is only limited by appended claim.

Claims (13)

1. a clean robot indoor cleaning processing method, it is characterised in that including:
Obtain the two-dimentional indoor map that clean robot self is drawn;
Obtain the stereo data of the indoor object that clean robot self detects;
Stereo data according to described two dimension indoor map and described indoor object generates three-dimensional indoor map, according to pre-conditioned in described two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in described three-dimensional indoor map;
The two-dimentional indoor map and three-dimensional indoor map that identify described corner or spot as yet untouched by a clean-up campaign are sent to mobile terminal so that described corner or spot as yet untouched by a clean-up campaign can be cleaned according to the instruction of the mobile terminal received.
2. method according to claim 1, it is characterised in that:
The stereo data of the indoor object that the clean robot of described acquisition detects self is to be detected according to self at least one assembly following by described clean robot: touch sensor, gyroscope, range sensor, airborne laser range finder, photographic head.
3. method according to claim 1 and 2, it is characterised in that described according to pre-conditioned in described two dimension indoor map and/or described three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning include:
According to the area that the neighbor distance of different objects in described two dimension indoor map and/or in described three-dimensional indoor map, the height supporting parts of same object or the support member group of same object become, in described two dimension indoor map and/or described three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning.
4. method according to claim 3, it is characterized in that, the described area become according to the neighbor distance of different objects in described two dimension indoor map and/or in described three-dimensional indoor map, the height supporting parts of same object or the support member group of same object, in described two dimension indoor map and/or described three-dimensional indoor map identifies and identifies the corner or spot as yet untouched by a clean-up campaign of cleaning include:
When the neighbor distance of the different objects in described two dimension indoor map and/or in described three-dimensional indoor map is less than or equal to setpoint distance, the adjacent area identifying different objects is corner or spot as yet untouched by a clean-up campaign;Or,
When the height supporting parts of the same object in described two dimension indoor map and/or in described three-dimensional indoor map is more than or equal to setting height, identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;Or,
The area become when the support member group of the same object in described two dimension indoor map and/or in described three-dimensional indoor map is less than or equal to setting area, and identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;
The described corner or spot as yet untouched by a clean-up campaign identified is identified in described two dimension indoor map and/or in described three-dimensional indoor map.
5. method according to claim 3, it is characterised in that described method also includes:
When the distance of two corner or spot as yet untouched by a clean-up campaign identified with described three-dimensional indoor map in described two dimension indoor map is less than or equal to setting threshold value, said two corner or spot as yet untouched by a clean-up campaign is merged into a corner or spot as yet untouched by a clean-up campaign mark in described two dimension indoor map and/or in described three-dimensional indoor map.
6. a clean robot indoor cleaning processing method, it is characterised in that including:
Receive the two-dimentional indoor map of the corner or spot as yet untouched by a clean-up campaign being identified with cleaning that clean robot sends and three-dimensional indoor map, the stereo data that wherein said three-dimensional indoor map is the two-dimentional indoor map drawn according to self by described clean robot and the indoor object self detected generates, and the corner or spot as yet untouched by a clean-up campaign of described cleaning is to be identified and identify according to pre-conditioned in described two dimension indoor map and/or in described three-dimensional indoor map by described clean robot;
Sending, to described clean robot, the instruction cleaning described corner or spot as yet untouched by a clean-up campaign makes described clean robot clean described corner or spot as yet untouched by a clean-up campaign according to the instruction received.
7. method according to claim 6, it is characterised in that also include after the two-dimentional indoor map of the corner or spot as yet untouched by a clean-up campaign being identified with cleaning of described reception clean robot transmission and three-dimensional indoor map:
The switching on mobile terminal screen of the three-dimensional indoor map of described reception and described two dimension indoor map and convergent-divergent is realized by gesture operation.
8. a clean robot, it is characterised in that including:
First acquisition module, for obtaining the two-dimentional indoor map that clean robot self is drawn;
Second acquisition module, for obtaining the stereo data of the indoor object that clean robot self detects;
Conversion and mark module, the stereo data of the indoor object obtained for the described two dimension indoor map obtained according to described first acquisition module and described second acquisition module generates three-dimensional indoor map, according to pre-conditioned in described two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in described three-dimensional indoor map;
Sending module, for being sent to mobile terminal by the two-dimentional indoor map and three-dimensional indoor map that identify described corner or spot as yet untouched by a clean-up campaign so that can clean described corner or spot as yet untouched by a clean-up campaign according to the instruction of the mobile terminal received.
9. clean robot according to claim 8, it is characterised in that described clean robot also includes:
Detection module, described detection module at least includes with the next item down assembly: touch sensor, gyroscope, range sensor, airborne laser range finder, photographic head,
Described detection module passes through the stereo data of the component detection indoor object comprised, and described second acquisition module obtains the stereo data of the indoor object that described detection module detects.
10. clean robot according to claim 8, it is characterised in that described conversion and mark module include:
Transform subblock, for the three-dimensional indoor map of stereo data generation of the indoor object that the described two dimension indoor map obtained according to described first acquisition module and described second acquisition module obtain;
Identify submodule, for the area become according to the neighbor distance of different objects in described two dimension indoor map and/or in described three-dimensional indoor map, the height supporting parts of same object or the support member group of same object, in described two dimension indoor map and/or described three-dimensional indoor map identifies the corner or spot as yet untouched by a clean-up campaign of cleaning;
Labeling submodule, for identifying, in described two dimension indoor map and/or in described three-dimensional indoor map, the corner or spot as yet untouched by a clean-up campaign that described identification module identifies.
11. robot according to claim 10, it is characterised in that described identification submodule includes:
First identifies submodule, and for when the neighbor distance of the different objects in described two dimension indoor map and/or in described three-dimensional indoor map is less than or equal to setpoint distance, the adjacent area identifying different objects is corner or spot as yet untouched by a clean-up campaign;Or,
Second identifies submodule, and for when the height supporting parts of the same object in described two dimension indoor map and/or in described three-dimensional indoor map is more than or equal to setting height, identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign;Or,
3rd identifies submodule, and for the area that becomes when the support member group of the same object in described two dimension indoor map and/or in described three-dimensional indoor map less than or equal to setting area, identifying and supporting the region that parts form is corner or spot as yet untouched by a clean-up campaign.
12. a clean robot, it is characterised in that including:
Processor and for storing the memorizer of processor executable;
Wherein, described processor is configured to:
Obtain the two-dimentional indoor map that clean robot self is drawn;
Obtain the stereo data of the indoor object that clean robot self detects;
Stereo data according to described two dimension indoor map and described indoor object generates three-dimensional indoor map, according to pre-conditioned in described two dimension indoor map and/or identify and identify the corner or spot as yet untouched by a clean-up campaign of cleaning in described three-dimensional indoor map;
The two-dimentional indoor map and three-dimensional indoor map that identify described corner or spot as yet untouched by a clean-up campaign are sent to mobile terminal so that described corner or spot as yet untouched by a clean-up campaign can be cleaned according to the instruction of the mobile terminal received.
13. a mobile terminal, it is characterised in that including:
Processor and for storing the memorizer of processor executable;
Wherein, described processor is configured to:
Receive the two-dimentional indoor map of the corner or spot as yet untouched by a clean-up campaign being identified with cleaning that clean robot sends and three-dimensional indoor map, the stereo data that wherein said three-dimensional indoor map is the two-dimentional indoor map drawn according to self by described clean robot and the indoor object self detected generates, and the corner or spot as yet untouched by a clean-up campaign of described cleaning is to be identified and identify according to pre-conditioned in described two dimension indoor map and/or in described three-dimensional indoor map by described clean robot;
Sending, to described clean robot, the instruction cleaning described corner or spot as yet untouched by a clean-up campaign makes described clean robot clean described corner or spot as yet untouched by a clean-up campaign according to the instruction received.
CN201610202459.9A 2016-03-31 2016-03-31 Clean robot indoor cleaning processing method, clean robot and mobile terminal Active CN105796002B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610202459.9A CN105796002B (en) 2016-03-31 2016-03-31 Clean robot indoor cleaning processing method, clean robot and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610202459.9A CN105796002B (en) 2016-03-31 2016-03-31 Clean robot indoor cleaning processing method, clean robot and mobile terminal

Publications (2)

Publication Number Publication Date
CN105796002A true CN105796002A (en) 2016-07-27
CN105796002B CN105796002B (en) 2018-09-18

Family

ID=56459865

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610202459.9A Active CN105796002B (en) 2016-03-31 2016-03-31 Clean robot indoor cleaning processing method, clean robot and mobile terminal

Country Status (1)

Country Link
CN (1) CN105796002B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106774338A (en) * 2017-01-11 2017-05-31 深圳市鑫益嘉科技股份有限公司 A kind of robot control method and device
CN106737709A (en) * 2016-11-25 2017-05-31 北京小米移动软件有限公司 Cleaning method and device
CN107272708A (en) * 2017-08-03 2017-10-20 佛山市盈智轩科技有限公司 Home-use floor cleaning system and floor cleaning method
CN107280590A (en) * 2017-08-03 2017-10-24 佛山市盈智轩科技有限公司 Family cleaning assembly and clean method
CN107544495A (en) * 2017-08-17 2018-01-05 北京小米移动软件有限公司 cleaning method and device
CN108436921A (en) * 2018-06-14 2018-08-24 于小龙 A kind of sweeping robot intelligent control method
WO2019032684A1 (en) * 2017-08-08 2019-02-14 Diversey, Inc. Creation and loading of mapping data on autonomous robotic devices
CN109920424A (en) * 2019-04-03 2019-06-21 北京石头世纪科技股份有限公司 Robot voice control method and device, robot and medium
CN109965785A (en) * 2019-03-04 2019-07-05 深圳市银星智能科技股份有限公司 Mobile terminal with display and information processing method
CN110561459A (en) * 2018-09-28 2019-12-13 成都家有为力机器人技术有限公司 cleaning robot with binocular camera and working method thereof
CN110648038A (en) * 2018-06-27 2020-01-03 广达电脑股份有限公司 Task area allocation method and system for cleaning device and cleaning device
CN112714684A (en) * 2018-09-20 2021-04-27 三星电子株式会社 Cleaning robot and method for performing task thereof
CN114061424A (en) * 2022-01-17 2022-02-18 广东奥迪威传感科技股份有限公司 Collision positioning method of robot
CN114096930A (en) * 2019-09-19 2022-02-25 苏州宝时得电动工具有限公司 Self-moving equipment and automatic moving and working method thereof
CN114610820A (en) * 2021-12-31 2022-06-10 北京石头创新科技有限公司 Optimization method and device for three-dimensional map display
WO2022171090A1 (en) * 2021-02-10 2022-08-18 北京石头创新科技有限公司 Map display method and apparatus, medium, and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070106864A (en) * 2006-05-01 2007-11-06 주식회사 한울로보틱스 The control method of cleaning action for cleaning robot
CN102053623A (en) * 2009-11-10 2011-05-11 德国福维克控股公司 Method for controlling robot
US8428776B2 (en) * 2009-06-18 2013-04-23 Michael Todd Letsky Method for establishing a desired area of confinement for an autonomous robot and autonomous robot implementing a control system for executing the same
CN103236141A (en) * 2013-04-23 2013-08-07 宋学平 Multifunctional operating method of intelligent cleaning robot device
CN103439973A (en) * 2013-08-12 2013-12-11 桂林电子科技大学 Household cleaning robot capable of establishing map by self and cleaning method
CN104586322A (en) * 2013-10-31 2015-05-06 Lg电子株式会社 Moving robot and operating method
CN105074600A (en) * 2013-02-27 2015-11-18 夏普株式会社 Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070106864A (en) * 2006-05-01 2007-11-06 주식회사 한울로보틱스 The control method of cleaning action for cleaning robot
US8428776B2 (en) * 2009-06-18 2013-04-23 Michael Todd Letsky Method for establishing a desired area of confinement for an autonomous robot and autonomous robot implementing a control system for executing the same
CN102053623A (en) * 2009-11-10 2011-05-11 德国福维克控股公司 Method for controlling robot
CN105074600A (en) * 2013-02-27 2015-11-18 夏普株式会社 Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method
CN103236141A (en) * 2013-04-23 2013-08-07 宋学平 Multifunctional operating method of intelligent cleaning robot device
CN103439973A (en) * 2013-08-12 2013-12-11 桂林电子科技大学 Household cleaning robot capable of establishing map by self and cleaning method
CN104586322A (en) * 2013-10-31 2015-05-06 Lg电子株式会社 Moving robot and operating method

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106737709B (en) * 2016-11-25 2019-10-15 北京小米移动软件有限公司 Cleaning method and device
CN106737709A (en) * 2016-11-25 2017-05-31 北京小米移动软件有限公司 Cleaning method and device
CN106774338A (en) * 2017-01-11 2017-05-31 深圳市鑫益嘉科技股份有限公司 A kind of robot control method and device
CN107280590A (en) * 2017-08-03 2017-10-24 佛山市盈智轩科技有限公司 Family cleaning assembly and clean method
CN107280590B (en) * 2017-08-03 2020-10-27 安徽省行千里知识产权服务有限公司 Household cleaning device and cleaning method
CN107272708A (en) * 2017-08-03 2017-10-20 佛山市盈智轩科技有限公司 Home-use floor cleaning system and floor cleaning method
WO2019032684A1 (en) * 2017-08-08 2019-02-14 Diversey, Inc. Creation and loading of mapping data on autonomous robotic devices
US11487013B2 (en) 2017-08-08 2022-11-01 Diversey, Inc. Creation and loading of mapping data on autonomous robotic devices
CN107544495A (en) * 2017-08-17 2018-01-05 北京小米移动软件有限公司 cleaning method and device
US10571918B2 (en) 2017-08-17 2020-02-25 Beijing Xiaomi Mobile Software Co., Ltd. Cleaning method and device
CN108436921A (en) * 2018-06-14 2018-08-24 于小龙 A kind of sweeping robot intelligent control method
CN108436921B (en) * 2018-06-14 2020-04-24 广东盈科机器人产业有限公司 Intelligent control method for sweeping robot
CN110648038B (en) * 2018-06-27 2022-07-19 广达电脑股份有限公司 Task area allocation method and system for cleaning device and cleaning device
CN110648038A (en) * 2018-06-27 2020-01-03 广达电脑股份有限公司 Task area allocation method and system for cleaning device and cleaning device
CN112714684A (en) * 2018-09-20 2021-04-27 三星电子株式会社 Cleaning robot and method for performing task thereof
CN110561459A (en) * 2018-09-28 2019-12-13 成都家有为力机器人技术有限公司 cleaning robot with binocular camera and working method thereof
CN109965785A (en) * 2019-03-04 2019-07-05 深圳市银星智能科技股份有限公司 Mobile terminal with display and information processing method
CN109920424A (en) * 2019-04-03 2019-06-21 北京石头世纪科技股份有限公司 Robot voice control method and device, robot and medium
CN114096930A (en) * 2019-09-19 2022-02-25 苏州宝时得电动工具有限公司 Self-moving equipment and automatic moving and working method thereof
CN114096930B (en) * 2019-09-19 2023-11-07 苏州宝时得电动工具有限公司 Self-moving device and automatic moving and working method thereof
WO2022171090A1 (en) * 2021-02-10 2022-08-18 北京石头创新科技有限公司 Map display method and apparatus, medium, and electronic device
CN114610820A (en) * 2021-12-31 2022-06-10 北京石头创新科技有限公司 Optimization method and device for three-dimensional map display
CN114061424A (en) * 2022-01-17 2022-02-18 广东奥迪威传感科技股份有限公司 Collision positioning method of robot

Also Published As

Publication number Publication date
CN105796002B (en) 2018-09-18

Similar Documents

Publication Publication Date Title
CN105796002A (en) Indoor cleaning method for cleaning robot, cleaning robot and mobile terminal
CN107544495B (en) Cleaning method and device
EP3447599B1 (en) Timed cleaning method, device and storage medium
JP6718954B2 (en) Room layout drawing method and device
CN105279898A (en) Alarm method and device
CN107049155B (en) Control method and device of sweeping robot
CN105928504A (en) Barrier positioning method, device and system
CN108606740A (en) Control the method and device of cleaning equipment operation
CN105607499A (en) Equipment grouping method and apparatus
CN105222802A (en) navigation, navigation video generation method and device
CN106580193A (en) Intelligent floor sweeping method and device and floor sweeping robot
CN105631803A (en) Method and device for filter processing
CN105828201A (en) Video processing method and device
CN105491523A (en) Method and device for acquiring position information
CN105407433A (en) Method and device for controlling sound output equipment
CN105718056A (en) Gesture recognition method and device
CN105912258A (en) Method and device for operation processing
CN105549944A (en) Device display method and device
CN105513137A (en) Three dimensional model and scene creating method and apparatus based on mobile intelligent terminal
CN105739834A (en) Menu displaying method and device
CN112179352A (en) Space map construction method and device, movement control method and device, and medium
CN105763552A (en) Transmission method, device and system in remote control
CN105517144A (en) Equipment positioning method and device
CN105912202A (en) Application sharing method and device
CN103995844A (en) Information search method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant