CN108107886A - Travel control method and device, the sweeping robot of sweeping robot - Google Patents
Travel control method and device, the sweeping robot of sweeping robot Download PDFInfo
- Publication number
- CN108107886A CN108107886A CN201711230750.8A CN201711230750A CN108107886A CN 108107886 A CN108107886 A CN 108107886A CN 201711230750 A CN201711230750 A CN 201711230750A CN 108107886 A CN108107886 A CN 108107886A
- Authority
- CN
- China
- Prior art keywords
- sweeping robot
- spatial image
- travel route
- image information
- mentioned
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000010408 sweeping Methods 0.000 title claims abstract description 97
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000004140 cleaning Methods 0.000 claims abstract description 14
- 238000003860 storage Methods 0.000 claims description 9
- 230000004888 barrier function Effects 0.000 claims description 7
- 238000010801 machine learning Methods 0.000 claims description 4
- 238000005259 measurement Methods 0.000 abstract description 7
- 230000006870 function Effects 0.000 description 8
- 238000004458 analytical method Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 238000013135 deep learning Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 210000002569 neuron Anatomy 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 238000013526 transfer learning Methods 0.000 description 3
- 241000406668 Loxodonta cyclotis Species 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000007935 neutral effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000005406 washing Methods 0.000 description 2
- 241001417527 Pempheridae Species 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000000151 deposition Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 230000008450 motivation Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000005086 pumping Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0253—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Acoustics & Sound (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
The invention discloses travel control method and device, the sweeping robots of a kind of sweeping robot.Wherein, this method includes:The spatial image information in space where obtaining sweeping robot, wherein, above-mentioned spatial image information includes target object to be cleaned;Obtain the location information of above-mentioned target object in above-mentioned spatial image information;The first travel route of above-mentioned sweeping robot is determined according to above-mentioned spatial image information and above-mentioned location information, and controls above-mentioned sweeping robot according to the above-mentioned first traveling route running.The present invention solves sweeping robot in correlation technique and there is technical issues that measurement blind area or cleaning.
Description
Technical field
The present invention relates to robot field, travel control method and device in particular to a kind of sweeping robot,
Sweeping robot.
Background technology
Traditional sweeping robot needs to carry out sweeping robot and obstacle by infrared sensor or ultrasonic technology
The distance of object, so as to avoiding obstacles, planning and designing route, still, is often detected not for smaller object to be cleaned again
It arrives, can so cause have many measurement blind areas or cleaning blind area, influence cleaning effect.
For it is above-mentioned the problem of, currently no effective solution has been proposed.
The content of the invention
An embodiment of the present invention provides travel control method and device, the sweeping robot of a kind of sweeping robot, so that
Sweeping robot in correlation technique is solved less, and there is measurement blind area or cleaning.
One side according to embodiments of the present invention provides a kind of travel control method of floor-washing robot, including:It obtains
The spatial image information in space where sweeping robot, wherein, above-mentioned spatial image information includes target object to be cleaned;
Obtain the location information of above-mentioned target object in above-mentioned spatial image information;Believe according to above-mentioned spatial image information and above-mentioned position
Breath determines the first travel route of above-mentioned sweeping robot, and controls above-mentioned sweeping robot according to above-mentioned first travel route row
It sails.
Optionally, the first traveling of above-mentioned sweeping robot is determined according to above-mentioned spatial image information and above-mentioned location information
Route includes:Using above-mentioned spatial image information and above-mentioned location information as the input of preset model, determine and above-mentioned space diagram
As information and corresponding first travel route of above-mentioned location information, wherein, above-mentioned preset model is using in first database
Multi-group data is trained by machine learning, and every group of data in above-mentioned first database in multi-group data include:It is empty
Between the location information of image information and target object and corresponding with the location information of above-mentioned spatial image information and target object
The first travel route.
Optionally, above-mentioned target object includes:First class object and the second class object, wherein, above-mentioned first class object bag
It includes:Close to the target object of barrier in above-mentioned spatial image information;Above-mentioned second class object is except upper in above-mentioned target object
State the target object outside the first class object;
Optionally, during controlling above-mentioned sweeping robot according to the above-mentioned first traveling route running, the above method is also
Including:It detects and whether there is above-mentioned first class object in above-mentioned travel route;It is determining, there are during above-mentioned first class object, to obtain
Second travel route, wherein, without above-mentioned first class object in above-mentioned second travel route;Determine above-mentioned first travel route
With the priority of the second travel route;Highest priority is selected from above-mentioned first travel route and above-mentioned second travel route
Travel route of the travel route as above-mentioned sweeping robot.
Optionally it is determined that the priority of above-mentioned first travel route and the second travel route, including:Obtain the above-mentioned first kind
Cleaning type belonging to object, wherein, above-mentioned cleaning type cleans the grade of difficulty of above-mentioned first class object for reflecting;Upper
When stating the corresponding grade of difficulty of the first class object more than predetermined threshold value, determine the priority of above-mentioned first travel route less than above-mentioned
The priority of second travel route.
Optionally, above-mentioned grade of difficulty is bigger, when cleaning the first class object corresponding with above-mentioned grade of difficulty is required
Between it is longer.
Optionally, when selecting travel route of second travel route as above-mentioned sweeping robot, the above method also wraps
It includes:When the distance of above-mentioned sweeping robot and above-mentioned barrier is less than predetermined threshold value, alerted;And in cut-through object
Afterwards, continue to travel according to above-mentioned second travel route.
Optionally, the spatial image information in space where obtaining sweeping robot, including:It obtains multiple in above-mentioned space
The multiple images for the image acquisition device that position is set;Above-mentioned multiple images are combined, obtain above-mentioned spatial image
Information.
Other side according to embodiments of the present invention provides a kind of travel controlling system of sweeping robot, including:
First acquisition module, for obtaining the spatial image information in sweeping robot place space, wherein, in above-mentioned spatial image information
Including target object to be cleaned;Second acquisition module, for obtaining the position of above-mentioned target object in above-mentioned spatial image information
Confidence ceases;Control module, for determining the of above-mentioned sweeping robot according to above-mentioned spatial image information and above-mentioned location information
One travel route, and above-mentioned sweeping robot is controlled according to the above-mentioned first traveling route running.
Other side according to embodiments of the present invention provides a kind of sweeping robot, including:Image collecting device,
For obtaining the spatial image information in sweeping robot place space, wherein, above-mentioned spatial image information includes to be cleaned
Target object;Processor, for obtaining the location information of above-mentioned target object in above-mentioned spatial image information;According to above-mentioned space
Image information and above-mentioned location information determine the first travel route of above-mentioned sweeping robot, and above-mentioned sweeping robot is controlled to press
According to the above-mentioned first traveling route running.
Other side according to embodiments of the present invention, provides a kind of storage medium, and above-mentioned storage medium includes storage
Program, wherein, equipment where above-mentioned storage medium is controlled when above procedure is run performs above-described sweeping robot
Travel control method.
Other side according to embodiments of the present invention provides a kind of processor, which is characterized in that above-mentioned processor is used
In operation program, wherein, above procedure performs the travel control method of above-described sweeping robot when running.
In embodiments of the present invention, using image recognition technology, believed using the spatial image in space where sweeping robot
The location information of breath and target object to be cleaned determines the first travel route of sweeping robot, and sweeping robot is controlled to press
According to the mode of the first traveling route running, so as to avoid blind area and range measurement blind area is cleaned, and then solves correlation technique
Middle sweeping robot there is technical issues that measurement blind area or cleaning.
Description of the drawings
Attached drawing described herein is used for providing a further understanding of the present invention, forms the part of the application, this hair
Bright schematic description and description does not constitute improper limitations of the present invention for explaining the present invention.In the accompanying drawings:
Fig. 1 is a kind of sweeping robot structure diagram according to embodiments of the present invention;
Fig. 2 is a kind of flow diagram of the travel control method of sweeping robot according to embodiments of the present invention;
Fig. 3 is a kind of structure diagram of the travel controlling system of sweeping robot according to embodiments of the present invention.
Specific embodiment
In order to which those skilled in the art is made to more fully understand the present invention program, below in conjunction in the embodiment of the present invention
The technical solution in the embodiment of the present invention is clearly and completely described in attached drawing, it is clear that described embodiment is only
The embodiment of a part of the invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people
Member's all other embodiments obtained without making creative work should all belong to the model that the present invention protects
It encloses.
It should be noted that term " first " in description and claims of this specification and above-mentioned attached drawing, "
Two " etc. be the object for distinguishing similar, without being used to describe specific order or precedence.It should be appreciated that it so uses
Data can exchange in the appropriate case, so as to the embodiment of the present invention described herein can with except illustrating herein or
Order beyond those of description is implemented.In addition, term " comprising " and " having " and their any deformation, it is intended that cover
Cover it is non-exclusive include, be not necessarily limited to for example, containing the process of series of steps or unit, method, system, product or equipment
Those steps or unit clearly listed, but may include not list clearly or for these processes, method, product
Or the intrinsic other steps of equipment or unit.
First, understand the embodiment of the present invention for convenience, below by part term or noun involved in the present invention into
Row illustrates:
Pixel:It is the least unit that can be shown on computer screen, for representing the unit of image, referring to can show
Horizontal and vertical pixel array, the pixel in screen is more, and the resolution ratio of picture is higher, and image is finer and smoother and forces
Very;Pixel:Refer to the numerical value of pixel.
Binaryzation:Refer to the picture to camera shooting, most of is coloured image, and coloured image information contained amount is huge
Greatly, for the content of picture, it can simply be divided into prospect and background, first cromogram is handled, picture is made there was only prospect
Information and background information can simply define foreground information as black, and background information is white, and here it is binary pictures.
CNN:Convolutional neural networks refer to describe the operation to input picture, export one group and describe dividing for picture material
Class or the probability of classification, i.e., be identified the image of input, to export the probability of the object in image;Pass through a series of convolution
Level builds up more abstract concept, including establishing multiple neurons, and establishes corresponding input layer and output layer, thus will
The node of input is constantly associated by neuron, obtains optimization object, can generally include convolutional layer, filter layer, by preceding to biography
Lead, loss function, backward conduction and function update as a learning cycle, to each trained picture, program will repeat solid
Fixed number purpose periodic process, to continue to optimize trained learning outcome.
To scheme to search figure:Refer to after image is got, result is ranked up by deep learning, and passes through user record
Triple data (inquiry picture, click on picture and do not click on picture) carry out the sequence loss function of training pattern, so as to obtain
Ranking results, after an image is inputted, model can detect main body automatically, and it is related right then to be discharged according to ranking score height
The result of elephant.
Transfer learning:Essence is images match, is applied model in every field by transfer learning, is specifically data
The vector representation X of picture in storehouse is moved to by linear transformation on the image X1 in other field, by quoting random Fourier
Migration conversion is changed into nonlinear function, the image then needed by function.
Naive Bayesian:It is to show a pictures, can be classified with returning an object value, using picture recognition as a simple state
Degree, to obtain corresponding object.
Dependency grammar:Refer to build the relation between subject term and the word for describing subject term, do not have in dependency grammar phrase this
Level, each node is corresponding with the word in sentence, can directly handle the relation in sentence between word and word, in order to
Analysis and information extraction.
Decision tree:Referring to be classified according to feature, each node proposes a problem, splits data into two classes, and after
Continuous to put question to, these problems are the learning trainings in existing data, with when putting into new data, on the tree according to where data
The problem of, data are divided on corresponding leaf.
Deep learning:It is a kind of based on the method that data are carried out with representative learning in machine learning, concept comes from artificial god
Research through network, motivation are the neutral net for establishing, simulating human brain progress analytic learning, and the mechanism that it imitates human brain is come
Explain data, such as image, sound and text.By combine low-level feature formed it is more abstract it is high-rise represent attribute classification or
Feature represents that the multilayer perceptron containing more hidden layers is exactly a kind of deep learning structure with the distributed nature for finding data.
KNN algorithms:If the sample of the k in feature space, a sample most like (i.e. closest in feature space)
In it is most of belong to some classification, then the sample falls within this classification.In KNN algorithms, selected neighbours are
Object through correctly classifying.
According to embodiments of the present invention, a kind of sweeping robot is provided, as shown in Figure 1, the sweeping robot includes:
Image collecting device 10, for obtaining the spatial image information in sweeping robot place space, wherein, spatial image
Information includes target object to be cleaned;
Processor 12, for obtaining the location information of target object in spatial image information;According to spatial image information and
Location information determines the first travel route of sweeping robot, and controls sweeping robot according to the first traveling route running
Based on the structure of above-mentioned sweeping robot, according to embodiments of the present invention, a kind of traveling of sweeping robot is provided
Control can be applied to structure shown in Fig. 1 when no embodiment of the method, this method, but not limited to this.It should be noted that attached
The step of flow of figure illustrates can perform in the computer system of such as a group of computer-executable instructions, though also,
So show logical order in flow charts, but in some cases, can be performed with the order being different from herein shown by
Or the step of description.
Fig. 2 is a kind of flow diagram of the travel control method of sweeping robot according to embodiments of the present invention, such as Fig. 2
Shown, this method comprises the following steps:
Step S202, the spatial image information in space where obtaining sweeping robot, wherein, spatial image information includes
Target object to be cleaned;
Step S204 obtains the location information of target object in spatial image information;
Optionally, above-mentioned target object includes:First class object and the second class object, wherein, the first class object includes:It is empty
Between in image information close to the target object of barrier;Second class object is the target in addition to the first class object in target object
Object.For example, the first class object includes but not limited to:Object to be cleaned close to wall OR gate etc..Wherein, the meaning of " close " can
To show as the distance of object and barrier to be cleaned less than pre-determined distance, which can flexibly set according to actual conditions
It is fixed, for example, 1cm, 2cm, 3cm etc..
As the alternative embodiment of the application, in control sweeping robot according to the process of the first traveling route running
In, it can also carry out procedure below:It detects and whether there is the first class object in travel route;Determining that there are the first class objects
When, the second travel route is obtained, wherein, without the first class object in the second travel route;Determine the first travel route and
The priority of two travel routes;Selected from the first travel route and the second travel route the travel route of highest priority as
The travel route of sweeping robot.Wherein, above-mentioned detection process may be employed traditional infrared or ultrasonic wave mode and detect,
When the route determined using traditional approach and image recognition technology is conflicted, it may be considered that the priority of the two, wherein, the priority
It can determine in the following manner:The cleaning type belonging to the first class object is obtained, wherein, it cleans type and cleans for reflecting
The grade of difficulty of one class object;When the corresponding grade of difficulty of the first class object is more than predetermined threshold value, the first travel route is determined
Priority be less than the second travel route priority.Above-mentioned cleaning difficulty includes but not limited to:Clean object to be cleaned when institute
The time of occupancy.For example, grade of difficulty is bigger, it is longer to clean the first class object required time corresponding with grade of difficulty.
The time can empirically obtain, alternatively, being determined by the way of machine learning.
Travel route is designed using above two mode, can realize the complementation of traditional approach and aptitude manner.
In one alternate embodiment, when selecting travel route of second travel route as sweeping robot, if
When running into the object for comparing and being difficult to clean, i.e., when the distance of sweeping robot and barrier is less than predetermined threshold value, alerted;
And after cut-through object, continue to travel according to the second travel route.
Step S206 determines the first travel route of sweeping robot according to spatial image information and location information, and controls
Sweeping robot processed is according to the first traveling route running.
Optionally, determine that the first travel route of sweeping robot can pass through according to spatial image information and location information
In the following manner is realized:Using spatial image information and location information as the input of preset model, determine with spatial image information and
Corresponding first travel route of location information, wherein, preset model is to pass through machine using the multi-group data in first database
What learning training obtained, every group of data in first database in multi-group data include:Spatial image information and target object
Location information and the first travel route corresponding with the location information of spatial image information and target object.
It is alternatively possible to the spatial image information in space where obtaining sweeping robot in the following manner:It obtains in sky
Between in the multiple images of image acquisition device that set of multiple positions;Multiple images are combined, obtain spatial image
Information.
In the embodiment of the present application, can be set in the specified region that is provided in the room of above-mentioned sweeping robot one or
Multiple images harvester, (for example, camera) are not limited for the installation position of camera in the application, for example, more than
It states sweeping robot to be arranged on exemplified by ordinarily resident family, can be, but not limited to all directions at the top of the house in parlor, point
It She Zhi not a camera.
In one alternate embodiment, image collecting device can also be arranged on sweeping robot, swept for example, being arranged on
Floor-washing robot side.The image collecting device that can also be arranged on other home appliances, for example, being arranged on air-conditioning or refrigerator etc.
On home appliance.
The spatial image of region can be gathered respectively by the camera for being arranged on different position, in acquisition image
When, can shoot an image every preset time period (for example, every one minute), then be generated according to above-mentioned graphical analysis
The status information (such as number of booty to be cleaned) of current region, and determine above-mentioned machine of sweeping the floor according to above-mentioned status information
The control instruction of people, and then control whether above-mentioned sweeping robot starts to work according to above-mentioned control instruction.
It should be noted that the classification of the image in the application for shooting does not limit, include but not limited to:Artwork master
As (gray level image), coloured image (RGB image).It, can be according to binary image processing mode analysis chart when analyzing image
Information as in specifically, in analysis, can carry out the pixel position in multiple pixels in image and history image
Compare, whether, to determine the pixel having differences, then the pixel there will be difference distinguishes, obtain depositing in image
In the image information of oil smoke.
Wherein it is possible to using CNN algorithms, multiple features that target object is extracted from the image information taken are believed
Breath, in extraction, can will input is into neutral net in image, with by establishing corresponding neuron, and according to neuron
Between preset function (such as Sigmoid functions) determine characteristics of image and image feature maps, so as to being reflected according to definite feature
It penetrates, exports multiple features of image.In addition, in the analysis image difference different time, deep learning may be employed, establish the first preset model
Afterwards, it can utilize to scheme image similar to current taken image in the method for searching figure search database, and extract image
The characteristic information of middle target object, further, it is also possible to the characteristic information using NB Algorithm extraction target object.
During target object in analysis space image, deep learning or KNN algorithms can be utilized, it will be in image information
Image information there are same characteristic features filters out, and with the characteristic information being had differences, and then obtains the image letter of target object
The different information of breath and template image.For example, taking spatial image information, image information and predetermined template image are carried out
Compare, if judging that sweeping robot can not be started, if at the sweeping robot not there are target object in image
In opening, it is also an option that closing the pumping sweeping robot;It, can be according to target pair if judging there are target object
The distribution of elephant determines the control instruction of sweeping robot.
The application can be, but not limited to use in a manner of scheming to search figure when analysis includes the spatial image of target object,
The image zooming-out for there are similar features with present image in model is come out, and passes through transfer learning algorithm, is determined immediate
Image.And the control instruction of the corresponding sweeping robot of distribution information in target object, and according to above-mentioned control instruction
When controlling the working condition of above-mentioned sweeping robot, the corresponding sweeper of above-mentioned control instruction can be extracted by KNN algorithms
The working condition of device people.
Fig. 3 is a kind of structure diagram of the travel controlling system of sweeping robot according to embodiments of the present invention.Such as Fig. 3
It is shown, the travel controlling system of the sweeping robot, including:
First acquisition module 30, for obtaining the spatial image information in sweeping robot place space, wherein, spatial image
Information includes target object to be cleaned;
Second acquisition module 32, for obtaining the location information of target object in spatial image information;
Control module 34, for determining the first traveling road of sweeping robot according to spatial image information and location information
Line, and sweeping robot is controlled according to the first traveling route running.
In addition, it is still necessary to which explanation, the optional or preferred embodiment of the present embodiment may refer to the phase in embodiment 1
Description is closed, details are not described herein again.
Through the above scheme, it can realize the effect for avoiding cleaning blind area and range measurement blind area, and then solve correlation
Sweeping robot there is technical issues that measurement blind area or cleaning in technology.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
In the above embodiment of the present invention, all emphasize particularly on different fields to the description of each embodiment, do not have in some embodiment
The part of detailed description may refer to the associated description of other embodiment.
In several embodiments provided herein, it should be understood that disclosed technology contents can pass through others
Mode is realized.Wherein, the apparatus embodiments described above are merely exemplary, such as the division of the unit, Ke Yiwei
A kind of division of logic function, can there is an other dividing mode in actual implementation, for example, multiple units or component can combine or
Person is desirably integrated into another system or some features can be ignored or does not perform.Another, shown or discussed is mutual
Between coupling, direct-coupling or communication connection can be INDIRECT COUPLING or communication link by some interfaces, unit or module
It connects, can be electrical or other forms.
The unit illustrated as separating component may or may not be physically separate, be shown as unit
The component shown may or may not be physical location, you can be located at a place or can also be distributed to multiple
On unit.Some or all of unit therein can be selected to realize the purpose of this embodiment scheme according to the actual needs.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it can also
That unit is individually physically present, can also two or more units integrate in a unit.Above-mentioned integrated list
The form that hardware had both may be employed in member is realized, can also be realized in the form of SFU software functional unit.
If the integrated unit is realized in the form of SFU software functional unit and is independent production marketing or use
When, it can be stored in a computer read/write memory medium.Based on such understanding, technical scheme is substantially
The part to contribute in other words to the prior art or all or part of the technical solution can be in the form of software products
It embodies, which is stored in a storage medium, is used including some instructions so that a computer
Equipment (can be personal computer, server or network equipment etc.) perform each embodiment the method for the present invention whole or
Part steps.And foregoing storage medium includes:USB flash disk, read-only memory (ROM, Read-Only Memory), arbitrary access are deposited
Reservoir (RAM, Random Access Memory), mobile hard disk, magnetic disc or CD etc. are various can to store program code
Medium.
The above is only the preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art
For member, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications also should
It is considered as protection scope of the present invention.
Claims (12)
1. a kind of travel control method of sweeping robot, which is characterized in that including:
The spatial image information in space where obtaining sweeping robot, wherein, the spatial image information includes to be cleaned
Target object;
Obtain the location information of target object described in the spatial image information;
The first travel route of the sweeping robot is determined according to the spatial image information and the location information, and is controlled
The sweeping robot is according to the described first traveling route running.
It is 2. according to the method described in claim 1, it is characterized in that, true according to the spatial image information and the location information
First travel route of the fixed sweeping robot includes:
Using the spatial image information and the location information as the input of preset model, determine and the spatial image information
The first travel route corresponding with the location information, wherein, the preset model is to use multigroup number in first database
According to what is trained by machine learning, every group of data in the first database in multi-group data include:Spatial image
The location information and corresponding with the location information of the spatial image information and target object first of information and target object
Travel route.
3. according to the method described in claim 1, it is characterized in that, the target object includes:First class object and the second class
Object, wherein, first class object includes:Close to the target object of barrier in the spatial image information;Described second
Class object is the target object in addition to first class object in the target object.
4. according to the method described in claim 3, it is characterized in that, the sweeping robot is controlled according to the described first traveling road
During line travels, the method further includes:
It detects and whether there is first class object in the travel route;
Determining there are during first class object, obtaining the second travel route, wherein, in second travel route without
First class object;
Determine the priority of first travel route and the second travel route;From first travel route and second row
Sail the travel route for selecting the travel route of highest priority as the sweeping robot in route.
5. according to the method described in claim 4, it is characterized in that, determine first travel route and the second travel route
Priority, including:
The cleaning type belonging to first class object is obtained, wherein, the cleaning type cleans the first kind for reflecting
The grade of difficulty of object;
When the corresponding grade of difficulty of first class object is more than predetermined threshold value, the priority of first travel route is determined
Less than the priority of second travel route.
6. according to the method described in claim 5, it is characterized in that, the grade of difficulty is bigger, clean and the grade of difficulty
The corresponding first class object required time is longer.
7. according to the method described in claim 4, it is characterized in that, the second travel route is being selected as the sweeping robot
Travel route when, the method further includes:
When the distance of the sweeping robot and the barrier is less than predetermined threshold value, alerted;And in cut-through object
Afterwards, continue to travel according to second travel route.
8. method as claimed in any of claims 1 to 7, which is characterized in that obtain space where sweeping robot
Spatial image information, including:
Obtain the multiple images of the image acquisition device that multiple positions are set in the space;By described multiple images into
Row combination, obtains the spatial image information.
9. a kind of travel controlling system of sweeping robot, which is characterized in that including:
First acquisition module, for obtaining the spatial image information in sweeping robot place space, wherein, the spatial image letter
Breath includes target object to be cleaned;
Second acquisition module, for obtaining the location information of target object described in the spatial image information;
Control module, for determining the first row of the sweeping robot according to the spatial image information and the location information
Route is sailed, and controls the sweeping robot according to the described first traveling route running.
10. a kind of sweeping robot, which is characterized in that including:
Image collecting device, for obtaining the spatial image information in sweeping robot place space, wherein, the spatial image letter
Breath includes target object to be cleaned;
Processor, for obtaining the location information of target object described in the spatial image information;According to the spatial image
Information and the location information determine the first travel route of the sweeping robot, and control the sweeping robot according to institute
State the first traveling route running.
11. a kind of storage medium, which is characterized in that the storage medium includes the program of storage, wherein, it is run in described program
When control the storage medium where sweeping robot in equipment perform claim requirement 1 to 8 described in any one traveling control
Method processed.
12. a kind of processor, which is characterized in that the processor is used for operation program, wherein, right of execution when described program is run
Profit requires the travel control method of the sweeping robot described in any one in 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711230750.8A CN108107886B (en) | 2017-11-29 | 2017-11-29 | Driving control method and device of sweeping robot and sweeping robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711230750.8A CN108107886B (en) | 2017-11-29 | 2017-11-29 | Driving control method and device of sweeping robot and sweeping robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108107886A true CN108107886A (en) | 2018-06-01 |
CN108107886B CN108107886B (en) | 2020-07-10 |
Family
ID=62208802
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711230750.8A Active CN108107886B (en) | 2017-11-29 | 2017-11-29 | Driving control method and device of sweeping robot and sweeping robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108107886B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109171571A (en) * | 2018-09-18 | 2019-01-11 | 格力电器(武汉)有限公司 | Method for cleaning, device and the clean robot of rubbish |
CN109760054A (en) * | 2019-01-30 | 2019-05-17 | 重庆两江微链智能科技有限公司 | Robot autonomous learning system and robot control method |
CN110780664A (en) * | 2018-07-25 | 2020-02-11 | 格力电器(武汉)有限公司 | Robot control method and device and sweeping robot |
CN110837829A (en) * | 2018-08-17 | 2020-02-25 | 珠海格力电器股份有限公司 | Control method and system of sweeping robot |
CN111481113A (en) * | 2019-01-29 | 2020-08-04 | 北京奇虎科技有限公司 | Method and device for judging slippage of sweeping robot |
CN111643011A (en) * | 2020-05-26 | 2020-09-11 | 深圳市杉川机器人有限公司 | Cleaning robot control method and device, cleaning robot and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1720852A (en) * | 2004-07-14 | 2006-01-18 | 三洋电机株式会社 | Cleaner |
CN103120573A (en) * | 2012-12-06 | 2013-05-29 | 深圳市圳远塑胶模具有限公司 | Working method and working system of intelligent cleaning robot |
CN103605365A (en) * | 2013-11-07 | 2014-02-26 | 成都赛康信息技术有限责任公司 | Fully automatic operation method of substation equipment pollution inspection, determination and cleaning |
US20150150429A1 (en) * | 2013-12-04 | 2015-06-04 | Samsung Electronics Co., Ltd. | Cleaning robot and control method thereof |
CN105182981A (en) * | 2015-10-14 | 2015-12-23 | 珠海格力电器股份有限公司 | Robot walking method, control method, control system, and server |
CN105380575A (en) * | 2015-12-11 | 2016-03-09 | 美的集团股份有限公司 | Control method and system for sweeping robot, cloud server and sweeping robot |
CN105411491A (en) * | 2015-11-02 | 2016-03-23 | 中山大学 | Home intelligent cleaning system and method based on environment monitoring |
CN205375188U (en) * | 2015-12-31 | 2016-07-06 | 浙江同筑科技有限公司 | Formula AGV navigation dolly slips into |
CN106292659A (en) * | 2016-07-30 | 2017-01-04 | 许琴琴 | A kind of sweeping robot method for searching |
CN107063257A (en) * | 2017-02-05 | 2017-08-18 | 安凯 | A kind of separate type sweeping robot and its paths planning method |
-
2017
- 2017-11-29 CN CN201711230750.8A patent/CN108107886B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1720852A (en) * | 2004-07-14 | 2006-01-18 | 三洋电机株式会社 | Cleaner |
CN103120573A (en) * | 2012-12-06 | 2013-05-29 | 深圳市圳远塑胶模具有限公司 | Working method and working system of intelligent cleaning robot |
CN103605365A (en) * | 2013-11-07 | 2014-02-26 | 成都赛康信息技术有限责任公司 | Fully automatic operation method of substation equipment pollution inspection, determination and cleaning |
US20150150429A1 (en) * | 2013-12-04 | 2015-06-04 | Samsung Electronics Co., Ltd. | Cleaning robot and control method thereof |
CN105182981A (en) * | 2015-10-14 | 2015-12-23 | 珠海格力电器股份有限公司 | Robot walking method, control method, control system, and server |
CN105411491A (en) * | 2015-11-02 | 2016-03-23 | 中山大学 | Home intelligent cleaning system and method based on environment monitoring |
CN105380575A (en) * | 2015-12-11 | 2016-03-09 | 美的集团股份有限公司 | Control method and system for sweeping robot, cloud server and sweeping robot |
CN205375188U (en) * | 2015-12-31 | 2016-07-06 | 浙江同筑科技有限公司 | Formula AGV navigation dolly slips into |
CN106292659A (en) * | 2016-07-30 | 2017-01-04 | 许琴琴 | A kind of sweeping robot method for searching |
CN107063257A (en) * | 2017-02-05 | 2017-08-18 | 安凯 | A kind of separate type sweeping robot and its paths planning method |
Non-Patent Citations (3)
Title |
---|
WOONG KEUN UYUN: "A Sweeping Path Planner for a Smearing Robot", 《COPYRIGHT IFAC MOBILE ROBOT TECHNOLOGY》 * |
万军: "基于STM32 的智能扫地机器人避障***设计的研究", 《中国高新区》 * |
王斐: "自主移动机器人全局覆盖策略的设计与实现", 《PROCEEDINGS OF THE 30TH CHINESE CONTROL CONFERENCE》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110780664A (en) * | 2018-07-25 | 2020-02-11 | 格力电器(武汉)有限公司 | Robot control method and device and sweeping robot |
CN110837829A (en) * | 2018-08-17 | 2020-02-25 | 珠海格力电器股份有限公司 | Control method and system of sweeping robot |
CN110837829B (en) * | 2018-08-17 | 2022-09-23 | 珠海格力电器股份有限公司 | Control method and system of sweeping robot |
CN109171571A (en) * | 2018-09-18 | 2019-01-11 | 格力电器(武汉)有限公司 | Method for cleaning, device and the clean robot of rubbish |
CN111481113A (en) * | 2019-01-29 | 2020-08-04 | 北京奇虎科技有限公司 | Method and device for judging slippage of sweeping robot |
CN109760054A (en) * | 2019-01-30 | 2019-05-17 | 重庆两江微链智能科技有限公司 | Robot autonomous learning system and robot control method |
CN111643011A (en) * | 2020-05-26 | 2020-09-11 | 深圳市杉川机器人有限公司 | Cleaning robot control method and device, cleaning robot and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN108107886B (en) | 2020-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108107886A (en) | Travel control method and device, the sweeping robot of sweeping robot | |
WO2021043193A1 (en) | Neural network structure search method and image processing method and device | |
CN108088235B (en) | The control method and device of dryer, dryer, memory, processor | |
CN109614985B (en) | Target detection method based on densely connected feature pyramid network | |
Alzubaidi et al. | Review of deep learning: concepts, CNN architectures, challenges, applications, future directions | |
CN106570477B (en) | Vehicle cab recognition model building method and model recognizing method based on deep learning | |
Elngar et al. | Image classification based on CNN: a survey | |
CN108050674A (en) | Control method and device, the terminal of air-conditioning equipment | |
Razmjooy et al. | A hybrid neural network Imperialist Competitive Algorithm for skin color segmentation | |
Marstaller et al. | The evolution of representation in simple cognitive networks | |
CN111797895B (en) | Training method, data processing method, system and equipment for classifier | |
CN107860100A (en) | The air-out control method and terminal of air-conditioning | |
CN108052199A (en) | Control method, device and the smoke exhaust ventilator of smoke exhaust ventilator | |
CN108105136A (en) | Control method, device and the fan of fan | |
CN107816781A (en) | The control method and device of air-conditioning | |
CN108921879A (en) | The motion target tracking method and system of CNN and Kalman filter based on regional choice | |
CN107423721A (en) | Interactive action detection method, device, storage medium and processor | |
CN107654406A (en) | Fan air-supply control device, fan air-supply control method and device | |
CN108596256B (en) | Object recognition classifier construction method based on RGB-D | |
JP6778842B2 (en) | Image processing methods and systems, storage media and computing devices | |
WO2022007867A1 (en) | Method and device for constructing neural network | |
CN107560090A (en) | Air blowing control method and device, the terminal of air-conditioning | |
Aitkenhead et al. | A neural network face recognition system | |
KR102597787B1 (en) | A system and method for multiscale deep equilibrium models | |
CN112069916B (en) | Face beauty prediction method, device and system and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |