CN108634874A - Sweeping robot and its cleaning method - Google Patents
Sweeping robot and its cleaning method Download PDFInfo
- Publication number
- CN108634874A CN108634874A CN201810450480.XA CN201810450480A CN108634874A CN 108634874 A CN108634874 A CN 108634874A CN 201810450480 A CN201810450480 A CN 201810450480A CN 108634874 A CN108634874 A CN 108634874A
- Authority
- CN
- China
- Prior art keywords
- image data
- target
- robot
- scene
- depth image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000010408 sweeping Methods 0.000 title claims abstract description 43
- 238000004140 cleaning Methods 0.000 title claims abstract description 38
- 238000000034 method Methods 0.000 title claims abstract description 27
- 238000010521 absorption reaction Methods 0.000 claims abstract description 17
- 238000001179 sorption measurement Methods 0.000 claims abstract description 10
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 1
- 229910003460 diamond Inorganic materials 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/24—Floor-sweeping machines, motor-driven
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0219—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a kind of sweeping robot and its cleaning methods.Including:Image collection module, depth image data and color image data for obtaining target scene in real time;Locating module determines the current pose of the robot for being based on the depth image data and the color image data;Module is built, the current pose based on the depth image data and the robot is used for, builds the current scene map of the target scene;Adsorption module carries out absorption cleaning for being based on the current scene map to the target scene.The sweeping robot and its cleaning method of the present invention, the current scene map established is more accurate, also, the structure of the sweeping robot is simple, feature-rich.
Description
Technical field
The present invention relates to robotic technology field, more particularly to a kind of sweeping robot and a kind of cleaning of sweeping robot
Method.
Background technology
With the horizontal raising of life, sweeping robot is because easy to operate, easy to use enter into more and more
People live and family, office contact together, are becoming a member important in small household appliances, well received.
The odometer of sweeping robot itself, which exists to skid, collide etc., at present can lead to odometer data error, be easy hair
It is raw disorderly to hit at random, and there is no critical object identification function, some important small articles (money, diamond) etc. can be adsorbed onto rubbish
Inside rubbish box.Generally there are two types of working methods for sweeping robot, pass through algorithm one is color image, identification domestic environment is used
Carry out navigator fix and build figure, this mode algorithm is complicated, high to hardware requirement and at high price, affected by environment very big.It is another
Kind carries out navigator fix using laser radar and builds figure, although low precision laser radar cost is cheap, figure of founding a capital navigation is inaccurate
Really, error is very big;High-precision laser radar cost is again very high, is not suitable for the level of consumption of ruck.
Invention content
The present invention is directed at least solve one of the technical problems existing in the prior art, it is proposed that a kind of sweeping robot and
A kind of cleaning method of sweeping robot.
To achieve the goals above, the first aspect of the present invention provides a kind of sweeping robot, including:
Image collection module, depth image data and color image data for obtaining target scene in real time;
Locating module determines the robot for being based on the depth image data and the color image data
Current pose;
Module is built, the current pose based on the depth image data and the robot is used for, builds the target
The current scene map of scene;
Adsorption module carries out absorption cleaning for being based on the current scene map to the target scene.
Optionally, described image acquisition module includes RGBD cameras.
Optionally, described to build module, it is additionally operable to:
Based on the depth image data, measure in current pose and the depth image data residing for the robot
Preset target obstacle target range;
Current pose based on the target range and the robot, with building the current scene of the target scene
Figure.
Optionally, further include object identification module and judgment module;Wherein,
The object identification module, for identification each target object in the color image data;
The judgment module, for each target object to be matched with pre-stored critical object respectively, and
Upon a match, which is defined as critical object;
The adsorption module carries out absorption cleaning for being based on the current scene map to the target scene, and
In scale removal process, the critical object is avoided.
Optionally, further include:
Logging modle, the location information for recording the critical object, and export prompting message.
The second aspect of the present invention provides a kind of cleaning method of sweeping robot, including:
Step S110, the depth image data and color image data of target scene are obtained in real time;
Step S120, it is based on the depth image data and the color image data, determines the current of the robot
Pose;
Step S130, the current pose based on the depth image data and the robot, builds the target scene
Current scene map;
Step S140, absorption cleaning is carried out to the target scene based on the current scene map.
Optionally, in the step S110, using RGBD cameras obtain in real time target scene depth image data and
Color image data.
Optionally, the step S130 is specifically included:
Based on the depth image data, measure in current pose and the depth image data residing for the robot
Preset target obstacle target range;
Current pose based on the target range and the robot, with building the current scene of the target scene
Figure.
Optionally, the step S140 further includes:
Identify each target object in the color image data;
Each target object is matched with pre-stored critical object respectively, and upon a match, by the target
Object is defined as critical object;
Absorption cleaning is carried out to the target scene based on the current scene map, and in scale removal process, avoids institute
State critical object.
Optionally, described the target object is defined as critical object to specifically include:
The location information of the critical object is recorded, and exports prompting message.
The sweeping robot and its cleaning method of the present invention obtains target scene in real time by image collection module first
Depth image data and color image data, secondly, by locating module be based on the depth image data and the machine
The current pose of people builds the current scene map of the target scene, again, the depth image is based on by building module
The current pose of data and the robot builds the current scene map of the target scene, finally, utilizes adsorption module base
Absorption cleaning is carried out to the target scene in the current scene map.Therefore, sweeping robot of the invention and its cleaning
Method, the current scene map established is more accurate, also, the structure of the sweeping robot is simple, feature-rich.
Description of the drawings
Attached drawing is to be used to provide further understanding of the present invention, an and part for constitution instruction, with following tool
Body embodiment is used to explain the present invention together, but is not construed as limiting the invention.In the accompanying drawings:
Fig. 1 is the structural schematic diagram of sweeping robot in one embodiment of the invention;
Fig. 2 is the flow chart of the cleaning method of sweeping robot in one embodiment of the invention.
Reference sign
100:Sweeping robot;
110:Image collection module;
120:Locating module;
130:Build module;
140:Adsorption module;
150:Object identification module;
160:Judgment module;
170:Logging modle.
Specific implementation mode
The specific implementation mode of the present invention is described in detail below in conjunction with attached drawing.It should be understood that this place is retouched
The specific implementation mode stated is merely to illustrate and explain the present invention, and is not intended to restrict the invention.
As shown in Figure 1, the first aspect of the present invention, is related to a kind of sweeping robot 100, including:
Image collection module 110, depth image data and color image data for obtaining target scene in real time;
Locating module 120 determines the robot for being based on the depth image data and the color image data
Current pose;
Module 130 is built, the current pose based on the depth image data and the robot is used for, builds the mesh
Mark the current scene map of scene;
Adsorption module 140 carries out absorption cleaning for being based on the current scene map to the target scene.
Sweeping robot 100 in the present embodiment obtains target scene in real time by image collection module 110 first
Secondly depth image data and color image data are based on the depth image data and the machine by locating module 120
The current pose of people builds the current scene map of the target scene, again, the depth is based on by building module 130
The current pose of image data and the robot builds the current scene map of the target scene, finally, utilizes absorption mould
Block 140 carries out absorption cleaning based on the current scene map to the target scene.Therefore, the machine of sweeping the floor in the present embodiment
People 100, and the current scene map established is more accurate, also, the structure of the sweeping robot 100 is simple, feature-rich.
It should be noted that the concrete structure for above-mentioned image collection module 110 does not define, as long as energy
It is enough to meet the depth image data and color image data for obtaining target scene, for example, as a kind of optional ground structure, it should
Image collection module 110 can be RGBD cameras.
Optionally, described to build module 130, it is additionally operable to:
Based on the depth image data, measure in current pose and the depth image data residing for the robot
Preset target obstacle target range;
Current pose based on the target range and the robot, with building the current scene of the target scene
Figure.
Robot 100 in the present embodiment builds module 130 by described, according to the depth image data, can survey
Measure the target range of the current pose and the preset target obstacle in the depth image data residing for the robot.
That is in the present embodiment, depth image data can simulate the laser radar of traditional sweeping robot, using depth image
DATA REASONING distance can more accurately measure the target range with target obstacle, so that is established works as
Preceding scene map is more accurate, improves the pick-up performance of sweeping robot 100.
Optionally, further include object identification module 150 and judgment module 160;Wherein,
The object identification module 150, for identification each target object in the color image data;
The judgment module 160, for each target object to be matched with pre-stored critical object respectively,
And upon a match, which is defined as critical object;
The adsorption module 140 carries out absorption cleaning for being based on the current scene map to the target scene, and
In scale removal process, the critical object is avoided.
Sweeping robot 100 in the present embodiment, is identified using object identification module 150 in the color image data
Each target object, and each target object is matched with pre-stored critical object respectively using judgment module 160, and work as
It is defined as critical object when matching, in this way, adsorption module 140 can be conducive to during adsorbing cleaning, avoids the pass
Key object, the property of effective protection user.
It should be noted that above-mentioned object identification module 150 and judgment module 160 can be the same module, sentence is changed
It talks about, can be that a module is integrated with two functions, can also identify and judge the crucial object in color image data
Body.
Optionally, further include logging modle 170, the location information for recording the critical object, and export prompting letter
Breath.
The sweeping robot 100 of the present embodiment structure can be recorded effectively and be known by set logging modle 170
The location information of other critical object, and prompting message can be exported, in this way, may remind the user that the position for paying attention to crucial article
It sets, prevents from accidentally clearing up crucial article, cause user's property impaired.
The second aspect of the present invention, as shown in Fig. 2, a kind of cleaning method S100 of sweeping robot is provided, including:
S110, the depth image data and color image data for obtaining target scene in real time;
S120, it is based on the depth image data and the color image data, determines the current pose of the robot;
S130, the current pose based on the depth image data and the robot, build working as the target scene
Preceding scene map;
S140, absorption cleaning is carried out to the target scene based on the current scene map.
The cleaning method S100 of sweeping robot in the present embodiment obtains the depth image of target scene in real time first
Data and color image data, secondly, the current pose based on the depth image data and the robot builds the mesh
The current scene map of scene is marked, again, the current pose based on the depth image data and the robot, described in structure
The current scene map of target scene finally carries out absorption cleaning based on the current scene map to the target scene.Cause
This, the cleaning method S100 of the sweeping robot in the present embodiment, the current scene map established is more accurate, also, should
The structure of sweeping robot is simple, feature-rich.
Optionally, in the step S110, using RGBD cameras obtain in real time target scene depth image data and
Color image data.
Optionally, the step S130 is specifically included:
Based on the depth image data, measure in current pose and the depth image data residing for the robot
Preset target obstacle target range;
Current pose based on the target range and the robot, with building the current scene of the target scene
Figure.
The cleaning method S100 of sweeping robot in the present embodiment can measure institute according to the depth image data
State the target range of the current pose and the preset target obstacle in the depth image data residing for robot.Namely
It says, in the present embodiment, depth image data can simulate the laser radar of traditional sweeping robot, using depth image data
Measurement distance can more accurately measure the target range with target obstacle, so that is established works as front court
Scape map is more accurate, improves the pick-up performance of sweeping robot 100.
Optionally, the step S140 further includes:
Identify each target object in the color image data;
Each target object is matched with pre-stored critical object respectively, and upon a match, by the target
Object is defined as critical object;
Absorption cleaning is carried out to the target scene based on the current scene map, and in scale removal process, avoids institute
State critical object.
The cleaning method S100 of sweeping robot in the present embodiment, by identifying each mesh in the color image data
Object is marked, and each target object is matched with pre-stored critical object respectively, and is defined as crucial object upon a match
Body avoids the critical object, effective protection user in this way, can be conducive to sweeping robot during adsorbing cleaning
Property.
Optionally, described the target object is defined as critical object to specifically include:
The location information of the critical object is recorded, and exports prompting message.
It is understood that the principle that embodiment of above is intended to be merely illustrative of the present and the exemplary implementation that uses
Mode, however the present invention is not limited thereto.For those skilled in the art, in the essence for not departing from the present invention
In the case of refreshing and essence, various changes and modifications can be made therein, these variations and modifications are also considered as protection scope of the present invention.
Claims (10)
1. a kind of sweeping robot, which is characterized in that including:
Image collection module, depth image data and color image data for obtaining target scene in real time;
Locating module determines the current of the robot for being based on the depth image data and the color image data
Pose;
Module is built, the current pose based on the depth image data and the robot is used for, builds the target scene
Current scene map;
Adsorption module carries out absorption cleaning for being based on the current scene map to the target scene.
2. sweeping robot according to claim 1, which is characterized in that described image acquisition module includes RGBD cameras.
3. sweeping robot according to claim 1, which is characterized in that it is described to build module, it is additionally operable to:
Based on the depth image data, measure current pose residing for the robot with it is pre- in the depth image data
If target obstacle target range;
Current pose based on the target range and the robot, builds the current scene map of the target scene.
4. sweeping robot as claimed in any of claims 1 to 3, which is characterized in that further include object identification mould
Block and judgment module;Wherein,
The object identification module, for identification each target object in the color image data;
The judgment module, for each target object to be matched with pre-stored critical object respectively, and when
The target object is defined as critical object by timing;
The adsorption module carries out absorption cleaning to the target scene for being based on the current scene map, and is clearing up
In the process, the critical object is avoided.
5. sweeping robot according to claim 4, which is characterized in that further include:
Logging modle, the location information for recording the critical object, and export prompting message.
6. a kind of cleaning method of sweeping robot, which is characterized in that including:
Step S110, the depth image data and color image data of target scene are obtained in real time;
Step S120, it is based on the depth image data and the color image data, determines the current pose of the robot;
Step S130, the current pose based on the depth image data and the robot, builds working as the target scene
Preceding scene map;
Step S140, absorption cleaning is carried out to the target scene based on the current scene map.
7. cleaning method according to claim 1, which is characterized in that real using RGBD cameras in the step S110
When obtain target scene depth image data and color image data.
8. cleaning method according to claim 1, which is characterized in that the step S130 is specifically included:
Based on the depth image data, measure current pose residing for the robot with it is pre- in the depth image data
If target obstacle target range;
Current pose based on the target range and the robot, builds the current scene map of the target scene.
9. the cleaning method according to any one of claim 6 to 8, which is characterized in that the step S140 further includes:
Identify each target object in the color image data;
Each target object is matched with pre-stored critical object respectively, and upon a match, by the target object
It is defined as critical object;
Absorption cleaning is carried out to the target scene based on the current scene map, and in scale removal process, avoids the pass
Key object.
10. cleaning method according to claim 9, which is characterized in that described that the target object is defined as critical object
It specifically includes:
The location information of the critical object is recorded, and exports prompting message.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810450480.XA CN108634874A (en) | 2018-05-11 | 2018-05-11 | Sweeping robot and its cleaning method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810450480.XA CN108634874A (en) | 2018-05-11 | 2018-05-11 | Sweeping robot and its cleaning method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108634874A true CN108634874A (en) | 2018-10-12 |
Family
ID=63754609
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810450480.XA Pending CN108634874A (en) | 2018-05-11 | 2018-05-11 | Sweeping robot and its cleaning method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108634874A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109947114A (en) * | 2019-04-12 | 2019-06-28 | 南京华捷艾米软件科技有限公司 | Robot complete coverage path planning method, device and equipment based on grating map |
CN111179377A (en) * | 2019-12-31 | 2020-05-19 | 深圳市优必选科技股份有限公司 | Robot mapping method, corresponding robot and storage medium |
CN112263189A (en) * | 2020-10-26 | 2021-01-26 | 中国计量大学 | Sweeping robot and method for distinguishing and cleaning garbage |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102283616A (en) * | 2010-10-22 | 2011-12-21 | 青岛科技大学 | Domestic intelligent cleaning system based on machine vision |
CN104027040A (en) * | 2013-03-05 | 2014-09-10 | Lg电子株式会社 | Robot cleaner |
CN105849660A (en) * | 2013-12-19 | 2016-08-10 | 伊莱克斯公司 | Robotic cleaning device |
CN106020201A (en) * | 2016-07-13 | 2016-10-12 | 广东奥讯智能设备技术有限公司 | Mobile robot 3D navigation and positioning system and navigation and positioning method |
CN106200645A (en) * | 2016-08-24 | 2016-12-07 | 北京小米移动软件有限公司 | Autonomous robot, control device and control method |
US20170010623A1 (en) * | 2015-07-08 | 2017-01-12 | SZ DJI Technology Co., Ltd | Camera configuration on movable objects |
CN106959691A (en) * | 2017-03-24 | 2017-07-18 | 联想(北京)有限公司 | Mobile electronic equipment and immediately positioning and map constructing method |
CN107024928A (en) * | 2016-02-01 | 2017-08-08 | 松下家电研究开发(杭州)有限公司 | A kind of Intelligent robot for sweeping floor and Intelligent robot for sweeping floor control method |
CN207051738U (en) * | 2017-06-12 | 2018-02-27 | 炬大科技有限公司 | A kind of mobile electronic device |
-
2018
- 2018-05-11 CN CN201810450480.XA patent/CN108634874A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102283616A (en) * | 2010-10-22 | 2011-12-21 | 青岛科技大学 | Domestic intelligent cleaning system based on machine vision |
CN104027040A (en) * | 2013-03-05 | 2014-09-10 | Lg电子株式会社 | Robot cleaner |
CN105849660A (en) * | 2013-12-19 | 2016-08-10 | 伊莱克斯公司 | Robotic cleaning device |
US20170010623A1 (en) * | 2015-07-08 | 2017-01-12 | SZ DJI Technology Co., Ltd | Camera configuration on movable objects |
CN107850902A (en) * | 2015-07-08 | 2018-03-27 | 深圳市大疆创新科技有限公司 | Camera configuration in loose impediment |
CN107024928A (en) * | 2016-02-01 | 2017-08-08 | 松下家电研究开发(杭州)有限公司 | A kind of Intelligent robot for sweeping floor and Intelligent robot for sweeping floor control method |
CN106020201A (en) * | 2016-07-13 | 2016-10-12 | 广东奥讯智能设备技术有限公司 | Mobile robot 3D navigation and positioning system and navigation and positioning method |
CN106200645A (en) * | 2016-08-24 | 2016-12-07 | 北京小米移动软件有限公司 | Autonomous robot, control device and control method |
CN106959691A (en) * | 2017-03-24 | 2017-07-18 | 联想(北京)有限公司 | Mobile electronic equipment and immediately positioning and map constructing method |
CN207051738U (en) * | 2017-06-12 | 2018-02-27 | 炬大科技有限公司 | A kind of mobile electronic device |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109947114A (en) * | 2019-04-12 | 2019-06-28 | 南京华捷艾米软件科技有限公司 | Robot complete coverage path planning method, device and equipment based on grating map |
CN109947114B (en) * | 2019-04-12 | 2022-03-15 | 南京华捷艾米软件科技有限公司 | Robot full-coverage path planning method, device and equipment based on grid map |
CN111179377A (en) * | 2019-12-31 | 2020-05-19 | 深圳市优必选科技股份有限公司 | Robot mapping method, corresponding robot and storage medium |
US11654572B2 (en) | 2019-12-31 | 2023-05-23 | Ubtech Robotics Corp Ltd | Robot mapping method and robot and computer readable storage medium using the same |
CN111179377B (en) * | 2019-12-31 | 2024-04-26 | 深圳市优必选科技股份有限公司 | Robot mapping method, corresponding robot and storage medium |
CN112263189A (en) * | 2020-10-26 | 2021-01-26 | 中国计量大学 | Sweeping robot and method for distinguishing and cleaning garbage |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108634874A (en) | Sweeping robot and its cleaning method | |
CN107981790B (en) | Indoor area dividing method and sweeping robot | |
CN103925923B (en) | A kind of earth magnetism indoor locating system based on adaptive particle filter device algorithm | |
CN105990876B (en) | Charging pile, identification method and device thereof and automatic cleaning equipment | |
CN104858871B (en) | Robot system and self-built map thereof and the method for navigation | |
CN106647742B (en) | Movement routine method and device for planning | |
CN103389488B (en) | A kind of multiple light courcess indoor positioning apparatus and method based on light intensity | |
US6674276B2 (en) | Surface object locator with level indicator and scribe tip | |
US20170329336A1 (en) | Method and apparatus for localization and mapping based on rfid | |
CN105892461B (en) | A kind of matching and recognition method and system of robot local environment and map | |
CN107144292A (en) | The odometer method and mileage counter device of a kind of sports equipment | |
CN107742091A (en) | A kind of method and device of curb extraction | |
CN109946715A (en) | Detection method, device, mobile robot and storage medium | |
CN106370190A (en) | Vehicle navigation method, position marking method, apparatus, and system | |
CN110928301A (en) | Method, device and medium for detecting tiny obstacles | |
CN106384355B (en) | A kind of automatic calibration method in projection interactive system | |
CN107515714A (en) | A kind of finger touch recognition methods, device and touch projection equipment | |
CN109923490A (en) | Method for running the robot automatically moved | |
CN105890580B (en) | A kind of interior space mapping system and mapping method | |
CN107645702A (en) | position calibration method, device and system | |
CN104964708A (en) | Pavement pit detecting method based on vehicular binocular vision | |
CN108286951B (en) | Hand-held laser scanner for indoor door and window measurement | |
CN109540127A (en) | Method for determining position, mobile robot, storage medium and electronic equipment | |
CN108873911A (en) | It is a kind of that luggage case and its control method are followed based on ROS automatically | |
CN109015632A (en) | A kind of robot hand end localization method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181012 |
|
RJ01 | Rejection of invention patent application after publication |