CN108242052A - A kind of fire point applied to farmland determines method - Google Patents

A kind of fire point applied to farmland determines method Download PDF

Info

Publication number
CN108242052A
CN108242052A CN201611213217.6A CN201611213217A CN108242052A CN 108242052 A CN108242052 A CN 108242052A CN 201611213217 A CN201611213217 A CN 201611213217A CN 108242052 A CN108242052 A CN 108242052A
Authority
CN
China
Prior art keywords
fire
pixels
determined
cluster
remote sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611213217.6A
Other languages
Chinese (zh)
Inventor
张丽
李会丹
李振钊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Space Star Technology (beijing) Co Ltd
Original Assignee
Space Star Technology (beijing) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Space Star Technology (beijing) Co Ltd filed Critical Space Star Technology (beijing) Co Ltd
Priority to CN201611213217.6A priority Critical patent/CN108242052A/en
Publication of CN108242052A publication Critical patent/CN108242052A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present invention provides a kind of fire point applied to farmland and determines method, by obtaining the remotely-sensed data in preset time period, by the non-fire point pixel removal in remotely-sensed data, it is got parms the factor using remotely-sensed data, fire monitoring model is established as the preliminary foundation for judging fire point, obtain fire point pixel to be determined, the redundancy fire point pixel being same as in unified class is removed finally by clustering algorithm, obtain fire point pixel, this method avoid fire point sentence and misjudge phenomenon more, the data of accurately fire point pixel can rapidly be obtained, reliable basis is provided to strengthen agricultural crop straw burning monitoring.

Description

Fire point determination method applied to farmland
Technical Field
The invention relates to the technical field of fire point judgment in remote sensing images, in particular to a fire point determination method applied to a farmland.
Background
The straw occupies an important position in biomass combustion. The straws mainly refer to surface biomass left after seeds of crops such as wheat, rice, corn, potatoes, oil plants, cotton, sugarcane and the like are harvested. The crop straws contain a large amount of C, H, O, N, S and other elements, a series of substances such as CO2, CO, CH4, N2O, NOX, suspended particles, organic hydrocarbons, toxic and harmful substances and the like can be generated by combustion of the crop straws, and especially, a large amount of direct combustion in the field can cause serious air pollution in a short time. With the increase of the yield per unit of crops, the total amount of the agricultural straws is rapidly increased. However, in recent years, the proportion of the straw directly used as a domestic fuel and a feed is greatly reduced, and the field burning phenomenon of the straw begins to appear in most areas. Because the fire points of the straw incineration are distributed in counties and villages, which are difficult to check and count, the treatment of the straw incineration is not targeted and cannot be effective.
With the progress of scientific technology, particularly the rapid development of information technology and 3S technology since the early 80S of the last century, the satellite remote sensing technology has the characteristics of strong timeliness, rapid data acquisition and low cost, and can dynamically and accurately monitor the straw burning condition in a large range by utilizing the satellite remote sensing technology. Much research and application has been carried out at home and abroad in this respect.
The MODIS satellite and SUOMI-NPP satellite data are applied to fire disaster measurement, but the difference between the MODIS satellite and the SUOMI-NPP satellite in design and operation still causes inconsistency of fire point prediction. Both the tera and AQUA satellites of MODIS can guarantee more than two observations per 24 hours for a given observation region. Theoretically, 4 MODIS observations can be obtained per day. In contrast, for a given observation region, a SUOMI-NPP satellite may have at most two observation opportunities per day. Thus, multiple observations may be made for the same fire. Due to differences of shooting angles and sensors, for the same fire point, monitoring of MODIS and SUOMI-NPP satellites may generate spatial deviation, so that multiple judgments and wrong judgments are caused.
Disclosure of Invention
Aiming at the existing defects or shortcomings, the invention provides a fire point determination method applied to a farmland, which is characterized by comprising the following steps:
step 1, obtaining remote sensing data in a preset time period;
step 2, preprocessing the remote sensing data and extracting fire influence factors applicable to a local area;
step 3, generating a fire monitoring model according to the fire influence factor;
step 4, calculating a fire monitoring index of the local area aiming at each detection area, and marking the fire monitoring index as containing a fire point to be determined when the index value is greater than the threshold value;
and 5: and acquiring a fire point pixel to be determined based on the area containing the fire point to be determined.
And 6, removing redundant fire pixels according to the time sequence of the fire pixels to be determined, and determining the fire pixels.
Preferably, the preprocessing the remote sensing data includes:
in the remote sensing data, taking near infrared band data with preset first distance resolution as original remote sensing data, and determining cloud and water body pixels in the original remote sensing data;
and converting the original remote sensing data which is preset with a first distance resolution and comprises the cloud and the water body pixels into remote sensing data with a second distance resolution by a cubic polynomial interpolation method.
Preferably, in step 6, according to the time sequence of the fire pixels to be determined, redundant fire pixels in the fire pixels to be determined belonging to the same cluster are removed by a clustering algorithm to determine fire pixel elements, and the method specifically includes:
6-1, acquiring the shooting time of the fire point pixel to be determined according to the remote sensing data;
and 6-2, sequencing the fire pixel elements according to the shooting time, removing redundant fire pixel elements in the fire pixel elements to be determined, which belong to the same cluster, and determining the fire pixel elements.
Preferably, wherein the step 6-2 is: the fire pixels are sequenced according to the shooting time, and for the fire data in the same time period, redundant fire pixels in the to-be-determined fire pixels belonging to the same cluster are removed by a spatial K mean value method, so that the fire pixels in the time period are determined, and the method specifically comprises the following steps:
step 6-2-1, acquiring longitude and latitude data of the center of each cluster in initial clusters, wherein the number of the initial clusters is K;
step 6-2-2, classifying the fire point pixels to be determined according to the Euclidean distance from the fire point pixels to be determined to the center of each cluster;
6-2-3, updating the center of each cluster according to the fire pixel in each cluster until the Euclidean distance from the fire pixel to be determined in each cluster to the center of each cluster is not reduced any more, and obtaining a clustering result;
and 6-2-4, determining whether the quantity K of the initial clusters needs to be changed or not according to the value of Silhouette obtained from the clustering result, if the value of the Silhouette is larger than a threshold value, not changing the quantity K, reserving a fire pixel in each cluster, and determining a final fire pixel after removing redundant fire pixels. And if the value of Silhouette is not larger than the threshold value, changing the number K and clustering again.
According to the technical scheme, the method for determining the fire point is characterized in that the remote sensing data in the preset time period are obtained, the non-fire point pixels in the remote sensing data are removed, the remote sensing data are used for obtaining parameter factors, a fire monitoring model is established as the basis for preliminarily judging the fire point, the fire point pixels to be determined are obtained, and finally the redundant fire point pixels which are the same in the unified category are removed through a clustering algorithm, so that the fire point pixels are obtained.
Drawings
FIG. 1 is a flow chart of a method of the present invention.
Detailed Description
For a better understanding of the invention, the method according to the invention is further illustrated below with reference to the description of an embodiment in conjunction with the drawing.
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be understood by those skilled in the art, however, that the present invention may be practiced without these specific details. In the embodiments, well-known methods, procedures, components, and so forth have not been described in detail as not to unnecessarily obscure the embodiments.
Referring to fig. 1, a method for determining a fire point applied to an agricultural field according to the present invention includes:
step 1, obtaining remote sensing data in a preset time period;
step 2, preprocessing the remote sensing data and extracting fire influence factors applicable to a local area;
step 3, generating a fire monitoring model according to the fire influence factor;
step 4, calculating a fire monitoring index of the local area aiming at each detection area, and marking the fire monitoring index as containing a fire point to be determined when the index value is greater than the threshold value;
and 5: and acquiring a fire point pixel to be determined based on the area containing the fire point to be determined.
And 6, removing redundant fire pixels according to the time sequence of the fire pixels to be determined, and determining the fire pixels.
Preferably, the preprocessing the remote sensing data includes:
in the remote sensing data, taking near infrared band data with preset first distance resolution as original remote sensing data, and determining cloud and water body pixels in the original remote sensing data;
and converting the original remote sensing data which is preset with a first distance resolution and comprises the cloud and the water body pixels into remote sensing data with a second distance resolution by a cubic polynomial interpolation method.
In order to obtain a more accurate fire point pixel, the remote sensing data in this embodiment is remote sensing data of MODIS and VIIRS. MODIS data often causes false positives in fire monitoring. For example, in the step of discriminating and removing the cloud and the water body by preprocessing the remote sensing data, red band and near infrared band data having a pixel size of 1 km are required. As a result, cloud and water pixels with a volume less than 1 square kilometer are missed by the discrimination algorithm. Thereby causing confusion between fire and water body pixels in the following fire monitoring. In the final result, it is often found that the water body pixels are misjudged as fire points. Therefore, accurate judgment of the cloud and the water body and elimination in the first time are important steps for avoiding misjudgment.
In order to perform more detailed cloud and water body discrimination, the red wave band data with a resolution of 250 meters and the near infrared wave band data with a resolution of 250 meters of the MODIS data are adopted in the embodiment. By adopting the data of the resolution ratio as the original data, the missing judgment of most of clouds and water bodies with the volume smaller than 1 square kilometer can be avoided, and thus a judgment result with finer granularity is obtained. After the discrimination of the cloud and water body pixels is finished, a cubic polynomial interpolation method is adopted to resample the image pixels to 1 kilometer resolution, so that the image pixels are fused with the original 1 kilometer resolution brightness-temperature data.
Preferably, in step 6, according to the time sequence of the fire pixels to be determined, redundant fire pixels in the fire pixels to be determined belonging to the same cluster are removed by a clustering algorithm to determine fire pixel elements, and the method specifically includes:
6-1, acquiring the shooting time of the fire point pixel to be determined according to the remote sensing data;
and 6-2, sequencing the fire pixel elements according to the shooting time, removing redundant fire pixel elements in the fire pixel elements to be determined, which belong to the same cluster, and determining the fire pixel elements.
Preferably, wherein the step 6-2 is: the fire pixels are sequenced according to the shooting time, and for the fire data in the same time period, redundant fire pixels in the to-be-determined fire pixels belonging to the same cluster are removed by a spatial K mean value method, so that the fire pixels in the time period are determined, and the method specifically comprises the following steps:
step 6-2-1, acquiring longitude and latitude data of the center of each cluster in initial clusters, wherein the number of the initial clusters is K;
step 6-2-2, classifying the fire point pixels to be determined according to the Euclidean distance from the fire point pixels to be determined to the center of each cluster;
6-2-3, updating the center of each cluster according to the fire pixel in each cluster until the Euclidean distance from the fire pixel to be determined in each cluster to the center of each cluster is not reduced any more, and obtaining a clustering result;
and 6-2-4, determining whether the quantity K of the initial clusters needs to be changed or not according to the value of Silhouette obtained from the clustering result, if the value of the Silhouette is larger than a threshold value, not changing the quantity K, reserving a fire pixel in each cluster, and determining a final fire pixel after removing redundant fire pixels. And if the value of Silhouette is not larger than the threshold value, changing the number K and clustering again.
Wherein the step 2: extracting fire impact factors applicable to the local area includes: combustible moisture saturation, temperature, humidity and crop coverage rate, wherein the combustible moisture saturation is the key; the temperature is increased, the water saturation amount is rapidly reduced, the ignition point is reduced, and the fire source is easily generated; the air humidity is the ratio of water vapor in air, and is a physical quantity used for representing the dryness and wetness of air. When the humidity of the air is relatively high, the combustible can absorb moisture from the air, and the moisture saturation amount of the combustible is increased, so that the combustible is not easy to ignite; conversely, when the air humidity decreases, the air absorbs moisture from the combustible, reducing the moisture saturation of the combustible and thus being easily ignited.
Wherein, the step 3: and 3, generating a fire monitoring model FPM according to the fire influence factors as follows:
wherein,
wherein MC represents water saturation of combustible material, PC represents crop coverage, a represents multiple regulating factor, generally 10 to the nth power, n is an integer greater than or equal to 2, T represents temperature, H represents humidity, K represents regulating factor, ρ1And ρ2Respectively representing the reflectivity, PC, of the 1 st and 2 nd wave bands of MODIS image0Indicates the crop coverage, PC, when the crop is sparsely distributed1Representing the crop coverage rate when the crops are densely distributed;
the larger the fire index value calculated by the fire monitoring model FPM is, the greater the possibility of fire occurrence is, and when the fire index value is larger than a preset fire index threshold value, the place contains the fire point to be determined.
The method comprises the steps of obtaining remote sensing data in a preset time period, removing non-fire pixels in the remote sensing data, obtaining parameter factors by utilizing the remote sensing data, establishing a fire monitoring model as a basis for preliminarily judging fire to obtain fire pixels to be determined, and removing redundant fire pixels in the same unified class through a clustering algorithm to obtain the fire pixels.
There has been described herein only the preferred embodiments of the invention, but it is not intended to limit the scope, applicability or configuration of the invention in any way. Rather, the detailed description of the embodiments is presented to enable any person skilled in the art to make and use the embodiments. It will be understood that various changes and modifications in detail may be effected therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (4)

1. A fire point determination method for use in an agricultural field, comprising:
step 1, obtaining remote sensing data in a preset time period;
step 2, preprocessing the remote sensing data and extracting fire influence factors applicable to a local area;
step 3, generating a fire monitoring model according to the fire influence factor;
step 4, calculating a fire monitoring index of the local area aiming at each detection area, and marking the fire monitoring index as containing a fire point to be determined when the index value is greater than the threshold value;
and 5: acquiring a fire point pixel to be determined based on an area containing the fire point to be determined;
and 6, removing redundant fire pixels according to the time sequence of the fire pixels to be determined, and determining the fire pixels.
2. The method of claim 1, wherein the preprocessing the telemetry data comprises:
in the remote sensing data, taking near infrared band data with preset first distance resolution as original remote sensing data, and determining cloud and water body pixels in the original remote sensing data;
and converting the original remote sensing data which is preset with a first distance resolution and comprises the cloud and the water body pixels into remote sensing data with a second distance resolution by a cubic polynomial interpolation method.
3. The method according to claim 1, wherein the step 6 is to eliminate redundant fire pixels in the fire pixels to be determined belonging to the same cluster by a clustering algorithm according to the time sequence of the fire pixels to be determined, and determine fire pixel elements, and specifically comprises:
6-1, acquiring the shooting time of the fire point pixel to be determined according to the remote sensing data;
and 6-2, sequencing the fire pixel elements according to the shooting time, removing redundant fire pixel elements in the fire pixel elements to be determined, which belong to the same cluster, and determining the fire pixel elements.
4. The method of claim 3, wherein the step 6-2 is: the fire pixels are sequenced according to the shooting time, and for the fire data in the same time period, redundant fire pixels in the to-be-determined fire pixels belonging to the same cluster are removed by a spatial K mean value method, so that the fire pixels in the time period are determined, and the method specifically comprises the following steps:
step 6-2-1, acquiring longitude and latitude data of the center of each cluster in initial clusters, wherein the number of the initial clusters is K;
step 6-2-2, classifying the fire point pixels to be determined according to the Euclidean distance from the fire point pixels to be determined to the center of each cluster;
6-2-3, updating the center of each cluster according to the fire pixel in each cluster until the Euclidean distance from the fire pixel to be determined in each cluster to the center of each cluster is not reduced any more, and obtaining a clustering result;
and 6-2-4, determining whether the quantity K of the initial clusters needs to be changed or not according to the value of Silhouette obtained from the clustering result, if the value of the Silhouette is larger than a threshold value, not changing the quantity K, reserving a fire pixel in each cluster, and determining a final fire pixel after removing redundant fire pixels. And if the value of Silhouette is not larger than the threshold value, changing the number K and clustering again.
CN201611213217.6A 2016-12-23 2016-12-23 A kind of fire point applied to farmland determines method Pending CN108242052A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611213217.6A CN108242052A (en) 2016-12-23 2016-12-23 A kind of fire point applied to farmland determines method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611213217.6A CN108242052A (en) 2016-12-23 2016-12-23 A kind of fire point applied to farmland determines method

Publications (1)

Publication Number Publication Date
CN108242052A true CN108242052A (en) 2018-07-03

Family

ID=62703660

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611213217.6A Pending CN108242052A (en) 2016-12-23 2016-12-23 A kind of fire point applied to farmland determines method

Country Status (1)

Country Link
CN (1) CN108242052A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242195A (en) * 2018-09-25 2019-01-18 李琳 Prediction technique that a kind of specified region catches fire that a situation arises
CN109670556A (en) * 2018-12-27 2019-04-23 中国科学院遥感与数字地球研究所 Global heat source heavy industry region recognizer based on fire point and noctilucence data

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070000317A1 (en) * 2002-07-16 2007-01-04 Umberto Berti System and method for territory thermal monitoring
CN103455708A (en) * 2013-07-24 2013-12-18 安徽省电力科学研究院 Power transmission line disaster monitoring and risk assessment platform based on satellite and weather information
CN103854413A (en) * 2014-03-10 2014-06-11 南京林业大学 FWI early warning system and application
CN104966372A (en) * 2015-06-09 2015-10-07 四川汇源光通信有限公司 Multi-data fusion forest fire intelligent recognition system and method
CN105678237A (en) * 2015-12-31 2016-06-15 张弓 Fire point determination method and system
CN105719421A (en) * 2016-04-27 2016-06-29 丛静华 Big data mining based integrated forest fire prevention informatization system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070000317A1 (en) * 2002-07-16 2007-01-04 Umberto Berti System and method for territory thermal monitoring
CN103455708A (en) * 2013-07-24 2013-12-18 安徽省电力科学研究院 Power transmission line disaster monitoring and risk assessment platform based on satellite and weather information
CN103854413A (en) * 2014-03-10 2014-06-11 南京林业大学 FWI early warning system and application
CN104966372A (en) * 2015-06-09 2015-10-07 四川汇源光通信有限公司 Multi-data fusion forest fire intelligent recognition system and method
CN105678237A (en) * 2015-12-31 2016-06-15 张弓 Fire point determination method and system
CN105719421A (en) * 2016-04-27 2016-06-29 丛静华 Big data mining based integrated forest fire prevention informatization system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
周利霞: "基于MODIS数据的火点监测指数方法研究", 《火灾科学》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242195A (en) * 2018-09-25 2019-01-18 李琳 Prediction technique that a kind of specified region catches fire that a situation arises
CN109242195B (en) * 2018-09-25 2021-10-19 杭州灿八科技有限公司 Method for predicting fire occurrence condition of specified area
CN109670556A (en) * 2018-12-27 2019-04-23 中国科学院遥感与数字地球研究所 Global heat source heavy industry region recognizer based on fire point and noctilucence data
CN109670556B (en) * 2018-12-27 2023-07-04 中国科学院遥感与数字地球研究所 Global heat source heavy industry area identification method based on fire point and noctilucent data

Similar Documents

Publication Publication Date Title
Jang et al. Detection and monitoring of forest fires using Himawari-8 geostationary satellite data in South Korea
Tarawally et al. Comparative analysis of responses of land surface temperature to long-term land use/cover changes between a coastal and Inland City: A case of Freetown and Bo Town in Sierra Leone
Marín et al. Drought and spatiotemporal variability of forest fires across Mexico
Ndalila et al. Geographic patterns of fire severity following an extreme eucalyptus forest fire in southern Australia: 2013 Forcett-Dunalley fire
Wulder et al. A national assessment of wetland status and trends for Canada’s forested ecosystems using 33 years of earth observation satellite data
Zhao et al. Long-term land cover dynamics (1986–2016) of Northeast China derived from a multi-temporal Landsat archive
Magro et al. Atmospheric trends of CO and CH4 from extreme wildfires in Portugal using Sentinel-5P TROPOMI level-2 data
CN105740817B (en) A kind of judgment method and system of straw burning fire point data
Motta et al. Structure, spatio-temporal dynamics and disturbance regime of the mixed beech–silver fir–Norway spruce old-growth forest of Biogradska Gora (Montenegro)
Axel Burned area mapping of an escaped fire into tropical dry forest in Western Madagascar using multi-season Landsat OLI Data
Vanderhoof et al. Mapping wetland burned area from Sentinel-2 across the Southeastern United States and its contributions relative to Landsat-8 (2016–2019)
Akinyemi Vegetation trends, drought severity and land use-land cover change during the growing season in semi-arid contexts
Caseiro et al. Persistent hot spot detection and characterisation using SLSTR
Zhang et al. Spatiotemporal analysis of active fires in the Arctic region during 2001–2019 and a fire risk assessment model
Tao et al. Analysis of forest fires in Northeast China from 2003 to 2011
Nur et al. Spatial prediction of wildfire susceptibility using hybrid machine learning models based on support vector regression in sydney, australia
de Klerk et al. Evaluation of satellite-derived burned area products for the fynbos, a Mediterranean shrubland
Radeloff et al. Need and vision for global medium-resolution Landsat and Sentinel-2 data products
Hong et al. Quantification and evaluation of atmospheric emissions from crop residue burning constrained by satellite observations in China during 2016–2020
Chew et al. A Review of forest fire combating efforts, challenges and future directions in Peninsular Malaysia, Sabah, and Sarawak
Wu et al. Analysis of factors related to forest fires in different forest ecosystems in China
Hess et al. Satellite-based assessment of grassland conversion and related fire disturbance in the Kenai Peninsula, Alaska
CN106652300B (en) A kind of fire point monitoring method applied to forest zone
CN108242052A (en) A kind of fire point applied to farmland determines method
Mpanyaro et al. Mapping and Assessing Riparian Vegetation Response to Drought along the Buffalo River Catchment in the Eastern Cape Province, South Africa

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information

Address after: 101399 No. 2 East Airport Road, Shunyi Airport Economic Core Area, Beijing (1st, 5th and 7th floors of Industrial Park 1A-4)

Applicant after: Zhongke Star Map Co.,Ltd.

Address before: 101399 Building 1A-4, National Geographic Information Technology Industrial Park, Guomen Business District, Shunyi District, Beijing

Applicant before: GEOVIS TECHNOLOGY (BEIJING) Co.,Ltd.

CB02 Change of applicant information
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180703

RJ01 Rejection of invention patent application after publication