CN115760655B - Interference array imaging method based on big data technology - Google Patents

Interference array imaging method based on big data technology Download PDF

Info

Publication number
CN115760655B
CN115760655B CN202211383247.7A CN202211383247A CN115760655B CN 115760655 B CN115760655 B CN 115760655B CN 202211383247 A CN202211383247 A CN 202211383247A CN 115760655 B CN115760655 B CN 115760655B
Authority
CN
China
Prior art keywords
data
visibility
neural network
astronomical
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211383247.7A
Other languages
Chinese (zh)
Other versions
CN115760655A (en
Inventor
张利
覃芹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou University
Original Assignee
Guizhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guizhou University filed Critical Guizhou University
Priority to CN202211383247.7A priority Critical patent/CN115760655B/en
Publication of CN115760655A publication Critical patent/CN115760655A/en
Application granted granted Critical
Publication of CN115760655B publication Critical patent/CN115760655B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Image Processing (AREA)

Abstract

The invention belongs to the technical field of radio astronomical imaging, and discloses an interference array imaging method based on a big data technology, which comprises the following steps of S1, simulating an input non-uniform visible data set by utilizing rascil to obtain an output data set; s2, constructing a deep neural network to realize meshing and radio mapping at the same time; s3, training the neural network, namely training the neural network by utilizing the output data set, so that the neural network learns the mapping relation between the non-uniform visibility data and the astronomical map; s4, meshing radio interference mapping is carried out by using the trained neural network, and an output astronomical image is obtained; the invention utilizes big data technology to realize gridding algorithm in radio interference imaging rapidly and efficiently, and carries out radio imaging, thus solving the problems of high complexity, high requirement on hardware configuration, long imaging time and poor quality in radio imaging of gridding astronomical imaging on algorithm realization.

Description

Interference array imaging method based on big data technology
Technical Field
The invention relates to the technical field of radio astronomical imaging, in particular to an interference array imaging method based on a big data technology.
Background
The radio interference imaging process is based on the comprehensive aperture imaging principle to carry out gridding, fast inverse Fourier transform, deconvolution and other operations on the sampled visible data, and finally an observation image of the radio source is generated. The interferometer scans the sky to obtain the Fourier component of the sky source, namely the visibility data, and the Fourier component is unevenly distributed due to the restriction of reality factors such as the interferometer; the non-uniformly distributed visibility data is subjected to gridding to be converted into uniformly sampled visibility data, and the specific implementation of gridding is to convolve the non-uniformly distributed visibility data with a convolution function; the gridding step is mainly divided into three steps: 1. convoluting the convolution function with the measured visibility, and placing the visibility under rectangular coordinate sampling in an interpolation mode; 2. processing the rectangular coordinate data by utilizing an oversampling technology to improve the resolution of the image; 3. the convolution effect is removed using a grid correction function. Thus, gridding is the longest and most important step in radiomapping, and the performance of the algorithm directly affects the speed and quality of the radiomapping.
The traditional gridding is mainly a convolution resampling process, and the selection of a convolution function directly influences the gridding effect; therefore, the convolution function is required to have better concentration in both the time domain and the frequency domain in the selection of the convolution function, and the convolution function can better avoid the aliasing problem. The convolution functions commonly used in radiointerference mapping are: cylindrical surface function, truncated sine function, oblong function, etc.
The existing meshing method depends on selection of a convolution function, but mapping researches aiming at different sources have different requirements on the convolution function, and meanwhile, the meshing process is mainly realized by a convolution interpolation mode, so that the meshing complexity is high and the error is larger. However, the prior art researches mainly accelerate and improve the algorithm from the viewpoint of high-performance calculation, such as accelerating by utilizing a GPU and a pipeline of Open GL, and the method has higher complexity and higher requirement on hardware configuration. In summary, the existing gridding method has the problems of high complexity, difficult realization and large error, which results in long time and poor imaging quality of radio interference imaging.
Disclosure of Invention
The invention aims to provide an interference array imaging method based on big data technology, which utilizes big data technology to quickly and efficiently realize gridding algorithm in radio interference imaging, carries out radio imaging, resamples non-uniform visible data obtained by an interferometer, realizes algorithm calculation process without complex redundancy, and can complete gridding and inverse Fourier transform to obtain astronomical images, so as to solve the problems of higher complexity, higher requirement on hardware configuration, long imaging time and poor quality of the gridding method in the prior art.
In order to achieve the above object, the present invention provides the following technical solutions:
an interference array imaging method based on big data technology comprises the following steps:
s1, simulating an input non-uniform visibility data set by using rascil to obtain an output data set;
s2, building a deep neural network, wherein the neural network comprises a full-connection layer, a convolution layer and a deconvolution layer so as to realize meshing and radio imaging at the same time;
s3, training the neural network, namely training the neural network built in the step S2 by using the output data set in the step S1, so that the neural network built in the step S2 learns the mapping relation between the non-uniform visibility data and the astronomical map;
s4, meshing radio interference mapping is carried out by utilizing the neural network trained in the step S3, and the tested non-uniform visibility data are input into the neural network trained in the step S3 to obtain an output astronomical image.
Further, in S1, the method of modeling the input non-uniform visibility dataset using rascil is:
a1, importing an astronomical field diagram into a rascil imaging package;
a2, predicting non-uniform visible data by using a prediction_2d function in a rascil imaging package; the process of predicting the visibility is the inverse process of the gridding process and comprises the step of demadding; abnormal data marking and calibration are omitted, and only a gridding and imaging method is considered;
a3, selecting a pswf spherical function as a grid convolution function for the non-uniform visibility data generated in the step A2, and performing grid processing on the non-uniform visibility data to obtain uniform visibility data; the convolution function has a good anti-aliasing effect; when the gridding processing is carried out, pre-defining a mining grid, and then inserting a specific point visibility into the grid point;
a4, mapping by using the uniform visibility data obtained in the step A3, and taking the uniform visibility data as an output data set in the step S1.
Further, in A3, the non-uniform visibility data considers only a small field of view, and the spatial frequency data is (u, v) data.
Further, in S2, the full connection layer is used to estimate a mapping relationship between the non-uniform visibility data to the image domain data, which includes gridding and fourier transform; the convolution layer is used for extracting high-level characteristic information in the astronomical image; the deconvolution layer is used for visualizing the features extracted by the convolution layer and then generating an astronomical image; the model is mainly solved in a linear European space, and a RMSprop optimizer is selected for fitting a nonlinear topological structure;
the mapping relation between the visibility data and the image is as follows:
Figure BDA0003929460290000031
in phi x The intrinsic coordinate system of the non-uniform visibility data is mapped to the European space near x, and g represents the direct mapping relationship of the representation of the non-uniform visibility data to the output astronomical map.
Further, in S2, the method for implementing meshing and radiomapping by the neural network simultaneously includes:
b1 non-uniform visibility data preprocessing
The visibility data is data about u and v space, namely space frequency domain, the mathematical expression of the visibility data is complex, the real value is required to be operated by a calculation framework of the neural network, so that the visibility data is divided into real components and virtual components, the real components and the virtual components are extracted, the real component visibility and the virtual component visibility are stacked by np.dstack, the stacked two-dimensional visibility is converted into a one-dimensional array by means of flat, and the one-dimensional visibility is used as an input layer;
b2, model training
The minimized loss function in the training process is mean square error, and the L1 norm lambda=0.001 is added to the feature mapping activation of FC2 to promote sparse convolution representation;
b3, model verification
After training is completed by using the neural network model, image quality is evaluated by using indexes such as imaging time t, peak signal-to-noise ratio, structural similarity and the like, wherein:
the mapping time t is the time taken from the non-uniform visibility data to the astronomical map;
the peak signal-to-noise ratio is the ratio of the signal noise power in the maximum power domain of the signal, and the mathematical expression is:
Figure BDA0003929460290000032
Figure BDA0003929460290000041
wherein, I represents a clean astronomical graph, K represents a dirty graph, and m represents the astronomical graph size;
the structural similarity is an index for measuring the similarity of two images, and is based on comparison of brightness, contrast and structure of a clean image and a dirty image:
SSIM(x,y)=[l(x,y)] α *[c(x,y)] β *[s(x,y)] γ
where x, y represents the pixel position of the image, l (x, y) represents the brightness of the pixel position, c (x, y) represents the contrast of the pixel position, and s (x, y) represents the structure of the two images.
The technical proposal has the beneficial effects that:
the invention utilizes the characteristic of characteristic extraction of big data end-to-end learning, solves the dependence of gridding astronomical imaging on algorithm realization, thereby obviously improving the imaging speed and the image quality.
The method comprises the steps of constructing a proper neural network, performing gridding operation by using the neural network and performing radioastronomical imaging, namely eliminating a complex flow of selecting a grid convolution function, rapidly realizing convolution resampling and performing rapid astronomical imaging.
By adopting a manifold learning-based network, an input data set of the network model is non-uniform visibility data simulated by an astronomical software package rascil, an output data set of the model is an astronomical map, and the mapping relation between the non-uniform visibility data and the uniform visibility data is utilized to learn by the network, and then inverse Fourier transform is carried out to obtain the astronomical map, so that astronomical imaging quality is improved on the premise of ensuring higher imaging speed, and the problems of slow imaging speed and poor quality in the conventional radio astronomical imaging are solved.
Drawings
FIG. 1 is a flow chart of an interferometric array patterning method based on big data technology of the present invention;
FIG. 2 is a flow chart of a method of modeling an input non-uniform visibility dataset using rascil in S1 of the present invention;
FIG. 3 is a schematic diagram of a method for simultaneously meshing and radiomapping a neural network in S2 of the present invention;
FIG. 4 is a graph of imaging results obtained using an interferometric array imaging method based on big data techniques of the present invention;
Detailed Description
The invention is described in further detail below with reference to the attached drawings and embodiments:
as shown in fig. 1 to 3, an interferometric array patterning method based on big data technology includes the following steps:
s1, simulating an input non-uniform visibility data set by using rascil to obtain an output data set; because the measured data in the aspect of radio astronomy is insufficient for training the neural network, all data comprising an input data set and an output data set are obtained by using astronomical software package rascil simulation;
the method for simulating the input non-uniform visibility data set by utilizing rascil comprises the following steps:
a1, importing an astronomical field diagram into a rascil imaging package; in the astronomical field, data is generally stored in a field format so as to facilitate the transmission and exchange of data between astronomical stations;
a2, predicting non-uniform visible data by using a prediction_2d function in a rascil imaging package; the process of predicting the visibility is the inverse process of gridding, and comprises the step of demaridding; compared with optical imaging, the radiointerference imaging utilizes the synthetic aperture imaging principle to estimate sky brightness, and the process is complex, so the method omits abnormal data marking and calibration, and only considers the gridding and imaging method;
a3, selecting a pswf spherical function as a grid convolution function for the non-uniform visibility data generated in the step A2, and performing grid processing on the non-uniform visibility data to obtain uniform visibility data; the convolution function has a good anti-aliasing effect; when the gridding processing is carried out, pre-defining a mining grid, and then inserting a specific point visibility into the grid point;
wherein the non-uniform visibility data only considers a small field of view, and the spatial frequency data is (u, v) data;
a4, mapping by using the uniform visibility data obtained in the step A3, and taking the uniform visibility data as an output data set in the step S1;
s2, building a deep neural network, wherein the neural network comprises a full-connection layer, a convolution layer and a deconvolution layer so as to realize meshing and radio imaging at the same time;
the full connection layer is used for estimating the mapping relation between the non-uniform visibility data and the image domain data, and comprises gridding and Fourier transformation; the convolution layer is used for extracting high-level characteristic information in the astronomical image; the deconvolution layer is used for visualizing the features extracted by the convolution layer and then generating an astronomical image; the model is mainly solved in a linear European space, and a RMSprop optimizer is selected for fitting a nonlinear topological structure;
the mapping relation between the visibility data and the image is as follows:
Figure BDA0003929460290000061
in phi x Mapping an intrinsic coordinate system of the non-uniform visibility data to an European space near x, wherein g represents a direct mapping relation of the non-uniform visibility data to an output astronomical map;
the method for realizing meshing and radio imaging simultaneously by the neural network comprises the following steps:
b1 non-uniform visibility data preprocessing
The visibility data is related to u, v space, namely space frequency domain data, mathematical expression is complex, because the calculation framework of the neural network requires to operate real values, the visibility data is divided into real components and imaginary components, the real components and the imaginary components are extracted, the real components and the imaginary components are stacked by np.dstack, the stacked two-dimensional visibility is converted into a one-dimensional array by means of flat, if the original visibility is (64, 64), the one-dimensional visibility is (2 x 64) and the one-dimensional visibility is used as an input layer FC1, FC1 is completely connected to a 64 x 64-dimensional hidden layer, an activation function is a hyperbolic tangent function, the first hidden layer is completely connected to another 64 x 1-dimensional hidden layer through the hyperbolic tangent activation function, and the first hidden layer is reconstructed into a matrix with the size of 64 x 64;
b2, model training
The minimized loss function during training is Mean Square Error (MSE), and an additional l1norm λ=0.001 is applied to the feature map activation of FC2 to facilitate sparse convolution representation;
b3, model verification
After training is completed by using a neural network model, performing image quality evaluation by using indexes such as imaging time t, peak signal-to-noise ratio, structural similarity and the like, wherein:
the mapping time t is the time taken from the non-uniform visibility data to the astronomical map;
the peak signal-to-noise ratio is the ratio of the signal noise power in the maximum power domain of the signal, and the mathematical expression is:
Figure BDA0003929460290000062
Figure BDA0003929460290000063
wherein, I represents a clean astronomical graph, K represents a dirty graph, and m represents the astronomical graph size;
the structural similarity is an index for measuring the similarity of two images, and is based on comparison of brightness, contrast and structure of a clean image and a dirty image:
SSIM(x,y)=[l(x,y)] α *[c(x,y)] β *[s(x,y)] γ
where x, y represents the pixel position of the image, l (x, y) represents the brightness of the pixel position, c (x, y) represents the contrast of the pixel position, and s (x, y) represents the structure of the two images;
s3, training the neural network, namely training the neural network built in the step S2 by using the output data set in the step S1, so that the neural network built in the step S2 learns the mapping relation between the non-uniform visibility data and the astronomical map;
s4, meshing radio interference mapping is carried out by utilizing the neural network trained in the step S3, and the tested non-uniform visibility data are input into the neural network trained in the step S3 to obtain an output astronomical image.
As shown in fig. 4, the image quality is high for the astronomical image effect map obtained by the above method.
The foregoing is merely exemplary embodiments of the present invention, and detailed technical solutions or features that are well known in the art have not been described in detail herein. It should be noted that, for those skilled in the art, several variations and modifications can be made without departing from the technical solution of the present invention, and these should also be regarded as the protection scope of the present invention, which does not affect the effect of the implementation of the present invention and the practical applicability of the patent. The protection scope of the present application shall be subject to the content of the claims, and the description of the specific embodiments and the like in the specification can be used for explaining the content of the claims.

Claims (5)

1. An interference array imaging method based on big data technology is characterized in that: the method comprises the following steps:
s1, simulating an input non-uniform visibility data set by using rascil to obtain an output data set;
s2, building a deep neural network, wherein the neural network comprises a full-connection layer, a convolution layer and a deconvolution layer so as to realize meshing and radio imaging at the same time;
s3, training the neural network, namely training the neural network built in the step S2 by using the output data set in the step S1, so that the neural network built in the step S2 learns the mapping relation between the non-uniform visibility data and the astronomical map;
s4, meshing radio interference mapping is carried out by utilizing the neural network trained in the step S3, and the tested non-uniform visibility data are input into the neural network trained in the step S3 to obtain an output astronomical image.
2. The method for patterning an interferometric array based on big data technology according to claim 1, wherein: in S1, the method of using rascil to simulate an input non-uniform visibility dataset is:
a1, importing an astronomical field diagram into a rascil imaging package;
a2, predicting non-uniform visible data by using a prediction_2d function in a rascil imaging package; the process of predicting the visibility is the inverse process of the gridding process and comprises the step of demadding; abnormal data marking and calibration are omitted, and only a gridding and imaging method is considered;
a3, selecting a pswf spherical function as a grid convolution function for the non-uniform visibility data generated in the step A2, and performing grid processing on the non-uniform visibility data to obtain uniform visibility data; the grid convolution function has a good anti-aliasing effect; when gridding treatment is carried out, an overdriving grid is predefined, and then a specific point visibility is inserted into the overdriving grid;
a4, mapping by using the uniform visibility data obtained in the step A3, and taking the uniform visibility data as an output data set in the step S1.
3. The method for patterning an interferometric array based on big data technology according to claim 2, wherein: in A3, the non-uniform visibility data considers only a small field of view, and the spatial frequency data is (u, v) data.
4. The method for patterning an interferometric array based on big data technology according to claim 1, wherein: in S2, the fully connected layer is used to estimate a mapping relationship between non-uniform visibility data to image domain data, which includes gridding and fourier transform; the convolution layer is used for extracting high-level characteristic information in the astronomical image; the deconvolution layer is used for visualizing the features extracted by the convolution layer and then generating an astronomical image; the model is mainly solved in a linear European space, and a RMSprop optimizer is selected for fitting a nonlinear topological structure;
the mapping relation between the visibility data and the image is as follows:
Figure QLYQS_1
in phi x The intrinsic coordinate system of the non-uniform visibility data is mapped to the European space near x, and g represents the direct mapping relationship of the representation of the non-uniform visibility data to the output astronomical map.
5. The method for patterning an interferometric array based on big data technology according to claim 4, wherein: in S2, the method for implementing meshing and radiomapping by the neural network simultaneously includes:
b1 non-uniform visibility data preprocessing
The visibility data is data about u and v space, namely space frequency domain, the mathematical expression of the visibility data is complex, the real value is required to be operated by a calculation framework of the neural network, so that the visibility data is divided into real components and virtual components, the real components and the virtual components are extracted, the real component visibility and the virtual component visibility are stacked by np.dstack, the stacked two-dimensional visibility is converted into a one-dimensional array by means of flat, and the one-dimensional visibility is used as an input layer;
b2, model training
The minimized loss function in the training process is mean square error, and the L1 norm lambda=0.001 is added to the feature mapping activation of FC2 to promote sparse convolution representation;
b3, model verification
After training is completed by using the neural network model, image quality is evaluated by using indexes such as imaging time t, peak signal-to-noise ratio, structural similarity and the like, wherein:
the mapping time t is the time taken from the non-uniform visibility data to the astronomical map;
the peak signal-to-noise ratio is the ratio of the signal noise power in the maximum power domain of the signal, and the mathematical expression is:
Figure QLYQS_2
Figure QLYQS_3
wherein, I represents a clean astronomical graph, K represents a dirty graph, and m represents the astronomical graph size;
the structural similarity is an index for measuring the similarity of two images, and is based on comparison of brightness, contrast and structure of a clean image and a dirty image:
SSIM(x,y)=[l(x,y)] α *[c(x,y)] β *[s(x,y)] γ
where x, y represents the pixel position of the image, l (x, y) represents the brightness of the pixel position, c (x, y) represents the contrast of the pixel position, and s (x, y) represents the structure of the two images.
CN202211383247.7A 2022-11-07 2022-11-07 Interference array imaging method based on big data technology Active CN115760655B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211383247.7A CN115760655B (en) 2022-11-07 2022-11-07 Interference array imaging method based on big data technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211383247.7A CN115760655B (en) 2022-11-07 2022-11-07 Interference array imaging method based on big data technology

Publications (2)

Publication Number Publication Date
CN115760655A CN115760655A (en) 2023-03-07
CN115760655B true CN115760655B (en) 2023-06-23

Family

ID=85356867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211383247.7A Active CN115760655B (en) 2022-11-07 2022-11-07 Interference array imaging method based on big data technology

Country Status (1)

Country Link
CN (1) CN115760655B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111754403A (en) * 2020-06-15 2020-10-09 南京邮电大学 Image super-resolution reconstruction method based on residual learning

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106408522A (en) * 2016-06-27 2017-02-15 深圳市未来媒体技术研究院 Image de-noising method based on convolution pair neural network
CN112200018B (en) * 2020-09-21 2024-05-14 江苏大学 Phase body interferogram identification method based on residual error network

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111754403A (en) * 2020-06-15 2020-10-09 南京邮电大学 Image super-resolution reconstruction method based on residual learning

Also Published As

Publication number Publication date
CN115760655A (en) 2023-03-07

Similar Documents

Publication Publication Date Title
Guo et al. An image super-resolution reconstruction method with single frame character based on wavelet neural network in internet of things
CN111025385B (en) Seismic data reconstruction method based on low rank and sparse constraint
CN110992366A (en) Image semantic segmentation method and device and storage medium
CN114119689A (en) Multi-modal medical image unsupervised registration method and system based on deep learning
CN114063168A (en) Artificial intelligence noise reduction method for seismic signals
CN112991483A (en) Non-local low-rank constraint self-calibration parallel magnetic resonance imaging reconstruction method
Zhou et al. MSAR‐DefogNet: Lightweight cloud removal network for high resolution remote sensing images based on multi scale convolution
CN114010180B (en) Magnetic resonance rapid imaging method and device based on convolutional neural network
CN115760655B (en) Interference array imaging method based on big data technology
CN111368680B (en) Wave atom transformation-based deep learning anti-aliasing seismic data regularization method
CN112241938A (en) Image restoration method based on smooth Tak decomposition and high-order tensor Hank transformation
CN114037609B (en) Terahertz image super-resolution algorithm based on learning terahertz imaging inverse process
CN116167239A (en) Infrared simulation method, system, computer and readable storage medium
Zeng et al. Focusing functions correction in Marchenko imaging with deep learning and transfer learning
CN112686807A (en) Image super-resolution reconstruction method and system
CN112150570A (en) Compressed sensing magnetic resonance imaging method based on iterative p-threshold projection algorithm
Song et al. Research on virtual color restoration of complex building system based on discrete wavelet transform
CN115375786A (en) One-dimensional synthetic aperture depth convolution neural network and image reconstruction method
CN114004764B (en) Improved sensitivity coding reconstruction method based on sparse transform learning
CN113435487B (en) Deep learning-oriented multi-scale sample generation method
CN116797682A (en) Magnetic resonance image reconstruction method based on residual non-local Fourier attention
CN115496824B (en) Multi-class object-level natural image generation method based on hand drawing
CN118211067A (en) Seismic data interpolation method based on self-supervision migration learning convex set projection network
Li et al. Infrared image super-resolution reconstruction based on residual fast fourier transform
CN117876841A (en) Deep learning data model for removing clutter of underground pipeline ground penetrating radar and construction method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant